Ubuntu :: 10.04 Hangs During Large File Transfers?

Aug 1, 2010

I recently built a home media server and decided on Ubuntu 10.04. Everything is running well except when I try to transfer my media collection from other PCs where it's backed up to the new machine. Here's my build and various situations:

Intel D945GSEJT w/ Atom N270 CPU
2GB DDR2 SO-DIMM (this board uses laptop chipset)
External 60W AC adapter in lieu of internal PSU
133x CompactFlash -> IDE adapter for OS installation
2(x) Samsung EcoGreen 5400rpm 1.5TB HDDs formatted for Ext4

Situation 1: Transferring 200+GB of files from an old P4-based system over gigabit LAN. Files transferred at 20MBps (megabytes, so there's no confusion). Took all night but the files got there with no problem. I thought the speed was a little slow, but didn't know what to expect from this new, low-power machine.

Situation 2: Transferring ~500GB of videos from a modern gaming rig (i7, 6GB of RAM, running Windows7, etc etc). These files transfer at 70MBps. I was quite impressed with the speed, but after about 30-45 minutes I came back to find that Ubuntu had hung completely.

I try again. Same thing. Ubuntu hangs after a few minutes of transferring at this speed. It seems completely random. I've taken to transferring a few folders at a time (10GB or so) and so far it has hung once and been fine the other three times.Now, I have my network MTU set from automatic to 9000. Could this cause Ubuntu to hang like this? When I say hang I mean it freezes completely requiring a reboot. The cursor stops blinking in a text field, the mouse is no longer responsive, etc.

View 4 Replies


ADVERTISEMENT

Ubuntu Networking :: Large File Transfers & How To Set Up A Netbook Desktop Combo

Jul 11, 2011

I tried using ssh between my netbook and desktop, but it was going to take around 30 hours to transfer 39GB over the home network. Also SSH is very sketchy and often drops connections.I've been messing with it all day and I'm quite frustrated.What I'm looking to do is use my netbook as more of a primary computer and the desktop as a storage computer. Not quite a server, because I'd like to still keep a GUI on it. I'd like to be able to keep my music and movies on the desktop and stream them to the netbook (SSH sucks for this, always drops connections). I've already set up the web client for Transmission bit torrent client so I can torrent on a machine that's almost always on and connected.

Is there a better setup for all of this? I like the netbook because of the portability; I like the desktop because it's always connected (for torrents) and it has a larger storage capacity. It would be mainly used around the house. I would like to back up a file or two while abroad, but I'm not looking to stream music while I'm across town or anything.

View 2 Replies View Related

Networking :: Large File Transfers Start Fas Then Drop To A Crawl?

Jul 19, 2010

I need to transfer 330G of data from a hard drive in my workstation to my NAS device. The entire network is gigabit and being run with new HP procurve switches. All machines have static IP addresses. The NAS is a Buffalo Terastation PRO which has the latest firmware, is set to jumbo frames, and has just been upgraded with 4 brand new 500G drives giving us a 1.4TB raid 5 setup. My workstation is a dual Quad core xeon box running on an Intel S5000XVN board with 8G of ram. My OS is Ubuntu 10.04 x64 running on a pair of Intel X25 SSDs in a raid mirror. The data drive is a 500G SATA drive connected to my onboard controller. The file system on the SATA drive is XFS. This problem was ongoing before I got my new workstation, before we had the GB switches, and before the NAS got new drives. When I transfer a small file or folder (less than 500M) it reaches speeds of 10-11 MB/sec. When I transfer a file or folder larger than that the speed slows to a crawl (less than 2MB/sec). It has always been this way with this NAS. Changing to jumbo frames speeds up the small transfers but makes little difference in the big ones. I verified with HP that the switches are jumbo frame capable.

View 2 Replies View Related

Server :: Large FTP Transfers Hanging CentOS?

Feb 3, 2011

I've got a server running CentOS 5.5. I used the automated iptables config tool included in the operating system to allow traffic for vsftpd, Apache and UnrealIRCd. When I send large files to FTP, even from the local network, it works fine for a while and then completely times out... on everything. IRC disconnects, FTP can't find it and when I try to ping it I get "Reply from 10.1.10.134: Destination host unreachable" where ..134 is the host address for the Win7 box I'm pinging from. This is especially frustrating as it's a headless server, and as I can't SSH into it to reboot I'm forced to resort to the reset switch on the front, which I really don't like doing.

Edit: the timeouts are global, across all machines both on the local network and users connecting in from outside.

View 4 Replies View Related

OpenSUSE Network :: 11.3 NFS Client Hangs On Large File Transfer?

Jan 2, 2011

When accessing an NFS mount for a large (200MB+) file transfer, the transfer starts rapidly, then becomes slower and slower until it hangs. On several occasions, it has frozen the client machine. Both client and server are set to default to nfs version 3. Slowdown and hang also occur when connecting to FreeBSD NFS mounts.

Presumably (I hope), there is some sort of configuration for the client that needs to be set. what should be changed in the configuration? This worked out of the box in OpenSUSE 11.0.

View 9 Replies View Related

Ubuntu :: Slow File Transfers On 10.10?

Oct 30, 2010

Has anyone else noticed that file transfers on Maverick are significantly slower than on Lucid? A review of Maverick on Tom's Hardware finds that it is considerably slower. My own test says: On 10.10, file copy between two NTFS drives maxes at 30 MBytes/s; on 10.04.1, the same operations clock in at 40 MBytes/s. Both drives are capable of rw speeds of 70-90 MBytes/s, which I got on WinXP, and hdparm's cached read results back that up.

why file transfers are so slow, and especially why Maverick is even slower than Lucid?

View 1 Replies View Related

Ubuntu Networking :: NFS File Transfers Are Very Slow?

Jun 10, 2010

I have a server set up as an NFS share, and the share mounted on my laptop. Using linksys wireless g router and 15mb internet connection. Laptop is on wireless connection and server is wired.While transferring files from the laptop to the server I only get about 55kb/s. Is this normal for wireless g?

View 1 Replies View Related

Ubuntu :: Significantly Slow File Transfers On 10.10

May 25, 2011

Has anyone else noticed that file transfers on Maverick are significantly slower than on Lucid? A review of Maverick on Tom's Hardware finds that it is considerably slower. My own test says: On 10.10, file copy between two NTFS drives maxes at 30 MBytes/s; on 10.04.1, the same operations clock in at 40 MBytes/s. Both drives are capable of rw speeds of 70-90 MBytes/s, which I got on WinXP, and hdparm's cached read results back that up. Any ideas why file transfers are so slow, and especially why Maverick is even slower than Lucid?

View 9 Replies View Related

Networking :: Slow File Transfers Using Samba?

Feb 2, 2010

I am running Samba on a debian Lenny box on a wireless home network. I find that file transfers to the samba share are very slow. It takes over a minute to copy a 40MB file to the linux box, but only 20 seconds to copy the same file to a windows XP box on the same network.

Anyways, I could use a little direction on how to proceed with this, I'm really not sure where to start,

View 9 Replies View Related

Fedora Networking :: 12 Hangs On Transfer Of Large Files?

Dec 9, 2009

I have Fedora 12 (with all the latest patches, including the 2.6.31.6-162 kernel) installed on a new Supermicro SYS-5015A-H 1U Server [Intel Atom 330 (1.6GHz) CPU, Intel 945GC NB, Intel ICH7R SB, 2x Realtek RTL8111C-GR Gigabit Ethernet, Onboard GMA950 video]. This all works great until I try to transfer a large file over the network, then the computer hard locks, forcing a power-off reset.

Some info about my setup:

[root@Epsilon ~]# uname -a
Linux Epsilon 2.6.31.6-162.fc12.i686.PAE #1 SMP Fri Dec 4 00:43:59 EST 2009 i686 i686 i386 GNU/Linux
[root@Epsilon ~]# dmesg | grep r8169
r8169 Gigabit Ethernet driver 2.3LK-NAPI loaded

[code]....

I'm pretty sure this is an issue with the r8169 driver (what I'm seeing is somewhat reminiscent of the bug reported here). The computer will operate fine for days as a (low volume) web server, and is reasonably stable transferring small files, but as when as I try to transfer a large file (say during a backup to a NAS or a NFS share), the computer will hard lock (no keyboard, mouse, etc.) at some point into the transfer of the file. It doesn't seem to matter how the file is transferred (sftp, rsync to NFS share, etc.).

View 10 Replies View Related

Ubuntu :: SANE / HPLIP And F300 Hangs With Large Scan Areas

May 5, 2011

SANE + HPLIP 3.11.3a-1 + Tomato v1.28.8754 ND USB Std + Hp Deskjet F380 (F300 series) only works with small scan areas. This error is affecting SANE running on my tomato router, this affects all computers that try to use that scanner, regardless of platform, in fact, it affects scans originating from the router itself. I believe that it could also happen on ubuntu if I had the same configuration as the router, it seems to be an error that would happen on any platform with the same SANE/hplip settings and hardware (HP Deskjet F380). The problem is also related to DPIs, but I can't manage to get one full page with 75 DPI, which is the lowest possible. If I get a small area (around 1/4 of the full glass) it will scan at some times.

If I increase the DPI, it won't scan. I will get an Error during device I/O, even if I try to scan directly from scanimage. I would like to have the scanner function of my AiO printer working, at least to scan the full glass in low resolutions (<=150). I can't load previews as well. I thought it could be a memory problem, since it does scan small areas, but it is not, in fact, the memory usage doesn't increase much during the scan, I even created a 128MB swap partition in my flash drive (was afraid of doing so, since it would wear it quickly, will probably remove now), but it didn't change anything, it still doesn't scan with more than 55% of the router's memory (16MB) free and more than 97% of the swap memory free.

Here are my configuration files:
/opt/etc/sane.d/dll.conf

Code:
hpaio
/opt/etc/sane.d/saned.conf

Code:
192.168.0.0/24
/opt/etc/xinetd.d/saned

Code:
service saned
{
port = 6566
socket_type = stream
server = /opt/sbin/saned
protocol = tcp
user = root
group = root
wait = no
disable = no
}
/opt/etc/xinetd.conf

Code:
# Copyright 1999-2004 Gentoo Foundation
# Distributed under the terms of the GNU General Public License v2
# Sample configuration file for xinetd
defaults
{
only_from = localhost 192.168.0.0/24
instances = 60
log_type = FILE /opt/var/xinetd.log
log_on_success = HOST PID
log_on_failure = HOST
cps = 25 30
}
includedir /opt/etc/xinetd.d

Everything is configured properly, I got the dbus and cups installed since both are required for hplip to work properly. Here are some other files, that shows what happens:
sane-find-scanner

Code:
# sane-find-scanner will now attempt to detect your scanner. If the result is different from what you expected, first make sure your scanner is powered up and properly connected to your computer.

# No SCSI scanners found. If you expected something different, make sure that you have loaded a kernel SCSI driver for your SCSI adapter. Also you need support for SCSI Generic (sg) in your operating system. If using Linux, try "modprobe sg".

found USB scanner (vendor=0x03f0 [HP], product=0x5511 [Deskjet F300 series]) at libusb:001:006
# Your USB scanner was (probably) detected. It may or may not be supported by SANE. Try scanimage -L and read the backend's manpage. Not checking for parallel port scanners.

# Most Scanners connected to the parallel port or other proprietary ports can't be detected by this program.
scanimage -L

Code:
device `hpaio:/usb/Deskjet_F300_series?serial=CN6BSGK1M104KH' is a Hewlett-Packard Deskjet_F300_series all-in-one
hp-probe

Code:
root@white:/opt/bin# ./hp-probe -busb
warning: python-dbus not installed.
warning: hp-probe should not be run as root/superuser.

HP Linux Imaging and Printing System (ver. 0.0.0)
Printer Discovery Utility ver. 4.1

Copyright (c) 2001-9 Hewlett-Packard Development Company, LP. This software comes with ABSOLUTELY NO WARRANTY. This is free software, and you are welcome to distribute it under certain conditions. See COPYING file for more details.

Found 1 printer(s) on the 'usb' bus.
Done.
scanimage > test.jpg

Code:
scanimage: sane_start: Error during device I/O
saned -d5
[saned] process_request: waiting for request
saned doesn't report any error, but starts waiting for another request.

View 1 Replies View Related

Slackware :: Amarok Hangs At 10% When Scanning Large Music Collection?

Jun 6, 2010

After installing slackware 13.1 I start up amarok and when I go in and configure the settings and it starts to scan the folder and it either hangs at 10%, stops responding all together or crashes, the library is about 130 gigs of mp3s. I do not know where to start on this one. Amarok version 2.3.0

View 4 Replies View Related

CentOS 5 :: 5.2 Installation Hangs On Server With Large Number Of Drives?

Apr 6, 2009

I am attempting to upgrade a system from 4.7 to 5.2 using a (now) DVD drive attached to the onboard IDE. Originally I had tried using a remote NFS image and a USB stick but I thought maybe there was a problem with the image. I can get up to the point of the installation of selecting the keyboard for the system and then it freezes and never goes any further. It doesn't appear to be a kernel panic since I can still switch between consoles.

I've got an MSI K9NGM2-FID with 14 drives in it. It serves as a file server for our backup server. It's got a secondary 4 port Silicon Image SII 3114 SATA card using the sata_sil module, and an old IDE Promise FastTrak TX2000. Technically I could have 16 drives but the 750W PS is walking the fine line on tripping it's self-breaker with the 14 drives and 7 fans. I would like to NOT have to disconnect all of this to do the upgrade.

I thought maybe that running the install using the "noprobe" option would help so it didn't detect and load the modules for the Silicon Image or the Promise cards and detect all of the drives but it still gets stuck on the step after selecting the keyboard. The installation info console and the dmesg console don't really provide any useful information. The installation console says:

INFO : moving (1) to step welcome
INFO : moving (1) to step language
INFO : moving (1) to step keyboard
INFO : moving (1) to step findrootparts

And the last lines of the dmesg console says:

<6>device-mapper: multipath: version 1.0.5 loaded
<6>device-mapper: multipath round-robin: version 1.0.0 loaded
<6>device-mapper: multipath emc: version 0.0.3 loaded

Is there a hidden "debug" option that will turn on a lot of extra logging?

View 7 Replies View Related

Fedora :: File Transfers (cp, Mv And Rsync) Slow System Tremendously

Aug 24, 2011

When ever I transfer large files using cp, mv, rsync or dolphin the system will slow down to the point that it's unusable. It will sometime hang completely and not accept any input until the file is transferred. I have no clue what could be causing this problem but I know it shouldn't be happening.I am using Fedora 15 (2.6.40.3-0.fc15.x86_64) with KDE 4.6. have a Phenom II 955 processor, 6 GB of system ram and the OS and swap file is on an 80 GB SSD. Copying files in the SSD doesn't cause any problem, but moving files between my other two large HDDs causes the extreme slow down. Using htop I can see that my system load jumps to between 3 and 4, but my RAM and CPU usage stays low during the transfer. Here are two commands that take about 10 mins to run and make the system unusable while it's running. It usually transferring around 2-20GB worth of data during the transfers:

cp -a /media/data.1.5/backup/Minecraft/backups/* /media/data.0.5/backup/Minecraft/backups/
rsync -a /media/data.1.5/backup/ /media/data.0.5/backup/
/media/data.1.5/ is the mount point for a 1.5 TB internal SATA drive, and /media/data.0.5/ is the mount point for a 500 GB internal SATA drive.

View 6 Replies View Related

Ubuntu Servers :: Samba Configuration - File Transfers Between The Machines Are Extremely Slow

Nov 1, 2010

1. When I'm not logged into the server, only the shares are visible on my Windows computer. Clicking on the share folder displays an error message. As soon as I log in at the server, the files within the shares become accessible on the Windows box.

2. File transfers between the machines are extremely slow. Watching the system monitor, there's a brief burst of network activity followed by 10-30 seconds of nothing...on a gigabit network, the effective transfer rate is ~120kbs. There's no other network activity going on that would account for this behavior.

View 9 Replies View Related

Ubuntu Networking :: Slow File Transfers (FTP, SCP) With Intel Gigabit Or Atheros WLAN?

Nov 16, 2010

I just installed ubuntu as my primary OS, but I have the disk with XP on it and I don't want to go back, but I need faster network connectivity. I have a T60p with Intel Gigabit jacked into my Gigabit router which also has my desktop (running XP) and my NAS. If I FTP files from my NAS (or SCP), I get transfer speeds around 250-500 KB/s (which is not very fast). On this same switch, from my XP desktop I get transfer speeds around 12 MB/s. I get the same speeds using my 802.11n card (Atheros) as with the ethernet NIC (250-500 KB/s).The drivers for the ethernet card and the atheros card are e1000e and ath9k respectively.I have disabled IPv6. Since the problem occurs using either interface, I am just going to concentrate on fixing it for the Ethernet interface (since I believe it to be a systemwide problem).

Code:

skinnersbane@albert:~$ sudo ethtool eth0
Settings for eth0:
Supported ports: [ TP ]
Supported link modes: 10baseT/Half 10baseT/Full

[code]....

Clearly my card is running at Gigabit, but why the bad transfer speeds? I am using filezilla for FTP (technically FTPES). I closed every other program. My CPU utilization does seem high and I wonder if this is part of the problem. I had no problems with throughput using either interface in Windows XP just one week ago.

View 1 Replies View Related

Ubuntu Networking :: D-link DWA-552 Master Mode On 10.04 - File Transfers Or Bandwidth Consuming Processes

May 20, 2010

after successfully configuring the dwa-552 to work in master mode in ubuntu 10.04 (ath9k driver) I ran some file transfer tests. The download speed is very good (~50mbps) but the upload speed spikes at about 10-20mbps for the first few KB and then it's nonexistent (0-1kbps). This only affects file transfers or otherwise bandwidth consuming processes. Normal web browsing or ssh is not affected. After running a speedtest of my internet connection which is routed through the AP I could upload to the internet with 1mbps which is my inet connection maximum so apparently this is not affected. Tried the same file transfers with netcat to eliminate any other factors and had the same problem. dmesg and hostapd debug did not report anything unusual

View 2 Replies View Related

Ubuntu Multimedia :: Large *.mp4 Files In Gtkpod - Hangs At "Copying Tracks" At 0%

Jul 27, 2010

I am dual booting XP and Ubuntu 10.04, but in the future I will be getting a new machine and I will only be running Ubuntu and won't have access to iTunes. Because I have an iPod Touch, I have been trying to find workarounds for syncing everything that iTunes took care of in the past. One problem I have is managing movies. I have looked through various media players/iPod management tools (Amarok, Rhythmbox, gtkpod) and I am using Rhythmbox to sync my music and and attempting to use gtkpod to sync my movies.

gtkpod is able to sync songs (Tested with a few minute test clip) and short *.mp4 files (15mb I know for sure from test). I am unable, however, to get it to sync a movie (~700mb) I am able to drag it onto my iPod in gtkpod, but when I try to save the changes and write the files, it hangs at "Copying Tracks" at 0%. It eventually crashes during the couple times I have tried to wait it out. So this being my situation, my question is, is there a size limit to the *.mp4 files I can sync to my iPod Touch via gtkpod? is there any other tools that you know of that I can sync videos to my iPod with?

View 9 Replies View Related

General :: View A Particular Ten Lines In A Large File Where Can't Open The File In Vi

May 12, 2010

I am using RHEL 5.I have a very large test file which cannot be opened in vi.The content of the file has some 8000 lines.I need to view ten lines between 5680 to 5690.How can i view these particular lines in a large file.what is command and option i need to use.

View 1 Replies View Related

Ubuntu :: "Error Splicing File: File Too Large" - Copy To A 500G USB Drive?

Sep 10, 2010

I have seen this 3 times now - it's an updated Lucid with EXT4, trying to copy to a 500G USB drive?

View 3 Replies View Related

Ubuntu :: Large .tar.gz File That Trying To Extract?

Jan 4, 2011

I've got a large .tar.gz file that I am trying to extract, I have had a look around at the problem and seems other people have had it, but I've tried their solutions and they haven't worked.The command I am using is:

Code:
tar zxvf file.tar.gz
and the error is:

[code]...

View 9 Replies View Related

Ubuntu :: Copying A Large File From The Network?

Feb 17, 2010

I am trying to copy a file from a network resource on my Local Area Network, which is about 4.5 GB. I copy this file through GNOME copying utilities by first going to Places --> Network and then selecting the Windows Share on another computer on my network. I open it and start copying the file to my FAT32 drive by Ctrl+C and Ctrl+V. It copies well up-to 4 GB and then it hangs.

After trying it almost half a dozen times I got really annoyed and left it hung and went to bed. Next morning when I checked a message box saying "file too large to write" has appeared.

I am very annoyed. I desperately need that file. It's an ISO image and it is not damaged at all, it copies well to any windows system. Also, I have sufficient space on the drive in which I am trying to copy that file.

View 8 Replies View Related

Ubuntu :: Bzip2 Compressed File Too Large

Feb 26, 2010

I have been having a recurring problem backing up my filesystem with tar, using bzip2 compression. Once the file reached a size of 4Gb, an error message appeared saying that the file was too large (I closed the terminal so do not have the exact message. Is there a way to retrieve it?). I was under the impression that bzip2 can support pretty much any size of file. It's rather strange: I have backed up files of about 4.5Gb before without trouble.

At the same time, I have had this problem before, and it's definitely not a memory problem: I am backing up onto a 100G external hard drive.

That reminds me, in fact, (I hadn't thought of this) that one time I tried to move an archived backup of about 4.5Gb to an external (it may have been the same one) and it said that the file was too large. Could it be that there is a maximum size of file I can transfer to the external in one go? Before I forget, I have ubuntu Karmic and my bzip2 version is 1.0.5 (and tar 1.22, though maybe this is superfluous information?)

View 4 Replies View Related

Ubuntu :: 699MB 10.04 File Too Large For 700MB CD?

May 2, 2010

I am attempting to burn the ISO for Lucid Lynx final onto a 700MB CD. The ISO file is 699MB, but Windows reports that the size on disk is 733MB and thus CD Burner XP refuses to burn the file, stating that it's too large for a CD.

Why this discrepancy on file sizes? I've noticed this with other files as well, suddenly it's a bit of a problem, as you can see!

View 6 Replies View Related

Ubuntu Servers :: Large File Transfer On LAN?

Nov 11, 2010

I'm trying to create an Ubuntu Server file server that will handle large file transfers (up to 50gb) from the LAN with Windows clients. We've been using a Windows server on our LAN on the file transfers will occasionally fail... though the server is used for other services as well.

The files will be up to 50gb. My thoughts are to create a VLAN (or separate physical switch) to ensure maximum bandwidth. Ubuntu server will be 64bit with 4tb of storage in a RAID 5 config.

View 2 Replies View Related

Ubuntu :: Error "File Too Large" Copying 7.3gb File To USB Stick

Nov 24, 2010

I am trying to copy a 7.3gb .iso file to an 8gb USB stick and I get the following error when it hits 4.0gb

Error while copying "xxxxxx.iso". There was an error copying the file into /media/6262-FDBB. Error splicing file: File too large The file is to be used by a windows user, and I'm just trying to do a simple copy, not a burn to USB or anything fancy. Using 10.4.1.LTS, AMD Dual Core, all latest patches.

View 2 Replies View Related

Server :: NFS Large File Copies Fail - Error Writing To File: Input/output Error?

Jun 27, 2009

I recently upgraded my file/media server to Fedora 11. After doing so, I can no longer copy large files to the server. The files begin to transfer, but typically after about 1gb of the file has transferred, the transfer stalls and ultimately fails with the message:

"Error writing to file: Input/output error"

I've run out of ideas as to what could cause this problem. I have tried the following:

1. Different NFS versions: NFS3 and NFS4
2. Tried copying the files to different physical drives on the server.
3. Tried copying the files from different physical drives on the client.
4. Tried different rsize and wsize block sizes when mounting the NFS share
5. Tried copying the files via a different protocol. SSH in this case. The file transfers are always successful when I use SSH.

Regardless of what I do, the result is the same. The file transfers always fail after approximately 1gb.

Some other notes.

1. Both the client and the server are running Fedora 11 kernel 2.6.29.5-191.fc11.x86_64

I am out of ideas. Has anyone else experienced something similar?

View 13 Replies View Related

Ubuntu :: Can't Copy A Large 30gig Image File?

Jan 3, 2010

I have some large image files that are 30 gig and more. I am running Ubuntu 9.10 whenever I try to copy one of these files to another drive I get a error saying the file is too large. I am trying to copy from an external Hard Drive or a slave drive does the same thing. I have a friend who has expressed the same issue. This must be a widespread bug.

View 9 Replies View Related

Ubuntu :: File System Format For Mac OSX 10.5 For Large Files?

Sep 19, 2010

Is there a file system that both Mac OSX 10.5 and linux can read/write for large files (like 4gb files)? My desktop is Ubuntu and I run most from there, but I want to back up my MacBook and linux box on the same external hard drive. Seems there are some (paid) apps for Mac that will mount NTFS but I'm wondering if there is just a shared files ystem that will work for both.

View 9 Replies View Related

Ubuntu :: Large File Size With Xsane Save To PDF?

Jun 17, 2011

CanoScan LiDE 210 running under 10.10 on a Tosh Tecra M11-130 laptop.Currently trying out xsane to archive some paperwork in monochrome, as the bundled Simple Scan utility can only save in either colour or greyscale. The problem is that the same A4 page saved as monochrome has a file size about three times larger in Ubuntu than in Windoze.

The scan mode options are either 'Colour', 'Greyscale' or 'Lineart'. There is no 'halftone' setting available, as shown in some of the xsane manuals. Don't know whether this is significant to this issue. Xsane's main option window shows 3508 x 2480 x 1 bit for a 300 dpi A4 monochrome scan when 'lineart' is selected, but the intermediate file size is 8.3MB instead of just over 1MB before packing for the PDF. This is consistent with each pixel not being recorded as a 1 or a 0, but as a greyscale 11111111 or 00000000, i.e. monochrome/halftone, but stored in an eight bit field. how to tweak xsane for true monochrome intermediate .pnm files and saved PDFs?

View 5 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved