Networking :: Large File Transfers Start Fas Then Drop To A Crawl?
Jul 19, 2010
I need to transfer 330G of data from a hard drive in my workstation to my NAS device. The entire network is gigabit and being run with new HP procurve switches. All machines have static IP addresses. The NAS is a Buffalo Terastation PRO which has the latest firmware, is set to jumbo frames, and has just been upgraded with 4 brand new 500G drives giving us a 1.4TB raid 5 setup. My workstation is a dual Quad core xeon box running on an Intel S5000XVN board with 8G of ram. My OS is Ubuntu 10.04 x64 running on a pair of Intel X25 SSDs in a raid mirror. The data drive is a 500G SATA drive connected to my onboard controller. The file system on the SATA drive is XFS. This problem was ongoing before I got my new workstation, before we had the GB switches, and before the NAS got new drives. When I transfer a small file or folder (less than 500M) it reaches speeds of 10-11 MB/sec. When I transfer a file or folder larger than that the speed slows to a crawl (less than 2MB/sec). It has always been this way with this NAS. Changing to jumbo frames speeds up the small transfers but makes little difference in the big ones. I verified with HP that the switches are jumbo frame capable.
View 2 Replies
ADVERTISEMENT
Jul 11, 2011
I tried using ssh between my netbook and desktop, but it was going to take around 30 hours to transfer 39GB over the home network. Also SSH is very sketchy and often drops connections.I've been messing with it all day and I'm quite frustrated.What I'm looking to do is use my netbook as more of a primary computer and the desktop as a storage computer. Not quite a server, because I'd like to still keep a GUI on it. I'd like to be able to keep my music and movies on the desktop and stream them to the netbook (SSH sucks for this, always drops connections). I've already set up the web client for Transmission bit torrent client so I can torrent on a machine that's almost always on and connected.
Is there a better setup for all of this? I like the netbook because of the portability; I like the desktop because it's always connected (for torrents) and it has a larger storage capacity. It would be mainly used around the house. I would like to back up a file or two while abroad, but I'm not looking to stream music while I'm across town or anything.
View 2 Replies
View Related
Aug 1, 2010
I recently built a home media server and decided on Ubuntu 10.04. Everything is running well except when I try to transfer my media collection from other PCs where it's backed up to the new machine. Here's my build and various situations:
Intel D945GSEJT w/ Atom N270 CPU
2GB DDR2 SO-DIMM (this board uses laptop chipset)
External 60W AC adapter in lieu of internal PSU
133x CompactFlash -> IDE adapter for OS installation
2(x) Samsung EcoGreen 5400rpm 1.5TB HDDs formatted for Ext4
Situation 1: Transferring 200+GB of files from an old P4-based system over gigabit LAN. Files transferred at 20MBps (megabytes, so there's no confusion). Took all night but the files got there with no problem. I thought the speed was a little slow, but didn't know what to expect from this new, low-power machine.
Situation 2: Transferring ~500GB of videos from a modern gaming rig (i7, 6GB of RAM, running Windows7, etc etc). These files transfer at 70MBps. I was quite impressed with the speed, but after about 30-45 minutes I came back to find that Ubuntu had hung completely.
I try again. Same thing. Ubuntu hangs after a few minutes of transferring at this speed. It seems completely random. I've taken to transferring a few folders at a time (10GB or so) and so far it has hung once and been fine the other three times.Now, I have my network MTU set from automatic to 9000. Could this cause Ubuntu to hang like this? When I say hang I mean it freezes completely requiring a reboot. The cursor stops blinking in a text field, the mouse is no longer responsive, etc.
View 4 Replies
View Related
Feb 3, 2011
I've got a server running CentOS 5.5. I used the automated iptables config tool included in the operating system to allow traffic for vsftpd, Apache and UnrealIRCd. When I send large files to FTP, even from the local network, it works fine for a while and then completely times out... on everything. IRC disconnects, FTP can't find it and when I try to ping it I get "Reply from 10.1.10.134: Destination host unreachable" where ..134 is the host address for the Win7 box I'm pinging from. This is especially frustrating as it's a headless server, and as I can't SSH into it to reboot I'm forced to resort to the reset switch on the front, which I really don't like doing.
Edit: the timeouts are global, across all machines both on the local network and users connecting in from outside.
View 4 Replies
View Related
Jun 10, 2010
I have a server set up as an NFS share, and the share mounted on my laptop. Using linksys wireless g router and 15mb internet connection. Laptop is on wireless connection and server is wired.While transferring files from the laptop to the server I only get about 55kb/s. Is this normal for wireless g?
View 1 Replies
View Related
Feb 2, 2010
I am running Samba on a debian Lenny box on a wireless home network. I find that file transfers to the samba share are very slow. It takes over a minute to copy a 40MB file to the linux box, but only 20 seconds to copy the same file to a windows XP box on the same network.
Anyways, I could use a little direction on how to proceed with this, I'm really not sure where to start,
View 9 Replies
View Related
Nov 16, 2010
I just installed ubuntu as my primary OS, but I have the disk with XP on it and I don't want to go back, but I need faster network connectivity. I have a T60p with Intel Gigabit jacked into my Gigabit router which also has my desktop (running XP) and my NAS. If I FTP files from my NAS (or SCP), I get transfer speeds around 250-500 KB/s (which is not very fast). On this same switch, from my XP desktop I get transfer speeds around 12 MB/s. I get the same speeds using my 802.11n card (Atheros) as with the ethernet NIC (250-500 KB/s).The drivers for the ethernet card and the atheros card are e1000e and ath9k respectively.I have disabled IPv6. Since the problem occurs using either interface, I am just going to concentrate on fixing it for the Ethernet interface (since I believe it to be a systemwide problem).
Code:
skinnersbane@albert:~$ sudo ethtool eth0
Settings for eth0:
Supported ports: [ TP ]
Supported link modes: 10baseT/Half 10baseT/Full
[code]....
Clearly my card is running at Gigabit, but why the bad transfer speeds? I am using filezilla for FTP (technically FTPES). I closed every other program. My CPU utilization does seem high and I wonder if this is part of the problem. I had no problems with throughput using either interface in Windows XP just one week ago.
View 1 Replies
View Related
May 20, 2010
after successfully configuring the dwa-552 to work in master mode in ubuntu 10.04 (ath9k driver) I ran some file transfer tests. The download speed is very good (~50mbps) but the upload speed spikes at about 10-20mbps for the first few KB and then it's nonexistent (0-1kbps). This only affects file transfers or otherwise bandwidth consuming processes. Normal web browsing or ssh is not affected. After running a speedtest of my internet connection which is routed through the AP I could upload to the internet with 1mbps which is my inet connection maximum so apparently this is not affected. Tried the same file transfers with netcat to eliminate any other factors and had the same problem. dmesg and hostapd debug did not report anything unusual
View 2 Replies
View Related
Mar 26, 2010
I've started having problems with large file downloads. A download will start and after a while freeze. The downloads window reports the correct connection speed and gives an estimated time to complete, but it stays frozen. Small downloads, torrents and surfing are not affected. I can do everything else normally even when the download is frozen. I've checked with my ISP and everything with my equipment checks out.
View 9 Replies
View Related
May 30, 2010
When copying files to USB drives, the file progress bar moves it 'bursts', sometimes doing nothing for long periods, then moving forward quickly and stopping again.It's almost like it is showing the transfer to the cache, not the transfer to the actual drive.
View 5 Replies
View Related
Aug 4, 2010
I started unarchiving a RAR file that this several gigabytes big. The computer is now going really slow, it's almost frozen. Sometimes I can move the mouse a little, but that's it. The unarchiving process seems to have halted, so now all I can do is restart the system. I don't think I can unarchive this file in Linux.
View 5 Replies
View Related
May 2, 2011
I have a linux server that has seemingly random network slow downs. This server is mainly my dvr. I'm starting to think it's a hardware problem, but that's just a gut feeling. I don't really know how to determine if it isn't.
Slow summary:SSH, HTTP, VNC incoming traffic are all affected
Outgoing traffic seems ok. I haven't tested this as much.
Rebooting mostly helps.
Stopping/starting network doesn't help
Load average is below 1.0
Updated Kernel with no change
[Code]...
View 6 Replies
View Related
Oct 30, 2010
Has anyone else noticed that file transfers on Maverick are significantly slower than on Lucid? A review of Maverick on Tom's Hardware finds that it is considerably slower. My own test says: On 10.10, file copy between two NTFS drives maxes at 30 MBytes/s; on 10.04.1, the same operations clock in at 40 MBytes/s. Both drives are capable of rw speeds of 70-90 MBytes/s, which I got on WinXP, and hdparm's cached read results back that up.
why file transfers are so slow, and especially why Maverick is even slower than Lucid?
View 1 Replies
View Related
May 25, 2011
Has anyone else noticed that file transfers on Maverick are significantly slower than on Lucid? A review of Maverick on Tom's Hardware finds that it is considerably slower. My own test says: On 10.10, file copy between two NTFS drives maxes at 30 MBytes/s; on 10.04.1, the same operations clock in at 40 MBytes/s. Both drives are capable of rw speeds of 70-90 MBytes/s, which I got on WinXP, and hdparm's cached read results back that up. Any ideas why file transfers are so slow, and especially why Maverick is even slower than Lucid?
View 9 Replies
View Related
Sep 6, 2010
I have a number of computers working just fine on my wireless network so I don't believe it is a router issue. I am running 10.04 on a dell gx280. I plug in the WUSB54g and ubuntu sees it and it connects to the internet just as it should. I can surf the internet thruough different browsers, and although a little sluggish, it works just fine. The problem is when I download a file. It starts out working as it should but continues to slow down to a crawl. Update manager had a 54MB update, it started out downloading, but then said it would take 2 hours so I just cancelled. I have several WUSB54g's and they all have the same issue. I changed to a different type of adapter, and the update manager only took a couple of minutes.
View 3 Replies
View Related
Aug 24, 2011
When ever I transfer large files using cp, mv, rsync or dolphin the system will slow down to the point that it's unusable. It will sometime hang completely and not accept any input until the file is transferred. I have no clue what could be causing this problem but I know it shouldn't be happening.I am using Fedora 15 (2.6.40.3-0.fc15.x86_64) with KDE 4.6. have a Phenom II 955 processor, 6 GB of system ram and the OS and swap file is on an 80 GB SSD. Copying files in the SSD doesn't cause any problem, but moving files between my other two large HDDs causes the extreme slow down. Using htop I can see that my system load jumps to between 3 and 4, but my RAM and CPU usage stays low during the transfer. Here are two commands that take about 10 mins to run and make the system unusable while it's running. It usually transferring around 2-20GB worth of data during the transfers:
cp -a /media/data.1.5/backup/Minecraft/backups/* /media/data.0.5/backup/Minecraft/backups/
rsync -a /media/data.1.5/backup/ /media/data.0.5/backup/
/media/data.1.5/ is the mount point for a 1.5 TB internal SATA drive, and /media/data.0.5/ is the mount point for a 500 GB internal SATA drive.
View 6 Replies
View Related
Jun 6, 2010
I have a problem that I can't seem to fix.When I try to transfer a large file lets say 700Mb or so my wireless shuts down and i have to restart my ubuntu computer the other computer is vista.ubuntu is on a wusb54gver4 and vista is going through a wrt54g tm running dd-wrt mega.I have tried every thing i know with no luck.
View 4 Replies
View Related
Dec 31, 2010
A little over a year ago I was using SCP to successfully transfer large files over my LAN (exact same hardware). I can't seem to do this any more, and I'm not sure why.I think it's either something with iptables, or a network card driver problem. I use the same driver for both computers (b43 wireless). I can't do FTP transfers either. They start going, but quickly stall. I've used the Firestarter (iptables gui) to allow all the correct connections.One last thing: When I tried to connect to ssh using an alfa wireless card (not sure of the drivers), I couldn't even connect to ssh period. Same settings were used.
View 1 Replies
View Related
Nov 1, 2010
1. When I'm not logged into the server, only the shares are visible on my Windows computer. Clicking on the share folder displays an error message. As soon as I log in at the server, the files within the shares become accessible on the Windows box.
2. File transfers between the machines are extremely slow. Watching the system monitor, there's a brief burst of network activity followed by 10-30 seconds of nothing...on a gigabit network, the effective transfer rate is ~120kbs. There's no other network activity going on that would account for this behavior.
View 9 Replies
View Related
Apr 14, 2009
I am having a problem with slow data transfers with both Samba and scp. I have gigabit NIC's on both all three machines that I am transferring to and from, connected to a gigabit switch. My data transfers under both smb and scp average around 21 MBit/s, (I am using nload to monitor transfer speeds).The machines are configured as follows,1) desktop
AMD Athlon 64 X2 6000+
6 gig Corsair memory
Realtek RTL8168C(P) gigabit NIC (on board)
[code]....
View 2 Replies
View Related
Nov 11, 2010
When clicking on a program to start from the drop down menu (latest example being Quadrapassel) the program starts but then the tab at the bottom of the screen disappears and the program doesn't load.
View 4 Replies
View Related
May 3, 2010
I have a script that writes the following type lines to a file. The lines include some html tags. Here is an example:<p>Total = 5</h2></b></center>I want to keep the <p> tag, but remove the ending 3 tags. Sometimes the line can have a 1, 2, or perhaps a 3 digit Total. So I could have a Total = 125.Anyway, after the removal of the ending tags the result would be:<p>Total = 5There are some lines in the file that end with </h2></b></center> that I want to keep. These lines looks generally like this:
<p><b><center><h2>Events Summary for Sun May 02 2010</h2></center></b>
As you can see I am using sed to change the beginning and end of lines to make a very simple web page.The file actually displays ok in a browser with the extra tags on the end, but it is just sloppy work to have these unneeded tags in the output file.
View 2 Replies
View Related
Jun 4, 2010
I have a dns-323 linux device that's running pure-ftpd with SSL/TLS authentication. Pure-ftpd is sitting behind a linksys router with IP 192.168.1.51. Pure-ftpd is configured for port 8021 and passive port range 55562-55663. The linksys router is configure to forward port 8021 and the passive port range to 192.168.1.51.
From outside my network I can connect to the ftp server using the WAN address of the router. I'm using filezilla 2.2.32 as my client and I choose FTP w/ explicit TLS (no other option will connect). The client will authenticate successfully with pure-ftpd but once it sets up the passive data connection and tries to do a LIST of the root directory, there's a timeout. I'm assuming this is because the passive data connection is not working. In pure-ftpd, I tried changing the passive address that it reports, to be the WAN address of the router, but it did not make a difference. I included the log from filezilla below.
[Code]...
View 5 Replies
View Related
Jul 28, 2011
I have a tenda wireless adaptor running on 11.04 natty. The USB id is: 148F 3070
Using the already installed drivers rt2800usb it can connect to all 13 channels but is very flaky, transfers slowing and stopping regularly. The rt2870sta driver is a lot more stable, but it will only connect to channels 1-11, but I need to use channel 13 due to massive wifi congestion on other channels...
I've tried iwpriv but it says there are no ioctls for the device.
Is there any way to get the installed driver rt2870sta to scan and connect to channel 13?
I've also tried to install the latest drivers from the ralink website: rt2870sta says it can see all 14 channels, but fails to scan. I also tried rt3070 driver but I cannot insmod as there are errors...
View 9 Replies
View Related
May 12, 2010
I am using RHEL 5.I have a very large test file which cannot be opened in vi.The content of the file has some 8000 lines.I need to view ten lines between 5680 to 5690.How can i view these particular lines in a large file.what is command and option i need to use.
View 1 Replies
View Related
May 13, 2011
I thought I would give Ubuntu One a try, even though I have used DropBox for ages. I think it is all set up ok, yet the box is open and it says along the top:
Using 102.6 MB of 2.0 GB (5%)
File sync in progress......
It has said that for 5 hours now.....is it actually doing anything? I have closed it a couple of times, and opened it, and it goes right back to the same thing. What is more weird is there are no sent / receive lights flashing on my Modem either.
View 1 Replies
View Related
Jul 29, 2011
On Windows, I can drop file on batch script file, then dropped file is accepted as script parameter and script is automatically executed. Trying the same in Nautilus it seems like not possible.Is there some other way of using this approach in Debian?I hope it's clear what I'm after - I don't want to write Nautilus scripts as workaround and want to avoid: - opening terminal - cd to bash script - type script name andmeter file then executeinstead I would like already mentioned, drop filename expected as parameter to bash script, and then script to execute automatically.
View 3 Replies
View Related
Nov 3, 2010
I have noticed that lately when I drag a file into another folder to replace a file of the same name, nautilus crashes. I need to restart nautilus to fix this. I can copy and paste to replace the file without a problem though.
View 2 Replies
View Related
Jun 1, 2010
I'm trying to setup apache2 to drop a core file when it crashes. I know that you need to set the CoreDumpDirectory directive in /etc/apache2/apache2.conf and run "ulimit -c unlimited" from the command line (and restart apache after the ulimit command). But, on a reboot, even though the output of "ulimit -a" shows unlimited, apache2 will not create a core dump file unliess you set ulimit -c unlimited again and restart apache2. There must be a way to configure apache2.conf or something so that ulimit -c unlimited is set prior to apache2 starting, no?
View 2 Replies
View Related
Sep 10, 2010
I have seen this 3 times now - it's an updated Lucid with EXT4, trying to copy to a 500G USB drive?
View 3 Replies
View Related