Ubuntu :: Network Shuts Down When Trying To Transfer A Large File?
Jun 6, 2010
When I try to transfer a large file lets say 700Mb or so my wireless shuts down and i have to restart my ubuntu computer the other computer is vista.ubuntu is on a wusb54gver4 and vista is going through a wrt54g tm running dd-wrt mega.I
I have a problem that I can't seem to fix.When I try to transfer a large file lets say 700Mb or so my wireless shuts down and i have to restart my ubuntu computer the other computer is vista.ubuntu is on a wusb54gver4 and vista is going through a wrt54g tm running dd-wrt mega.I have tried every thing i know with no luck.
When accessing an NFS mount for a large (200MB+) file transfer, the transfer starts rapidly, then becomes slower and slower until it hangs. On several occasions, it has frozen the client machine. Both client and server are set to default to nfs version 3. Slowdown and hang also occur when connecting to FreeBSD NFS mounts.
Presumably (I hope), there is some sort of configuration for the client that needs to be set. what should be changed in the configuration? This worked out of the box in OpenSUSE 11.0.
I'm trying to create an Ubuntu Server file server that will handle large file transfers (up to 50gb) from the LAN with Windows clients. We've been using a Windows server on our LAN on the file transfers will occasionally fail... though the server is used for other services as well.
The files will be up to 50gb. My thoughts are to create a VLAN (or separate physical switch) to ensure maximum bandwidth. Ubuntu server will be 64bit with 4tb of storage in a RAID 5 config.
i am trying to transfer a file from my live linux machine to remote linux machine it is a mail server and single .tar.gz file include all data. but during transfer it stop working. how can i work and trouble shooot the matter. is there any better way then this to transfer huge 14 gb file over network,vpn,wan transfer. the speed is 1mbps,rest of the file it copy it.
[root@sa1 logs_os_backup]# less remote.log Wed Mar 10 09:12:01 AST 2010 building file list ... done bkup_1.tar.gz deflate on token returned 0 (87164 bytes left) rsync error: error in rsync protocol data stream (code 12) at token.c(274) building file list ... done code....
I'm setting up a htpc system (Zotac IONITX-F based) based upon a minimal install of ubuntu 9.10, with no GUI other than xbmc. It's connected to my router (d-link dir-615) over a wifi connection configured for static IP (ath9k driver), with the following /etc/network/interfaces:
auto lo iface lo inet loopback # The primary network interface #auto eth0
Network is fine, samba share to the media direction works, until I try to upload a large file to it from my desktop system. Then it downloads a couple of percents at a really nice speed, but then it stalls and the box becomes unpingable (Destination Host Unreachable), even after canceling the transfer, requiring a restart of the network.
Same thing when I scp the file from my desktop system to the htpc, same thing when I ssh into the htpc, and scp the file from there. Occasionally (rarely) the file does pass through, but most of the time the problem repeats itself. Transfer of small text files causes no problems, and the same goes for the fanart downloads done by xbmc. I tried the solution proposed in this thread, and set mtu to 800 in the interfaces file, but the problem persists.
I am trying to copy a file from a network resource on my Local Area Network, which is about 4.5 GB. I copy this file through GNOME copying utilities by first going to Places --> Network and then selecting the Windows Share on another computer on my network. I open it and start copying the file to my FAT32 drive by Ctrl+C and Ctrl+V. It copies well up-to 4 GB and then it hangs.
After trying it almost half a dozen times I got really annoyed and left it hung and went to bed. Next morning when I checked a message box saying "file too large to write" has appeared.
I am very annoyed. I desperately need that file. It's an ISO image and it is not damaged at all, it copies well to any windows system. Also, I have sufficient space on the drive in which I am trying to copy that file.
I've started having problems with large file downloads. A download will start and after a while freeze. The downloads window reports the correct connection speed and gives an estimated time to complete, but it stays frozen. Small downloads, torrents and surfing are not affected. I can do everything else normally even when the download is frozen. I've checked with my ISP and everything with my equipment checks out.
- Intel P4 3.2GHz - Gigabyte G32M-S2L - 3GB RAM DDR2 - 3 x 1TB RAID5 (MDADM) - 1 x 1TB OS Drive
New Linux user here. I'm running 11.3 x64 gnome (runlevel3) on my headless home file server. I've mounted my RAID array onto /shared and have setup samba (through Yast) for file sharing. However I am having an issue when I transfer multiple small files over the network (i.e music) it will crash the box. I am unable to reach it through ping or SSH. It is also physically unresponsive where by connecting a keyboard or mouse to the system will not work.
Looking through /var/log/messages does not indicate any sort of issue. I've tried turning off unnecessary programs (rTorrent etc) to make sure it isn't overloading the connections however the box still crashes. Although transferring files locally from HDD to HDD does not crash the server.
Ok, the only thing I remember doing is changing out my hard drive because it went bad. I swapped in a new one and reformatted with Windows 7. (I can hear the boos and hisses as I type) Anyway, now file transfers are exceedingly slow. They are ~10KB/s when they should be ~40MB/s. The odd thing is that it mostly occurs on large files, but not all large files and sometimes smaller files are affected. Copying files from one hard disc to another on the server, as well as copying from the server to Win7. Local files are not affected. The problem does not occur on my laptop which has Slackware Linux on it.
Another thing about the problem is that when I mount a disc image on a virtual drive, access time is very slow and windows explorer is very unresponsive and often locks up completely.
I have a samba server with Slackware Linux on it and a Windows 7 client. DHCP is configured through dnsmasq. (dhpc-host=<mac address>,<computer name>) Every now and again, dnsmasq seems to conk out when I try to access the network with my Win7 machine. I then have to reboot the server, as my only access to it is via SSH and I don't have a video card in it. After the server reboots everything is fine until accessing the network with my Win7 machine conks it out again.
I've tried different MTU settings, different network cards on the Win7 machine(Dlink and realtek), various regedit hacks, but none of them produce any results.
how to transfer large files from my laptop to external hard drive. Problem occurs when I'm sending Blu-ray films (4.4GB) to external, gets to 4GB and then comes up with error. Is there any way of breaking it up and then merging when it reaches the hard drive or is there a way of sending it as one whole file.
I am currently running Windows Xp with service pack 3 and OpenSuse 11.0 (i586) on the same machine on separate hard drives. here are the specs on the machine. (Note: it is only a single core machine. I have no idea why both Suse and Windows say it is dual core as the processor was bought and installed months before dual cores went on the market.)
OS Information OS: Linux 188.8.131.52-0.5-default i686 Current user: telknor@linux-l3e2 System: openSUSE 11.0 (i586) KDE: 4.0.4 (KDE 4.0.4 >= 20080505) "release 15.4"
Now on to my problem. About a year ago I had to wipe my Windows drive and backed up about 32 GB of stuff from it to my OpenSuse drive. I did this from the Suse OS using Dolphin. In fact I can see, open, change, delete, modify layout, and move stuff around on the Windows drive all day long from Suse. I can as I said pull stuff over to the Suse drive from Windows drive while in Suse and save to Suse drive. Now I have need of the stuff that was moved but Windows does not see the Suse drive for what it is. Just says 'unknown partion type' gives size in GB and ignores it otherwise. From Suse if I try to move stuff back to the Windows drive it starts to transfer then stops and tells me it does not have permission to access the destination drive, folder, or location depending on how I tried to save it to the Windows drive. Adding new hard drives to the system does not help and the external 1 TB drive we have Suse can see it but will not do anything with it. The thumb drives on the other hand Suse does not have a problem with. It will open, read, and write to them all day long. Suse will not access the network to dump files to the network file server and it does not see any other systems on the network not even the ones running the same OS. I want to do the following:
1. Move the Windows files from the Suse drive to the Windows drive in same machine or to the network file server (also a windows machine)
2. Move all the Suse files to another machine running same Suse OS.
3. Wipe the Suse drive in this Machine so that Windows can use it.
I've fixed almost everything except this problem. My network card worked fine on Windows but since I switched from Windows, file transfer over the network using my network card is horribly slow, 30kbps max on 100mbps wired connection. I know my network card is working fine, and I'm trying to copy files between my Ubuntu PC and windows PC, I've set up Samba as recommended by these forums, I can transfer files when using wireless connection, but I still can't transfer files at a speed faster than 30kbps on a wired connection.And the funny part is, it's fast when copying files from Ubuntu to Windows but it's slow the other way. And I've tried this with more than 4 different Windows computers.lspci returns this for the network controllerCode:Silicon Integrated Systems [SiS] 191 Gigabit Ethernet AdapterIs there any way of updating drivers or something that I can do to fix this problem
I would like to transfer my music library and movie collection from my Desktop computer running Windows Vista and my laptop running Debian Squeeze. I have the laptop connected via wireless but it's possible to connect the two either directly with a CAT5e cable or through the router. I'm just wondering what the best way to do this would be.
I have Fedora 12 (with all the latest patches, including the 184.108.40.206-162 kernel) installed on a new Supermicro SYS-5015A-H 1U Server [Intel Atom 330 (1.6GHz) CPU, Intel 945GC NB, Intel ICH7R SB, 2x Realtek RTL8111C-GR Gigabit Ethernet, Onboard GMA950 video]. This all works great until I try to transfer a large file over the network, then the computer hard locks, forcing a power-off reset.
Some info about my setup:
[root@Epsilon ~]# uname -a Linux Epsilon 220.127.116.11-162.fc12.i686.PAE #1 SMP Fri Dec 4 00:43:59 EST 2009 i686 i686 i386 GNU/Linux [root@Epsilon ~]# dmesg | grep r8169 r8169 Gigabit Ethernet driver 2.3LK-NAPI loaded
I'm pretty sure this is an issue with the r8169 driver (what I'm seeing is somewhat reminiscent of the bug reported here). The computer will operate fine for days as a (low volume) web server, and is reasonably stable transferring small files, but as when as I try to transfer a large file (say during a backup to a NAS or a NFS share), the computer will hard lock (no keyboard, mouse, etc.) at some point into the transfer of the file. It doesn't seem to matter how the file is transferred (sftp, rsync to NFS share, etc.).
it has Adobe Illustrator, I would like to get a file or three across to it. How? The only Mac bit that seems to know what a network card is is something called 'airport'. DHCP is all around it. "Network" means 56k modem, which can't connect to anything. If I show it a usb disk, it wants drivers, which apple hardly have up still for os 9.0.4.
I just bought a HP 3085dx laptop with an intel 5100 agn wireless card. The problem: copying a big file over the wireless to a gigabit hardwired to the router computer only gives an average 3.5MB/Second transfer rate. If I do the same copy from my wireless-n macbook pro to the same computer. I get a transfer rate of about 11MB/sec. Why the big difference? I noticed the HP always connects to the 2.4 GHZ band instead of the 5GHZ bands...
[jerry@bigbox ~]$ iwconfig wlan0 wlan0 IEEE 802.11abgn ESSID:"<censored>" Mode: Managed Frequency:2.412 GHz Access Point: 00:24:36:A7:27:A3 Bit Rate=0 kb/s Tx-Power=15 dBm Retry long limit: 7 RTS thr: off Fragment thr:off Power Management: off Link Quality=70/70 Signal level=-8 dBm Noise level=-87 dBm Rx invalid nwid:0 Rx invalid crypt:0 Rx invalid frag:0 Tx excessive retries:0 Invalid misc:0 Missed beacon:0
I am not getting any errors. I don't know why the bit rate is not known. My airport extreme base station typically reports that the 'rate' for the hp is typically 250~300MBi and about the same for the MacBook Pro. The hp is about 6 inchs away from the base station. Is there anyway to get the rascal to go mo'faster? Is there anyway to get the rascal to use the 5GHZ band.
Having a bit of a issue with Debian Squeeze and transferring files to the Sony PSP..Hook up PSP to USB port and Debian mounts it..I go to drag a 125 meg mp4 to video folder..Copy windows takes about 10 seconds to transfer it..Exit USB mode and there is no video there. Go back into USB mode and look at video folder on the PSP memory stick and there is no video..It vanished. From another after copy progress closed I right clicked PSP and unmounted it..
It error-ed saying device was busy and could not unmount..Looking at light on PSP i see memory stick is still being written to..i wait for light to stop flashing..About a minute or so..Then am able to unmount it..Go to PSP video and theres the video ready to be watched. Debian isnt accurately showing the copy progress...Its showing complete when it isnt..I have to watch the light on PSP to know when it is truly finished.
I just recently erased my hard drive and installed ubuntu 9.10 all over again. I updated as expected and everything was going well, until, my internet stopped working after maybe 40 mins or so after a successful boot. Now this same thing happens every 40 mins and I have to reboot the computer by forcing my laptop to shut down by holding down the power button. This also made me aware of another problem when restarting or logging off my laptop. The laptop hangs at that terminal-syled screen and the underscore blinks and fails to shut the computer down/log off. I resort to holding down the power button to restart. I am using a wireless connection with WEP key protection.
Atheros network chip Laptop Acer Aspire 5532 It just happened again..... I am typing now in tomboy notes.... Time to restart again...... I also try to open terminal and it fails to function (after the internet stop working), any reason for that? I would also like to note that the signal strength decreases as time passes by.......
I have two servers, one has an empty / and the other has a subdirectory with a large number (4 gig) with many, many files. I need a way to transfer the files en masse from the server with the large number of files to the one that is essentially blank.I don't have space on the used host to simply gzip all the files. I've googled this and see that there may be some combination of tar and/or gzip that will let me do this with some sort of redirection.
I really need and example line of how this can be accomplished. If my explanation seems rather sparse, I can supply more details.
I am using RHEL 5.I have a very large test file which cannot be opened in vi.The content of the file has some 8000 lines.I need to view ten lines between 5680 to 5690.How can i view these particular lines in a large file.what is command and option i need to use.
I've got a large .tar.gz file that I am trying to extract, I have had a look around at the problem and seems other people have had it, but I've tried their solutions and they haven't worked.The command I am using is:
how efficient and effective are these snort, argus, ossec etc etc for an organization having 3500 PC Network, connected through 700+ Cisco Devices (Layer 2 and Layer 3), and scattered on 130 different sites (geographically)? what should be the combination of products and what should be the architecture for an efficient forensics activity?
I have been having a recurring problem backing up my filesystem with tar, using bzip2 compression. Once the file reached a size of 4Gb, an error message appeared saying that the file was too large (I closed the terminal so do not have the exact message. Is there a way to retrieve it?). I was under the impression that bzip2 can support pretty much any size of file. It's rather strange: I have backed up files of about 4.5Gb before without trouble.
At the same time, I have had this problem before, and it's definitely not a memory problem: I am backing up onto a 100G external hard drive.
That reminds me, in fact, (I hadn't thought of this) that one time I tried to move an archived backup of about 4.5Gb to an external (it may have been the same one) and it said that the file was too large. Could it be that there is a maximum size of file I can transfer to the external in one go? Before I forget, I have ubuntu Karmic and my bzip2 version is 1.0.5 (and tar 1.22, though maybe this is superfluous information?)
I am attempting to burn the ISO for Lucid Lynx final onto a 700MB CD. The ISO file is 699MB, but Windows reports that the size on disk is 733MB and thus CD Burner XP refuses to burn the file, stating that it's too large for a CD.
Why this discrepancy on file sizes? I've noticed this with other files as well, suddenly it's a bit of a problem, as you can see!
I recently built a home media server and decided on Ubuntu 10.04. Everything is running well except when I try to transfer my media collection from other PCs where it's backed up to the new machine. Here's my build and various situations:
Intel D945GSEJT w/ Atom N270 CPU 2GB DDR2 SO-DIMM (this board uses laptop chipset) External 60W AC adapter in lieu of internal PSU 133x CompactFlash -> IDE adapter for OS installation 2(x) Samsung EcoGreen 5400rpm 1.5TB HDDs formatted for Ext4
Situation 1: Transferring 200+GB of files from an old P4-based system over gigabit LAN. Files transferred at 20MBps (megabytes, so there's no confusion). Took all night but the files got there with no problem. I thought the speed was a little slow, but didn't know what to expect from this new, low-power machine.
Situation 2: Transferring ~500GB of videos from a modern gaming rig (i7, 6GB of RAM, running Windows7, etc etc). These files transfer at 70MBps. I was quite impressed with the speed, but after about 30-45 minutes I came back to find that Ubuntu had hung completely.
I try again. Same thing. Ubuntu hangs after a few minutes of transferring at this speed. It seems completely random. I've taken to transferring a few folders at a time (10GB or so) and so far it has hung once and been fine the other three times.Now, I have my network MTU set from automatic to 9000. Could this cause Ubuntu to hang like this? When I say hang I mean it freezes completely requiring a reboot. The cursor stops blinking in a text field, the mouse is no longer responsive, etc.
I have 2 ASUS Boxes (one with 8GB, one with 4GB) When both mtus are set at 7200, using scp to copy a 56MB file takes 2:06. If I reduce either mtu to 1500, the speed is 2 seconds. I'm wondering if this is some kind of kernel bug or driver bug or what. For the moment, I've lowered the mtu to 1500 to get the performance out of the machines, but find it interesting that what should make it faster is actually slowing down. Where should I post this to get it looked at? Is anyone else seeing it.I see a similar performance issue with smbclient too.