Ubuntu Networking :: Transfer A Large File Lets Say 700Mb Or So Wireless Shuts Down?
Jun 6, 2010
I have a problem that I can't seem to fix.When I try to transfer a large file lets say 700Mb or so my wireless shuts down and i have to restart my ubuntu computer the other computer is vista.ubuntu is on a wusb54gver4 and vista is going through a wrt54g tm running dd-wrt mega.I have tried every thing i know with no luck.
When I try to transfer a large file lets say 700Mb or so my wireless shuts down and i have to restart my ubuntu computer the other computer is vista.ubuntu is on a wusb54gver4 and vista is going through a wrt54g tm running dd-wrt mega.I
I am attempting to burn the ISO for Lucid Lynx final onto a 700MB CD. The ISO file is 699MB, but Windows reports that the size on disk is 733MB and thus CD Burner XP refuses to burn the file, stating that it's too large for a CD.
Why this discrepancy on file sizes? I've noticed this with other files as well, suddenly it's a bit of a problem, as you can see!
I'm setting up a htpc system (Zotac IONITX-F based) based upon a minimal install of ubuntu 9.10, with no GUI other than xbmc. It's connected to my router (d-link dir-615) over a wifi connection configured for static IP (ath9k driver), with the following /etc/network/interfaces:
auto lo iface lo inet loopback # The primary network interface #auto eth0
Network is fine, samba share to the media direction works, until I try to upload a large file to it from my desktop system. Then it downloads a couple of percents at a really nice speed, but then it stalls and the box becomes unpingable (Destination Host Unreachable), even after canceling the transfer, requiring a restart of the network.
Same thing when I scp the file from my desktop system to the htpc, same thing when I ssh into the htpc, and scp the file from there. Occasionally (rarely) the file does pass through, but most of the time the problem repeats itself. Transfer of small text files causes no problems, and the same goes for the fanart downloads done by xbmc. I tried the solution proposed in this thread, and set mtu to 800 in the interfaces file, but the problem persists.
I'm trying to create an Ubuntu Server file server that will handle large file transfers (up to 50gb) from the LAN with Windows clients. We've been using a Windows server on our LAN on the file transfers will occasionally fail... though the server is used for other services as well.
The files will be up to 50gb. My thoughts are to create a VLAN (or separate physical switch) to ensure maximum bandwidth. Ubuntu server will be 64bit with 4tb of storage in a RAID 5 config.
I just bought a HP 3085dx laptop with an intel 5100 agn wireless card. The problem: copying a big file over the wireless to a gigabit hardwired to the router computer only gives an average 3.5MB/Second transfer rate. If I do the same copy from my wireless-n macbook pro to the same computer. I get a transfer rate of about 11MB/sec. Why the big difference? I noticed the HP always connects to the 2.4 GHZ band instead of the 5GHZ bands...
[jerry@bigbox ~]$ iwconfig wlan0 wlan0 IEEE 802.11abgn ESSID:"<censored>" Mode: Managed Frequency:2.412 GHz Access Point: 00:24:36:A7:27:A3 Bit Rate=0 kb/s Tx-Power=15 dBm Retry long limit: 7 RTS thr: off Fragment thr:off Power Management: off Link Quality=70/70 Signal level=-8 dBm Noise level=-87 dBm Rx invalid nwid:0 Rx invalid crypt:0 Rx invalid frag:0 Tx excessive retries:0 Invalid misc:0 Missed beacon:0
I am not getting any errors. I don't know why the bit rate is not known. My airport extreme base station typically reports that the 'rate' for the hp is typically 250~300MBi and about the same for the MacBook Pro. The hp is about 6 inchs away from the base station. Is there anyway to get the rascal to go mo'faster? Is there anyway to get the rascal to use the 5GHZ band.
When accessing an NFS mount for a large (200MB+) file transfer, the transfer starts rapidly, then becomes slower and slower until it hangs. On several occasions, it has frozen the client machine. Both client and server are set to default to nfs version 3. Slowdown and hang also occur when connecting to FreeBSD NFS mounts.
Presumably (I hope), there is some sort of configuration for the client that needs to be set. what should be changed in the configuration? This worked out of the box in OpenSUSE 11.0.
I have Fedora 12 (with all the latest patches, including the 220.127.116.11-162 kernel) installed on a new Supermicro SYS-5015A-H 1U Server [Intel Atom 330 (1.6GHz) CPU, Intel 945GC NB, Intel ICH7R SB, 2x Realtek RTL8111C-GR Gigabit Ethernet, Onboard GMA950 video]. This all works great until I try to transfer a large file over the network, then the computer hard locks, forcing a power-off reset.
Some info about my setup:
[root@Epsilon ~]# uname -a Linux Epsilon 18.104.22.168-162.fc12.i686.PAE #1 SMP Fri Dec 4 00:43:59 EST 2009 i686 i686 i386 GNU/Linux [root@Epsilon ~]# dmesg | grep r8169 r8169 Gigabit Ethernet driver 2.3LK-NAPI loaded
I'm pretty sure this is an issue with the r8169 driver (what I'm seeing is somewhat reminiscent of the bug reported here). The computer will operate fine for days as a (low volume) web server, and is reasonably stable transferring small files, but as when as I try to transfer a large file (say during a backup to a NAS or a NFS share), the computer will hard lock (no keyboard, mouse, etc.) at some point into the transfer of the file. It doesn't seem to matter how the file is transferred (sftp, rsync to NFS share, etc.).
I have windows 7 dual-booted with ubuntu. Im having issues with ubuntu however and i am very new to using linux (sorry). When i intalled ubuntu and ran on it a seperate partition everything seemed to work fine until i tried connecting to the wireless internet in my university. This is what's driving me crazy. When i first turn ubuntu on it connects normally to the internet, and i can usually get onto google for 30 seconds. After 30 seconds, the wireless loses its connection and the internet shuts off, even though it can still detect the wifi in the area. I try to reconnect but it consistently fails... I also get some sort of .local avashi error message (dont remember what it said exactly) for a brief moment and then it fades away. Can someone tell me what to do? This is driving me crazy. btw, i have a realtek rtl8191 wlan nic driver installed on windows, so im assuming thats the driver installed on linux to. Do i need to install a different version like kubuntu or xubuntu?
I have been getting from my wireless adapter. This has been going on for quite some time but only now am I able to see a pattern. The behavior seems to be the following: Start up, connects to network and work fine until I suspend my machine. On wake up, it works again but after a seemingly random amount of time the connection is lost and I can no longer find any networks. Attached is dmesg output after startup and connecting to a wireless network (dmesgAfterConecting.txt) and the same after the connection has been lost (dmesgAfterConnectLost.txt). I get this error which seems important:
Is it possible to split a NIC into two lets say.I've seen how in virtual box you set an option (I don't remember which one) that the guest OS uses the same NIC (virtualized?) and gets an IP assigned from the router. For example the host connects to the router using wlan0 and recieves IP 192.168.1.2 then the guest uses the same NIC wlan0 (i think correct me if I'm wrong thats why I ask) and gets 192.168.1.3
My question is: Is it possible to split lets say wlan0 in two in the same OS to get different IP address?and if it is. Can you connect to two different SSID with the same NIC using wpa_supplicant?
I downloaded a file that was 40mb compressed and was almost 700mb when fully extracted. It was inside a .rar file that in turn was inside another .rar file. How can this be done in Ubuntu? Can this also be done with .zip and .7z files?
i am trying to transfer a file from my live linux machine to remote linux machine it is a mail server and single .tar.gz file include all data. but during transfer it stop working. how can i work and trouble shooot the matter. is there any better way then this to transfer huge 14 gb file over network,vpn,wan transfer. the speed is 1mbps,rest of the file it copy it.
[root@sa1 logs_os_backup]# less remote.log Wed Mar 10 09:12:01 AST 2010 building file list ... done bkup_1.tar.gz deflate on token returned 0 (87164 bytes left) rsync error: error in rsync protocol data stream (code 12) at token.c(274) building file list ... done code....
how to transfer large files from my laptop to external hard drive. Problem occurs when I'm sending Blu-ray films (4.4GB) to external, gets to 4GB and then comes up with error. Is there any way of breaking it up and then merging when it reaches the hard drive or is there a way of sending it as one whole file.
I would like to transfer my music library and movie collection from my Desktop computer running Windows Vista and my laptop running Debian Squeeze. I have the laptop connected via wireless but it's possible to connect the two either directly with a CAT5e cable or through the router. I'm just wondering what the best way to do this would be.
During NFS transfers my laptop crashes my entire wireless network. Laptop is a Samsung R20, Ubuntu 32-bit 10.04 with atheros AR5001 wifi and ath5k driver. What works: ->NFS is working. I have 3 other machines on wired connections that use NFS happily. And the NFS shares work fine if i plug the laptop into a wired connection. ->Wifi works. I can browse the internet and a large download (700MB) with bittorrent takes 8 mins and doesn't down the wireless.
What doesn't work: ->NFS and wireless together. Any NFS transfer over wireless eventually downs the wifi. Forl example streaming one 3mb mp3 in rhythmbox crashes wireless after playing for about 30 seconds. When the wireless fails it breaks for ALL computers and the only way to re=establish it is by restarting the router. Wired connections remain OK even when wireless is down (both from the same router).->I have a 64-bit desktop with wifi that also suffers the same problem but that doesn't matter so much since it also has a wired connection.
Every time I download torrents in Ubuntu 11.04 on my desktop, using Transmission or Deluge clients, the wireless transfer speed abruptly drops off, becomes very choppy, and in general doesn't remain stable. It's not only the torrent download that slows down, but it's everything including web surfing, even though I'm not maxing out my connection speed. It's not that I'm getting disconnected from my router, it's just that the wireless transfer speed drops off, becomes intermittent, and never gets back up to full speed.
At first I thought that my ISP was throttling my connection, but this issue doesn't happen with a direct cable connection to my router, nor did it happen when I was running Ubuntu 10.04 before. It also doesn't happen when I'm downloading the exact same torrents over wireless using my netbook running Ubuntu Netbook Remix. This issue ONLY seems to happen when using a bittorrent client while on wireless on Ubuntu 11.04 on my desktop. I can max out my internet connection speed by streaming videos, downloading from FTP, etc. using wireless just fine, and the transfer speed remains stable. This also seems to happen with a couple of different wireless USB adapters (rt73usb or rtl8187 drivers).
I tried using ssh between my netbook and desktop, but it was going to take around 30 hours to transfer 39GB over the home network. Also SSH is very sketchy and often drops connections.I've been messing with it all day and I'm quite frustrated.What I'm looking to do is use my netbook as more of a primary computer and the desktop as a storage computer. Not quite a server, because I'd like to still keep a GUI on it. I'd like to be able to keep my music and movies on the desktop and stream them to the netbook (SSH sucks for this, always drops connections). I've already set up the web client for Transmission bit torrent client so I can torrent on a machine that's almost always on and connected.
Is there a better setup for all of this? I like the netbook because of the portability; I like the desktop because it's always connected (for torrents) and it has a larger storage capacity. It would be mainly used around the house. I would like to back up a file or two while abroad, but I'm not looking to stream music while I'm across town or anything.
I need to transfer 330G of data from a hard drive in my workstation to my NAS device. The entire network is gigabit and being run with new HP procurve switches. All machines have static IP addresses. The NAS is a Buffalo Terastation PRO which has the latest firmware, is set to jumbo frames, and has just been upgraded with 4 brand new 500G drives giving us a 1.4TB raid 5 setup. My workstation is a dual Quad core xeon box running on an Intel S5000XVN board with 8G of ram. My OS is Ubuntu 10.04 x64 running on a pair of Intel X25 SSDs in a raid mirror. The data drive is a 500G SATA drive connected to my onboard controller. The file system on the SATA drive is XFS. This problem was ongoing before I got my new workstation, before we had the GB switches, and before the NAS got new drives. When I transfer a small file or folder (less than 500M) it reaches speeds of 10-11 MB/sec. When I transfer a file or folder larger than that the speed slows to a crawl (less than 2MB/sec). It has always been this way with this NAS. Changing to jumbo frames speeds up the small transfers but makes little difference in the big ones. I verified with HP that the switches are jumbo frame capable.
I managed to ssh lan for file transfer but now i want to connect to ssh again for file transfer but from another city,not lan, What ip do i type in Host Ip? I mean,i can tell them to start the ssh server but then i need to type the external ip.
I have ubuntu 10.04 on my netbook and all my files on my XP desktop. How can I connect the two so that I can transfer files from the XP machine to the Ubuntu netbook? Both have WiFi and ethernet. I am not fluent in windows networking but could manage to join two xp machines using an ethernet cable and naming the two with the same workgroup then setting a folder as shared!
I am trying to set up a computer so that I can remotely torrent files and then transfer the files to my PC while I am at college. I set up my computer so that it has a dynamic dns as well as hosting via DynDns. I have recently hit a wall when trying to find a way to transfer my files from the computer at home to my Windows 7 computer at school.
Having a bit of a issue with Debian Squeeze and transferring files to the Sony PSP..Hook up PSP to USB port and Debian mounts it..I go to drag a 125 meg mp4 to video folder..Copy windows takes about 10 seconds to transfer it..Exit USB mode and there is no video there. Go back into USB mode and look at video folder on the PSP memory stick and there is no video..It vanished. From another after copy progress closed I right clicked PSP and unmounted it..
It error-ed saying device was busy and could not unmount..Looking at light on PSP i see memory stick is still being written to..i wait for light to stop flashing..About a minute or so..Then am able to unmount it..Go to PSP video and theres the video ready to be watched. Debian isnt accurately showing the copy progress...Its showing complete when it isnt..I have to watch the light on PSP to know when it is truly finished.
I have a lan network. I want to transfer a file from one Pc (connected to Lan) to another one. I know that the best way to do this it's ssh because it's very safe . But i know Samba too. Are there other ways to tranfer a file in a Lan?
I am unable to transfer files in pidgin. I am working under the assumption that I need to create an exception in iptables in order to rectify this problem. I have done a bunch of google searches trying to figure out how to do this specifically to allow file transfers using pidgin.
I am unfamiliar with iptables (I know I know, read the man page) and I'm not sure which port pidgin uses to transfer files. Can anyone help me figure out how to add an exception to so I can transfer files?
I have connected two computers with each other both having fc installed. Now, when I tried to transfer a file from one computer to another using scp command, sometimes the file transfers very slowly and sometimes very fast. I wana know that why it sometimes transfers slowly. By slow I mean much slower than a file downloaded from a dsl.
We have a server and we have instales an Open suse 10.3 on it. We created a Samba server also. Made to share folder, that we acces from network from other computers that have xp.
The problem is if we try to copy from server it is very slow only 100-300kb/s. The strange thing is that if i copy 1 file then its slow but if i start to copy another one the speed gos up to 10-15mb/s. Evry time i want to copy somethin or install from that server i need to start another copy. If i copy from a comp to that server the speed is normal only if i copy from server its slow.