when I try to transfer files between flash drives, it starts to copy normally and then after few seconds it starts to lag. Transfer rate drops with each second by 0.2 MB down until 5 MB/s where it completely hangs. This happens to every single flash drive I put in and this is annoying because I have external HardDrive connected over USB port and I keep all my data in there, so when I try to transfer the files from the hard drive to the flash drive - it lags and hangs. Same thing if I want to copy from internal harddrive to flash drive. Although everything is fine when I copy from the flash drive to the internal harddrive and everything is fine when flash drive or the external harddrive is alone (no other flash drives or something is connected).
I am also using Ubuntu 10.04. I have tried transfering files on my netbook running ubuntu, because I thought that it might be a USB port issue but the same thing happened on the netbook too...
I am running an old p4 system with the latest ubuntu. It has been working wonderfully until today. It all of the sudden starts to hang when copying or even just accessing files from one specific hard drive. I have read that this is an error with ext4. Is there anyway to fix this?
It would be extremely difficult for me to transfer all of these files over if i don't know whether or not it will hang.
I am attempting to migrate from Mandriva to Ubuntu and attempted to install the 11.04 and 10.04 CD distos on a 7-year old computer (Elite Group L7VMM3 motherboard, 1.4 ghz AMD Duron, 1.2GB RAM, 40GB IDE HDD).
The 11.04 installation CD makes it through the HD partitioning (erase everything), Computer name, time zone, and keyboard detection, but hangs after starting the "Copying files" stage of the install. The progress bar stops just below the in "Copying files...".
The 10.04 installation CD makes it through the "Installing the base system", "Retrieving console-terminus" with the progress bar stalled at 6%. Although the computer is old, Vista successfully installs on it.
I built a centralized storage system a year ago (Aug. 2010) and for the past 10 months it has preformed flawlessly. About two months ago (July 2011) I began having problems writing to my centralized storage. The copies (cp -rpv) fail whether they are nfs or samba mounted disks to my centralized storage.
Attempting to copy either frame sequence (.dpx, .tiffs, .targas) or .MOV files now lock up my system and knock my storage offline.
The error that continues to echo back when this happens is :
I have not made any changes or updates to my storage configuration since building it last August 2010. I have reached out to Areca, and they found nothing wrong with the Raid Controller or my Raid settings.
I tried to troubleshoot all the basics - swapped out cables, replaced the WD storage disks, changed switches still the problem persists.
If anyone else has experienced this issue please let me know what you did to fix it. I can also post any error logs or message logs if that would be of assistance.
I currently have a problem in running rsync on 64 bit Debian Jessie (although the problem also occurred with 64 bit Debian Wheezy)I am trying to use rsync to archive my home directory (which is on a hard disk) to a USB memory stick. The home directory is about 18GB in size and the memory stick has 32GB.
Unfortunately, rsync hangs after copying a certain number of files and the process eventually has to be killed. Rsync was rerun but hung again at about the same point as before.This has now happened several times. Each time the hang occurs at about the same point.Use of strace after the hang shows that rsync appears to be processing a pdf file at the time, although not always the same pdf file.I originally had the rsync hang problem on a PC which ran 64 bit Wheezy and which used a USB 2.0 port.
I now am running rsync on another PC, which runs 64 bit Jessie and which uses a USB 3.0 port..I have also tried three different USB sticks, two from one manufacturer and the third from another manufacturer.All give similar rsync hangs.
I am dual booting XP and Ubuntu 10.04, but in the future I will be getting a new machine and I will only be running Ubuntu and won't have access to iTunes. Because I have an iPod Touch, I have been trying to find workarounds for syncing everything that iTunes took care of in the past. One problem I have is managing movies. I have looked through various media players/iPod management tools (Amarok, Rhythmbox, gtkpod) and I am using Rhythmbox to sync my music and and attempting to use gtkpod to sync my movies.
gtkpod is able to sync songs (Tested with a few minute test clip) and short *.mp4 files (15mb I know for sure from test). I am unable, however, to get it to sync a movie (~700mb) I am able to drag it onto my iPod in gtkpod, but when I try to save the changes and write the files, it hangs at "Copying Tracks" at 0%. It eventually crashes during the couple times I have tried to wait it out. So this being my situation, my question is, is there a size limit to the *.mp4 files I can sync to my iPod Touch via gtkpod? is there any other tools that you know of that I can sync videos to my iPod with?
I have once tried transferring files from Redhat 9 web files to Centos 5.5 /var/www/html/ directory. Ok... LAMP is installed and working fine. httpd / apache is running and mysqld is installed without root password. Now I wan't to transfer it again to secure the database from Centos 5 to fedora 13. But when I tried browsing the web to check the web files I can only see these:
Problem : centos 5.3 hangs while remotely copying data onto this server.
I have A centos 5.3 x86 Version installed on Xeon Quad core , running on 8 GB DDR2 RAM, RAID 1 confgured on SATA drives. this was old installation copy and It has many more things running on it.Therfore, I can't just reinstall it.
I have setup SAMBA PDC + ldap on it. I encounter that system hangs when I copy data into this server from LAN computers.
My troubleshooting steps :
1. I have checked for network issue , My network works fine when I transfer data between other LAN compuers.
2. Even I have tried confguring seprate LAN....
3. I thought their might be some problem with my SAMBA configuration, therefore I did data transfer from sftp .
I've resolved to move onto Ubuntu, but I haven't done so quite yet. One question that sprang to mind was the transferral of files from the Windows OS to the Linux OS on the same machine, if it's possible. I'm thinking something like the dual partition, though I suppose Copy/paste wouldn't work.
Another possible solution was using dual partition and then Windows SCP, but everything I've read on the topic indicates that SCP works by finding the (different) host address of the computer; thus it would read my machine as one, even if I'm running Ubuntu on it as well.
how to transfer these friggin' things.
Part of the reason for my query is that I am quite strapped financially and am unable to backup my files on an internet service or on discs.
I want to convert my wife's computer from MS XP 2003 to Ubuntu 10.10. I have saved "My Documents" to an external hard drive. How will I go about transferring these files to Ubuntu? Do I just pick individual files and go to "Save As"? Can I just transfer them en mass?
I am looking to set up a home network server on an old PC. I want to be able to access files on the network hard drive in the PC, stream video or music from the "server", and use the computer as a regular desktop connected to TV as a monitor. I tried setting up an FTP server and that worked for transferring files but failed to get a good connection for streaming or fast transfers.
Was less then 1MB a second. I tried setting up a shared folder but ran into several permission errors between the host PC and the laptop that that connected to the shared folder. which way to go with this to get what I want out of it. I know if can be done just not sure how to do it.
The computer and phone are paired, and transfer starts, but after 4096 bytes it says:Transfer finished with an error: Error sending object. I'm running 64-bit Lucid and Maemo 5 on the phone. Here are some of the things I've tried, following the docs, and the errors I get:
Code: ~$hcitool info 3D:F7:2A:60:7A:40 Requesting information ...
Every time I attempt to transfer over a large file (4 GB) via any protocol, my server restarts. On the rare occasion that it doesn't restart, it spits out a few error messages saying "local_softirq_pending 08" and then promptly freezes. Small files transfer fine.
Ubuntu server 10.10 Four hard drives in RAID 5 configuration CPU/HD temperatures are within normal range
I've got a quadcore PC, 2gb ram pc running ubuntu 11.04 with samba set up on a shared folder - the plan is to share files and media between myself and my housemates. However, I do not have the router attached by ethernet, I am connecting to the router (in another room) by a d-link wireless dongle.
I can access the share without real issue from my windows laptop and other computers in the house share I'm part of, and I can happily transfer a text file here and there, but it seems to be anything over 20mb will not go. All the files I am trying to transfer are >1gb and it's driving me mad.
After some decent transfer rates of around 1mb per second it just freezes up. The screen attached to the actual server goes blank and the host becomes unreachable by ping.I'll be happy to provide any command results or anything to get this working. any assistance would be welcome!
I have two different Linux operating systems (on two separate partitions) on my laptop. One is open SUSE and the second is PCLinuxOS and I frequently have to update files and folders from one to the other (the consequences of trying to keep both current!). When I am using openSUSE and want to access folders/files on the other OS, openSUSE asks for SU authorisation to access the other . . ./home file. Until recently, I could open ALL of the folders on the (other) /home file but now I can open some folders but not others. The Dolphin screen shows a warning at the bottom that "Could not enter folder /media...... " ! I can then only access the folder by going into the menu and using the 'File manager -Super User' mode ..AGAIN.
I suspect this action is necessary because I have had to previously open these specific folders in 'Root mode' to be able to 'paste' the moved file! I would like to avoid this (double) 'messy' action - how to change/avoid this My system is a 32bit openSUSE 11.4 with KDE 4.6.0 and the 2nd OS is also KDE 4.6
This problem is not exclusive to Ubuntu, I've experienced it in Windows and OSX as well, but it seems that almost every time I transfer a large number of files (i.e. my music collection) between my desktop computer and laptop via my external hard drive, I end up losing files for no reason. I usually don't notice the files are missing until later on, because I am never informed of any data loss. Now, every time I make a large transfer of files, I just do it two or three times to ensure that I don't lose any files.
I downloaded Amarok 2.4 today and am attempting to use it to sync my music to my iPod Touch. .mp3 files transfer perfectly, but for some reason any .m4a file is being reassigned a .mp4 extension and being relegated to the 'Video' app on the iPod. I do realise that an .m4a is just .aac audio in an mp4 wrapper, why Amarok and/or the iPod is deciding to treat them like a video instead? It's doing it both to my own ffmpeg-encouded .m4as as well as unprotected .m4as from the iTunes Store (relics of my time on Windows). Anyone know what might be going on with this? Ubuntu 10.10 NE, Amarok 2.4, iPod Touch 2G (... possibly 3G, but I'm pretty sure it's a 2G) running firmware 4.2.1
I am trying to download a EBCDIC file from z/Os to UNIX using FTP in ASCII mode. The problem I am seeing is; when UNIX FTP Client issues a get; UNIX CPU goes to 100% CPU utilization when transferring a file in ASCII mode. I have done packet traces using wireshark and noted that z/OS Server is translating the file to ASCII before putting it on the wire, however when UNIX FTP Client gets the file I believe that the UNIX FTP Client is again translating the file to ASCII causing the 100% CPU utilization. I need verification that this is what is happening, and is there a way of getting around this other then transferring the file in binary mode then doing a EBCDIC
I have a linux box that I plan to use primarily as a server. I also have another machine that dual boots Windows/Linux.I would like to have a way to backup my filehe Windows/Linux box onto the Linux server. In other words, I am assuming the hardrive on the Windows/Linux box could fail at any time and I want to have a backup of important files.Should I set up an ftp server to do this? Are there any security issues that I need to be concerned about if the files contain sensitive information etc?
Desktop, laptop, both static IPs, can ping each other no issue. 10.04 LTS.Am attempting to transfer files using Filezilla (which worked when I tried it about two years ago!) and the overall response is 'No route to host'. I have scoured the interwebs and have found no solution. Pretty sure I'm putting in the correct details.
I want to transfer an arbitrarily large file (say >20GB) between 2 servers. I have several considerations:
Must use port 22 (ssh) because of firewall restrictions Cannot tax the CPU (production server) Memory efficiency Would prefer a checksum check but that could be done manually Time is not of the essence
Server A and Server B are on the same private network (sharing a switch) and data security is not a concern, Server A and Server B are not on the same network and transfer will be via the public internet so data security is a concern, My first thought was using nice on an scp command with a non-CPU-intensive cypher (blowfish?). But I thought I'll refer to the SU community for recommendations.
I am running a linux server and am familiar with scp somewhat, and have used it successfully in the past. But that was talking to another linux server. Thing is, here I need to talk with a Windows server. How do I do that. In addition, their server may require a VPN, which seems to add another layer of complexity.
sometimes when transferring large files using scp between my desktop running maverick and other servers running Ubuntu, Debian or CentOS, I get the following error message: 77% 258MB 11.3MB/s 00:06 ETAReceived disconnect from xxx.xxx.xxx.xxx: 2: Packet corruptI've found a seemingly related bug report on launchpad here: but the provided "ethtool" fix did not help. I'd be most grateful for any ideas on how to solve this issue. Some more info:Linux lotus 2.6.35-22-generic-pae #35-Ubuntu SMP Sat Oct 16 22:16:51 UTC 2010 i686 GNU/Linuxlspci | grep eth -i00:19.0 Ethernet controller: Intel Corporation 82567LM-3 Gigabit Network Connection (rev 02)
I've discovered that Dolphin seems to lose random files when copying many large folders.
I first noticed this a few months ago when I tried to copy my music library from one folder to another on the same HDD. It consisted of around 600 folders and 6500 files. During the copy there were no errors but after the copy I found that some of the newly copied folders were missing files. I put it down to human error or a glitch.
Yesterday I tried to copy 13 folders containing rips of some of my DVDs. Each folder basically had one film of either 700MB or 1.4GB. Again no errors showed up during the copy but I found 3 of the newly copied folders were empty.
It's not so critical with music or films but I can't afford to lose work data like this.
Has anyone experienced or seen a similar problem with Dolphin? I'm going to have to do some more extensive testing but this is not good.
The first time I noticed the problem I was running KDE4.3.4 (I think) and now the latest was with KDE4.4.0.
I have been using gFTP to transfer files to/from a windows machine, but have run into a problem and wanted to try FileZilla. Using YasT (Package Search) I found and selected "filezilla - A GUI FTP and SFTP Client", and proceeded to attempt to install it. The outcome is:
Attempting to install FileZilla under YasT:
Download failed: File '/repodata/repomd.xml' not found on medium url.
I need to copy some files from an OpenBSD 4.5 server to my Ubuntu set up via flash disk tell me what commands to enter on each in order to do this please? I believe OpenBSD is UFS and I'm running 9.10 so Ubuntu is ext4
I am trying to copy files across a network using terminal. i know this is possible but i can't find a way. but i can paste smb://xxx.xxx.xxx.xxx/myfolder in the address bar in a folder and navigate to the correct area.is there some other way i can do this?
I had to leave rather suddenly (decision was Germany's, not fiancee's), and now am in the States again, but my files are still there. There are lots of things (mainly photos) on there, that I would like to have here. SOME stuff is on CD/DVD, but there is stuff that isn't.
Is there some way that I could access her computer and just copy files to mine? (I picture two folders, one on hers, one on mine, and I just dragging an icon from one to the other - but I don't suppose anything's THAT easy...)
Naturally, her computer would have to be on, and I assume she could not be using it herself. Naturally also, I would get her permission first...
I assume that it would take a while, but if it were as simple as the icon thing I described above, then I could just go away and come back later. I assume something would be running to make sure that the destination file and source file were identical before proceeding to the next file.
[Before you suggest: why doesn't she just copy the files to DVD's and mail them to me? She is not all that computer saavy, I'm afraid, and trying to explain it all, including the exact folder names and such, Plus, how to use a DVD/CD burner and software... Plus translate it all into German... If I were rich enough, I could just go there and do what I needed, but that is not in the cards for the present. ]
IS there a way to copy stuff from her computer to mine?