Ubuntu :: Data Loss When Transferring Large Number Of Files?
Jul 20, 2010
This problem is not exclusive to Ubuntu, I've experienced it in Windows and OSX as well, but it seems that almost every time I transfer a large number of files (i.e. my music collection) between my desktop computer and laptop via my external hard drive, I end up losing files for no reason. I usually don't notice the files are missing until later on, because I am never informed of any data loss. Now, every time I make a large transfer of files, I just do it two or three times to ensure that I don't lose any files.
View 2 Replies
ADVERTISEMENT
Feb 6, 2011
Every time I attempt to transfer over a large file (4 GB) via any protocol, my server restarts. On the rare occasion that it doesn't restart, it spits out a few error messages saying "local_softirq_pending 08" and then promptly freezes. Small files transfer fine.
Relevant information:
Ubuntu server 10.10
Four hard drives in RAID 5 configuration
CPU/HD temperatures are within normal range
View 7 Replies
View Related
Oct 5, 2010
I want to transfer an arbitrarily large file (say >20GB) between 2 servers. I have several considerations:
Must use port 22 (ssh) because of firewall restrictions
Cannot tax the CPU (production server)
Memory efficiency
Would prefer a checksum check but that could be done manually
Time is not of the essence
Server A and Server B are on the same private network (sharing a switch) and data security is not a concern, Server A and Server B are not on the same network and transfer will be via the public internet so data security is a concern, My first thought was using nice on an scp command with a non-CPU-intensive cypher (blowfish?). But I thought I'll refer to the SU community for recommendations.
View 1 Replies
View Related
Nov 12, 2010
sometimes when transferring large files using scp between my desktop running maverick and other servers running Ubuntu, Debian or CentOS, I get the following error message: 77% 258MB 11.3MB/s 00:06 ETAReceived disconnect from xxx.xxx.xxx.xxx: 2: Packet corruptI've found a seemingly related bug report on launchpad here: but the provided "ethtool" fix did not help. I'd be most grateful for any ideas on how to solve this issue. Some more info:Linux lotus 2.6.35-22-generic-pae #35-Ubuntu SMP Sat Oct 16 22:16:51 UTC 2010 i686 GNU/Linuxlspci | grep eth -i00:19.0 Ethernet controller: Intel Corporation 82567LM-3 Gigabit Network Connection (rev 02)
View 1 Replies
View Related
Jun 16, 2011
I am using the diff command with the -r option, to compare a large number of files and files in subdirectories. My main interest is to find out which files have been changed, and not what the actual changes are, and since a lot of files has been changed, it would be a lot easier to view the file names only. Is there and option for diff that might do this, or does there exist a similar tool/command that could do the job?
View 1 Replies
View Related
Dec 6, 2010
in my middle of script i am using tar command to tar some 1000 images and size of each image is 5MB.
all the images are provided as argument as tar -cvf file.tar <all images as argument>
but my tar file file.tar does not contain all the images.
View 4 Replies
View Related
Feb 21, 2011
We recovered a large number of files from a HD I messed up. I am attempting to move large numbers of files of a type e.g. .txt .jpg , into a folder by type to more easily sort through them.
Here are the commands I have mainly been trying with various edits:
Code:
Code:
So far the most common complaint I have gotten "missing arguments to execdir".
This is on Ubuntu 10.04
View 6 Replies
View Related
Feb 10, 2010
I've a directory containing around 2.8 lacs of files. I want to move them to another directory.If I use cp or mv then I get an error 'argument list too long'. If I write a script like
for file in ls *; do
cp {source} to {destination}
done
then because of ls command , its performance degrades.How can I do this?
View 7 Replies
View Related
Mar 15, 2011
I am facing problem in copying a large number of file 18 lakh (18,000,000) files from my personal hardisk to another hardisk each file is very small and size of folder is around 3.95 GB copying files using copy given by Windows is frustrating and I am not even able to compress file its giving me error that its not readable.And problem is I am not able to open this drive in Linux it showing me error there saying do diskchk in Windows and Windows disk check is also not able to repair this drive and goes into some mode unsolvable.Is there any way to open disk with error to open in Windows and if not any way I can copy data faster?ERROR: Disk labled EDU is corrupt go to windows and chkdsk /f there and reboot into window 2 times.
View 3 Replies
View Related
Aug 17, 2010
I understand that chroot is usually used to provide security, however, for my issue, security is a big don't care. I am very new to using chroot and don't fully understand how the chroot'd env works.
problem: Trying to use a vendor supplied cross compile environment. The environment runs as a chroot'd env and works just fine. I have a large number of additional modules that I wish to compile in the chroot'd environment. FYI, these modules are also (succesfully) compiled for other targets not using chroot'd env's. Copying the source files into the the chroot environment is not an option (don't have hours to wait for copies to finish and it would break the make system). Having them live in the environment is also not an option (the chroot build is a tiny part of the build process and we cannot revamp our entire source tree to accommodate it).
I am looking for a way to have the compiler in the chroot'd env have access to a path that is outside of the env and typically higher up in the same path that holds the chroot'd env. I have tried soft links (they don't work as expected). Hard links only work for single files and there are 10's of thousands of files that would need to be linked. I am not sure how I would go about exporting the additional files and then mounting the exported files in the chroot'd env (or if that would even work).
View 2 Replies
View Related
May 28, 2011
I have the standard problem of trying to count a large number of files in a directory (>100k)
I have tried: ls ~/user/images/* -l | wc -l and find ~/user/images/* -maxdepth 1 -type f | wc -l
In both cases, I get the argument list too long error message.
I have tried using xargs but I can't seem to get it to work right.
The command
returns a valid answer but it includes all the subdirectories in the file count.
View 4 Replies
View Related
Nov 18, 2010
I'm trying to extract the sender id from a fairly large number of files and am having trouble assigning variables from a file. Here is what I have so far, (which is fairly kludgy I know, but it's been some years since I've done any scripting or programming, and I find that I have lost the knack to a large degree).
[Code]...
View 1 Replies
View Related
Sep 18, 2010
I want to move files from a $SOURCEDIR to a $DESTBASE/$DESTDIR. Under $DESTBASE there are many directories, and I need to test beforehand if a file from $SOURCEDIR already exists in any of them.
This is obviously extremely slow, and the real use case involves dozens of dirs and thousands of files. Creating a temporary "index" file for the find command (instead of running it every iteration) speeds it up a little, but it's still very clumsy.
View 14 Replies
View Related
Oct 20, 2010
I have two servers, one has an empty / and the other has a subdirectory with a large number (4 gig) with many, many files. I need a way to transfer the files en masse from the server with the large number of files to the one that is essentially blank.I don't have space on the used host to simply gzip all the files. I've googled this and see that there may be some combination of tar and/or gzip that will let me do this with some sort of redirection.
I really need and example line of how this can be accomplished. If my explanation seems rather sparse, I can supply more details.
View 3 Replies
View Related
Jul 6, 2011
I'm experiencing connection problem when transferring a large file from Windows 7 (Home Premium) to my Ubuntu 11.04. The transfer starts, but after a couple of seconds the connection drops and all the shares are unavailable. I'm also unable to connect to the server over ssh, and the only thing I can do to restore the connection is to reboot the server. The strange part is that this was never a problem a couple of weeks ago, and I've not done anything to the setup on either machines besides installing security updates.
View 9 Replies
View Related
Jun 15, 2010
I would like to transfer a massive amount of data between my laptops simultaneously. Is there a way to attach a cable via one flash port on one laptop to another and transfer like that?
View 4 Replies
View Related
May 30, 2011
I was dual booting Ubuntu 8.10 and Windows Vista on a Acer Aspire 5920. My laptop went for service and the company said they will give me a fresh laptop instead of the one I am using now for they cannot set it right.
When I plugged in my hard disk through an external cable onto a windows machine it detected only the data present (which is almost negligible in my case) in the vista partition. I would like to transfer all the data I have from hard disk at least the one in the Ubuntu 8.10 to my new laptop on which I plan to run Ubuntu. Can someone guide me on how to go about doing it? Also is it possible to transfer all the installed softwares so that they start running in the new laptop without any extra effort?
View 9 Replies
View Related
Jun 13, 2011
I am not sure where to post this so i figure that general help is as good as any. I have just bought my self a 32 gb micro sd card and I want to transfur all my old stuff (apps, contacts, music) off my existing card, how do I do this?
View 2 Replies
View Related
Mar 6, 2010
openSuse 11.2 KDE 4 desktop User = newbie I was having some trouble with a Hard drive that I store my photo's and other personal data on. Fisrtly, it was messing my OS up, so that it would not boot to the desktop, but only to the command line where it would report that this drive was somehow corrupted and needed to be repaired. Secondly, the OS was reporting that certain files were 380,000.2 TB in size. this drive is only 200 GB.I didn't want to do the repair until I found a way to back up my data. I found another hard drive to do backup to but I had to take some data off of it. Which I did.
In the spirit of trying learn command line I decided to partition and format the hard drive using fdisk. I found some easy to use directions and proceeded to partition the drive. Unfortunately, I failed to realize the directions were only for the partitioning part [my fault for not reading further]. What I missed was that after using fdisk to partition, you must use mkfs to set filesystem type.[I thought I had done that in fdisk using the "l" , "t" commands within fdisk].
My questions are:
1] Can I do this mkfs command after I have put my data on the new hard drive or should I start fresh and redo partioning? [I have not deleted the other drive so my data is still there.]
2] What would be the cause of the incorrectly reported file sizes?
View 2 Replies
View Related
Dec 9, 2009
I' m using Mandriva 08 and windows 7 on my system. When i login to linux and try to copy data from linux drive to win drive or vice versa then it shows........
[root@lenovo mnt]# cp /home/simer/Desktop/*.avi win_e/
cp: cannot create regular file `win_e/xyz.avi': Read-only file system
cp: cannot create regular file `win_e/abc.avi': Read-only file system
I tried rsync, scp but didn't worked.
View 2 Replies
View Related
Jan 7, 2010
i have installed ubuntu server 64bit with lamp so all in one (apache, mysql and php) and i need to downgrade mysql because of serious problems with mysql 5.1 and this is the only solution. But i need to be working with apatche and php also and have my.ini like i have now (can backup it) so is there any save way to do that?
View 9 Replies
View Related
Jun 14, 2010
I installed Ubuntu 10.04 on the laptop and it looks pretty good. I currently run 9.10 on the main desktop and would like to upgrade to 10.04, by pressing "upgrade" in the update manager, but I have some questions before I do, namely about data loss.
If I upgrade, will stuff like Thunderbird keep my emails, FF keep its profile (cookies, bookmarks, addons etc..), the documents keep all the documents, I have an apache server installed with a few websites - will they still be there after an upgrade? I also have a virtual machine with windoze on, what about all the stuff in there and VMware itself?
Or, will I need to back everything up onto an external hard drive (not sure how to backup Thunderbird and FF), and then reinstall everything, and transfer all the documents, websites etc.. back over again??
View 3 Replies
View Related
Jun 27, 2011
My company needs to send sensitive data across to another company, 800gb of .dpx. The way I have thought of is:
E-Sata/1TB WD black.
True-encrypted/ hw accelerated aes (3x machines being built with 2600k)
Sha1sum on each file.
The main goal is to make sure that
1. The files that were transferred off the server onto the drive, are exactly the same.
2. Secure.
3. Fast.
View 9 Replies
View Related
Nov 18, 2010
I have a 1TB External HD that at the time of purchasing was used with my PS3 which only allowed FAT32 HDs. But now I am using it for other uses. I have came across the problem of the file size limit of 4gb that FAT32 has.The problem is I have about 200 GB filled of data on this HDD and wish to convert it to NTFS with no data being lossed. Is this possible and if so how?
Edit: BTW no Microsoft just Ubuntu
View 2 Replies
View Related
Aug 3, 2011
This forum might not be the best place for this question, but some people here are pretty knowledgeable and may have more insight than I do about this. Anyways, I'm thinking about expanding an NTFS (Windows 7) partition on my desktop computer into unallocated space. I know that there is a risk when shrinking a NTFS partition due to fragmentation but are there any risks of data loss from expanding a NTFS partition? My common sense tells me there isn't a risk but I want to be 100% sure I won't lose any files.
View 4 Replies
View Related
Feb 23, 2010
I used 9.04 for months and it work fine before restarting my PC. After I restarted my PC, the memory consumption takes up to 4.2 GB after login. However, I cannot find any process that consume such large number of memory.
[code]....
View 3 Replies
View Related
Jun 29, 2010
I'm running 64-bit 10.04, upgraded from 9.10. The problem I am experiencing is that any user accounts aside from my main account are problematic. This includes any accounts I add, as well as the GDM guest session.The specific problems that I have thus far experienced are as follows:
1. The desktop loads often improperly. In the latest instance of this the graphics on the right side top panel were randomly chopped-up, leaving parts of my clock on either side of the volume control, among other things. 2. If I make ANY customizations to the desktop at all, the desktop takes nearly a full minute to load on log-in. 3. Flash videos don't work properly on Firefox. Sometimes they only play after refreshing a page, often they will not load at all. Also, attempting to load or play a flash video will sometimes causes Gnome or Firefox to crash. 4. (And this is the one that REALLY has me stumped) Whenever I log into my main accountant after logging out of another account, the IBus control appears in my system tray.However, when I open the IBus preferences the associated check box is (and has always remained) unchecked.Not sure where to go with this one. More than anything, the IBus bug makes me unsure of where to even begin looking for the problem.
View 1 Replies
View Related
Dec 7, 2010
I have a Dell workstation, 2 HDD, HDD 1 setuped Red Hat 5.3 with LVM, and that HDD 2 is empty, not install RAID 1. And, I want to setup RAID 1 (hardware RAID)...but, have a problem. I don't want to lost data on HDD 1 when I setup raid, I try ghost or backup it, but when I restore, it error because LVM is setup on that.
View 4 Replies
View Related
Aug 11, 2010
how to recover A mounted RAID5 data???
View 5 Replies
View Related
Oct 25, 2010
I have a laptop running slackware-current. The disk is /dev/sda and the root 'sda1' is xfs formatted (there is also linux swap at sda2).
recently I was trying to setup openvpn and had to copy a folder with configuration files from /usr/doc/openvpn_<version>/easy-rsa to /etc/openvpn.
I am sure the copying completed cause I got a prompt, but a few seconds later the battery died on me. When I got mains and powered it up, I could see the directory I copied under /etc/openvpn, and the files where all there too. but they all contained nothing. i.e. they had a size of 0.
I read [URL] Fthat an external journal filesystem for root is not supported. I am not sure If it applies to my situation though. As in does it use an internal journal instead?
and the bottom line is: shouldn't the copying have completed successfully? shouldn't I be worried, that this copy failed?
View 7 Replies
View Related