General :: Copying Large Number Of Files From One Directory To Another

Feb 10, 2010

I've a directory containing around 2.8 lacs of files. I want to move them to another directory.If I use cp or mv then I get an error 'argument list too long'. If I write a script like

for file in ls *; do
cp {source} to {destination}
done

then because of ls command , its performance degrades.How can I do this?

View 7 Replies


ADVERTISEMENT

General :: Copying Large Number Of Files In Windows?

Mar 15, 2011

I am facing problem in copying a large number of file 18 lakh (18,000,000) files from my personal hardisk to another hardisk each file is very small and size of folder is around 3.95 GB copying files using copy given by Windows is frustrating and I am not even able to compress file its giving me error that its not readable.And problem is I am not able to open this drive in Linux it showing me error there saying do diskchk in Windows and Windows disk check is also not able to repair this drive and goes into some mode unsolvable.Is there any way to open disk with error to open in Windows and if not any way I can copy data faster?ERROR: Disk labled EDU is corrupt go to windows and chkdsk /f there and reboot into window 2 times.

View 3 Replies View Related

Server :: Difference After Copying Large Directory To A New Directory?

Apr 4, 2010

I m having a RHEL-5 sever.ABC directory size is 57GB after taking backup in the same disk with name ABC.bkp showing 56GB. i used below command to copy/backup. # cp -r ABC ABC.bkp (different sizes after copying)..I checked both the directory sizes by #du -sh <ABC> and du -ks <ABC.bkp>In both GB and KB there is lots of difference (200mb). why this will happen in copying? what is the solution for above question? what is the correct way of copying 1dir to newdir exactly?

View 4 Replies View Related

General :: Tar Not Working With Large Number Of Files?

Dec 6, 2010

in my middle of script i am using tar command to tar some 1000 images and size of each image is 5MB.

all the images are provided as argument as tar -cvf file.tar <all images as argument>

but my tar file file.tar does not contain all the images.

View 4 Replies View Related

General :: Cp Adds Exclamation Points When Copying Very Large Text Files?

Jul 13, 2009

For my research I have some very large files that are basically millions of lines of ten columns of numbers. These files can be up to 5 GB in size. Recently I noticed that when I made a copy of one of my files, some exclamation points appeared in it where there should not be any: in front of random numbers throughout the file. Making another copy of the file would result in exclamation points in front of different numbers in different parts of the file. Doing this many times has given me up to four exclamation points in different parts of the file. Sometimes the file copies just fine without producing any extraneous exclamation points.Additionally, I have occasionally seen a "^K" where there should be a newline (the data that should have been on the next line was instead on the previous line with a ^K in front of it) in copies that I have made of my files. I don't know if this is related or not.

View 7 Replies View Related

General :: Share A Large Number Of Files Into Chroot Env?

Aug 17, 2010

I understand that chroot is usually used to provide security, however, for my issue, security is a big don't care. I am very new to using chroot and don't fully understand how the chroot'd env works.

problem: Trying to use a vendor supplied cross compile environment. The environment runs as a chroot'd env and works just fine. I have a large number of additional modules that I wish to compile in the chroot'd environment. FYI, these modules are also (succesfully) compiled for other targets not using chroot'd env's. Copying the source files into the the chroot environment is not an option (don't have hours to wait for copies to finish and it would break the make system). Having them live in the environment is also not an option (the chroot build is a tiny part of the build process and we cannot revamp our entire source tree to accommodate it).

I am looking for a way to have the compiler in the chroot'd env have access to a path that is outside of the env and typically higher up in the same path that holds the chroot'd env. I have tried soft links (they don't work as expected). Hard links only work for single files and there are 10's of thousands of files that would need to be linked. I am not sure how I would go about exporting the additional files and then mounting the exported files in the chroot'd env (or if that would even work).

View 2 Replies View Related

OpenSUSE :: Dolphin Losing Files When Copying Many Files Or Large Folders?

Feb 14, 2010

I've discovered that Dolphin seems to lose random files when copying many large folders.

I first noticed this a few months ago when I tried to copy my music library from one folder to another on the same HDD. It consisted of around 600 folders and 6500 files. During the copy there were no errors but after the copy I found that some of the newly copied folders were missing files. I put it down to human error or a glitch.

Yesterday I tried to copy 13 folders containing rips of some of my DVDs. Each folder basically had one film of either 700MB or 1.4GB. Again no errors showed up during the copy but I found 3 of the newly copied folders were empty.

It's not so critical with music or films but I can't afford to lose work data like this.

Has anyone experienced or seen a similar problem with Dolphin? I'm going to have to do some more extensive testing but this is not good.

The first time I noticed the problem I was running KDE4.3.4 (I think) and now the latest was with KDE4.4.0.

View 9 Replies View Related

Ubuntu :: Command With The -r Option To Compare A Large Number Of Files And Files In Subdirectories

Jun 16, 2011

I am using the diff command with the -r option, to compare a large number of files and files in subdirectories. My main interest is to find out which files have been changed, and not what the actual changes are, and since a lot of files has been changed, it would be a lot easier to view the file names only. Is there and option for diff that might do this, or does there exist a similar tool/command that could do the job?

View 1 Replies View Related

Fedora Hardware :: Low Transfer Rate When Copying Large Files Over Wireless

Jan 11, 2010

I just bought a HP 3085dx laptop with an intel 5100 agn wireless card.
The problem: copying a big file over the wireless to a gigabit hardwired to the router computer only gives an average 3.5MB/Second transfer rate. If I do the same copy from my wireless-n macbook pro to the same computer. I get a transfer rate of about 11MB/sec. Why the big difference? I noticed the HP always connects to the 2.4 GHZ band instead of the 5GHZ bands...

On the HPL
[jerry@bigbox ~]$ ifconfig wlan0
wlan0 Link encap:Ethernet HWaddr 00:246:36:AC4
inet addr:192.168.1.75 Bcast:192.168.1.255 Mask:255.255.255.0
inet6 addr: fe80::224:d6ff:fe36:acc4/64 Scope:Link
UP BROADCAST RUNNING MULTICAST MTU:1500 Metric:1
RX packets:639243 errors:0 dropped:0 overruns:0 frame:0
TX packets:1293049 errors:0 dropped:0 overruns:0 carrier:0
collisions:0 txqueuelen:1000
RX bytes:53832795 (51.3 MiB) TX bytes:1888619922 (1.7 GiB)

[jerry@bigbox ~]$ iwconfig wlan0
wlan0 IEEE 802.11abgn ESSID:"<censored>"
Mode: Managed Frequency:2.412 GHz
Access Point: 00:24:36:A7:27:A3
Bit Rate=0 kb/s Tx-Power=15 dBm
Retry long limit: 7 RTS thr: off Fragment thr:off
Power Management: off
Link Quality=70/70 Signal level=-8 dBm Noise level=-87 dBm
Rx invalid nwid:0 Rx invalid crypt:0 Rx invalid frag:0
Tx excessive retries:0 Invalid misc:0 Missed beacon:0

I am not getting any errors. I don't know why the bit rate is not known. My airport extreme base station typically reports that the 'rate' for the hp is typically 250~300MBi and about the same for the MacBook Pro. The hp is about 6 inchs away from the base station. Is there anyway to get the rascal to go mo'faster? Is there anyway to get the rascal to use the 5GHZ band.

View 3 Replies View Related

Ubuntu :: Copying Files To A Directory And Skip The Files That Already Exist In The Directory?

Jun 30, 2011

How would i go about copying files to a directory, yet skip the files that already exist in the directory, and also remove the files that are in the directory. For example:

Code:

$ls /dir1
img001.jpg
img002.jpg

[code]....

Now i would like to copy from dir1 to dir2, but the contents of dir2 would be:

Code:

$ls /dir2
img003.jpg

View 7 Replies View Related

General :: Transfer Large Number Of Files Host To Host

Oct 20, 2010

I have two servers, one has an empty / and the other has a subdirectory with a large number (4 gig) with many, many files. I need a way to transfer the files en masse from the server with the large number of files to the one that is essentially blank.I don't have space on the used host to simply gzip all the files. I've googled this and see that there may be some combination of tar and/or gzip that will let me do this with some sort of redirection.

I really need and example line of how this can be accomplished. If my explanation seems rather sparse, I can supply more details.

View 3 Replies View Related

General :: Find Number Of Files In A Directory?

Feb 22, 2010

i need to know how to find number of files in a directory? is there any system calls in fedora 12.And i need to know how to perform a operation if the that count increases by one?

View 14 Replies View Related

Debian :: Resync Hangs After Copying A Certain Number Of Files

Nov 9, 2015

I currently have a problem in running rsync on 64 bit Debian Jessie (although the problem also occurred with 64 bit Debian Wheezy)I am trying to use rsync to archive my home directory (which is on a hard disk) to a USB memory stick. The home directory is about 18GB in size and the memory stick has 32GB.

Unfortunately, rsync hangs after copying a certain number of files and the process eventually has to be killed. Rsync was rerun but hung again at about the same point as before.This has now happened several times. Each time the hang occurs at about the same point.Use of strace after the hang shows that rsync appears to be processing a pdf file at the time, although not always the same pdf file.I originally had the rsync hang problem on a PC which ran 64 bit Wheezy and which used a USB 2.0 port.

I now am running rsync on another PC, which runs 64 bit Jessie and which uses a USB 3.0 port..I have also tried three different USB sticks, two from one manufacturer and the third from another manufacturer.All give similar rsync hangs.

View 11 Replies View Related

Ubuntu :: Data Loss When Transferring Large Number Of Files?

Jul 20, 2010

This problem is not exclusive to Ubuntu, I've experienced it in Windows and OSX as well, but it seems that almost every time I transfer a large number of files (i.e. my music collection) between my desktop computer and laptop via my external hard drive, I end up losing files for no reason. I usually don't notice the files are missing until later on, because I am never informed of any data loss. Now, every time I make a large transfer of files, I just do it two or three times to ensure that I don't lose any files.

View 2 Replies View Related

Ubuntu :: Find And Move Commands On Large Number Of Files ?

Feb 21, 2011

We recovered a large number of files from a HD I messed up. I am attempting to move large numbers of files of a type e.g. .txt .jpg , into a folder by type to more easily sort through them.

Here are the commands I have mainly been trying with various edits:

Code:

Code:

So far the most common complaint I have gotten "missing arguments to execdir".

This is on Ubuntu 10.04

View 6 Replies View Related

Applications :: Extract The Sender Id From A Fairly Large Number Of Files?

Nov 18, 2010

I'm trying to extract the sender id from a fairly large number of files and am having trouble assigning variables from a file. Here is what I have so far, (which is fairly kludgy I know, but it's been some years since I've done any scripting or programming, and I find that I have lost the knack to a large degree).

[Code]...

View 1 Replies View Related

General :: Count The Number Of Lines In All Files In This Directory?

Jul 5, 2011

I want to count the lines of all files in this directory and all its subdirectories, but exclude directories "public", "modules", and "templates".

View 2 Replies View Related

Ubuntu :: Large Number Of Files To Count - Argument List Too Long

May 28, 2011

I have the standard problem of trying to count a large number of files in a directory (>100k)

I have tried: ls ~/user/images/* -l | wc -l and find ~/user/images/* -maxdepth 1 -type f | wc -l

In both cases, I get the argument list too long error message.

I have tried using xargs but I can't seem to get it to work right.

The command

returns a valid answer but it includes all the subdirectories in the file count.

View 4 Replies View Related

Programming :: BASH Script Optimization For Testing Large Number Of Files

Sep 18, 2010

I want to move files from a $SOURCEDIR to a $DESTBASE/$DESTDIR. Under $DESTBASE there are many directories, and I need to test beforehand if a file from $SOURCEDIR already exists in any of them.

This is obviously extremely slow, and the real use case involves dozens of dirs and thousands of files. Creating a temporary "index" file for the find command (instead of running it every iteration) speeds it up a little, but it's still very clumsy.

View 14 Replies View Related

General :: Ls Command And Displaying Number Of Files In Current Directory?

Oct 15, 2010

What command will provide you with the number of files in your current directory?
Choose one answer.
A. ls -c
B. ls | wc -w (this one)
C. ls -n | count
D. ls -wc (this one ?)

View 5 Replies View Related

Ubuntu Multimedia :: Large *.mp4 Files In Gtkpod - Hangs At "Copying Tracks" At 0%

Jul 27, 2010

I am dual booting XP and Ubuntu 10.04, but in the future I will be getting a new machine and I will only be running Ubuntu and won't have access to iTunes. Because I have an iPod Touch, I have been trying to find workarounds for syncing everything that iTunes took care of in the past. One problem I have is managing movies. I have looked through various media players/iPod management tools (Amarok, Rhythmbox, gtkpod) and I am using Rhythmbox to sync my music and and attempting to use gtkpod to sync my movies.

gtkpod is able to sync songs (Tested with a few minute test clip) and short *.mp4 files (15mb I know for sure from test). I am unable, however, to get it to sync a movie (~700mb) I am able to drag it onto my iPod in gtkpod, but when I try to save the changes and write the files, it hangs at "Copying Tracks" at 0%. It eventually crashes during the couple times I have tried to wait it out. So this being my situation, my question is, is there a size limit to the *.mp4 files I can sync to my iPod Touch via gtkpod? is there any other tools that you know of that I can sync videos to my iPod with?

View 9 Replies View Related

Ubuntu :: Pulling Of Large Files From A Mounted Directory Into RAM?

Oct 18, 2010

I'm having a bit of an issue with Lucid installed via Wubi. I stuck the OS on its own partition (30 GB in size), and don't store any large files in the Ubuntu file system (when I download something large I move it to another hard drive.) I don't have anything wacky or esoteric installed on my system.

I've been consistently having a problem where, after a few hours or a few days of being booted up, Ubuntu warns me that my available HD space is dangerously small. The amount of available HD space Ubuntu sees then shrinks from a few GB to nothing within a few minutes, and the only way I can seem to solve this is to reboot. Taking a closer look at what's happening, my Home folder balloons in size until there's no more writable space recognized. But there are no files being created or added to, so it looks like there's a bug of some sort. This SEEMS to be correlated with watching videos (or maybe it's the pulling of large files from a mounted directory into RAM? My videos are all on another HD, as mentioned before). I can generally go a few days without getting the "low space" message, but I can't seem to make it through a full 2-hour movie without getting the error.

View 3 Replies View Related

General :: Using Sed To Replace A *large* Number Of Variables In A File?

Jul 28, 2011

I have a large number of log files, on a linux box, I need to cleanse sensitive data from before sending to a third party. I have used the below script on previous occasions to perform this task, and it has worked brilliantly (script was built with some help from here :-)

#!/bin/bash
help_text () {
cat <<EOF
Usage: $0 [log_directory] [client_name(s)]
EOF

[Code]...

However, now one of our departments has sent me a CLIENT_FILE.txt with 425000+ variables! I think I may have hit some internal limit. I have tried splitting the client file into 4 with around 100000 variables in each, this still doesn't work. I'm loathe to keep splitting though as I have 20 directories with up to 190 files in each directory to run through. The more client files I make, the more passes I have to do.

View 2 Replies View Related

General :: Convert A Large Number Of File Types From None Standard To Text?

Sep 28, 2009

I have on my windows machine several hundred files that are a format of .nc .ncs for a CNC machine. I need to convert them to txt which is something as easy as opening in notepad and then saving as .txt but there are so many that this kind of action would take way too long.

The reason I am writing the linuxquestions is because I would feel more comfortable in loading a live CD and using some sort of terminal command to do this than I would to download one of the many "freeware" type programs I have found for windows (even more so since I have had a root kit before and had to start all the way over to get rid of it).

I need to know:

1. Is this possible to do with the terminal without super advanced knowledge.

2. Can one please point me in the right direction; something to read or an example

View 2 Replies View Related

Ubuntu :: Moving/copying Files And Directories To Base Directory Of User?

Mar 12, 2011

How do I copy and/or move files to the base folder of a user? I don't know what is is called, so I do not know what to put in the my file "?" command? I know you would normally put mv filename /directoryname, but what is the base username called?

View 1 Replies View Related

Fedora :: How To Limit The Number Of Files In A Particular Directory

Aug 31, 2011

I was nosing around in my /home folder and I noticed that the /.thumbnails directory had 38,000+ files in it. That number seem a bit excessive to me. Is there a way to limit the number of files that are allowed to be in that directory, and maybe delete the oldest files automatically when the directory reaches it's limit in order to make room for the new incoming files, so there are no "directory full" type of errors?

View 8 Replies View Related

OpenSUSE :: Maximun Number Of Files Per Directory ?

Apr 28, 2011

If there is any maximum allowed number of files per folder in linux (without risking it to lose everything).

I am using openuse 11.4 with latest kde (4.6?).

I am trying something fast and dirty and it might be that one folder will contain like 10^6 files.

Is there is anything I should be warned about that? Of course filesystem security is still the most important.

View 9 Replies View Related

Ubuntu :: Copying A Large File From The Network?

Feb 17, 2010

I am trying to copy a file from a network resource on my Local Area Network, which is about 4.5 GB. I copy this file through GNOME copying utilities by first going to Places --> Network and then selecting the Windows Share on another computer on my network. I open it and start copying the file to my FAT32 drive by Ctrl+C and Ctrl+V. It copies well up-to 4 GB and then it hangs.

After trying it almost half a dozen times I got really annoyed and left it hung and went to bed. Next morning when I checked a message box saying "file too large to write" has appeared.

I am very annoyed. I desperately need that file. It's an ISO image and it is not damaged at all, it copies well to any windows system. Also, I have sufficient space on the drive in which I am trying to copy that file.

View 8 Replies View Related

Ubuntu :: Everything Freezes When Copying Large Amount Of Data?

May 20, 2010

Well, when I copy large amount of data the other applications than Nautilus freezes until the copy is done...

So, what can I do? Because when backuping some data this is really annoying =/

View 6 Replies View Related

Server :: Speed Up Single Large File Copying?

Apr 22, 2010

I'm planing to copy a productive mysql innodb file from one server to another, and the file size is around 300GB. As the file is keeping changing all the time, I have to shutdown mysql instance and copy the large data file to other server as quickly as possible.I should have to find a way to speed up file copying ... I'm wondering whether there's a way to copy file block by block.If the destination side block has same content, then bypass it.

View 4 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved