Ubuntu :: Find And Move Commands On Large Number Of Files ?

Feb 21, 2011

We recovered a large number of files from a HD I messed up. I am attempting to move large numbers of files of a type e.g. .txt .jpg , into a folder by type to more easily sort through them.

Here are the commands I have mainly been trying with various edits:

Code:

Code:

So far the most common complaint I have gotten "missing arguments to execdir".

This is on Ubuntu 10.04

View 6 Replies


ADVERTISEMENT

Ubuntu :: Command With The -r Option To Compare A Large Number Of Files And Files In Subdirectories

Jun 16, 2011

I am using the diff command with the -r option, to compare a large number of files and files in subdirectories. My main interest is to find out which files have been changed, and not what the actual changes are, and since a lot of files has been changed, it would be a lot easier to view the file names only. Is there and option for diff that might do this, or does there exist a similar tool/command that could do the job?

View 1 Replies View Related

Ubuntu :: Can't Find Any Process That Consume Such Large Number Of Memory

Feb 23, 2010

I used 9.04 for months and it work fine before restarting my PC. After I restarted my PC, the memory consumption takes up to 4.2 GB after login. However, I cannot find any process that consume such large number of memory.

[code]....

View 3 Replies View Related

General :: Tar Not Working With Large Number Of Files?

Dec 6, 2010

in my middle of script i am using tar command to tar some 1000 images and size of each image is 5MB.

all the images are provided as argument as tar -cvf file.tar <all images as argument>

but my tar file file.tar does not contain all the images.

View 4 Replies View Related

Ubuntu :: Data Loss When Transferring Large Number Of Files?

Jul 20, 2010

This problem is not exclusive to Ubuntu, I've experienced it in Windows and OSX as well, but it seems that almost every time I transfer a large number of files (i.e. my music collection) between my desktop computer and laptop via my external hard drive, I end up losing files for no reason. I usually don't notice the files are missing until later on, because I am never informed of any data loss. Now, every time I make a large transfer of files, I just do it two or three times to ensure that I don't lose any files.

View 2 Replies View Related

General :: Copying Large Number Of Files From One Directory To Another

Feb 10, 2010

I've a directory containing around 2.8 lacs of files. I want to move them to another directory.If I use cp or mv then I get an error 'argument list too long'. If I write a script like

for file in ls *; do
cp {source} to {destination}
done

then because of ls command , its performance degrades.How can I do this?

View 7 Replies View Related

General :: Copying Large Number Of Files In Windows?

Mar 15, 2011

I am facing problem in copying a large number of file 18 lakh (18,000,000) files from my personal hardisk to another hardisk each file is very small and size of folder is around 3.95 GB copying files using copy given by Windows is frustrating and I am not even able to compress file its giving me error that its not readable.And problem is I am not able to open this drive in Linux it showing me error there saying do diskchk in Windows and Windows disk check is also not able to repair this drive and goes into some mode unsolvable.Is there any way to open disk with error to open in Windows and if not any way I can copy data faster?ERROR: Disk labled EDU is corrupt go to windows and chkdsk /f there and reboot into window 2 times.

View 3 Replies View Related

General :: Share A Large Number Of Files Into Chroot Env?

Aug 17, 2010

I understand that chroot is usually used to provide security, however, for my issue, security is a big don't care. I am very new to using chroot and don't fully understand how the chroot'd env works.

problem: Trying to use a vendor supplied cross compile environment. The environment runs as a chroot'd env and works just fine. I have a large number of additional modules that I wish to compile in the chroot'd environment. FYI, these modules are also (succesfully) compiled for other targets not using chroot'd env's. Copying the source files into the the chroot environment is not an option (don't have hours to wait for copies to finish and it would break the make system). Having them live in the environment is also not an option (the chroot build is a tiny part of the build process and we cannot revamp our entire source tree to accommodate it).

I am looking for a way to have the compiler in the chroot'd env have access to a path that is outside of the env and typically higher up in the same path that holds the chroot'd env. I have tried soft links (they don't work as expected). Hard links only work for single files and there are 10's of thousands of files that would need to be linked. I am not sure how I would go about exporting the additional files and then mounting the exported files in the chroot'd env (or if that would even work).

View 2 Replies View Related

Ubuntu :: Large Number Of Files To Count - Argument List Too Long

May 28, 2011

I have the standard problem of trying to count a large number of files in a directory (>100k)

I have tried: ls ~/user/images/* -l | wc -l and find ~/user/images/* -maxdepth 1 -type f | wc -l

In both cases, I get the argument list too long error message.

I have tried using xargs but I can't seem to get it to work right.

The command

returns a valid answer but it includes all the subdirectories in the file count.

View 4 Replies View Related

Applications :: Extract The Sender Id From A Fairly Large Number Of Files?

Nov 18, 2010

I'm trying to extract the sender id from a fairly large number of files and am having trouble assigning variables from a file. Here is what I have so far, (which is fairly kludgy I know, but it's been some years since I've done any scripting or programming, and I find that I have lost the knack to a large degree).

[Code]...

View 1 Replies View Related

Programming :: BASH Script Optimization For Testing Large Number Of Files

Sep 18, 2010

I want to move files from a $SOURCEDIR to a $DESTBASE/$DESTDIR. Under $DESTBASE there are many directories, and I need to test beforehand if a file from $SOURCEDIR already exists in any of them.

This is obviously extremely slow, and the real use case involves dozens of dirs and thousands of files. Creating a temporary "index" file for the find command (instead of running it every iteration) speeds it up a little, but it's still very clumsy.

View 14 Replies View Related

General :: Transfer Large Number Of Files Host To Host

Oct 20, 2010

I have two servers, one has an empty / and the other has a subdirectory with a large number (4 gig) with many, many files. I need a way to transfer the files en masse from the server with the large number of files to the one that is essentially blank.I don't have space on the used host to simply gzip all the files. I've googled this and see that there may be some combination of tar and/or gzip that will let me do this with some sort of redirection.

I really need and example line of how this can be accomplished. If my explanation seems rather sparse, I can supply more details.

View 3 Replies View Related

General :: Find Number Of Files In A Directory?

Feb 22, 2010

i need to know how to find number of files in a directory? is there any system calls in fedora 12.And i need to know how to perform a operation if the that count increases by one?

View 14 Replies View Related

General :: Find And Move Files Greater Than 15GB?

Jan 23, 2010

I have a Ubuntu NAS set up with two 1.5TB in a mirrored array. We recently needed more storage and will constantly be adding to this machine. We added 2 2TB drives in a striped array. What I'd like to do is find all directories totaling 10GB+ on the mirrored array and move them over to the striped array to increase storage on the mirrored array for smaller, more important data. I've tried:

Code:
find /mirror -maxdepth 1 -size 10G
find /mirror -size 10

[code]....

View 2 Replies View Related

General :: Find Files With Specific Extension And Move To A Directory?

Apr 18, 2011

I need little help. I want to find all files with extension "*.tar" "*.gz" and "*.zip" and move all those files into "/opt/old" directory. I've tried this command:

Quote:

find . -type f -name "*.tar" "*.gz" "*.zip" -print0 | xargs -0 -r mv /opt/test

It's not working, something wrong after "mv" i guess.

View 10 Replies View Related

Ubuntu :: Write A Bash Script Which Will Find Files Then Move Them To A Specific Directory?

Sep 1, 2011

I'm trying to write a bash script which will find files then move them to a specific directory.

So far I have:

Code:
#!/bin/bash
#script to find and move files
src_dir="/path/to/source/directory"
des_dir_mov="/path/to/destination/directory/for/movies"
des_dir_img="/path/to/destination/directory/for/images"
find $src_dir -iname '*.avi' -type f -exec mv '{}' $des_dir_mov ';'

I'd like to have all the possible movie file types then the image file types checked in a loop.

Every time I try to include an array in this script it breaks

View 3 Replies View Related

General :: Back Up Scrip - Find / Cp / Md5sum / Rm - Move All Files And Directories

Oct 22, 2010

I want to move all files and directories that are 1 month old out to back up into a separate folder. There will be a lot of files and I want to make sure it copies properly. The problem I'm having is integrating a MD5SUM into it to check integrity. MD5SUM is not recursive, so I figured it would work in a loop when it copies each individual file, I'll do a md5sum on each file and delete that md5 once its verified it copied ok.

[Code]...

I also need some sort of error handling to output all md5's that didnt pass the hash check.

View 3 Replies View Related

General :: Find A Proper Command To Move A Certain Set Of Files According To Date/time Range?

Mar 18, 2009

I'm trying to find a proper command to move a certain set of files according to date/time range. I am thinking that the command should be something like:

Code:
ls -l | grep 'date/time range' | mv /folder

View 6 Replies View Related

Ubuntu :: Adding Additional Users Causes Large Number Of Errors?

Jun 29, 2010

I'm running 64-bit 10.04, upgraded from 9.10. The problem I am experiencing is that any user accounts aside from my main account are problematic. This includes any accounts I add, as well as the GDM guest session.The specific problems that I have thus far experienced are as follows:

1. The desktop loads often improperly. In the latest instance of this the graphics on the right side top panel were randomly chopped-up, leaving parts of my clock on either side of the volume control, among other things. 2. If I make ANY customizations to the desktop at all, the desktop takes nearly a full minute to load on log-in. 3. Flash videos don't work properly on Firefox. Sometimes they only play after refreshing a page, often they will not load at all. Also, attempting to load or play a flash video will sometimes causes Gnome or Firefox to crash. 4. (And this is the one that REALLY has me stumped) Whenever I log into my main accountant after logging out of another account, the IBus control appears in my system tray.However, when I open the IBus preferences the associated check box is (and has always remained) unchecked.Not sure where to go with this one. More than anything, the IBus bug makes me unsure of where to even begin looking for the problem.

View 1 Replies View Related

General :: Using Sed To Replace A *large* Number Of Variables In A File?

Jul 28, 2011

I have a large number of log files, on a linux box, I need to cleanse sensitive data from before sending to a third party. I have used the below script on previous occasions to perform this task, and it has worked brilliantly (script was built with some help from here :-)

#!/bin/bash
help_text () {
cat <<EOF
Usage: $0 [log_directory] [client_name(s)]
EOF

[Code]...

However, now one of our departments has sent me a CLIENT_FILE.txt with 425000+ variables! I think I may have hit some internal limit. I have tried splitting the client file into 4 with around 100000 variables in each, this still doesn't work. I'm loathe to keep splitting though as I have 20 directories with up to 190 files in each directory to run through. The more client files I make, the more passes I have to do.

View 2 Replies View Related

Software :: Filter A Large Document By Line Number?

Feb 17, 2010

I have a 50000 line(ish) set of records in a file. I have another file where I have filtered out all the line numbers for those which have an error of various types. e.g column count, field type etc. I want to get all those lines into a separate file so I can sanitise them. There are abt 3-4000 of them.

How can I access those lines which I want to isolate into a single file? I have all the usual linux stuff available and a bit of understanding of regexps.

View 5 Replies View Related

CentOS 5 Networking :: Configuring For Large Number Of Tcp Connections?

Jan 24, 2011

i've got a select based application that wants to support a large number of mostly idle connections. the code is java and works on windows, suse enterprise linux, mac os x. it does not work on centos 5.5 (32-bit, 2.6.18 kernel, 1G of memory).

i've read and followed the directions in various articles about tuning linux for large numbers of connections (including the C10K problem), and gotten the number of sockets up to 3200.

these didn't make any apparent difference:

[URL]

on windows, i can get up to around 78,000.

on suse enterprise linux (a few years ago), i got up to 90,000. that's where i got bored and stopped.

on my mac laptop with os x (snow leopard), i got up to 10,500.

i have used ulimit -n 10240

my current goal is 10k sockets.

the test is that i'm opening one socket at a time until it fails. when it fails, many of the sockets which have already been opened also fail, in one giant cascade. sounds like a buffer / memory problem.

each group of 64 sockets gets a thread to manage select calls for them. thus i'm only using around 61 threads total when it fails.

View 3 Replies View Related

CentOS 5 :: 5.2 Installation Hangs On Server With Large Number Of Drives?

Apr 6, 2009

I am attempting to upgrade a system from 4.7 to 5.2 using a (now) DVD drive attached to the onboard IDE. Originally I had tried using a remote NFS image and a USB stick but I thought maybe there was a problem with the image. I can get up to the point of the installation of selecting the keyboard for the system and then it freezes and never goes any further. It doesn't appear to be a kernel panic since I can still switch between consoles.

I've got an MSI K9NGM2-FID with 14 drives in it. It serves as a file server for our backup server. It's got a secondary 4 port Silicon Image SII 3114 SATA card using the sata_sil module, and an old IDE Promise FastTrak TX2000. Technically I could have 16 drives but the 750W PS is walking the fine line on tripping it's self-breaker with the 14 drives and 7 fans. I would like to NOT have to disconnect all of this to do the upgrade.

I thought maybe that running the install using the "noprobe" option would help so it didn't detect and load the modules for the Silicon Image or the Promise cards and detect all of the drives but it still gets stuck on the step after selecting the keyboard. The installation info console and the dmesg console don't really provide any useful information. The installation console says:

INFO : moving (1) to step welcome
INFO : moving (1) to step language
INFO : moving (1) to step keyboard
INFO : moving (1) to step findrootparts

And the last lines of the dmesg console says:

<6>device-mapper: multipath: version 1.0.5 loaded
<6>device-mapper: multipath round-robin: version 1.0.0 loaded
<6>device-mapper: multipath emc: version 0.0.3 loaded

Is there a hidden "debug" option that will turn on a lot of extra logging?

View 7 Replies View Related

General :: No Space On Root After Large Move Operation?

Feb 16, 2010

I tried to move 2.7TB of data from my /var/webroot/ partition (4.5TB total in size). I left it to run over night, this morning when I came to check I saw that all data on / paratition is used up and no operations can be done because of the "no space left on device" message.

Code:

Filesystem Size Used Avail Use% Mounted on
/dev/cciss/c0d0p7 911G 911G 0 100% /
tmpfs 7.9G 0 7.9G 0% /lib/init/rw

[code]....

I freed up several hundred MB from / but still the usage is at 100% and I cant free up any more space or complete the transfer ?

View 1 Replies View Related

General :: Convert A Large Number Of File Types From None Standard To Text?

Sep 28, 2009

I have on my windows machine several hundred files that are a format of .nc .ncs for a CNC machine. I need to convert them to txt which is something as easy as opening in notepad and then saving as .txt but there are so many that this kind of action would take way too long.

The reason I am writing the linuxquestions is because I would feel more comfortable in loading a live CD and using some sort of terminal command to do this than I would to download one of the many "freeware" type programs I have found for windows (even more so since I have had a root kit before and had to start all the way over to get rid of it).

I need to know:

1. Is this possible to do with the terminal without super advanced knowledge.

2. Can one please point me in the right direction; something to read or an example

View 2 Replies View Related

General :: Move Large Amounts Of Music Within A File Structure?

Dec 20, 2009

i have a car stereo that reads a USB drive with all my music on it, however to sort through the music it uses a method of finding folders containing music, then displaying them all in a list. i find this interface annoying because in order to sort the music by artist i have to go and manually move it out of the album folders by hand, this takes a long time for 11+ GB of music so i was trying to use the linux CLI to quicken the process. use a command like this

Code:

mv /media/usb/music/*/*/* /media/usb/music/*/

but for some reason this moves all my music into the last folder alphabetically in my drive, the music is all pre-arranged like this /media/usb/music/artist/album/song

View 5 Replies View Related

Debian Installation :: Dividing A Large Upgrade \ Move From Squeeze To Unstable?

Oct 6, 2010

Trying to move from squeeze to unstable -- my downloads add up to some 700 M or so.So I am trying to batch the upgrade:Some of the big-fellas are openoffice and texlive:So I didsudo aptitude hold '?name(openoffice)'sudo aptitude hold '?name(texlive)'Is that fine or are there some pitfalls to this?

View 5 Replies View Related

General :: Commands To Display Running Processor Number Id?

Jan 24, 2011

I have 8 cores system, but I only want to use 4 cores of it. Are there any commands to show which cores are running tasks?

View 1 Replies View Related

Software :: Bash Script To Run Commands For Specified Number Of Seconds?

Dec 2, 2009

I want to make a very simple bash script to run a command for a user-specified number of seconds and then kill it. The purpose is to limit the amount of time the program runs.Example in pusedocode:

Code:
#!/bin/bash
#$1 is the user input number of seconds

[code]...

View 14 Replies View Related

General :: Vi Editor - Move Cursor To A Specified Line Number?

Dec 6, 2010

I am using Vi editor for editing and configuring my file.I am facing a little problem when there is long file like 3000 lines. Normally i use

Code:

:set number

in vi editor to visible my line number.The problem is when i have to go in the top of the file like say line number one I use k for it and to move down I use j which is too much time consuming. How can i jump directly my cursor to line number 2333 or line number 2600.

View 4 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved