General :: Using Find With File Name And Mtime To Remove Files Gets Arg List Too Long Error?
Dec 25, 2009
I need to delete all *.trc files that are older than 30 days and I am getting a "Argument list too long" error. There are other files that should not be deleted which is why I am using the "*.trc" and newer files need to be kept as well. I have seen other postings but they do not cover both of the conditions. Below are 2 of the many attempts at doing this but I cannot get this to work.
I'm trying to do a find /photos/* -type f -mtime +365 to find all my pictures that are over a year old, but I keep getting argument list too long. How can I view what all the results are, even if it just dumps it to a file that I have to open?
I have some files on server with the date several months ago, but invisible for `find -mtime 7` search. When I list them as `ls -l` they look perfectly normal: -rw-r--r-- 1 root root 347253 Jun 12 16:26 pedia_main.2010-06-12-04-25-02.sql.gz -rw-r--r-- 1 root root 490144578 Nov 24 16:26 gsmforum_main.2010-11-24-04-25-02.sql.gz "find -mtime" does not work as expected on files with different timezones?
I've got a script where I have to parse out the last modified time for a large amount of files. Piping the output for "ls" into "cut" seems to work most of the time, but the output is unpredictable.The "fields" argument doesn't find the date modified columns consistently, and using character count is as well since the output can vary in width depending on the file name
I am writing a bash script which includes the find (and rm command with exec) command the options -mtime -15 or +15 so that files older 15 days must be deleted from the home directory. I think that -mtime +15 should be ok, but in which case does the option -mtime -15 really come in handy?? go back 15 days until today's date?
I've got a directory with thousands of files and I want to delete those that contain specific text.When I try:Code: ls | grep -l "specific text" * | xargs rm I get the error: Code: /bin/grep: Argument list too long Is there an easy way to get around this without having to move files into seperate folders and processing them in batches? I found an article on getting around this problem, but I'm kind of new to Linux and don't know how to apply this to my specific problem.
i have ubuntu karmic 9.10 and when i try to update anything or install anything the a very similar error occurs."(Reading database . . . 55%dpkg: unrecoverable fatal error , aborting: files list file for package `com.palm.net.precoddr.fcoaster' contains empty filenameit repeats this message 3 times then gives up i believe.
I want to source this file but getting error message as word too long,kindly solve this question. set path=(/user/lib/usr/bin /bin/usr/ucb/etc/usr/ccs/bin/$path)
I'm very very tired, worked all night long, and I did't sleep for hours... So I'm like a zombie now... half awake, and half asleep.
I was trying to clear a directory. Then I run the command cd to enter the directory, and then before thinking I run this dangerous command. on my Linux server: "find / -mtime +1 -exec rm {} ;"
I got a lot of:
Could this command have deleted something inside these directories? I'm afraid that the next reboot the server won't startup...
How to find and list files and directories present the current directory which were created in, say, years 2005, 2006, and 2009 and then move them to some other location, for example, /backup. Yes, I need to list them and move simultaneously. We can use:
Code:
find . -mtime n {};
but that n is troublesome for me to figure out files/directories created in years 2005, 2006, and 2009, for instance. Is there any way to match exactly by Year Value rather than calulating the "n" (days * 24 Hours)?
Whenever I try to install or remove a program, I get the following error
Code: dpkg: warning: files list file for package `libavahi-common-data' missing, assuming package has no files currently installed.
dpkg: warning: files list file for package `libgtk2.0-common' missing, assuming package has no files currently installed.
dpkg: warning: files list file for package `libxres1' missing, assuming package has no files currently installed. (Reading database ... 55%dpkg: unrecoverable fatal error, aborting: files list file for package 'ubuntu-mono' is missing final newline E: Sub-process /usr/bin/dpkg returned an error code (2)
I have recently taken delivery of a Dell Inspiron mini netbook with Ubuntu on, and I am new-fangled .install updates, I clicked the (orange down-arrow) button, and it compalined "E: /var/cache/apt/archives/linux-image-2.6.24-22-lpia_2.6.24-22.45netbook9_lpia.deb: files list file for package `libxcb-shape0' is missing final newline"
When I try to list files in directory. I am getting i/o error #ls -l /test I am getting i/o error. Why I am getting this error and what are these i/o errors.
have installed some programs from source and found no trace where and what were installed and I would like to remove those installed files. So I am looking for any script or app to list all orphaned (I mean not related to any installed package) files. I am using Ubuntu Server 9.10 without any fancy X11 stuff so console version is preferred. I have found bitbleach and computer janitor in this forum but they are X11 apps.
I would like to create a cronjob that will delete all files within a directory 1 hours after it is created to the folderI found this cron find /path/to/file/* -ctime +1 -exec rm {} ; but it's deleted all files.I want to make an exception, all file should be deleted except one file (letsay file a.zip)
just start Ubuntu 9.04 said: File system chek failed a long is beging saved /var/long/fsck/checkfs if that location is writable Please repair the file systmen manually A maintenance shell will now be started Ctr+ D terminate this shell and resume system boot. Give root password for maintenance or type Control +D to continue. I did Ctr+D , and after login said , that can not find /home. I starte with the live cd:
The find command is taking too long on my machine to complete. When I use time command, I find that sys time and user time are too small as compared to real time. Is my find process not getting scheduled properly?
I interrupted the neverending find command and got the following statistics:
Real time : 5min Sys time : 1.1 sec User time : 3 sec
I have hard drive with several thousand photos. These photos are in different formats, some are tif some jpg some raw (cr2). These files are in dozens of directories. What I want to do is produce a list of all the files, in all of the directories, sorted by the file name (not sorting on the path), listing the location, file name, size and date created. For instance I may have a file called photo1.jpg in /photos/pics/ I may also have a file called photo1.cr2 in /photos/misc/ and a file called photo1.tif in /photos/processed/summer/.
I would like a text file that would look like this: /photos/misc/photo1.cr2 2536658 2010-07-09 13:17 /photos/pics/photo1.jpg 320046 2010-07-07 14:47 /photos/processed/summer/photo1.tif 234456689 2010-07-10 09:22 Of course I want it to do this for all of the photos. I pretty sure that there is a way to do this with a minimum amount of work. I have no problem with using the command line.
I'm always hesitant to use /var/tmp/, because I never quite know exactly how long the files are kept there for, or even what the directory is used for. What determines when a file gets removed from /var/tmp/, and how is the directory intended to be used?
I am trying to run i386 in gcc_test-suite using dejagnu runtest and it fails with error given below. I can see that gcc-dg.exp is in folder gcc-4.6-20100911/gcc/testsuite/lib, but runtest is not searching in this folder.
how to resolve this issue and run only i386 tests.
$ cd /gcc-4.6-20100911/gcc/testsuite/gcc.target $ runtest -a -tool i386 -verbose .... .... Looking for library file /usr/local/share/dejagnu/lib/gcc-dg.exp