General :: How Long Are Files Kept In /var/tmp/, And How To Use The Directory
Jul 26, 2010
I'm always hesitant to use /var/tmp/, because I never quite know exactly how long the files are kept there for, or even what the directory is used for. What determines when a file gets removed from /var/tmp/, and how is the directory intended to be used?
For reasons I won't get into, I need to copy directories so long as the average system load is low. Can someone help me write a BASH script that will copy the contents of a directory, but check to make sure the average system load is below X before copying each file, and if not, wait Y seconds and try again?
I need to delete all *.trc files that are older than 30 days and I am getting a "Argument list too long" error. There are other files that should not be deleted which is why I am using the "*.trc" and newer files need to be kept as well. I have seen other postings but they do not cover both of the conditions. Below are 2 of the many attempts at doing this but I cannot get this to work.
I want to run a cronjob every 15 minutes that checks a directory for files. If the directory contains more than ten files I want it to send an email to me.
All I have is this...
*/15 * * * * ls -l | wc -l | [filename] | mail -s "This is just a test" [email address]
I would rather not write a bash script. Is there an easier way to do this? I was looking into some commands like find and grep.
I'm quite new to linux but I have configured a simple ftp server and it's working great. I have a FTP-Shared folder with upload and download subfolders. Under upload's and download's I have identical category subfolders like mp3's, movies, software etc. in both. As the guy's upload, I would like to create a line crontab where I can move all the content under /FTP-Shared/upload/mp3/* older than 14 day's to FTP-Shared/downloads/mp3/ recursively (Like in cp command), but the timestamp must be searched on the first directory and not sub files example: /mp3/Club Dance/CD1/Hallo world.mp3This is how far I got:[root@clients ~]# /usr/bin/find /FTP_Shared/upload/Mp3s/ -depth -mindepth 1 -mtime +14 -type d -exec mv -f {} /FTP_Shared/download/Mp3s/ ;This command moves the directory and files, but it is not recursively
If I have a directory /foo with a few files in it, how do I symlink each entry in /foo into /bar/? For instance, if /foo has the files a, b and c, I want to create three symlinks:
How would i go about copying files to a directory, yet skip the files that already exist in the directory, and also remove the files that are in the directory. For example:
Code:
$ls /dir1 img001.jpg img002.jpg
[code]....
Now i would like to copy from dir1 to dir2, but the contents of dir2 would be:
I've been looking high and low for a utility program or perl script or something that can take a linux directory structure as input and convert it to MS-DOS 8.3 directory structure.
The purpose of this is to conform to the path format that is expected on my rather old Creative Zen Neeon MP3 player for m3u play lists.
I've got a directory with thousands of files and I want to delete those that contain specific text.When I try:Code: ls | grep -l "specific text" * | xargs rm I get the error: Code: /bin/grep: Argument list too long Is there an easy way to get around this without having to move files into seperate folders and processing them in batches? I found an article on getting around this problem, but I'm kind of new to Linux and don't know how to apply this to my specific problem.
There are millions of files in many directories. Wherenver i try rm * or find or use xargs, they say 'argument list too long' and exit. How can i deleted files in a directory with so many files without deleting the directory itself.
I am using my media server as my podcast collector. I am in the process of learning the ins and outs of NFS so i can mount a NFS directory and transfer my podcasts from server to player. For now i am using scp to transfer podcasts from server to desktop then to player. The problem is the path to the directory of one of the podcasts is /home/user/gpodder-downloads/The BILL&TIMMY Show Podcast.
whenever i try and run my scp command it fails because it thinks that TIMMY is a script i want to run in the background. I have tried to back-slash escape the character, i've tried single quoting and double quoting the character and i still get the same problem. as it sits now i have to move all podcasts to another directory and then transfer them to my desktop...but i would like to transfer the podcasts without un-necessary steps.
when i try to copy more than one files of average size of around 300MB, ubuntu copy one file then data transfer stops for some time, and then after it continues and again stops on the end of file?? i didnot faced this problem while copying data to my external hard drive.
just start Ubuntu 9.04 said: File system chek failed a long is beging saved /var/long/fsck/checkfs if that location is writable Please repair the file systmen manually A maintenance shell will now be started Ctr+ D terminate this shell and resume system boot. Give root password for maintenance or type Control +D to continue. I did Ctr+D , and after login said , that can not find /home. I starte with the live cd:
I'm wondering what might be causing some VERY long delays when I move groups of files from one directory to another on the same drive. In the GUI, I simply multi(shift)select a few dozen items at once (a set of JPEGs previously downloaded from my camera) and drag them together from the source directory window (where I downloaded all the images from the memory card) into a new/empty folder/window specific to that group of images. Just routine sorting of files basically... Once I 'let go' ('dropping' the items into their destination) there's often a SURPRISINGLY long delay before I can do something else within the GUI... open another file, or rename an item, etc... This delay can take a few seconds to more than a minute (if moving a couple hundred files at once)... this 'wait' during such a routine 'housekeeping' task seems surprising to me. During these delays, I CAN open/use other programs such as System Monitor or a browser... it just seems that additional GUI/filesystem tasks must wait out the delay before proceeding. If I go ahead and try to do another filesystem task during the delay anyway, it gets buffered... the file won't open/next files don't get moved/etc... UNTIL the delay from the first operation is complete (updated item counts resulting from file moves aren't reflected in List View until the delay is finished too).
According to System Monitor (see image attachment for screen cap during one of these delays...) one CPU is pegged (the "gvfsd-metadata" process, which I guess corresponds to the file/move) and the other 3 cores are relatively idle, and there's plenty of free RAM/no swap. I'd have thought such a delay wouldn't be an issue with Ubuntu/my PC... maybe I have something set up incorrectly? Other clues: intermittently, during these delays/file operations, the GUI shuts down all open windows (folders)... as if the delays/buffered tasks caused some sort of reset. The hard drive is internal SATA formatted regular Ext4 and the Ubuntu on my PC is the 32-bit version since I figured my Dell is too old (about 4-5 years) to justify the 64-bit version. Like most folks, there are LOTS of files/folders on my drives, but I only have 3 file windows open at once most of the time, and am only displaying the item names and 'sizes'... no other columns. Nothing other than the 'move' itself is running at the time which could help explain the delay.
When I want copy a file to my Pendrive it take long time, How can I troubleshooting it? For example, It show me 13 seconds to finish but take 20 minutes to finish !!!
I have several audiobooks that are each split into many small chapters, and I would like to string together about ten of them at a time, so that I don't have 60 4-5 minute mp3 files per audiobook.If I were to do this all by hand, with audacity or something similar, it would get very tedious, so I'd like to know if there's already a program that would do it for me when told which files to string together.
My work requires me to have a lot of postscript files. Hence i have elaborate names for easy identification. For example --- AP-1_Jul-Ctrl-noEqSA_bg25C_precip,U250-150_xy.ps. I am unable to open them using evince. I get a blank evince screen with loading written on it. The same file named as 1.ps opens in a jiffy. As of now evince quickly opens pdf files with long names. But i would not like to convert ps to pdf.
Even kghostview does not open these long-named files.
I have a USB drive that I boot using SysLinux. I think select one of several options to complete a task. I do not have access to edit those Kernels. I need to add a option from the Syslinux menu where I can delete all the files from a specific directory.
I'm writing a Perl script which performs linux commands.I have a directory with a load of files.
Code:exec_cmd('rm $(ls * | grep -v file1)'); This command will delete all except for file1. How can I modify this to delete all files except for file1 & file2?
So I am taking an half online/ half go to school class. I am working on an assignment that is due tonight by midnight. I am kind of confused on this question and am waiting on an email from my instructor but only have 2.5 hrs left til it's done. Here is the question.Use file globbing to copy all the files in the /labs/data directory that end with a .out to the lab07 directory. How many files are in the lab07 directory?when I ls -l into this directory. I don't see any files with that extension. Is there an easier way that I am missing such as. cp /labs/data *.out > lab07 ??? Someone help?
I am installing openSUSE 11.2 on a laptop. After installation, reboot, auto-configuration, the screen show lines of text following "Starting SuSEconfig. The latest line is "Creating cache files for fontconfig". I don't understand why this specific operation takes too long; since almost an hour has passed. Is there a way to speed up and complete the process so that I can login and start using openSUSE? I did try hitting the spacebar and moving the mouse but nothing happens.
I have a php script in cron directory that generates 5 textfiles, after the files are generated, I want to create a script that will move the 5 text fiels to anoher folder name "web".
I created a directory somewhere with permissions rwxrwxr-x so that other users in my group can create files and directories in it.
I do need to be able to delete the contents in this "public" directory, but it seems that while I am able to remove any files in this directory I cannot remove and subdirectories under it.
Is there a way to remove such subdirectories owned by others under a directory owned by me?