I have a directory listing with many subdirectories having many files. I want to recursively search for the oldest 5 files starting from the base directory and not 5 from each subdirectory. I am writing a shell script which sorts them using ls -lRtur|egrep "txt|jpg" > /tmp/file1 Now from this /tmp/file1 file I want to sort the files same as what the ls -ltr command does that is oldest file time to newest file time first. How do I sort based on Linux time stamp? The files itself also have Linux timestamps embedded in them So I can sort based after extracting them as well if it is easier. My /tmp/file1 has entries like below.
Actuaaly i am creating watch on one directory in which files are continuously coming.Is there any command which can give listing of all files who have come in last 24 hrs.
I am trying to add a command to my backup script to delete the oldest file in the destination folder before adding a new .tar.gz file.I found this information at .html which I thought would work fine and added the following line to my backup script:ls -t -r -l /backups/Scalix_Backup* | head --lines 1 | xargs rmHowever when I tried this I get an error:rm: invalid option -- wTry `rm --help' for more information
got a whole lot of video files spread among 100's of folders, wondering if Linux has a program that can scan the modification and view dates of them all, and just display what's been accessed most recently.
I'm using openSUSE 11.4 (x86_64) with KDE: 4.6.00 (4.6.0) "release 6"Everytime I visit a direcotorya hidden file ".directory" is created. How to disable that? Is There a possibility to disable that behaviour only for public_html directory
I've just discovered that crontab is creating a new file in the root directory every time it executes a cronjob, and it doesn't erase over the old file so there are thousands of files in the root directory, they have the same name as the script file (appended with a numeral) but are all blank.here is what one of the cronjob's looks like[URL]
I've a user account in a remote machine. but it doesn't have a home directory in that machine.Is it possible to create a home directory without having root account details. If yes, how it can be done.
I have heard that creating hard link to a directory is not possible however when reading the man page of "ln" the "-d/-f" option says hard link directories ( super-user only). Thus this mean the super user i.e root can create hard link to directory and not a normal user , If yes then you . Even on specifying the above options I get a operation not permitted for a super user.
This may be a rookie mistake, but I created a user (new user) in Linux on a Ubuntu system and didn't actually create the home directory for this user. Now, when I log in, it says there are problems... If I delete the path home/<new user> and try to log in the system tells me I can use root as home directory but I will likely experience problems, and then it won't let me log in. What is the best way to create this directory with the appropriate permissions? Should I just create another user and delete this one?
jump into a Linux class in college with only 3 weeks left in the course. I thought I would be able to catch on, and go figure, it didn't exactly happen that way. I was given an assignment to do, and I am so far lost it isn't even funny. I need to create a directory structure, set up file security, create a step by step instruction manual on how to copy/delete said files, and create a guide to common Linux commands. How would I create these files in root and share them with the other users? and where can I find a list of common commands and their functions?
I want to run a cronjob every 15 minutes that checks a directory for files. If the directory contains more than ten files I want it to send an email to me.
All I have is this...
*/15 * * * * ls -l | wc -l | [filename] | mail -s "This is just a test" [email address]
I would rather not write a bash script. Is there an easier way to do this? I was looking into some commands like find and grep.
I'm quite new to linux but I have configured a simple ftp server and it's working great. I have a FTP-Shared folder with upload and download subfolders. Under upload's and download's I have identical category subfolders like mp3's, movies, software etc. in both. As the guy's upload, I would like to create a line crontab where I can move all the content under /FTP-Shared/upload/mp3/* older than 14 day's to FTP-Shared/downloads/mp3/ recursively (Like in cp command), but the timestamp must be searched on the first directory and not sub files example: /mp3/Club Dance/CD1/Hallo world.mp3This is how far I got:[root@clients ~]# /usr/bin/find /FTP_Shared/upload/Mp3s/ -depth -mindepth 1 -mtime +14 -type d -exec mv -f {} /FTP_Shared/download/Mp3s/ ;This command moves the directory and files, but it is not recursively
If I have a directory /foo with a few files in it, how do I symlink each entry in /foo into /bar/? For instance, if /foo has the files a, b and c, I want to create three symlinks:
full snapshot of ArchLinux Repo for x86_64I want to use this as my restore backup should I need to reinstall Arch without network support.How do I build several *tar volumes of my /mount/my_repo to fit into 4.5GB DVDs ... the thing is 18 GB size...How do I extract all the *tar created to a folder later on...? is it the same as extracting a single *tar, will tar find all volumes in the same directory level so as to continue extracing or do I need to merge them in some way
how I can randomly write / create a 1 GB file in bash to test disk / network i/o? I was told I could use the 'dd' command but I don't know if there are some better ways and or what the 'dd' command looks like.
Concatenating two files without creating a newline between them? So how is it possible to do that? I've tried the following: Code: echo 123 > file1 echo 456 > file2 cat file2 file2 > file3
I want to delete the first two partitions (a 243 MB Linux swap partition and a 5.87 GB root partition). Problem is, every time I deleted them, Windows would also delete Drive G: and H:, leaving only drive F: intact. Deleting the two Linux partitions using GParted from a Live CD also gave the same result. I've attempted this multiple times now and, after each attempt, TestDisk always managed to recover all the deleted partitions.
Any ideas on how to delete these two partitions without affecting the rest?
After launching the gnome-keyring-demon my mounted mp3-player is no longer accessible. In /var/log/messages I get the message "gnome-keyring-demon removes usb device". As long as the gnome-keyring-demon is running, I cant not remount the device though it is visible using lsusb. I'm running an FC12 system.
I am directly creating "qf" & "df" files into the sendmail queue folder and then processing this queue by command line. This is the only way to export data and email from this old application I am stuck with.This works quite well in my test enviornment but I am really new to linux/sendmail and just looking for any feedback on this process. Is this direct creation of queue files safe and any pitfalls that I should be aware of?
How would i go about copying files to a directory, yet skip the files that already exist in the directory, and also remove the files that are in the directory. For example:
Code:
$ls /dir1 img001.jpg img002.jpg
[code]....
Now i would like to copy from dir1 to dir2, but the contents of dir2 would be:
Please why my scanning is always creating huge 50Mb to 100Mb PDF files ?Each A4 Pnm file is of 6.5Mbytes by resolution of 150.If I decrease the resolution lower than 100, then it starts to be unreadable my text ...