General :: Shell Script For Identifying The File And Zip All Files, Move The Files To Target Dir?
May 7, 2011
1. Every Sunday2. Find all files older than 1 day3. Gzip these file4. Tar up the gzipped files into one tar file.5. Name the tarball with a date stamp indicating what day it was created, so we know that week's files are in the file
An external service I dont manage pushes mediafiles into a shared directory on my server. I need to move these files into their correct directories automatically. The problem is that if I run my script as a cronjob once every 3 minutes, I notice that the script copies files which are still on their way into my server. So I need to figure out how to have the script check that the files are complete (done downloading) before the script moves the files. This is what I got so far:
Code:
#!/bin/sh # Script by proximity 280709. # Locate correct directory
I have to move files between two file systems /inst and /inst2.When I perform 'cp -a /inst /inst2' it copies everything even hidden files and preserves access permissions.But when I perform 'mv /inst /inst2' it also preserves access perms and moves everything besides hidden files.Questions :hy is so ?What tool to use when moving file systems from one fs to another (rsync) ?
the system currently have a directory with all the invalid files. how bad is it to move a single file to a directory containing 3 million files already?
I am supposed to take some small files, and print them to a specific printer, such that the small files are concatenated into one file. The file name has to be included in the file that gets printed.
Should I be looking to concatenate the files into one file with the file names included, and then print them?
Being relatively new both to Linux and this forum, i am sorry if i make a post that already is, evn though i couldn`t find it.My problem is i can`t move downloaded files over to root filesystem, i have downloaded and unpacked them to files. to change it`s looks and downloaded a skin, i open root, go to usr---> amsn ---> share --> skins, now i am to copy the file of the skin over to the root directory, butI also tried alt+f2, writing sudo conqueror, as an advice i got, but there was noe difference.
what I got - from a crontab run a script (understand that part), this script needs to count the amount of files in /outgoing/, then take 30 less that number, and move that many files from /readycalls/. I need to keep the asterisk outgoing que full of .call files with out having to many in there at any given time.
I am looking for a script/advice or guidance on how to write a script so that when I use the 'del' command it removes/sends the files/folders to a I specify for example 'dustbin
I'm using the command below to sync two directories. Problem is insted of deleting the files on the target directory it simply appends a ~ character at the end of the file name. Not sure why this is happening?I'd like to have all deletes on the source replicated on target.
I have a directory with a bunch of scripts and other random files. I want to move the files with the executable flag set to another folder en masse (or, if it's easier, the ones without the x flag). Is there an easy way?
In my bash script I need to move files in a folder if it is not in use.
Code: for entry in `ls /root/shared_storage/input`; do echo $entry run=`lsof /root/shared_storage/input/$entry` ru=${run:0:5} echo $entry if [ "$ru" == "" ]; then ........ It worked fine sometimes but sometimes it just get stuck at lsof. Is there any other way that I can use here to check if the $entry is using some other process?
I have 2 external hdd in wich I have all my files. yesterday, I have copied all the files from hdd2 to hdd1 and I want to eliminate duplicates so I used FSLint to find them,now I want to make a shell script to delete all the files/entries (read from the log file) that begin with.
I have a php script in cron directory that generates 5 textfiles, after the files are generated, I want to create a script that will move the 5 text fiels to anoher folder name "web".
So I was wondering, if I capture this output into a file (ie. one file per line), can anyone help me write a command which iterates through the file and moves the files one by one to a specified directory?
I have this script that I use to find log files in the /var/log directory that are 2 days old, move them to /var/log/tmp, rename them to the system date.filename and move them back to /var/log. Everything seems to work as planned, except that the files don't get moved out of temp, and they keep getting rename. This leads to very long filenames such as:
What is it about this script that isn't moving it back to /var/log? Also, is there a better way of doing this than what I'm doing? Basically, I'm just trying to set up an audit trail on some of the files in /var/log, so that at the end of the month I can tar them, and then have our syslog server pick up the one giant monthly log.
I have Ubuntu 10.l0 installed on my laptop. I recently install the KDE desktop from the Software Center. Today, I noticed something strange. I tried to move a file to the trash when I got this error message: "The trash has reached its maximum size! Cleanup the trash manually." I don't have any files in the trash. I went back to Gnome, and was able to delete the file. I opened up Dolphin while still in Gnome, and couldn't delete anything, so I know that this isn't a KDE problem
I use avg free as antivirus scanner. I looked some time for a scanner to scan and remove viruses. Problem is avg 8.5 will only detect viruses, not remove them...I use a oneliner looking like this:
I have 2 external hdd in wich I have all my files.... yesterday, I have copied all the files from hdd2 to hdd1 and I want to eliminate duplicates so I used FSLint to find them, now, I have a txt file that looks like this:
Code: /media/My Book/!!!MIS DOCUMENTOS/Documentos/2 sep2003-jun2009 USB/!TESIS/TESIS/TESIS CVT LABVIEW Y CODEWARRIOR/LabVIEW85RuntimeEngineFull.exe /media/My Book/HDD_Toshiba/Borrable/Pen_Drive_4GB/Tesis/Super CD de la tesis/LabView/LabVIEW85RuntimeEngineFull.exe multiplied by millions of entries...
now I want to make a shell script to delete all the files/entries (read from the log file) that begin with:
Code:
/media/My Book/HDD_Toshiba/**** Since HDD_Toshiba is the folder in hdd1 (MyBook) that contains all the files from hdd2
I recently had data recovered and it was sent back to me on what I think is an NTFS drive. I copied all the files over to a file share I have on a Linux box, that's ext4. Now I have that share mounted on my OSX machine, and I can't move or rename most of the files. However, in a couple cases I was able to rename a folder after the third try. Another time I was able to rename a folder once, but not again. All the permissions are showing up the same on the command-line -- I can't see any differences between the permissions on any of the files/folders. Note that I can create new folders and add files no problem, and then rename and move those all I want.
I have a Ubuntu NAS set up with two 1.5TB in a mirrored array. We recently needed more storage and will constantly be adding to this machine. We added 2 2TB drives in a striped array. What I'd like to do is find all directories totaling 10GB+ on the mirrored array and move them over to the striped array to increase storage on the mirrored array for smaller, more important data. I've tried:
I`m totally new to linux, in fact I`m a windows adminscenario:I need to run a script that will automatically move 30 days old files from particular folder to a particular folder.
I'd like to move a selection of files from all the sub-directories within an overall directory to a single destination. I don't want any of the directory structure, just the files themselves. This is what I tried so far:
mv /dir1/*/igs*.sp3.Z /dir2
There are other .sp3.Z files in the * directories within /dir1 but I just need the ones that start with igs..
I'm trying to move font files (.ttf and .otf) from the download folder to a folder Inkscape can find them in. I tried dragging and dropping them in Dolphin but I don't have permission! So tried in the terminal:
Code: ~$ mv ~/downloads/fonts/*.*tf /usr/share/fonts mv: cannot stat `/home/bryan/downloads/fonts/*.*tf': No such file or directory
A user will be ftping some files to an upload directory. I need to move those files to another directory. I also need to mail a list of the just moved files to the user. This job will need to run every 10 minutes. I need to keep a log that holds all the files for the day that were moved, renaming it with the date/timestamp.I have this below but I just can't put it all together. a workable script out of this?
Our client has a website (Joomla 1.5 based) to promote participation in and report results of a 10k race for runners in Scandinavia...
The race is scheduled for late June and will be covered by at least 5 still photographers at various points along the race course.
There are to be about 5,000 runners, each entered under a shirt number like bib_1000 through bib_5999, and this means there will be a significant number of digital jpg files submitted by the photographers...
The client has in place some software which will "automagically" resize, watermark, and rename the individual files in the form:
And so on with a possibility of as many as 10 or 12 images in the aggregate with the same alpha-numeric "shirt number" at the beginning of the filename.
Our presentation software does a great job of handling thumbnail image generation and displaying a slideshow in a lightbox of all files within a given folder on the webserver.
And now finally the question...
Given a folder containing 25,000 or so *.jpg files, how can we write a script that:
1. parses the filenames to unique "shirt numbers" 2. makes folders with each "shirt number" as the foldername 3. moves the files from the root of the original folder into the appropriate "shirt number" folder.
Note that the order of my list above is not important, and if you know of a better way to organize the task we are fine with that.
I have an rsnapshot backup that I need to move off of a corrupt Linux file system. I need to preserve the internal hardlinks. I've tried rsync -H and using a newer rsync and neither preserve the hardlinks on OS X.
I tried to get rsync -H working and I've isolated it to the file system mounted. I can preserve hard links copying locally (HFS to HFS) but it doesn't preserve when I try to rsync off of a SMB file system mount or AFP file system mount. Is there some mount option solution to getting OS X rsync to obey -H?
I'm looking for a script that can be run regularly with Cron.
Check a folder for Rar files every few minutes, Unrar if present, and delete the left over files once done.
Be able to specify the directory of which folder to watch within the script.
Run an extension white list (.avi, .mkv, .mp4) and blacklist (rar files) of files to be moved.
Specify within the script which folder to move found files to.
I've seen a few online that does some of this or much more than this but I'm looking for something that just does this in a simple and efficient way... (Also for the life of me, I just can't get how to edit this to do what I'm looking for)