I have a Ubuntu NAS set up with two 1.5TB in a mirrored array. We recently needed more storage and will constantly be adding to this machine. We added 2 2TB drives in a striped array. What I'd like to do is find all directories totaling 10GB+ on the mirrored array and move them over to the striped array to increase storage on the mirrored array for smaller, more important data. I've tried:
I need little help. I want to find all files with extension "*.tar" "*.gz" and "*.zip" and move all those files into "/opt/old" directory. I've tried this command:
I want to move all files and directories that are 1 month old out to back up into a separate folder. There will be a lot of files and I want to make sure it copies properly. The problem I'm having is integrating a MD5SUM into it to check integrity. MD5SUM is not recursive, so I figured it would work in a loop when it copies each individual file, I'll do a md5sum on each file and delete that md5 once its verified it copied ok.
[Code]...
I also need some sort of error handling to output all md5's that didnt pass the hash check.
I'm trying to find a proper command to move a certain set of files according to date/time range. I am thinking that the command should be something like:
We recovered a large number of files from a HD I messed up. I am attempting to move large numbers of files of a type e.g. .txt .jpg , into a folder by type to more easily sort through them.
Here are the commands I have mainly been trying with various edits:
Code:
Code:
So far the most common complaint I have gotten "missing arguments to execdir".
I have a problem with TFTP files greater than 32MB with Ubuntu. I haven't been able to find a fix for this issue. This has been a known issue for years and was corrected in the winodws world(AUGH !). when I did an apt-get install this morning it said my tftp was up to date.
1. Every Sunday2. Find all files older than 1 day3. Gzip these file4. Tar up the gzipped files into one tar file.5. Name the tarball with a date stamp indicating what day it was created, so we know that week's files are in the file
Being relatively new both to Linux and this forum, i am sorry if i make a post that already is, evn though i couldn`t find it.My problem is i can`t move downloaded files over to root filesystem, i have downloaded and unpacked them to files. to change it`s looks and downloaded a skin, i open root, go to usr---> amsn ---> share --> skins, now i am to copy the file of the skin over to the root directory, butI also tried alt+f2, writing sudo conqueror, as an advice i got, but there was noe difference.
what I got - from a crontab run a script (understand that part), this script needs to count the amount of files in /outgoing/, then take 30 less that number, and move that many files from /readycalls/. I need to keep the asterisk outgoing que full of .call files with out having to many in there at any given time.
I'm getting an error when I try and zip a large file on Linux because it is too large for zip to deal with. what commands I can use to get around this?
This is the error I'm getting:
zip error: Entry too big to split, read, or write (file exceeds Zip's 4GB uncompressed size limit)
we are running a Red Hat Enterprise Linux ES release 3 (Taroon Upd 5) Kernel 2.4.21-32.ELsmp since several years. The server hosts an old ERP system who will be replaced at the end of the year.However it is necessary that some collegues are able to write some files to that server regulary. Since we are running Windows 7 on several machines, those users aren't anymore able to write to the samba share. Getting files from the share works fine.
But the problem seems not to be situated at the samba service because also the transfer using SSH (WINSCP) from any Win7 system to the server doesn't work.During testing we recogniced that transfering files smaller then 1kb works fine ... any file greater then 1kb ends up in an connection abort. This works with samba and also using SSH.All the workarounds editing some registry entries in Win7 for improving the interoperability between vista / win7 and samba don't work for us ... and also seem not to be the source of the problem.Is there a general known incompatibility between our RHEL version / kernel and Windows 7 regarding file transfers?
I have a directory with a bunch of scripts and other random files. I want to move the files with the executable flag set to another folder en masse (or, if it's easier, the ones without the x flag). Is there an easy way?
In my bash script I need to move files in a folder if it is not in use.
Code: for entry in `ls /root/shared_storage/input`; do echo $entry run=`lsof /root/shared_storage/input/$entry` ru=${run:0:5} echo $entry if [ "$ru" == "" ]; then ........ It worked fine sometimes but sometimes it just get stuck at lsof. Is there any other way that I can use here to check if the $entry is using some other process?
I have a php script in cron directory that generates 5 textfiles, after the files are generated, I want to create a script that will move the 5 text fiels to anoher folder name "web".
So I was wondering, if I capture this output into a file (ie. one file per line), can anyone help me write a command which iterates through the file and moves the files one by one to a specified directory?
I have this script that I use to find log files in the /var/log directory that are 2 days old, move them to /var/log/tmp, rename them to the system date.filename and move them back to /var/log. Everything seems to work as planned, except that the files don't get moved out of temp, and they keep getting rename. This leads to very long filenames such as:
What is it about this script that isn't moving it back to /var/log? Also, is there a better way of doing this than what I'm doing? Basically, I'm just trying to set up an audit trail on some of the files in /var/log, so that at the end of the month I can tar them, and then have our syslog server pick up the one giant monthly log.
I have Ubuntu 10.l0 installed on my laptop. I recently install the KDE desktop from the Software Center. Today, I noticed something strange. I tried to move a file to the trash when I got this error message: "The trash has reached its maximum size! Cleanup the trash manually." I don't have any files in the trash. I went back to Gnome, and was able to delete the file. I opened up Dolphin while still in Gnome, and couldn't delete anything, so I know that this isn't a KDE problem
I use avg free as antivirus scanner. I looked some time for a scanner to scan and remove viruses. Problem is avg 8.5 will only detect viruses, not remove them...I use a oneliner looking like this:
I have to move files between two file systems /inst and /inst2.When I perform 'cp -a /inst /inst2' it copies everything even hidden files and preserves access permissions.But when I perform 'mv /inst /inst2' it also preserves access perms and moves everything besides hidden files.Questions :hy is so ?What tool to use when moving file systems from one fs to another (rsync) ?
I am trying to do a find/grep/wc command to find matching files, print the filename and then the word count of a specific pattern per file. Here is my best (non-working) attempt so far:
Is there a way to specify to find that I only want text files (and not binary files)? Grep has an option to exclude binary files, so I thought find probably has a similar feature, but I've been unable to find it.
the system currently have a directory with all the invalid files. how bad is it to move a single file to a directory containing 3 million files already?
I recently had data recovered and it was sent back to me on what I think is an NTFS drive. I copied all the files over to a file share I have on a Linux box, that's ext4. Now I have that share mounted on my OSX machine, and I can't move or rename most of the files. However, in a couple cases I was able to rename a folder after the third try. Another time I was able to rename a folder once, but not again. All the permissions are showing up the same on the command-line -- I can't see any differences between the permissions on any of the files/folders. Note that I can create new folders and add files no problem, and then rename and move those all I want.
I`m totally new to linux, in fact I`m a windows adminscenario:I need to run a script that will automatically move 30 days old files from particular folder to a particular folder.
I'd like to move a selection of files from all the sub-directories within an overall directory to a single destination. I don't want any of the directory structure, just the files themselves. This is what I tried so far:
mv /dir1/*/igs*.sp3.Z /dir2
There are other .sp3.Z files in the * directories within /dir1 but I just need the ones that start with igs..
I'm trying to move font files (.ttf and .otf) from the download folder to a folder Inkscape can find them in. I tried dragging and dropping them in Dolphin but I don't have permission! So tried in the terminal:
Code: ~$ mv ~/downloads/fonts/*.*tf /usr/share/fonts mv: cannot stat `/home/bryan/downloads/fonts/*.*tf': No such file or directory