General :: Automate Archiving Of Log Files Using Tar?
Mar 29, 2011
I need to tar this logs, but i dont how to make it simplier to me. Everyday there are created this five logs. I need to make five tar files from every day from this files at the end of the month
For example
Till now i have tar it manualy (copied every file)
View 2 Replies
ADVERTISEMENT
Jun 4, 2011
I have a script which periodically backs up a directory using the command "tar -czvf [name] [directory]" but my problem is that the script has recently been putting a lot of stress on the server (Minecraft SMP) and tends to lag players as it backs up, which recently has been taking nearly 5 minutes.So I need to know if there's a way to control the GZip compression rate at the same time that it archives and backs up the files?I understand that I can first tar the files and then GZip them separately with a different compression rate afterwards, but this would not work because it names the files with the current server time, which sometimes changes in between commands.
View 1 Replies
View Related
May 17, 2011
i am using red hat linux 2.4 . I have 3 folders dir1 dir2 dir3 I have tarred them like this.
1.tar cvfz tarball_1.tgz dir1 dir2 dir3
2.tar cvfz tarball_2.tgz dir1 dir2 dir3 2>& /dev/null (So that it does not display any error message or operation details to the user)
[usr@machine]$ ls -lrt
-rw-r--r-- 1 usr grp 199843988 May 17 13:39 tarball_1.tgz
-rw-r--r-- 1 usr grp 199837488 May 17 13:53 tarball_2.tgz
But can any one explain the size difference as seen in list output...
View 4 Replies
View Related
Feb 28, 2011
I need a script that accepts two parameters inputDir and outputDir.
This script should copy all the log files in the inputDir to a folder like <BackupLogs-currentDaysDate>
The new folder with the log files should be tarred and gzipped <BackupLogs-currentDaysDate>.tgz
And this new <BackupLogs-currentDaysDate>.tgz file should be copied to the outputDir.
Also all the log files in the inputDir should be deleted.
View 16 Replies
View Related
Sep 30, 2009
Description: I am a newly appointed system engineer taking care of linux servers. We have a new set of data coming in which need below configuration: How to do a script with function?:
for files with ".txt" in sm
copy each of the files to folder : sm1 and sm2 (log every copy)
if succesful:
remove original
log into the log file
if not successful: (not successful copying 1 particular file to all the folders)
retain and retry
log into the log file
mail out the admin with that particular file name
I have already do try a bit:
cd /export/home/
for dir in sm1 sm2; do
cp -p sm/*.txt $dir/
done
Is my starting right? How to do the rest parts?
View 6 Replies
View Related
Mar 4, 2011
I'm trying to zip or rar >100000 files into a single file so that I can upload it to my server much faster than ftp downloaded it. Total they're all only 4gb, but because of the number of files Nautilus freezes just opening the folder they're in. They're all .jpgs and all in the same folder and I've tried a few commands but I keep getting error messages.
Anyone have a command that will archive all the files from a folder into a single zip (or rar, tar etc)? I can't just archive the folder because then I would have to move all the files out of that folder and just opening the folder to move them would crash it, and I don't have ssh into that server.
View 3 Replies
View Related
Oct 30, 2010
I'm trying to automate the transfer and processing of files between two systems to help test and compare a new server installation. The workflow is a bit complex but I'm basically modifying a script on server 'A' to push a file to server 'B' as standard input to another script.
[Code]...
But no luck. I've tried it without the port in the server_args parameter, without the '-l' option; I've tried having the server parameter set to 'tcpd' and the call to '/bin/nc' in the server_args too. But no success. Can anyone point out what I'm doing wrong with the config? PS. I've restarted xinetd and server B is listening on port 1112 and accepting connections - but nothing gets piped into the script on server B.
View 2 Replies
View Related
Jul 28, 2010
Here is what I currently do and want to automate:
1) I manually enter a particular web site address in the browser.
2) When the page displays on my machine, it shows a number of links I need to visit, one after the other.
3) Each of these links display another page (file) from which I "cut and paste" information. I do this by highlighting manually the wanted info, click "copy" then select an open file on my computer, select "undo" if necessary to remove any previous content, click "paste" and then "save".
4) I then call a Yabasic program that reads the saved file and trims unwanted info.
5) At the completion of the Yabasic program, I click the web page tag, click the "back" button to return to the first page (since I am in the second) and click the next link in this first page till all links have been visited.
6) visit the next known web site and repeat 1 to 5
In an automated program, what I need to do is:
1) Visit the known page of the web site showing the links
2) save the page showing the links (the first page)
3) make a list of those links
4) visit each link one after the other and save its page from which I will programmatically (Yabasic) select info.
5) repeat 1 to 4
I can do this in "Yabasic" (which can issue Linux commands) or PHP although I do not know PHP much.the purpose of this is to associate towns and cities of the world with their respective political/geographical divisions and their respective time zone. This has to be done often because that data changes regularly.
View 3 Replies
View Related
Dec 14, 2010
I'm relatively new to Linux in general but have learned to do the basics with the CLI.
Well my main problem is writing my first "real" script in VIM. I just have no idea where to start. I was hoping you guys could point me in the right direction.
Well this is what the script needs to do.
"As the IT administrator for a large manufacturing company you have been tasked with producing a script for archiving the daily log files. Each week the daily log files are placed in a directory named weekXX, where XX is the number of the week. Each week directory should be archived and compressed and stored in a folder called log_directories. When the script has completed the task it should display on the monitor exactly what files and directories it has archived.
The script should be started by a user with the required week numbers added as arguments (e.g prog 13 14 15 should start the program and archive the daily log files in week13, week14 and week15).
A basic manual is required, showing how to start the program, the hardware and / or software requirements, a hard copy of the script and a brief description of the test strategy and test data."
View 14 Replies
View Related
Aug 14, 2011
I would like to backup important files (totaling about 400GB) on my ext 4 RAID 5 array to an ext4 external hard drive over USB (external drive is mounted to /mnt. In the future I'd like to automate the process using rsync and cron so for now I'm using rsync to transfer the files. My problem is that using the rsync command like this: # rsync -Pr "/dir1" "/dir2" "/dir3" "/dir4" /mnt
rsync shows me the checks and transfers for awhile and then throws up an i/o error (wish I had a screenshot to show but I don't). When I ls /mnt I get a similar i/o error. I then check /dev for the drive and find that it no longer shows up. Originally the partition was /dev/sdc1. I tried unplugging the USB at this point, plugging it back in and mounting the drive back to /mnt, however it has now assigned it to (you guessed it) /dev/sdd1. I get the drive mounted and try the original rsync command again, hoping the first error was a fluke or some kind of one-time drive fart. This time it makes it quite a bit further and then throws up the exact same problem. Am I doing something terribly wrong here? As I said, I'm very new to bash so I'm not making some absolutely moronic, newbie mistake.
View 9 Replies
View Related
Jun 23, 2010
Which are the Open Source "file and email archiving" software for both Linux and Windows equivalent to Enterprise Vault Symantec?
View 2 Replies
View Related
Feb 24, 2010
I am a little confused in distinguishing Backup, compressing and Archiving data. Help me to figure out how these can be useful.
View 2 Replies
View Related
Apr 30, 2011
I have a bunch of .7z files in a directory, and I need to put each one of them into a separate directory, named after the file (without extention). The command line I use:
Code:
find . -type f | mkdir `sed -e "s:..(.*)...:1:"` ; ls | grep .7z | cp * `sed -e "s:(.*)...:./1/:"`
Copying fails though:
[Code]....
PS. I don't want to use scripts, I want to do it using simple commands and piping.
View 5 Replies
View Related
Aug 4, 2010
I started unarchiving a RAR file that this several gigabytes big. The computer is now going really slow, it's almost frozen. Sometimes I can move the mouse a little, but that's it. The unarchiving process seems to have halted, so now all I can do is restart the system. I don't think I can unarchive this file in Linux.
View 5 Replies
View Related
Oct 6, 2010
I need to back up a fold on a remote machine to my local box; the remote hd does not have enough space archive it, neither does my local box. I know there's a cantrip to pipe scp through gzip (or similar), but I don't remember the syntax.
View 1 Replies
View Related
Jun 22, 2011
I sometimes get confused by the varying command line options I need to run common Unix archiving and compression software (e.g. gzip, bzip2, zip, tar).
Is there a program out there that can just Do What I Mean for common cases? For example:
View 2 Replies
View Related
Jun 7, 2011
I currently have a setup which allows me to connect to all computers on my home network via SSH and RSA keys. I'm very security-conscious, so all of my keys are passphrase protected. I'd like to essentially set something up where I'm running Unison on a cron job to back up to a file server on my network, which we'll call timmy. I've noticed that the first time I try to use a key on my Ubuntu laptop teeks, I get a dialog which pops up asking me to type in my key passphrase. I've heard that for servers needing to make automated backups like this that one should use ssh-agent to ask for the key passphrase on login/server start. How can I set this up on teeks?
I'd essentially like to have the following happen:When I boot and come into the OS, prompt visually for the passphrase as is done when I first use a key.If I SSH into this computer (as it's internet-facing) and I haven't provided the SSH passphrase yet, then prompt for it. (Sometimes, I might need to remotely reboot the machine over SSH, so I'll be SSH'ing into it after it reboots and I'd like to be able to authenticate the key without having to VNC in and do it manually.)
View 2 Replies
View Related
Feb 10, 2011
I'd like to get the information about automatization of the tasks in Linux, more specifically in Fedora with minimal installation (without graphical interface and so on).The application (developed in C) needs to be started automatically when the computer is initialized. I've read about cron but I guess it's not a solution in this case because whenever computer starts the application must run once.I also have read about the/etc/ rc.local file but I made some tests changing it and they didn't work, the computer starts asking me the login and password but nothing run after that.The inittab was the same. In this way, someone know how to initialize an application developed in C when the computer starts?
View 6 Replies
View Related
May 24, 2011
I would like to automate a definition process. I have 11,581 to define. :|
I would like to read each word, copy the definition(s) and append it next or below the word or save definitions to file.
The word list has one word per line.
Python script?
Bash?
Perl?
Its an interesting project but I cant seem to figure it out. Lack of programming skills.
View 2 Replies
View Related
Feb 25, 2010
I have RHEL Machine in production server where we gets usual alerts that logs filling up spaces. We archive those logs but in case it grows maximum we delete the old ones. Any idea how can we autiomate the process.
View 1 Replies
View Related
Oct 16, 2010
I need to source the my /home/me/.bashrc file every time I "su -" to root. Is there any way to automate this? I cannot edit any thing in the root's environment as it is shared by people.
View 7 Replies
View Related
Mar 11, 2010
I need to ftp some files nightly from my linux box to an arbituary ftp server not controlled by me.
The ftp server admin has granted me an account for the purpose, but do not wish me to store the plain username or password in any script files for security reasons. How can I do that?
the wrong way would be:
Code:
$ cat my_script
open server_address
user plane_ftp_username plane_ftp_password
put a.txt
[Code].....
View 8 Replies
View Related
Apr 19, 2010
I have a heavily used file server that I want to restart, then if it requires e2fsck's on any volume to run them after it restarts. The only problem is that the server is rarely rebooted, and they said it might kernel panic because its been so long. I've heard there's a way to have it go past the kernel panic if it does happen, but I'm not sure how to do that or the other stuff.If it was a Windows server, I would schedule a shutdown with the force switch, and have the chkdsk's already scheduled for each volume on reboot. But for RHEL, I really don't know.I'm hoping this can be done, so that way I can have it kick off at say 7am, then when I get in at 8am it will probably be near the end of the e2fsck's so I can see what's going on.
View 11 Replies
View Related
Dec 31, 2010
I've been using Windows Home Server as a Backup Appliance, Media Server and Share Server at home for some time. I decided it was costing me allot of juice so very early on added the "Lights Out" add-on to ensure it was only running as and when needed.
I'm now looking to switch to a Linux based server and I'm looking for a similar tool/set of tools for advanced power management.
Now the question;
Anyone got any all-in-one suggestions (i.e with client parts for both Windows and Linux and a server part for the Linux server), or can anyone simply verify that I'll need to set-up all the individual bits for this myself separately?
View 4 Replies
View Related
Feb 9, 2011
i want to automate sudo su - user command from a script...it will then ask for a password...how do i automate this?
View 3 Replies
View Related
Jul 30, 2010
I've inherited the following Virtual Machine scenario and am new to Linux Administration and Patch Management. The Host Operating System is Windows 2003 Enterprise, which has VMware Server 2.0.2 installed. Under the VMware Server 2.0.2 I have a Ubuntu 32-bit OS web server running Apache2 Web Services. When I log onto the Ubuntu server (9.10 32-bit) I see the following two lines just above the new mail/last logon lines.
85 packages can be updated
55 updates are security updates
I would like to see at least a summary of each update and its urgency so I can notify the various developers/server owners to get their input regarding whether we should or should not apply that particular update to the server. We apply the patches in our test/dev environment first then once vetted there we roll them out to our production servers. What I am looking for is a way to automate the gathering of the information and once approval has been received automating the actual patching process so that I do not have to manually perform the apt-get process for each separate package needed/approved.
Ideally I would like a recommendation for a GUI based package to manage this process and that is capable of generating the appropriate reports for the 'powers that be' regarding the current security/patch management environment. For proof of concept I would like a free version that is not hamstrung in functionality but is not too costly to procure the production version with no limitations.
View 4 Replies
View Related
Jul 22, 2010
I'm not that familiar with sed and awk in order to be able to solve this problem on my own, so I'm calling on you for a bit of assistance. I'm writing a Nagios plugin to check our Oracle tablespaces and the output is given in one line like this: 1.04007771 TEMP 0 UNDOTBS1 .005340579 USERS 0 7 rows selected. I've been playing around with sed like below to delete the obsolete info and change every second space into a newline:
[code]...
how many tablespaces there are so I'd have to check all databases and 'hardcode' the tablespaces in my script. Is there any way to 'automate' this knowing that 'rows selected' preceded by a number is always the last line and using a sort of counter to auto-adjust the number to put in the -e 's/ / /2' part?
View 7 Replies
View Related
Mar 14, 2011
This is my sample code in /etc/httpd/conf.d/applications.conf file currently we are creating subdomain mannually for every new subdomain. I want to automate this process througs bash script , how its possible.
Code:
<VirtualHost *:80>
ServerName google.com
ServerAlias google.com
[code]....
In this code which i marked BOLD that content only we'l change for every subdomain. while manually doing this most of the times errors are coming to avoid that i need to automate this process.
View 3 Replies
View Related
Jul 22, 2010
I'm looking for an application that will give me some advanced tools for editing PDFs.
Here are the features I'm looking for:
Editing metadata (tagging with keywords)
Merge multiple PDFs
Rearrange page order
PDF bookmarking
Optical Character Recognition
I can give further clarification on these items if needed. My goal is to convert all of my paper files into digital files that I can store on my server. In order to effectively do this, I need the tools listed above.
Is there anything in the linux world that will give me these PDF editing abilities?
View 4 Replies
View Related
Jan 31, 2011
I want to create retrievable archives of a my old emails say monthly to avoid the old emails running me out of memory. If I use the 'Backup Settings' procedure as explained in Evolution Help what happens when I wish to consult the /home/dbus/evolution-backup.tar.gz archive file?
1- Will it simply over-write my current Evolution data? [in which case its not what I need]
2- If not, how do I return the archive file to dead storeage and resuscitate my current data?
3- If it will overwrite using the 'Restore Evolution' procedure given in Evolution Help is there a workaround ... perhaps by ...
3a- renaming the archive file,
3b- or 'restoring' it in another version of Evolution,
3c- or archiving CURRENT data as a 2nd backup with a different name [eg: /home/dbus/evolution-backupJan11.tar.gz?] then restoring that?
4- Will I be able to retrieve successive archives if I rename them, say '/home/dbus/evolution-backup.tarDec10.gz' etc once Evolution's saved them?
Alternatively the following came from a dead thread [from commonlyUNIQU3 ] ... is it still valid? does it avoid the problem of potentially running out of memory?
A. Make a subfolder to the "Inbox" under the "On This Computer" tree (I call mine "Archive")
B. Drag and drop the emails you want to archive into this folder.*
This will move the selected emails off of your Exchange server/account and into this folder (and into local storage) - unless you do a copy & paste instead of drag & drop. You may need to have the setting for downloading emails for offline access enabled for this to work as desired. If I recall, the new integrated backup feature creates a compressed (.zip) archive from which you can later restore the email (haven't tried that just yet).
View 5 Replies
View Related