General :: Backup Of The Access Log File Without Disturbing The Server Performance

May 20, 2010

I am administering a live web server i want to keep a backup of the access log file without disturbing the server performance. can anyone guide me how to to this. the size of teh log file run in GB so i will need to take a daily backup

View 7 Replies


ADVERTISEMENT

CentOS 5 Server :: NFS Performance With Files Not Yet Cached By The Server File System?

Feb 25, 2009

I have a weird performance issue with a centos 5 running a nfs server and a rh8 client. I think the fact that it is rh8 client should be downplayed. It is just that with rh8 client the performance degradation seems more clear. See test details below OS in server is Centos 5 x86_64 kernel 2.6.18-92.1.22.el5

1Gb connection between machines File to test over NFS is a 1GB file. First of all I wanted to measure how the network alone performs while using NFS. So in the server side I run a "cat" command on the 1GB file to /dev/null. Please note that the disk read speed is about 98MBs. At this point the file system has the 1GB file cached in memory. In the client side a "cat" on the same file gives me a speed of about 113MBs. It seems then that the bottleneck in this instance is the network and it is very close to nominal speed. So the network performance is really good. (BTW I know that the server got that file from cache because a vmstat or iostat shows no disk activity.)

The second test is reading from disk with no caching involve. In the server I flushed the 1GB file from the memory. For instance by reading another 5GB file and I repeat the same thing as above in the client (a cat on the 1GB file). Now, the server has to go to disk.(vmstat or iostat shows the disk activity). However the performance, now, is about 20MBs, I was expecting something closer so 90MBs. (since the reading speed in the server in the first test showed 98MBs).

This second test was repeated for ext2, ext3, xfs with no significant differences. A similar test using a RH8 NFS server and client gets me close to 60MBs for a 1GB file not cache by the file system in the serverSince network speeds and disk read speeds are not the bottlenecks ... what or where is the limiting factor then?

View 4 Replies View Related

General :: File Is An Automated Backup Script, Backup.sh?

Sep 13, 2010

Can some one give me a sample of a crontab for backing a directory please, System is Ubuntu 9.04Quote:

#!/bin/bash
# this file is an automated backup script, backup.sh.
# this backs up my domain site.

[code]....

View 7 Replies View Related

Server :: Sync File Server Data Into Backup Server Machine By Command- Rsync -avu?

Jun 21, 2011

iam trying to sync file server data into backup server machine by command- rsync -avu path/of/data ipaddress-of-backup-server:/path/where/to/save after running it ask for root password and manually it is successful.but i want to make it automatic.for that i also tried cronjob and also generated authentication key but iam not successful in login automatically..anybody know how to authenticate root to login for storing data in backup server.

View 14 Replies View Related

Ubuntu Servers :: Slow Performance On File Server (LVM/ReiserFS/SoftwareRAID)?

Aug 18, 2011

It stores all my important stuff, as well as some music and movies.I use a second linux box in my living room to "stream" content via NFS or SAMBA share.The streaming tends to stop several times during playback, and needs to fill up its buffer again before continuing to play.I also have some Windows XP and 7 based computers that connect to this file server.I have noticed that directory listing is VERY slow, and there is a huge lag when I want to save/read a file from/to my home directory.

This is my setup:Ubuntu Server 10.10 64 bit (I have the same problem with 32bit ubuntu)
3 RAID5 arrays with 4 hard drives eachLVM on top of the 3 raid5 arrays.The Logical Volume i use is about 6.5TB, and I use the ReiserFS file systemThis LVM has grown over the years, and has had som replaced disks. So I have used the pvmove, and extend commands a bit.I have tried using IOTop and top to check if there is not enough resources available, but that doens't seem to be the problem.I haven't been able to find out why streaming over the network stops, but I know it is the server that causes the problem.Does ReiserFS have any performance problems with large logical volumes? Would changing to EXT4 or some other FS give any performance gain?

View 9 Replies View Related

General :: Access File On Different Server From Local?

Nov 19, 2010

I have a file 'my_file.txt' stored on 'myserver1.col.edu' Now, I am using a different server 'myserver2.col.edu' to do some work and I want to access 'my_file.txt' on 'myserver1.col.edu' to read (possibly edit) WITHOUT physically copying the entire file across. Is there a way to do this - perhaps through ssh?

View 2 Replies View Related

General :: Remote Desktop Without Disturbing Local User Or Other Remotes?

Jul 13, 2011

I would like to know how can I have remote desktop to a Linux Box without any disturbance of local user or others who logged in like me .I mean exactly like remote desktop in windows 2003 or 2008 which every users who logged in remotely has it's own desktop without any disturbance of others. and is this possible to do it from fedora to ubuntu and vice versa .

View 2 Replies View Related

Server :: Command To Backup An Image Or .iso File?

Mar 3, 2011

Is there a way/command to back up all data from a Red hat Linux 4 serve[Including user rpofiles, data, group info, encrypts] either to a Red hat Linux 5.4 machine or as an Image file or manageable resource?

View 1 Replies View Related

Server :: Unable To Decompress TGZ Backup File

Jan 3, 2011

I have recently upgraded to Bugzilla3 and I wanted to restore my bugzilla database with my backup but when I attempt to tar -xvvzf file.tgz I get the error:
gzip: stdin: not in gzip format
tar: Child returned status 1
tar: Error exit delayed from previous errors

My script that creates the backup is:
#!/bin/sh
datestr=`date +%m-%d-%Y`
bakdirpart="bugzilla.backup.$datestr"
bakdir="$HOME/$bakdirpart"
mkdir "$bakdir"

(cd /etc; tar cvzf $bakdir/mysql.conf.tgz mysql)
(cd /etc; tar cvzf $bakdir/apache2.conf.tgz apache2)
(cd /usr/share; tar cvzf $bakdir/bzreport.share.tgz bzreport)
(cd /usr/share; tar cvzf $bakdir/bugzilla.share.tgz bugzilla)
(cd /var/lib; tar cvzf $bakdir/mysql.hotdb.tgz mysql)
(cd /var; tar cvzf $bakdir/www.tgz www)
(cd "$HOME"; tar cvf "${bakdir}.tar" "$bakdirpart")

Is there any way to recover my backup copy?

View 7 Replies View Related

General :: Boosting Application Performance By Editing Kernel File?

Jun 15, 2010

which file is edited of kernel for boosting the application performance at boot

View 2 Replies View Related

General :: Using The /proc File Systems To Increase Performance And Functionality?

Jul 5, 2010

I'm interesting in knowing what processes could be altered to improve performance and functionality on my system. And which process may be the best one to alter

View 2 Replies View Related

Ubuntu :: Auto Backup To Samba File Server

May 8, 2010

I just switch back to ubuntu after running the windows for about a 6 months again, new laptop, programs needed in windows, either way I'm back. What I had setup in windows were specific files that would auto backup to my samba file server when the network was detected. I'm looking to do the same in ubuntu now. Basically I'm thinking of writing a script to backup the files, only thing I'd be stuck with is how to tell the script to run when I connect to the network at home? Is their software already designed for this.

View 1 Replies View Related

Ubuntu :: Set Up Home File And Backup Server From An Old Computer?

Dec 8, 2010

Through the Black Friday shuffle of getting new hardware, I now have a 500TB external drive, a 1TB external drive, and an old computer I want to set up as a home server. My family has a lot of photos that are currently stored on many different computers and are not backed up, I want 500gb of space for photos, and for those photos to be backed up. That would leave the other half of the 1TB drive for assorted things like personal backups, and general file storage. I know enough how to set up Ubuntu server edition on the computer, but the options on how I can set up the storage is stumping me.

To Recap, I have 1.5TB of storage total split 1TB/500GB. I want 500GB to be used for a central storage for the 10+ computers in my house(mostly using Windows) and that 500GB would be automatically backed up. The 500GB that's left would be used for non critical files, and wouldn't be backed up.

What is the best way of backing up the files? (script once a day that copies files? Some backup program?)

Would the 500gb drive be best for backing up to(having the 1TB be where people would put the pictures) or the other way around? Does it really matter?

Any tips on the cleanest way to have this work cleanly with Windows, Linux, and Mac? How well do photo programs(Picasa, Shotwell, iPhoto) like a setup like this? Is it possible to have different programs on different machines all reference the same file system without their automatic sorting(to folders, usually by date) messing each other up?

View 1 Replies View Related

Software :: File Server, Backup Windows PC's Across The Network?

May 16, 2010

I have use Linux ( Suse, Ubuntu , Red Hat ) different times for different things. My newest goal is File Server. Here are the specs, I have already made the box, just choosing the OS here is what it needs to be able to do from order of most important to least.

Specs;
CPU- AMD Phenom IIx4 945
Ram- 4GB DDR 3 1600mhz
Mobo- Asus M4a79xtd EVO
Video- ATI 4650
PSU- Corsair 650w Modular

[Code]...

View 1 Replies View Related

Server :: Rsync Error - Command To Backup Image File

Oct 6, 2010

when i use rsync command to backup my image file , it shows the following error message.

bash: line 1: /usr/bin/rsync: Argument list too long
rsync: connection unexpectedly closed (0 bytes received so far) [receiver]
rsync error: remote command could not be run (code 126) at io.c(463) [receiver=2.6.8]

The command which i used is rsync -avrl -e ssh cms@server:/data/cms/data/images/* /mnt/Backup/Intranet_cms_backup/images

View 7 Replies View Related

General :: Does Scheduled Reboot Of Server Improve Performance?

Nov 12, 2009

I just want to know if a linux server got rebooted after a scheduled time(2/3 month), whether the performance improves. If improves why.

View 1 Replies View Related

Debian :: Backup Purge Script - `/mnt/backup/subfolder': No Such File Or Directory

Nov 10, 2010

This script simply deletes files older than a certain age (in this case 7 days) from a certain location; I use it to purge old backups nightly, and it works as expected:

# delete backups older than 7 days
find /mnt/backup/*  -mtime +7 -exec rm -Rf {} ;

The problem is, every morning I get an email with an error message something like this:

find: `/mnt/backup/subfolder': No such file or directory

View 2 Replies View Related

Server :: Amanda Unable To Copy Backup File From Holding Disk To Tapes

May 14, 2010

I have some issue with my amanda backup server, which is connecting with Scalar Quantum i500 via FC. I got the error as below 3 days ago.
These dumps were to tape 000289.
*** A TAPE ERROR OCCURRED: [No more writable valid tape found].

Normally I will load the proper tapes and run the amflush to push stuff from the holding disk to tapes manually. However this time amflush in this case did not help, Amanda immediately responded with an out of tape error again.

Meanwhile I got some errors from dmesg as well
st3: Error 18 (sugg. bt 0x0, driver bt 0x0, host bt 0x0).
scsi1 (0,3,0) : reservation conflict

View 8 Replies View Related

CentOS 5 Server :: Estimate The Amount Of Time The Backup Will Take And The Size Of The Image File?

Jul 26, 2009

I have a CentOS5 server with a 1tb hard drive.There is only 80gb of data on that huge drive and now I want to make a bare metal recovery backup using AcronisMy question is, how can I estimate the amount of time the backup will take and the size of the image file? Is it based on the size of my drive or is it based on the amount of data on the drive?

View 1 Replies View Related

Ubuntu :: Restore Backup Server - Where Is The Location Backup Snapshots Or Files Are Saving

May 10, 2011

I install and tested Restore EE Backup server on a test PC with basic configuration and its working fine.

[URL]

The issue i have is where is the location these backup snapshots or files are saving? I want to add a separate Storage to save the backup?

View 1 Replies View Related

Server :: In Apache Server, Change Log File Location And Log Format For Access Log Fil?

Aug 19, 2009

I installed Apache server with Debian 5.0.2 Lenny. I am trying to write a script which would analysis web log files. I found the log files on /var/log/apache2. There is an access log file, `access.log`. My question is what configuration file determines the location and the name of the access log file. How can I change them? I used CustomLog in /etc/apache2/apache2.conf like below.LogFormat ": %h %l %u %t "%r" %>s %b" common
CustomLog /home/test/my_log_file common Apache2 generated /home/test/my_log_file. But no logs were written in the file even after I run `/etc/init.d/apache2 restart`. Ichanged the log file location. It still didn't work. However, Apache2 still wrote logs in the file `/var/log/apache2/access.log`

View 1 Replies View Related

Software :: Backup A Few Servers And A Bunch Of Desktops Onto One Backup Server?

May 10, 2010

Does anyone know of any decent enterprise level backup solutions for Linux? I need to backup a few servers and a bunch of desktops onto one backup server. Using rsync/tar.gz won't cut it. I need like bi-monthly full HDD backups, and things such as that, with a nice GUI interface to add/remove systems from the backup list. I need basically something similar to CommVault or Veritas. Veritas I've used before but it has its issues, such as leaving 30GB cache files. CommVault, I have no idea how much it is, and if it supports backing up to a hard drive rather than tape.

View 7 Replies View Related

General :: Tar Backup File Keeps Growing?

Jan 7, 2010

I am new to Linux. I am using tar to backup my emails to a server. I would like to automate this process to routinely backup my emails periodicly, however, i keep running into a problem: I start in the dir I would like to create the tar file (dir size = 240MB). I enter the following command

tar cf bup.mail.llc.tar "/Users/d/Library/Mail/INBOX.mbox/Messages"
file size = 234MB

When I would like to backup my emails into the previously created tar file I use the following command:

tar uf bup.mail.llc.tar "/Users/d/Library/Mail/INBOX.mbox/Messages"file size = 462MB The backup command works, except the size of the original tar file grows, around twice the size. When I extract the updated tar file (file size = 462MB), the unarchived file is 240MB the same size as the original directory.

Why does the size of the tar keep growing each time i perform 'tar uf'?? I don't understand this

View 6 Replies View Related

Server :: Access Ubuntu File Server If It's Connected To A Router?

Feb 13, 2011

I want to access files on my ubuntu server wireless. Is there a way I can do that? I'm sorry if this is a stupid question, but I'm kind of new at this whole server thing.

View 5 Replies View Related

General :: Fstab File And Backup Products

Aug 18, 2010

The fifth field in the /etc/fstab file specifies dump frequency isn't it ?now this value can be theoretically used by backup products isn't it ?

View 2 Replies View Related

General :: Rsync Command - For File Backup ?

Mar 23, 2011

I'm going to be using this command to back up my files:

Should I change anything or is it ok?

View 4 Replies View Related

General :: How To Get Backup And Restore For F12 File System

Apr 13, 2010

O/S: Fedora 12
I am newbie in linux. What I want to do is: Make backup for my file system, cos I learn how to configure servers. So if I made some thing wrong, I want to be able to restore the default setting for my files. Instated of install new O/S.

View 4 Replies View Related

General :: SSD Backup - NFS Stale File Handle

Oct 19, 2009

I did a backup of the ssd on my eeepc using the following command from a Linux Mint on a USB key:
dd if=/dev/sda1 of=/media/disk/eeepc_save/SYSTEM/system.bck
(/media/disk in an external USB disk)

I deleted the ext2 partition using GPartEd on live USB key and created it back. I rebooted Linux Mint and restored the filesystem using the opposite command :
dd if=/media/disk/eeepc_save/SYSTEM/system.bck of=/dev/sda1

I mounted /dev/sda1 and when I "ls" the root directory, I get several "NFS stale file handle" messages concerning directories (/dev and other). I tried "e2fsck -y", had a bundle of corrections that resulted in the deletion of the directories. I don't use NFS. I did the same for the user filesystem and had no problem (it's an ext3 partition). The two filesystems are the ones that came with the original Xandros installed on my eeepc and that were mounted with union-fs.

View 3 Replies View Related

General :: Backup Large File To Multiple DVDs

Nov 2, 2009

I work for a school consulting company.We helped a school deploy about 1500 computers.The computers have windows XP but we have been using G4L for the restore partition on the drives.So far the software works great. We did however run into a problem in that many of the computers we deployed are missing the restore partition. The reason they are missing is long and convoluted and not really that important. What I have been charged to do is try and fix the restore partition problem. One solution that I had, which im not even sure if it will work, was to backup the recovery file, that g4l created, to DVD and write a basic script to recreate the partition and then copy the file over. This process would need to be as automated as possible since this disc will be inserted by the end user(the students). The backup file that g4l created is 5.9GB so it wont fit on just one disc and Dual layer discs are too expensive to use for this project, so the file will either need to be compressed again (not sure if that's a good idea or not) or split across two DVD's.

I have searched the forums here and I was not able to find anything to fix this problem. I was able to find some info on splitting files across two discs but im not sure how to use that to fix my problem.

View 5 Replies View Related

General :: Backup A Configuration File Like Sources.list?

Feb 11, 2011

It's always a good to backup a configuration file like sources.list before you edit it. To do so, issue the following command: sudo cp /etc/apt/sources.list /etc/apt/sources.list.backup Where does it backup to and how do I access it? I want to put the backup on removable disk or upload it

View 2 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved