I'm going to be launching my website soon, and I found a company to host it on one of their dedicated servers. I think I'm going to go with fedora as to OS, and well my problem is I'm having trouble finding a company to backup up my files, that both supports fedora and well is reasonably priced.
I wrote a script to wake up my windows machine and do an rsync backup of some of my files. I wanted to make this command a accessible through local bin so I made it executable. However the problem is that when I copies files is copies them with root permissions and i can edit or delete them. How can I set the files so they transfer with the proper permissions for my Ubuntu user?
Code: #!/bin/bash # Description: This script first wakes up the client machine and syncs the appropriate folders. # Finally the script shuts down the client if it was off to begin with. if [ "$(whoami)" != "root" ]; then echo "Permission Denied" exit 1 fi .....
I am researching how to make an effective backup on Ubuntu Server. This server will have Vsftp, VPN, Samba stuff , many other added packages +many printers, many users + data. I know I can use tar for the data /u no problem. 1. I was testing tar on the /home directory on a few user directories. I then created a new directory and did a restore of the users directories on it. I noticed the /home/user owner and group were root. The files in each directory remained the same. This gave me concern. If I had a crash and had to restore these to a new HD. I would have to change these, what else would I need to change? 2. Since I have many config files, how do I back up them? I know I can do a dump, but then users shouldn't be on the system. The system files will change as they add users, printers, etc, and asking users to not work, is not really an option while dump is running. I thought I could do a tar on whole system. (cron late at night .. not as many users) Then in event of crash of HD.
1. Boot from live cd 2. format the new drive 3. tar back in the whole system
Will this work right? Is there something I am missing?
I have a scheduled backup to run on our server at work and since the 7/12/09 it has be making 592k files instead of 10Mb files, In mysql-admin (the GUI tool) I have a stored connection for the user 'backup', the user has select and lock rights on the databases being backed up. I have a backup profile called 'backup_regular' and in the third tab along its scheduled to backup at 2 in the morning every week day. If I look at one of the small backup files generated I see the following:
Code:
-- MySQL Administrator dump 1.4 -- -- ------------------------------------------------------ -- Server version`
[code]....
It seems that MySQL can open and write to the file fine, it just can't dump
Does anyone know of any decent enterprise level backup solutions for Linux? I need to backup a few servers and a bunch of desktops onto one backup server. Using rsync/tar.gz won't cut it. I need like bi-monthly full HDD backups, and things such as that, with a nice GUI interface to add/remove systems from the backup list. I need basically something similar to CommVault or Veritas. Veritas I've used before but it has its issues, such as leaving 30GB cache files. CommVault, I have no idea how much it is, and if it supports backing up to a hard drive rather than tape.
I'm just setting up a partition on a seperate HDD in my system. I plan to use the partition to backup the important files on my main HDD (to guard against HD crash).
The question I have is about where would be the typical location to auto mount this partition? Which would it be normal to go for:
I have several machines that I'd like to backup various folders through the linux box onto DVD-RW media.I want to keep log files of what was written and when to dvd on one machine and have it automagically assign a unique serial number that I can print on the dvd in case I need to recover.I'd like a user friendly UI that I can point and click to schedule the backup and it's type.Is there a good Fedora backup application (read easy to use/understand and configure) I can use to backup machines across a network and across multiple dvd's (if needed)? The host machine is a F11 box.These are a mix of Win server 2008, win 7, win XP and several Fedora boxes.Speaking of dvd media is this a good idea and how many erase/write cycles are they good for?
I need to install a backup server in my work environment.We have a Windows 2008 server and an old DELL PowerEdge 1750 server that has no OS on it yet.I would like to install Fedora on it and then backup the Windows Server data on the Fedora server using rsync or something else to do the backups.Do you think it's a good idea ? If not what would you use to backup the Windows server data, preferably on a linux system.
I want to backup my apache website using tar. I want to make sure all the permissions and such stay the same so that if I restore a website, it's ready to go exactly as it had been.I am using the following command and would like to know if there is anything else I should neeed to do?
tar --xattrs cvzf backup.tgz /path-to-webserver-files
Now my other question is since tar doesn't store absolute paths, when I run the restore I have to be in the root directory? In another words if I run the command in a user directory will tar do something retarded like encode the paths as ../path-to-webserver or will it just be path-to-webserver so that when I run the restore from the / folder it will automatically go into the right place?
How to backup in such way that, data will be backed up in every 1 hour...but every time not full data;; only the data/file which have been modified within last 1 hour. Its on locally. And how to back up local data to remote system. Because I have tried with ssh-keygen also , but everytime its requiring password.
I want to backup my server that godaddy is hosting and install it on my linux system at home. In other words I want to have a clone of my server at home, like what timemachine does on mac (carries files and programs from one mac to another to make a clone).
How do I do that? I create an image file an iso file? or what? How?
I want to do this incase my server burns up over there. I don't want to reinstall all the programs and do the set ups manually again.
How is it possible to store and backup our files online? I don�t know that how this concept actually works. is there any kind of device needed while storing data at another location
I am somewhat new to Linux and I am looking for a way to back up my HD with all my Linux files. I have a Toshiba laptop running Windows 7. The HD has been partitioned so that the computer can run Red Hat Scientific Linux. Using Grub I can dual boot to either Windows 7 or Linux on start up. I want a simple way of backing up the entire contents of my HD (both partitions - everything) - so that in the event of my laptop being damaged I can reconstruct my set up and data as before with all my files and settings in both Windows 7 and Linux intact. Is there a simple program that will enable me to copy everything to an external HD for back up. Can anyone recommend a package that will do this?
Attempting to create a backup script to copy files from one file system to a remote file system.
When I try this I get:
Quote:
# tar -cf - /mnt/raid_md1 | gzip -c | ssh -i ~/.ssh/key -l user@192.168.1.1 "cat > /mnt/backup/fileserver.md1.tar.gz" tar: Removing leading `/' from member names Pseudo-terminal will not be allocated because stdin is not a terminal. ssh: Could not resolve hostname cat > /mnt/backup/fileserver.md1.tar.gz: Name or service not known
[Code].....
I know that the remote file system dir is RW and the access is working fine. I am stumped...
Is it possible to backup and restore the system files of fedora 10_x86_64 so that if there will be any problem at OS , I can easily recover it from the previous backup files?
As root, I use crontab to run mirrordir to backup directories. Everything gets copied over properly, but owner information isn't preserved and root is the owner of all the backed up files. I can deal with that, but crontab reports tons and tons of chown/chgrp errors for mirrordir every time I do back ups--which is every day--and the multiple emails to root of thousands of chown/chgrp errors is very annoying. The error is "Operation not permitted," but that doesn't make sense to me because the job runs as root (right?) and clearly the job is permitted to create the backup files, so why would it fail to chown and chgrp?
I've had the exact same setup on another server for years, and crontab has always run mirrordir without error. Any suggestions how to clear the errors on my new server?
I'm currently learning to use rsync to backup my music collection. I have a Firefox tab open to the rsync manual page(s) and have been reading man rsync and running experimental rsync operations.I've been doing this for the last 3-4 hours. I've used rsync for this purpose in the past with disastrous results. What was and is once again (due to a month and a half of file pruning) a 9000 file music collection had mysteriously grown to over 25,000 music files and 80GB of data! This was likely due to the fact that I didn't really know what I was doing with rsync and had never spent too much time learning about all the parameters, what their functions are and how they may relate to my goal.Here are the particulars:
* Source drive is a 500GB disk, /media/sata500/music/.
* Destination drive is a 250GB USB disk, /media/FreeAgent/music, connected to the same computer that houses the 500GB disk.
* I want to copy or backup files from /media/sata500/music to /media/FreeAgent/music.
* I do not want to create ANY duplicates of files that exist.
* I only want to add files to the destination drive if they are new on the source drive, like if I rip a CD and add the contents to the source. I want them copied over next time I run rsync.
Here's the rsync command in it's most recently used form, and probably very immature at this point.
This appears to have copied all files and folders and I'm satisfied that my goal has been met with some success. To convince myself of this I ran the command and then once it was complete I added 2 new songs putting them in their respective folders on the source drive and ran the same command again. The resulting output was
[code]....
Two files transferred. Exactly what I want.Both folders now house 20,931 files and use 40.6GB. Identical as far as I can tell.What I'm concerned about are time stamps and play count data, etc. Anything that changes the original file. I don't want this data to cause a file to be transferred as I'm afraid that the new file will be created along side the old file of the same name thereby starting this whole music collection expansion thing all over again. I've invested a lot of time and effort to get it pruned down to where there are virtually no duplicates and albums are correct in that they contain the proper songs in the proper order.
I currently have a group of 3 servers connected to a local network. One is a web server, one is a mysql server, the other used for a specific function on my site (calculation of soccer matches!).
Anyway, I have been working on the site a lot lately but it is tedious connecting my USB hard drive to each computer and copying the files. This means I am not backing up as often as I should...
I have a laptop connected to this same network that I use for development so I can SSH into to the computers, is there any software for ubuntu that can take backups of files that I choose on multiple computers? I know I could rsync but is there something with more or an GUI?
Then I can just every 2 days move the most recent backup from my laptop to the USB drive. Then I will have the backup stored in 2 places if things go kaboom somewhere.
My Proxy finally had a hard drive failure after 4 years of flawless service.So I'm building a new one. Got everything up and running except Dansguardian.
If I do a yum install dansguardian it does the install and all looks great until I go to the /etc/dansguardian folder. I'm missing all my folders and configuration files. For example:
exceptionsitelist bannedsitelist.... yadaa yadaa
Is there a problem on the repo (I'm using the default repo for fedora 11) or has there been a step changed since the last time I loaded this that I'm unaware of and unable to find a fix for?
I would like to set up a system for backing up files and even possibly using it to keep music on and listen to it over the network. I am wondering which would be better to use a separate FTP server or buy one of the NAS enclosures and a couple of hard drives to put in it. I am assuming that the NAS would be accessed via NFS. I have never run an FTP server and I have never had used NAS.I am just looking for pros and cons to each one. I would just like opinions as to which service (FTP/NFS) would be better for this task.
I've set up a FTP server, but now I would also give the ability to users to access file through a web interface, like the ones you can find in many NAS.I there anyone that knows a software that do this? I can't find nothing useful.
I have a server with RedHat distro which stores and updates someg files and second computer with Windows 7 on it. I must configure both machines so one can get those files from server to Win7 machine using ssh. But I have no idea how to do it.
I'm trying to set up awstats for my web server which runs ISPConfig3. Due to ISPConfig, my log-rotated files have the extension .log.gz, and the naming syntax of DATE-access.log.gz.
According to awstats documentation, I need their tool to merge the log files, however, I cannot get it to work. I always get file not found or pipe error like messages.
Code:
I took a look at permissions, log files are world-readable. Checked path's 1000 times, no typo. When I try to find out whats wrong, the problems usually begins when I try to use the * character in the LogFile variable, ..
Anyone got experience with multiple log files and awstats? ...
I have been hassling with this for several days now. I have 64-bit Ubuntu Server 10.04 running on an Acer Aspire EasyStore H340. I have windows 7 running on a 64-bit desktop pc and on a laptop. I mainly wanted to use the Ubuntu server for a file server, so I installed Samba and created three shares. These do show up in Windows explorer, and I can read and write to them. My windows applications seem to be able to see the shares and open & save files.
My next step was to try and set up a backup of the Windows 7 pc to the Ubuntu server. Windows integrated backup sees the shares and sub-directories within the shares, and the initial part of the backup seems to run OK, but when it tries to save an image of the 'C:" drive it works for a long time and then ends with an error (cannot complete backup).
So, I looked for some free backup programs to try, but these do not allow me to select the shares as a destination (invalid destination). The dialogue sees the drives I have mapped the shares to in Windows, but does not show any sub-directories, and selecting the mapped drive letter does not take as a destination. If I try to browse down through "Network" in the destination dialogue, it selects "Network," but does not expand it or accept it as a destination.
So, I partitioned, formatted as ext3, and mounted my 2nd 1TB SATA drive on the server, and mounted it as "storage." I set this up as share in Samba and gave everyone read-write access, but still no luck selecting it for a backup destination. After some Googling, I downloaded and installed Ext2Fsd-0.48 (a windows 'driver' for Ext2/Ext3). This installed correctly, but when I open the program, neither "Network," the shares, or the mapped drives show up anywhere.
Is it possible in dd to use it for the output file to be stored at some remote location. I do not have free space on the LVM partition whose backup I want to have via dd.
I have a file server running in an office that's mostly used for file sharing and a scanner saves pdf files to the server. I'm running the latest LTS ubuntu server edition and I really only have ssh installed and samba. My question is that I've done so much to the server as far as premissions and configuration and I'd like to make a clone of this to another computer and not sure how I would do this?
I'm not sure if clonezilla or something like this can perform this task? I basically just have a very old computer and now I have another very old computer that I want to make into a spare just incase something happens to the original. Any recommedations on how I would accomplish this?