CentOS 5 :: Can't Get A Backup Database Script To Work
Oct 21, 2009
I am trying to get a script to run, that will backup all my databases. The script is called automysqlbackup.sh (chmod 744), I have filled in the blanks and placed this into /etc/cron.daily I have checked that crond is running. Nothing happens, the databases are not backed up. I am at a loss, can anyone tell me what i am doing wrong. The script is shown below:
I had a problem with my ubuntu installation, but I don't want to get into that...The OS wouldn't load...Now, that I have a ubuntu partition mounted on a live cd, is there a way I can access or backup my database without booting the system? I'm pretty sure the database is stored somewhere, but how can I find it and back it up?
I'm looking for a backup solution for multiple web applications that exist in code (flat files) and an associated mysql database. I'd like the code backups to be grouped with the DB backups by name/location. I'd also like redundancy, with backups going to a local drive and a remote network drive.I'm just about done writing a bash script that does a combination of tar/gz and mysqldump per application. Once it's done, I'll throw it in a cron job, and enable/disable sites by name in the script.Before I go further down this path, does anyone recommend a solid package that already exists?
I'm trying to back up my database daily at 2:30am. is this the right format? 30 2 * * * mysqldump -u root -pPassword database > backup_$(date +%y%m%d).sql
After I spent some time discovering The BIG BANG of Universe and The Meaning of Life :
I managed somehow to create a script to make some backup of files on server and TAR it and then FTP the archive to another location FTP server and then emails result.
It also measures time needed to complete it and deletes archive older than XX days (set in find -mtime +20) and makes incremental backup every weekday and FULL on Sundays (which suits me bcoz no heavy load).
Files for TAR to include and exclude are in txt files listed each line separate name:
Now i want to create a resue disk for my Centos5.5i think mondoArchive tool is best for this job.i installed mondo by usingyum install mondoand it is intalled successfullybut i cant see it inApplication>sytem toolshow can i run it in a GUI mode.
Does anybody know of a database program that allows one to use a barcode scanning system? It would be like a library system, but used to track inventory. A library program can work though (we would just see checked out as sold or something like that).
I'm looking for a simple solution to backup my CentOS Server (5.x) on a daily base to a mounted disk. I found the glastree tool but I have no clue if it will work on CentOS.All recommendations, tipps, hints and maybe scripts are welcome. Unfortunatelly I'm an Linux newbie and starting with Linux CentOS a couple of weeks ago
This situation after a mains power failure took server down.Log oonto server with root, go to Admin > users and get this message:" The user database cannot be read. This problem is most likely caused by mismatch between /etx/password and /etc/shadow or /etc/group and /etc/gshadow. The program will exit now."
I am looking for advice on a what I am sure is a very basic procedure but I have never had to set this up before and am not sure where to start.
I am running 2 CentOS5.3 boxes. I have one webserver connected to the internet and behind a firewall. I would like to set up the 2nd server (database) behind the webserver and completely inaccessible from the internet. Ideally so it can only be accessed by first ssh'ing onto the webserver and then ssh'ing from there onto the database. The webserver connecting locally to database.
From what I have seen it looks like I need to set this up using NAT but I have never done this and do not know what is involved. Can someone point me in the right direction and optimally outline the steps I need to take to hook this up? Do I need to worry about any specific hardware configuration as well?
I have a dedicated server running Centos 5.2. I have a Wordpress MU site that's setup over apache 2, mysql 5, php5, and I use phpmyadmin as well. I'm able to read the database as my website appears perfectly fine, however, I can not write to it. What exactly do I need to install on my Centos sever to allow my site to interface with the database? I've used shared servers and VPS's with no problems but this just won't work. The support team seems clueless and I'm ready to ask them to reinstall my entire OS and everything else again.
I need to create a server for a database that has a .db file that is over 800Gb. now my first disk is 50Gb and i do the standard
100m ext2 boot 2000m swap * / ext3
now I have a second 1.2T that i want to be /opt is there a way to set up 4kib block size in ext3 in the os install? or should I do it after the install? Is this block size even big enough?
I have an application running on a centOS machine that needs to access a database on a CentOS server. I granted access to all users with a certain user name and password.i opened up port 3306 as well with the following command: /sbin/iptables -A INPUT -i eth0 -p tcp --destination-port 3306 -j ACCEPTbut whenever i try to connect from my machine i get an error:ERROR 2003 (HY000): Can't connect to MySQL server on '172.16.102.129' (113)I am using mysql version 5.0.77 on centOS 5.5.
I just installed mysql and need to move the databases to a new location.I say 'no problem' I shutdown mysql (service mysqld stop) I configure my.cnf to point to the new location, which in my case is: /mnt/data/mysql. I know not very original naming. I do the old chown -R mysql:mysql /mnt/datal/mysql AND I copy all the files over (cp -R /var/lib/mysql /mnt/cgsvol/mysql) chmod 777 /mnt/data/mysql I ensure the chown worked (ls -a -l) and the files are there from the old directory. I can't load the daemon again. I check the log (/var/log/mysqld.log) and it can't write any test file to that directory.
(log file exact data). 091107 23:22:21 mysqld started 091107 23:22:22 [Warning] option 'max_join_size': unsigned value 18446744073709$ 091107 23:22:22 [Warning] option 'max_join_size': unsigned value 18446744073709$
I would like to install this software on linux server for purpose of system reporting. I have all the installation software and the system's specification is based on system need.
I have installed on a remote server phpmyadmin 3.3.1 and mysql 5.1.46 on a centos 5.4 OS.
Any newly created user with phamyadmin is unable to login both from mysql command line (from localhost) and phpmyadmin (localhost and remotely).
No prob when the user is created with mysql command line (from localhost). All of the users created with phpmyadmin, are not "visible" recognized from the show user command via the mysql command line. The setting I used when I create any user was the same as the ones with the mysql command line creation.
The error message I get when accessing from mysql command line (from localhost) is the following: "Access denied for user ... @localhost (using password: yes)"
I want to make a daily backup of my websites from ubuntu server over ftp to another server I own. Backup schedule and process works, the problem is backup restore. Winrar says: The file is corrupt, 7-zip crashes.
Backup archive looks ok (the same size as original folder) and you can also extract it ignoring the error by winrar. But the extracted folder only contains one or two subfolder and one file(usually image) and that's all.
If I try to restore from Webmin it doesn't report any error, and it looks like restore worked. But restored files are nowhere.
am using zabbix open source solution for systems monitoring. I am facing a problem and discussed it on zabbix forum. my post was as "My zabbix server is behaving abnormally, approximately daily from 9 to 12, the server stop accumulating logs. I observed that the server report is RUNNING but it did not accumulate log values and also the machine have no extra load. Its shown in the graph image attached.t the following reply,"database performance?are you monitoring database IO and available database threads? "So any one have any idea that how can I do this as I am using MySQL as backend database on RHEL 3.
Here's what I want to do: Copy the whole Ubuntu 10.04 partition/installation from my old laptop to the new one.
What I tried: Used Simple backup to back up my Ubuntu installation to a USB hard drive. It yields a 10.4 gig folder containing 7 files. Installed 10.04 on the new laptop, used Synaptic to install Simple Backup, plugged in USB drive, started Simple Backup Restore, tagged the backup directory in Simple Backup Restore, and get the error:
Error: no backups found in the target directory.
I also tried copying the backup to the local drive, same difference.
I'm running CentOS 5.4 on a Dell Xeon PowerEdge 1850. I want to back up an RHEL 4x web/mail server, 2 WinXP workstations, and 2 Ubuntu desktops....and I want them to go to my Dell LTO2 tape drive. What are my best options? I want a series of incremental daily backups and then a different series of weekend full backups.
I should have clarified that this is all in a small office LAN environment and that the RHEL web & mail server is another PowerEdge 1850 that is co-located with the CentOS server (same rack).
I am planning to implement a backup strategy to backup my basic system ,applications installed and the different configurations also of the personal data (From documents , music etc , to my photos , 3d design files etc) I have a Hard disk (well SSD really) of 120GB for the system (8GB Swap,30GB / and 60GB /home), one 750GB Hdd for save my music,photos,images and other things not related to 3d design , and i am going to buy tomorrow 2 hdds more , one of 1 TB for use only for 3D design , for save final images , models , textures , 3d related documentation and like , and one 1,5 TB HDD to use as backup of the personal data and the system. (One 500 Gb partition for backup system, and hold clonezilla images, and one for backup the data for example)
By the moment what i have done , is use clonezilla to save a image of / and another image of /home in my other hard disk (I save a image of system installation in ext3 prior migrate to XFS , and later , one image with the system migrated to XFS) and i have used Rsync to copy both / and /home to two partitions in the other hard disk , so have a copy also of clonezilla images. I think use clonezilla ,is useful to do each some time,so be able to have a fast way of recover the system , but i would like to implement a daily or weekly backup of my data and system , for example making a full backup and then incremental backups.
Perhaps for backup data this is a good aproach , but for backup configuration files , and the system in general ? , i mean , if i backup the things while using the system , could some files being not backuped , because being used ?
If i decide to install,for example, my new Wacom Intuos4 M tablet, for that i need to install the kernel source , and compile some modules and things by hand (because repository packages of wacom are a bit older than need for new intuos4) ,that will change my system and add new libraries and files to it.
If i make a full backup , and then incremental backups , each some time , if i have a problem or make a mistake when installing it or doing something , i have only to restore the backups via live cd, no ?
But if some files are being used , that, will not be backuped, so when restore it , that files , will be modern versions compared to the backup (perhaps that files would not be later modified , but who knows) and i could have problems no ? Perhaps a solution is have a clonezilla image of the system prior the first full backup ,and restore it fist to have a new installation ,with all files , used included, and then restore the full backup of the 1h later ,and the incremental backups until the moment all gone bad , to have the system working again no ?
So i don't have to be worried by files used by the system as long as also of doing online backups , also have a exact image of my system , that will make me able to restore it in some minutes and don't have to install all , by hand and remember what i had installed. Also that lets me to backup only , at least when doing full and incremental backups , the directories of the system that are dynamic, and change , and those that are static , backup them only one time.
I have a CentOS 5.4 installation with 2 Hard Drives [Disk 1 and Disk 2]. I have mounted the Disk 2 as a ext3 partition called /backups within the OS installed in Disk 1 [other partitions on Disk1 include / and SWAP].
I have a CRON job that creates specific backups of Disk 1 onto Disk 2.
This works fine.
My Problem : I wish to move Disk 2 to a New Box on which Iam planning to load the same CentOS 5.4.
My Questions :
1. Will Disk 2 be immediately identified in the new install and I add it as another partition in the new install ?
2. Do I have to perform any specific set of operations to integrate Disk 2 into the new Box ?
3. this kind of migration of a Disk partition contained on a Seperate Hard Disk cannot be migrated to a New install CentOS Box ?
I am thinking about upgrading to 64bit Ubuntu 10.10 because I have heard it's good for gaming and makes things run smoother. Also, I know it's the new thing and I'm going to have to switch eventually. Point is I don't want to loose my data with the clean install and I was wondering if I backup my current system, which is 32bit (assuming there is a backup utility, I thought I saw one) will I be able to load that on my new 64bit install?
on my server I have atMail and a intranet site being hosted using mySQL for the database locally, and obviously backing it up would be a good thing. I got the backup package from the list if you use the add/remove software called dump-0.4.etc.. can't seem to find much on how to use it, set it up, nothing. Granted I could be looking for the wrong thing which I'm sure is the case. As of now it's looking like a USB drive is going to be my only option, that will change in the future as this "project" goes further (we are a windows shop trying out Linux, and the next step is going to be adding another server that's a file server at that point we'll probably go to NAS for the backup and then transer that to a USB for offsite/secondary backup).
I have re-installed our operating system (CentOS 5) ,what is the best way to restore our Samba server? Can it be as simple as copying the smb.config and smbpasswd files back into the /etc/samba directory? That's what I am hoping. If I just copy the smb.config and the smbpasswd files back to the samba directory will the machine trusts, users and passwords just work? If not, what is the proper procedure for restoring. Actually i want to make CentOS 5.3 as my Domain Controller, I want to test all Scenario in case of any disaster of DC before putting it into production environment. I have some queries as:
1. Is CentOS 5 is more stable than RHEL 5 etc.
2. How can i take back up of entire samba server if need it in case of any disaster. How can i restore it.
3. How can i use logon scripts like GPO in windows servers.
4. How can add users in samba server and linux at one time with one command.
5. tell me any Web based samba administration tool other than SWAT.
Is there anywhere that gives detailed (suitable for non-experts) instructions on how to install rdiff-backup?
Only ancient versions seem to exist in any repository, which appears to rule out use of yum. Finding rpm packages also seems extremely difficult. And I'm not too clear what would be required to install from sources.