Server :: MySQL Backup - Deal With Large Amounts Of Data?
Feb 15, 2011
we've been trying to become a bit more serious about backup. It seems the better way to do MySQL backup is to use the binlog. However, that binlog is huge! We seem to produce something like 10Gb per month. I'd like to copy the backup to somewhere off the server as I don't feel like there is much to be gained by just copying it to somewhere else on the server. I recently made a full backup which after compression amounted to 2.5Gb and took me 6.5 hours to copy to my own computer ... So that solution doesn't seem practical for the binlog backup.Should we rent another server somewhere? Is it possible to find a server like that really cheap? Or is there some other solution? What are other people's MySQL backup practices?
Are there any tools out there that let me select a bunch of data and burn it to multiple cd's or DVD's? I'm using k3b but have to manually select cd and dvd size amounts.
We're load testing some of our larger servers (16GB+ RAM), and when memory starts to run low they are kicking off the oomkiller instead of swapping. I've checked swapon -s (which says we're using 0 bytes out of 16GB of swap), I've checked swappiness (60), I've tried upping the swap to 32GB, all to no avail. If we pull some RAM, and configure the box with 8GB of physical ram and 16 (or more) GB of swap, sure enough it dips into it and is more stable than a 16GB box with 16 or 32GB of swap.
I am trying to move a large amount of files (over 30k and 86GB) to another HDD but I get a Augment list too large error?? I tried rsync, cp, mv and still the same error
i have a car stereo that reads a USB drive with all my music on it, however to sort through the music it uses a method of finding folders containing music, then displaying them all in a list. i find this interface annoying because in order to sort the music by artist i have to go and manually move it out of the album folders by hand, this takes a long time for 11+ GB of music so i was trying to use the linux CLI to quicken the process. use a command like this
Code:
mv /media/usb/music/*/*/* /media/usb/music/*/
but for some reason this moves all my music into the last folder alphabetically in my drive, the music is all pre-arranged like this /media/usb/music/artist/album/song
I installed mediawiki the other day and went with the default innodb option. However a week later something went wrong. And since I have scripts that nightly backup /var/ I just copied the backup of /var/lib/mysql/wikidb/ (as I've done with MyISAM). Then when I connect the wikidb database. I can see the tables (via "show tables"), but when I do any query with them (check table X, select * from X) I get:
Code:
Table 'wikidb.X' doesn't exist I've since read that can can't just copy the database directory like MyISAM, and there appears to be no way that I can find to restore or fix Innodb, without a dump of the data. And I never got a chance to do a mysqldump of the data. So has anybody got any idea how I can at least view the "page" table from the files I've backed up in /var/lib/mysql/wikidb/ ?
The company that I work for has massive amounts on our file server and that amount continues to grow. What we are looking for is a search appliance that will make it easier to search all documents on the file server and also search the content of those documents. I don't really like the idea of everyone using an app like X1 and searching the share drives that way on their individual PC. I would like a search appliance.
Sometimes I need to copy a huge directory to another directory (local filesystem), and usually I will use the "cp" or "rsync" commands. These commands are good, but depending on the size of the data being copied, the copy is painfully slow. I realize we are limited because of the hardware we have with it's limitations, ie, I/O speed, and the filesystem (which is usually ext3). Are there any other utilities that maybe not well known, but can handle copying large amounts of data? (mostly in the TB range)
iam trying to sync file server data into backup server machine by command- rsync -avu path/of/data ipaddress-of-backup-server:/path/where/to/save after running it ask for root password and manually it is successful.but i want to make it automatic.for that i also tried cronjob and also generated authentication key but iam not successful in login automatically..anybody know how to authenticate root to login for storing data in backup server.
I'm trying to recover a compressed mysql backup. As the backup is extremely large, I dont wanna decompress it before importing. How can I make a mysql variable take effect before I load this compressed file into the database.
When a user logs into my webmail application, it creates entries in a table on MySQL called "identities" which works great however once the user is deleted fro Linux with the "userdel -r" command, the data still sits in MySQL. My question is how I can I remove the table data from old users? It appears as show below:
I have a scheduled backup to run on our server at work and since the 7/12/09 it has be making 592k files instead of 10Mb files, In mysql-admin (the GUI tool) I have a stored connection for the user 'backup', the user has select and lock rights on the databases being backed up. I have a backup profile called 'backup_regular' and in the third tab along its scheduled to backup at 2 in the morning every week day. If I look at one of the small backup files generated I see the following:
Code:
-- MySQL Administrator dump 1.4 -- -- ------------------------------------------------------ -- Server version`
[code]....
It seems that MySQL can open and write to the file fine, it just can't dump
I already have an ubuntu backup server in my location and need this one server to be backed up remotely in another state. this other location is a helpdesk so there's a danger that they can gain access to confidential data. I'll be setting up this new server as an ftp server but need to set the ftp folder to only allow access to the backup server and me. Because its remote on the helpdesk side, they'll need some access to the file system but need to be completely blocked off from the ftp folder where all the data is at. How can I make sure I can keep them away from my data and still be able to retrieve or copy files over without permission issues between both servers?
I have try to sync my MySQL Server database and HLDS Data on lan,one is windows server 2008 and one is Ubuntu Linux 9.10 i have try to use the remote address(192.168.0.4:3306) but can't connect and say the error code is 10060 i have check the connect is normal and ok,the accont is can let any address to contro.
My computer has broken and I cannot login. I don't know what caused it.
I am using Fedora 14 and so it is easy to retrieve my files with the Fedora 14 installation disk under the 'restore' option. I cannot however, work out how to retrieve my MySQL data.
Would anyone be able to shed some light on this matter?
i setup the following script ro run each night (12am) as a cron on the main server:
if mount | grep -q '/home'; then rsync -ranv --delete /home/ 138.73.56.12:/home;
[code]....
the main server is running on a dell poweredge 2600. The script rsyncs to a virtualised duplicate running on a hp dl380. when i set this script up and begn running data started going missing from the main server. if new files had been created the files by staff they would go missing, if data was added to existing files prior to activating the script the new changes made to that file would be lost.i just cant understand why this happened. as soon as i turned the script off after a few days it was all back to normal but the data that had gone missing had gone missing.
i just wanted to know if this could be a disk read/write issue, was the script running too soon not allowing data to be written before it can be backed up? could it bee memory? i just dont know. another developed occured after a few days of all this happening, which was one of our hard disk from the main server started mis behaving and flashing amber (attention).
I am now preparing myself to upgrade lenny to squeeze and decided to do a backup on my system. I used backup-manager to do the job and it worked fine. how do you restore said backup data?
I need help about the error in my website. I have the following error....
Code: user warning: Got error 134 from storage engine query: SELECT data, created, headers, expire, serialized FROM cache WHERE cid = 'theme_registry:database1' in /var/www/html/web/includes/cache.inc on line 26.
I work for a school consulting company.We helped a school deploy about 1500 computers.The computers have windows XP but we have been using G4L for the restore partition on the drives.So far the software works great. We did however run into a problem in that many of the computers we deployed are missing the restore partition. The reason they are missing is long and convoluted and not really that important. What I have been charged to do is try and fix the restore partition problem. One solution that I had, which im not even sure if it will work, was to backup the recovery file, that g4l created, to DVD and write a basic script to recreate the partition and then copy the file over. This process would need to be as automated as possible since this disc will be inserted by the end user(the students). The backup file that g4l created is 5.9GB so it wont fit on just one disc and Dual layer discs are too expensive to use for this project, so the file will either need to be compressed again (not sure if that's a good idea or not) or split across two DVD's.
I have searched the forums here and I was not able to find anything to fix this problem. I was able to find some info on splitting files across two discs but im not sure how to use that to fix my problem.
I have a large collection of pictures (12GB and growing) - way too big to fit on one CD or DVD.I want to back them up to CDs or DVD's in standard (I think it's iso 9660) format that Windows can read.I know how to do this the hard way - by manually selecting a pile of pictures that will fit on one disc, burning it and then going on to the next pile.There must be a way to tell k3b or a similar program to do this for me - to automatically make a backup of the whole thing using as many discs as necessary.Can anyone tell me how to do this?
I don't want to use tar or another archive/compression scheme because I want the pictures accessible to someone with minimal technical expertise who doesn't even know how to spell "Linux".
I think this goes here, but I'm not sure. I decided that XAMPP had been troublesome enough. MySQL never worked. So I decided to instal the LAMP stack offered by YaST. I went about installing it thinking that it would all work. But it seems that I was wrong. So I try to start mysql, and here's what I get:
Code: the-matrix:~ # mysql start ERROR 2002 (HY000): Can't connect to local MySQL server through socket '/var/mysql/mysql.sock' (2) or Code: the-matrix:~ # rcmysql start Starting service MySQL warning: /var/mysql/mysql.sock didn't appear within 30 seconds chmod: cannot access `/var/run/mysql/mysqld.pid': No such file or directory