Server :: MySQL Backup - Deal With Large Amounts Of Data?

Feb 15, 2011

we've been trying to become a bit more serious about backup. It seems the better way to do MySQL backup is to use the binlog. However, that binlog is huge! We seem to produce something like 10Gb per month. I'd like to copy the backup to somewhere off the server as I don't feel like there is much to be gained by just copying it to somewhere else on the server. I recently made a full backup which after compression amounted to 2.5Gb and took me 6.5 hours to copy to my own computer ... So that solution doesn't seem practical for the binlog backup.Should we rent another server somewhere? Is it possible to find a server like that really cheap? Or is there some other solution? What are other people's MySQL backup practices?

View 8 Replies


ADVERTISEMENT

General :: Writing Large Amounts Of Data To Multiple CD/DVDs?

Sep 1, 2009

Are there any tools out there that let me select a bunch of data and burn it to multiple cd's or DVD's? I'm using k3b but have to manually select cd and dvd size amounts.

View 1 Replies View Related

Server :: PAE Kernel (2.6.18) Fails To Swap With Large Amounts Of Physical Ram?

Dec 8, 2009

We're load testing some of our larger servers (16GB+ RAM), and when memory starts to run low they are kicking off the oomkiller instead of swapping. I've checked swapon -s (which says we're using 0 bytes out of 16GB of swap), I've checked swappiness (60), I've tried upping the swap to 32GB, all to no avail. If we pull some RAM, and configure the box with 8GB of physical ram and 16 (or more) GB of swap, sure enough it dips into it and is more stable than a 16GB box with 16 or 32GB of swap.

View 6 Replies View Related

Ubuntu :: Moving Large Amounts Of Files

Mar 6, 2010

I am trying to move a large amount of files (over 30k and 86GB) to another HDD but I get a Augment list too large error?? I tried rsync, cp, mv and still the same error

View 1 Replies View Related

Ubuntu Multimedia :: Prepare Large Amounts Of Images For Web?

Oct 27, 2010

I've been using GIMP's 'save for the web' tool to reduce the file sizes of images.

I now have a directory with about 50 images. I'd like to avoid processing them all by hand.

I have a (very) basic knowledge of programming, and I'm comfortable with the commandline. I don't mind doing some homework on how to use new tools.

All I'm really concerned with here is reducing the file sizes of the images I have.

What possible pathways are there for me to prepare large amounts of images for the web?

View 4 Replies View Related

General :: Move Large Amounts Of Music Within A File Structure?

Dec 20, 2009

i have a car stereo that reads a USB drive with all my music on it, however to sort through the music it uses a method of finding folders containing music, then displaying them all in a list. i find this interface annoying because in order to sort the music by artist i have to go and manually move it out of the album folders by hand, this takes a long time for 11+ GB of music so i was trying to use the linux CLI to quicken the process. use a command like this

Code:

mv /media/usb/music/*/*/* /media/usb/music/*/

but for some reason this moves all my music into the last folder alphabetically in my drive, the music is all pre-arranged like this /media/usb/music/artist/album/song

View 5 Replies View Related

Server :: Recover MySQL Innodb From /var/lib/mysql/ Backup?

Jan 30, 2011

I installed mediawiki the other day and went with the default innodb option. However a week later something went wrong. And since I have scripts that nightly backup /var/ I just copied the backup of /var/lib/mysql/wikidb/ (as I've done with MyISAM). Then when I connect the wikidb database. I can see the tables (via "show tables"), but when I do any query with them (check table X, select * from X) I get:

Code:

Table 'wikidb.X' doesn't exist I've since read that can can't just copy the database directory like MyISAM, and there appears to be no way that I can find to restore or fix Innodb, without a dump of the data. And I never got a chance to do a mysqldump of the data. So has anybody got any idea how I can at least view the "page" table from the files I've backed up in /var/lib/mysql/wikidb/ ?

View 1 Replies View Related

Server :: Remote MySQL Server Connection Dies After Wget Large File

Feb 3, 2011

We have 2 servers, 1 is the webserver and the other is the Mysql server.

When transfering a 2GB file from the webserver to the Mysql server.

The webserver's connection to the mysql DB server dies completely.

Need to restart the MYSQL process in order for it to come back online.

During this connection downtime, when using phpmyadmin on the mysql server shows no problem running queries etc.

View 2 Replies View Related

Software :: Searching Through Massive Amounts Of Data?

Feb 27, 2009

The company that I work for has massive amounts on our file server and that amount continues to grow. What we are looking for is a search appliance that will make it easier to search all documents on the file server and also search the content of those documents. I don't really like the idea of everyone using an app like X1 and searching the share drives that way on their individual PC. I would like a search appliance.

View 5 Replies View Related

Hardware :: Finding Utility To Copy Massive Amounts Of Data?

Jul 16, 2010

Sometimes I need to copy a huge directory to another directory (local filesystem), and usually I will use the "cp" or "rsync" commands. These commands are good, but depending on the size of the data being copied, the copy is painfully slow. I realize we are limited because of the hardware we have with it's limitations, ie, I/O speed, and the filesystem (which is usually ext3). Are there any other utilities that maybe not well known, but can handle copying large amounts of data? (mostly in the TB range)

View 2 Replies View Related

Server :: Sync File Server Data Into Backup Server Machine By Command- Rsync -avu?

Jun 21, 2011

iam trying to sync file server data into backup server machine by command- rsync -avu path/of/data ipaddress-of-backup-server:/path/where/to/save after running it ask for root password and manually it is successful.but i want to make it automatic.for that i also tried cronjob and also generated authentication key but iam not successful in login automatically..anybody know how to authenticate root to login for storing data in backup server.

View 14 Replies View Related

Server :: Tar Backup And Restore In Mysql?

May 6, 2011

how to take tar backup and restore in mysql,Iam new to mysql,i searched in google but i did not get the exact one.

View 4 Replies View Related

Server :: Mysql Backup And Restore Scripts?

Apr 11, 2011

writting shell scripts,i searched in google but could not get the exact,pls can anyone help how to take mysql backup & restore using shell scripts.

View 3 Replies View Related

Server :: Recover A Compressed Mysql Backup?

Dec 6, 2009

I'm trying to recover a compressed mysql backup. As the backup is extremely large, I dont wanna decompress it before importing. How can I make a mysql variable take effect before I load this compressed file into the database.

View 7 Replies View Related

Server :: Stop Mysql In Order To Preform A Backup?

Apr 12, 2010

I'm working on an old redhat server with mysql installed and I need to find a way to stop the service in order to preform a backup.

some info that I found on the server code...

View 9 Replies View Related

Server :: MySQL Backup Cron Job Not Executing Correctly?

Mar 2, 2011

I own a CentOS 5 VPS. I typed crontab -e, and then I added the following line to automatically have my server backup mysql

0 * * * * mysqldump -u root -p password --all-databases | gzip > /home/dbbackup/database_`date '+%m-%d-%Y_%H'`.sql.gz

When I go in and look, it doesn't place any files in /home/dbbackup. When I run

mysqldump -u root -p password --all-databases | gzip > /home/dbbackup/database_`date '+%m-%d-%Y_%H'`.sql.gz

View 3 Replies View Related

CentOS 5 :: Backup Data On Server To HDD Box?

Aug 19, 2010

Now, i thinking solution: Backup Data on Server CentOS to HDD Box. When server error, i can restore Server from HDD Box.

View 1 Replies View Related

General :: How To Backup Server For Preserving Data

Nov 10, 2010

How to backup my Linux server for preserving data in case if hard disk crashes?

View 2 Replies View Related

Server :: Data Copy From Mysql To Samba?

Jul 18, 2011

how i can copy data from mysql server to local system or samba server. I want to know the command.

View 5 Replies View Related

Server :: Removing Old Table Data From MySQL?

Jul 6, 2009

When a user logs into my webmail application, it creates entries in a table on MySQL called "identities" which works great however once the user is deleted fro Linux with the "userdel -r" command, the data still sits in MySQL. My question is how I can I remove the table data from old users? It appears as show below:

[code].....

View 1 Replies View Related

Ubuntu Servers :: Mysql-admin Won't Schedule Backup But Will Manual Backup

Jan 19, 2010

I have a scheduled backup to run on our server at work and since the 7/12/09 it has be making 592k files instead of 10Mb files, In mysql-admin (the GUI tool) I have a stored connection for the user 'backup', the user has select and lock rights on the databases being backed up. I have a backup profile called 'backup_regular' and in the third tab along its scheduled to backup at 2 in the morning every week day. If I look at one of the small backup files generated I see the following:

Code:

-- MySQL Administrator dump 1.4
--
-- ------------------------------------------------------
-- Server version`

[code]....

It seems that MySQL can open and write to the file fine, it just can't dump

View 3 Replies View Related

Ubuntu Servers :: Remote Backup Ftp Server With Confidential Data?

Feb 15, 2011

I already have an ubuntu backup server in my location and need this one server to be backed up remotely in another state. this other location is a helpdesk so there's a danger that they can gain access to confidential data. I'll be setting up this new server as an ftp server but need to set the ftp folder to only allow access to the backup server and me. Because its remote on the helpdesk side, they'll need some access to the file system but need to be completely blocked off from the ftp folder where all the data is at. How can I make sure I can keep them away from my data and still be able to retrieve or copy files over without permission issues between both servers?

View 9 Replies View Related

Server :: Sync MySQL Database And HLDS Data On Lan?

Mar 4, 2010

I have try to sync my MySQL Server database and HLDS Data on lan,one is windows server 2008 and one is Ubuntu Linux 9.10 i have try to use the remote address(192.168.0.4:3306) but can't connect and say the error code is 10060 i have check the connect is normal and ok,the accont is can let any address to contro.

View 1 Replies View Related

Server :: Retrieve MySQL Data From Command Line?

Aug 17, 2011

My computer has broken and I cannot login. I don't know what caused it.

I am using Fedora 14 and so it is easy to retrieve my files with the Fedora 14 installation disk under the 'restore' option. I cannot however, work out how to retrieve my MySQL data.

Would anyone be able to shed some light on this matter?

View 1 Replies View Related

General :: Backup Script Causing Data To Be Lost On Main Server- RHEL?

Feb 15, 2010

i setup the following script ro run each night (12am) as a cron on the main server:

if mount | grep -q '/home';
then
rsync -ranv --delete /home/ 138.73.56.12:/home;

[code]....

the main server is running on a dell poweredge 2600. The script rsyncs to a virtualised duplicate running on a hp dl380. when i set this script up and begn running data started going missing from the main server. if new files had been created the files by staff they would go missing, if data was added to existing files prior to activating the script the new changes made to that file would be lost.i just cant understand why this happened. as soon as i turned the script off after a few days it was all back to normal but the data that had gone missing had gone missing.

i just wanted to know if this could be a disk read/write issue, was the script running too soon not allowing data to be written before it can be backed up? could it bee memory? i just dont know. another developed occured after a few days of all this happening, which was one of our hard disk from the main server started mis behaving and flashing amber (attention).

View 5 Replies View Related

Debian :: Used Backup-manager - Restore The Backup Data?

Feb 4, 2011

I am now preparing myself to upgrade lenny to squeeze and decided to do a backup on my system. I used backup-manager to do the job and it worked fine. how do you restore said backup data?

View 4 Replies View Related

Server :: MySQL Error 134 From Storage Engine Query: Select Data, Created, Headers?

Mar 1, 2011

I need help about the error in my website. I have the following error....

Code:
user warning: Got error 134 from storage engine query: SELECT data, created, headers, expire, serialized FROM cache WHERE cid = 'theme_registry:database1' in /var/www/html/web/includes/cache.inc on line 26.

View 2 Replies View Related

General :: Backup Large File To Multiple DVDs

Nov 2, 2009

I work for a school consulting company.We helped a school deploy about 1500 computers.The computers have windows XP but we have been using G4L for the restore partition on the drives.So far the software works great. We did however run into a problem in that many of the computers we deployed are missing the restore partition. The reason they are missing is long and convoluted and not really that important. What I have been charged to do is try and fix the restore partition problem. One solution that I had, which im not even sure if it will work, was to backup the recovery file, that g4l created, to DVD and write a basic script to recreate the partition and then copy the file over. This process would need to be as automated as possible since this disc will be inserted by the end user(the students). The backup file that g4l created is 5.9GB so it wont fit on just one disc and Dual layer discs are too expensive to use for this project, so the file will either need to be compressed again (not sure if that's a good idea or not) or split across two DVD's.

I have searched the forums here and I was not able to find anything to fix this problem. I was able to find some info on splitting files across two discs but im not sure how to use that to fix my problem.

View 5 Replies View Related

Software :: Backup Large Sets Of Files To Multiple CD's / DVDs

Jun 5, 2010

I have a large collection of pictures (12GB and growing) - way too big to fit on one CD or DVD.I want to back them up to CDs or DVD's in standard (I think it's iso 9660) format that Windows can read.I know how to do this the hard way - by manually selecting a pile of pictures that will fit on one disc, burning it and then going on to the next pile.There must be a way to tell k3b or a similar program to do this for me - to automatically make a backup of the whole thing using as many discs as necessary.Can anyone tell me how to do this?

I don't want to use tar or another archive/compression scheme because I want the pictures accessible to someone with minimal technical expertise who doesn't even know how to spell "Linux".

View 3 Replies View Related

OpenSUSE Network :: ERROR 2002 (HY000): Can't Connect To Local MySQL Server Through Socket '/var/mysql/mysql.sock

Jun 7, 2011

I think this goes here, but I'm not sure. I decided that XAMPP had been troublesome enough. MySQL never worked. So I decided to instal the LAMP stack offered by YaST. I went about installing it thinking that it would all work. But it seems that I was wrong. So I try to start mysql, and here's what I get:

Code:
the-matrix:~ # mysql start
ERROR 2002 (HY000): Can't connect to local MySQL server through socket '/var/mysql/mysql.sock' (2) or
Code:
the-matrix:~ # rcmysql start
Starting service MySQL warning: /var/mysql/mysql.sock didn't appear within 30 seconds
chmod: cannot access `/var/run/mysql/mysqld.pid': No such file or directory

[Code]...

View 6 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved