Software :: Mysql Full Backup Script Not Archiving The Dump File?
Mar 20, 2011
I have a problem with a script i wrote, the script runs fine if manually executed however it doesn't run *fully* when executed via cron
here's the script :
Code:
#!/bin/bash
FILENAME=mysql_full_dump_`date '+%m.%d.%y'`.sql
`which mysqldump` --all-databases -uroot -p************ -h127.0.0.1 > /root/$FILENAME
RETVAL=$?
[code]....
the script resides in /root/bin and the cron entry is as follows:
Code:
0 0 * * * root "/root/bin/mysql_daily.sh"
the result is the .sql file, but it doesn't archive it.
View 2 Replies
ADVERTISEMENT
Mar 17, 2010
I have a script (below) which works ok, but I have tried to modify it as I want to keep the older files for a restore if needed. I have tried adding a date suffix to the newly created files (second lump of code), but it doesn't seem to work.I get the error:
$SOURCEDIR/p1db_$DATEVAR.sql: ambiguous redirect
The working original script:
Code:
#!/bin/bash[code].....
View 1 Replies
View Related
May 22, 2010
Does the dump command back up entire file-systems or is it capable of backing up subsets of a file-system? And is tar capable of taking device names (for file systems) as input to be archived?
View 1 Replies
View Related
Sep 6, 2010
I have used Dump Command to dump the application files. For Full backup the level 0 is working fine. For incremental backup I used the level 1 or 2 it is getting the error as
DUMP: Only level 0 dumps are allowed on a subdirectory
DUMP: The ENTIRE dump is aborted.
The code I used
===============================
#!/bin/bash
#Full Day Backup Script
#application folders backup
#test is the username
now=$(date +"%d-%m-%Y")
[Code]...
View 2 Replies
View Related
Jan 31, 2011
I want to create retrievable archives of a my old emails say monthly to avoid the old emails running me out of memory. If I use the 'Backup Settings' procedure as explained in Evolution Help what happens when I wish to consult the /home/dbus/evolution-backup.tar.gz archive file?
1- Will it simply over-write my current Evolution data? [in which case its not what I need]
2- If not, how do I return the archive file to dead storeage and resuscitate my current data?
3- If it will overwrite using the 'Restore Evolution' procedure given in Evolution Help is there a workaround ... perhaps by ...
3a- renaming the archive file,
3b- or 'restoring' it in another version of Evolution,
3c- or archiving CURRENT data as a 2nd backup with a different name [eg: /home/dbus/evolution-backupJan11.tar.gz?] then restoring that?
4- Will I be able to retrieve successive archives if I rename them, say '/home/dbus/evolution-backup.tarDec10.gz' etc once Evolution's saved them?
Alternatively the following came from a dead thread [from commonlyUNIQU3 ] ... is it still valid? does it avoid the problem of potentially running out of memory?
A. Make a subfolder to the "Inbox" under the "On This Computer" tree (I call mine "Archive")
B. Drag and drop the emails you want to archive into this folder.*
This will move the selected emails off of your Exchange server/account and into this folder (and into local storage) - unless you do a copy & paste instead of drag & drop. You may need to have the setting for downloading emails for offline access enabled for this to work as desired. If I recall, the new integrated backup feature creates a compressed (.zip) archive from which you can later restore the email (haven't tried that just yet).
View 5 Replies
View Related
Feb 24, 2010
I am a little confused in distinguishing Backup, compressing and Archiving data. Help me to figure out how these can be useful.
View 2 Replies
View Related
Jan 28, 2014
I would like to create a full systembackup to a ISO/IMG-file. I've been searching and found mondorescue.org, but something is wrong with package for debian 6.
View 14 Replies
View Related
Apr 28, 2009
How do I get a full kernel dump?
View 4 Replies
View Related
Oct 7, 2009
I want to generate core dump files from my program when it crashes. Its a pretty big process and has about 10-11 threads in it.I have followed the documentation to enable core dump by setting ulimit to unlimited etc. I quickly tried "A demo program creating a core dump" from the following webpage, which succeeds in Segfault and dumping a core file in the directory that I configured.However, I tried running my original program and caused it to crash. I did this by making calls to kill(), raise() or the same null pointer access as shown in the webpage above. In each case, my program crashed but did not generate a core dump file. Am I missing something?My program is in C++ and my environment is Redhat 9.0 (kernel 2.4.20)
Going through the "Why do I NOT get a core dump?" section on the same webpage as above, I can see two potential problems. One - there are issues with the suid/sgid (bullet # 6). I am not able to change any settings with suid because my system does not contain either /proc/sys/fs/suid_dumpable or /proc/sys/kernel/suid_dumpableTwo, my program has threads in it and the bullet # 8 is the problem.
View 1 Replies
View Related
Sep 23, 2010
I am manually backing up my server now with mysqldump and that works but I was wondering about mechanizing the process.
Nirvana would be to dump to a file named with the date
This way I have backups going back over time
View 9 Replies
View Related
Mar 27, 2011
I would like to have dump backup just my home directory but am having problems the command I am using wants to back every thing and takes hours upon hours it has been running for about 10 hr and only 21% is done. This is the command dump -0u -f dp_hd /media/CENTON USB/ /how can I get this to back up only my home directory
View 7 Replies
View Related
May 19, 2009
Using tar is it possible to backup different types of file system e.g.ext3, ufs, or any other file system. I know using dump it is not possible because it is reading through raw device. Then what about tar? Where I get more info about this? Means suppose I want to backup files from different file systems using tar then is it possible?
View 1 Replies
View Related
Mar 15, 2011
I am currently using a script to backup my Ubuntu 10.04.1 system. The mySQL databases are backed up separately from the the system / data.
My problem is with the mySQL incremental / binary log backups.
The problem is that the binary log file(s) are always named xxxx-bin.1.
Up to about a month ago the binary logs were named xxxx-bin.000001, xxxx-bin.000002, etc.
I did make some changes at about the time that this change in file naming ocurred, but I can not identify what, if any, setting I may have changed that has caused all of the binary log files to always have the same name.
My back up script uses both mysqldump and mysqladmin flush-logs to create the binary logs.
All of the setting for mysqldump and mysqladmin are contained in the my.cnf file.
The my.cnf file contents that are relavent are as follows:
Code:
The statements in the backup script that do the backup are:
mysqladmin flush-logs
or
mysqldump | gzip > $DB_BACKUP_DIR/$ARCHIVE_FILE #Note: delete-master-logs in my.cnf
View 3 Replies
View Related
Apr 14, 2011
One of my clients needs a backup of his svn repository. I see that this is possible using svadmin dump command. I see where the location of the source repository is, but I don't see anything in documentation as to where the actual dump file is located. I need to know where the dump file is so I can scp or rsync the file to another server for backup.
View 5 Replies
View Related
Aug 8, 2010
I've got a Centos 5 machine running with a raid 1 SSD hard drive combo, as I don't know how or even if it's possible yet to wipe the free disk space clean I need to be careful to not fill all the free disk space. As I don't want to fill the free disk space too quickly and was wandering if it is possible to pipe the result of a mysqldump to a FTP client only writing it to the ram and not writing it to this disk.
I've done a bit of research on the subject and have found the two following commands:
Code: mysqldump < mysqldump options> | gzip > outputfile.sql.gz
Code: tar cf - / | ncftpput -c sonic.sega.co.jp /usr/local/backup.tar
I would like to combine the two to make something like this:
Code: mysqldump mysqldump_options > | ncftpput ncftpput_options -c SERVER_IP backup.sql
I haven't actually tried my code as it seems too easy and I'm sure I've got something wrong! If this command is even correct will it prevent the sql file from being written to the hard drive to my local machine?
View 4 Replies
View Related
Dec 11, 2010
Does anyone have a simple to use bash script or some such that will convert MSSQL dump files to MySQL formatted dump files?
View 6 Replies
View Related
Jan 19, 2010
I have a scheduled backup to run on our server at work and since the 7/12/09 it has be making 592k files instead of 10Mb files, In mysql-admin (the GUI tool) I have a stored connection for the user 'backup', the user has select and lock rights on the databases being backed up. I have a backup profile called 'backup_regular' and in the third tab along its scheduled to backup at 2 in the morning every week day. If I look at one of the small backup files generated I see the following:
Code:
-- MySQL Administrator dump 1.4
--
-- ------------------------------------------------------
-- Server version`
[code]....
It seems that MySQL can open and write to the file fine, it just can't dump
View 3 Replies
View Related
Jan 30, 2011
I installed mediawiki the other day and went with the default innodb option. However a week later something went wrong. And since I have scripts that nightly backup /var/ I just copied the backup of /var/lib/mysql/wikidb/ (as I've done with MyISAM). Then when I connect the wikidb database. I can see the tables (via "show tables"), but when I do any query with them (check table X, select * from X) I get:
Code:
Table 'wikidb.X' doesn't exist I've since read that can can't just copy the database directory like MyISAM, and there appears to be no way that I can find to restore or fix Innodb, without a dump of the data. And I never got a chance to do a mysqldump of the data. So has anybody got any idea how I can at least view the "page" table from the files I've backed up in /var/lib/mysql/wikidb/ ?
View 1 Replies
View Related
Jun 23, 2010
Which are the Open Source "file and email archiving" software for both Linux and Windows equivalent to Enterprise Vault Symantec?
View 2 Replies
View Related
Aug 4, 2010
I started unarchiving a RAR file that this several gigabytes big. The computer is now going really slow, it's almost frozen. Sometimes I can move the mouse a little, but that's it. The unarchiving process seems to have halted, so now all I can do is restart the system. I don't think I can unarchive this file in Linux.
View 5 Replies
View Related
Mar 4, 2011
I'm trying to zip or rar >100000 files into a single file so that I can upload it to my server much faster than ftp downloaded it. Total they're all only 4gb, but because of the number of files Nautilus freezes just opening the folder they're in. They're all .jpgs and all in the same folder and I've tried a few commands but I keep getting error messages.
Anyone have a command that will archive all the files from a folder into a single zip (or rar, tar etc)? I can't just archive the folder because then I would have to move all the files out of that folder and just opening the folder to move them would crash it, and I don't have ssh into that server.
View 3 Replies
View Related
Oct 6, 2010
I need to back up a fold on a remote machine to my local box; the remote hd does not have enough space archive it, neither does my local box. I know there's a cantrip to pipe scp through gzip (or similar), but I don't remember the syntax.
View 1 Replies
View Related
Mar 25, 2010
I had asked red hat support how to do a full systems backup of a server, they said to use dd to make a full copy of the disk which looks straight forwards enough:dd if=/dev/c0d0 of=/path/to/file/system/backup.img However, red hat have said that backups and restores are not supported. I just wanted to find out whether anyone had successfully done this, and whether anyone had tried creating a clone using this method.Is it as simple as it appears or are there any points of note.
View 5 Replies
View Related
Feb 18, 2010
I got a backup file which first 1000 bytes are as follows:
Code:
00000000 54 41 50 45 00 00 03 00 8c 00 0e 01 00 00 00 00 |TAPE............|
00000010 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 |................|
*
00000030 02 00 87 05 51 64 50 48 01 00 00 00 01 00 00 00 |....QdPH........|
[code]....
I tried to restore it:
Code:
$ restore -r -f zzz_labxxxx
Checksum error 32615101403, inode 0 file (null)
restore: Tape is not a dump tape
What application could I use or even try? The man who made the backup can't be reached anymore and we had been left with the backup.
View 1 Replies
View Related
Oct 1, 2010
I just moved to Ubuntu. I have been configuring and installing applications so I can finally call this OS my new home. Still not there yet but have been doing alot of work on it. I was originally with Windows XP. I am a back up freak. I backed up my documents, firefox bookmarks and other important information every 5 mins to another drive when I was using Windows XP. Also, every 24 hours I would have this program Acronis True Image back up the whole Windows partitiong (Drive C). This was so if my hard disk died, I know I could restore the partition from an image and only lose the last 24 hours of configurations/modifications made to the OS.
And for personal data such as bookmarks, messenger chat logs, documents, pictures etc, I would only lose the last 5 minutes of this data, since it was always backed up every 5mins. It was very fast too.
So my question is: 1. Whats the best application to save a whole image of the Ubuntu partition so that it can be restored with minimal (hopefully only 24 hours) of data loss?
2. Whats the best application to save selected directories and have it configured such that those directories are backed up periodically?
3. Whats the best application to do Number 2 (above question) with versioning? That is, if it is backing up periodically, it will save file changes, so we can revert back to an older version of a file.
I have tried SimpleBackUp suggested by the Ubuntu site and did not like it at all. It just said it would run in the background and you couldnt even cancel it or monitor it!
View 9 Replies
View Related
Jun 3, 2011
I am a backup noob. My idea of backing something up is finding a big enough flash drive and copying the necessary files over.
So I really need to learn now. I'm wiping a Vista laptop for a friend to install Windows 7. But first, I want to do a whole-drive backup in case something goes wrong. It's a 100GB drive with 50GB of data.
Is it possible that I could do this via my home network or via a direct ethernet connection? I have a desktop with a 1TB drive I could back up to. Like I say, I'm a noob so I'm open to anything.
One more thing: I'd like this backup to be in a form that I can retrieve individual files from it if necessary. If everything goes right, I'll probably want to pull My Documents out of the backup and drop it into Windows 7.
Oh, and why am I asking on UbuntuForums instead of a Windows forum? Because I'm betting I'll end up booting a live CD on the laptop to do the backup. But I'm just guessing. At any rate, I'm sure I'll use Ubuntu tools, because that's what I know.
View 9 Replies
View Related
Aug 23, 2009
I'm trying to add a scheduled full backup to the crontab file, but the full backup never completes; it always stops somewhre in the file system. I guess is b/c the os is updating those files or has them open. I've tried to use the --exclude options but still it always hangs somewhere else.... this is what I'm usingtar -zcvpf /mnt/storage/backup/fullbackup1.tar.gz --exclude=/mnt --exclude=/sys --exclude=/proc --exclude=/lost+found --exclude=/net --exclude=/srv / > /mnt/storage/backup/fullbackup.log
View 1 Replies
View Related
Oct 21, 2009
How to create a dump of an existing file and how to restore it with command line?
View 6 Replies
View Related
May 13, 2010
We are going through the motions of testing the backup and restore configurations of our postgres database. One idea was testing the viability of the dump file. Does anyone know of a way of testing the dump file to determine if there is any corruption in it
View 2 Replies
View Related
May 5, 2010
I want to be able to recover from a disaster by simply inserting a CD of my entire system, boot from it, and reinstall my system back to the way it was before the disaster. After much research here, I feel the need to ask this question directlybut as a new user, I find it somewhat difficult locating information.
I have seen references to all sorts of backup software. I am trying to use Simple ackup.Each time I run this utility, it gives me a process ID and then apparently vanishes. I don't see the process running in System Monitoror see anything recognizable in var/backups.Perhaps, being as new to Linux as I am, I am simply overlooking something. I must say though, that these are the friendliest user groups I have ever seen. It amazes me that so many people are so willing to post long, complicated solutions to someones problem
View 8 Replies
View Related