I would like to have my backup script that I am writing to create a sql dump of my database and go directly into a tar file. Does anyone know how I could do this with one command?
My cron job is executing the below mysqldump command but it produces an empty sql file. However, when I run from the command line, it works as expected.
I am going crazy with a gzip file. I can decompress the file in Windows using WinRAR but it is impossible on any UNIX operating system. the file seems to be ok. If I do file the_name_of_the_file.gz
I get: the_name_of_the_file.gz: gzip compressed data, from Unix, last modified: Sun Jan 30 14:10:21 2011
But if I do gunzip -f the_name_of_the_file.gz I alsways get: gzip: the_name_of_the_file.gz: unexpected end of file The same problem happens when I try to extract the file using the GUI tool in Ubuntu or MacOSX,
I have been having a recurring problem backing up my filesystem with tar, using bzip2 compression. Once the file reached a size of 4Gb, an error message appeared saying that the file was too large (I closed the terminal so do not have the exact message. Is there a way to retrieve it?). I was under the impression that bzip2 can support pretty much any size of file. It's rather strange: I have backed up files of about 4.5Gb before without trouble.
At the same time, I have had this problem before, and it's definitely not a memory problem: I am backing up onto a 100G external hard drive.
That reminds me, in fact, (I hadn't thought of this) that one time I tried to move an archived backup of about 4.5Gb to an external (it may have been the same one) and it said that the file was too large. Could it be that there is a maximum size of file I can transfer to the external in one go? Before I forget, I have ubuntu Karmic and my bzip2 version is 1.0.5 (and tar 1.22, though maybe this is superfluous information?)
I am attempting to be careful in case my system crashes, and although highly unlikely my first question is if there is a way to first compress my Linux Partitions. After running the diskutil command in OSX's Terminal, I basically end up with this poartition scheme:
Quote: Macintosh HD = 130GB disk0s3 = 1MB disk0s4 = 30GB Linux Swap = 1.3 GB
I am sure there is a way in the Terminal to first compress disk0s3, disk0s4, and Linux Swap, and then output the compressed partitions into my external Harddrive. I have already read some of the suggestions that only /HOME, /etc/fstab/, list of installed packages, /opt, and /var/cache/apt/archives/-where all installed packages are stored, is what I should backup. But, please correct me if I'm wrong. Wouldn't it take quite a while to install all those packages again in case of a system failure. Or would it just be easier to untar all of them in their directories once Linux has been reinstalled. The closest command I have found so far in being able to achieve this is:
Quote:
sudo tar cvf - files | (cd target_directory ; tar xpf -) The above code is very suitable for what I am looking for because it enables you to copy files into another location by using the tar command where you would create In my case the new location would be my external harddrive. My external harddrive already has its own Linux partition which I am able to mount in Linux and that Linux sees as free space.
I'm trying to figure out how to access compressed files without uncompressing them beforehand, and also without modifying the application/script I am using. Named pipes do the trick, but only seem to work once
In one terminal I do this:
Code: $ echo "This is a file I'd like to be able to read." >> my_file $ gzip my_file $ mkfifo my_named_pipe $ ls my_file.gz my_named_pipe $ gunzip -c my_file.gz >> my_named_pipe
When trying to create a new compressed/archive file in Gnome Commander (GM) the file is created but the selected files are not added. I can open the new (empty) archive file and then add files to be compressed. I have tried using several different formats (zip, tar.bz and others) with the same results. The "file roller" is shown as a plugin but has no configuration other than the compressed file type.
I want to create a compressed ISO image file and mount that file to one of the virtual drives and access the content (read-only) without worrying about manual decompression/extraction.For Windows and Linux (Ubuntu) OSes.
On a Linux CD/DVD, there are compressed filesystem images for the live version for KDE or Gnome for example, but they have no extension, but they are clearly an image file ( compressed filesystem images for the live version before installation ) !!
I was wondering, How do I mount these compressed filesystem images, after I copy the ISO content of the CD/DVD on my system .... I want to edit some files or packages and make some changes, like if I want to customize a live version of gnome for example ! ... ( I know you might be tempted to tell me to use KIWI etc to customize etc ..... ) ... but I want to be able to mount the compressed file system image, then edit it for reading and writing while it is in a subdirectory on its own ... i want to open it ! ... is there a way to do this ??? ... these type of files have no extension ...
i can open this compressed filesystem image then to edit for read & write ... before I roll it back again ..... If and when I succeed .... what should I watch out for ? ... will the same compressed file image but slightly modified work again ?
PS. that same question could be kind of translated or be extended like : how do I use unionfs/squashfs programs on the command line to mount these image files with no extension for read & write mode ???
I have a problem, I'm trying to make my own LiveCD, but I can't mount compressed SquashFS file system. Here I give you my limited LiveCD version... If somebody would take a look [URL]
I have a LAMP website with mysql backend with InnoDb engine for tables.I would like to be able to use mysqldump to take periodic dumps of the database - WITHOUT having to stop the mysql server (i.e. shutting down the website) for the duration of the backup. not even in the mySQL documentation. A lot of mention is made about mysqlhotcopy, but that only works for IMSAM tables - and is therefore of no interest/use to me.Does anyone know if (how?) I can use mysqldump to take a copy/dump of a database that is still being used?.A link to the official documentation would be very useful, since I want to make sure that I get this absolutely right.I am running on Ubuntu 10.0.4 LTS
Is it possible to compress the mysqldump output into say db_backup.sql.tgz. Then add that to an existing archive e.g. backup.tgz in one command or on the fly to save space and deleting it?
I have one doubt, is copying /var/lib/mysql is a good alterntive to mysqldump?.
Because i use rsync to copy /var/lib/mysql for back up without dumping the database. I use rsync to do differential backup up so that it copies /var/lib/mysql to /var/tmp every one minute.
We are running an adserver with mysql as the database. We backup the database every hour and lately noticed that ads doesn't get delivered at the time of backup. The database size is 1.2 GB in which a single table ifself is 1.1 GB. Googling around the problem says this may be due to lock table feature in mysqldump and disabling it might solve the issue but i'm afraid whether this would be a problem. Can anyone please explain what lock table means and will it be a problem if i disable it or may be any other alternative solutions/assitance (other than master/slave or replication since we cannot afford it at this time) would be really helpful.
When I execute the command below, does mysqldump lock all tables while the backup is in process? I want to make sure this is the appropriate type of backup command for MyISAM database.
i use linux ubuntu..i try to copy db and type some mysqldump command..but at terminal show: The program 'mysqldump' can be found in the following packages:
* mysql-client-5.0 * mysql-client-5.1
Try: sudo apt-get install <selected package> bash: mysqldump: command not found
how to get mysqldump for my pc? i've tried to follow the instruction,but show like below:
Err http://ubuntu-ashisuto.ubuntulinux.jp jaunty/main libnet-daemon-perl 0.43-1 Could not connect to ubuntu-ashisuto.ubuntulinux.jp:80 (122.216.218.146), connection timed out
I've this script which takes backup of mysqldb through mysqldump. I need to send an automated email after success/failure mysqldump. How can add this feature in the script? Here's the script:
I have some tables that contain special characters from different languages like German, Italian, Russian, Spanish etc. They are stored and displayed correctly.
When want to backup my DB like "mysqldump -h localhost-u root -p dbname > dbname.sql" the special characters are lost, they are not correctly stored nor displayed in the sql file. This means there is no possible restoration.
I just backup my mysql db using mysqldump. I run it around 6pm of the evening but I just notice that on the last part of the dump file, it says "Dump completed on 2010-01-14 11:30:01". The time is "Jan 14 19:30" when I run mysqldump. Though the content is corrent, I still want to know why the dump reported it was completed but wrong time. I don't have my.cnf on my /etc so it means all was default by mysql. If I'm not mistaken, timezone by default in mysql is gmt.
My /etc/sysconfig/clock is ZONE="Asia/Manila" UTC=true ARC=false
When I do "select now();" on mysql shell, it was the same with my system. I want to prove my backup was right but this thing confuse me a bit.
I noticed in phpmyadmin that we can export a database in type update. Indeed we have update... in place of insert.... is there anyone know if it possible by a command line or by a simple PHP code ?
I am having a 1 gb of mysqldump. When i try to restore this dump file, it is giving error " ERROR 1153 (08S01) at line 289: Got a packet bigger than 'max_allowed_packet' bytes". And with this i am not able to restore the database. mysqld version is 5.0.77.
I am trying to create an Ubuntu install disk from a downloaded ISO. Evrey time I have tried (3 or 4 times now) to DL the iso I get a compressed folder with a zillion files (including WUBI installer). However there is no actual iso in the compressed folder any where. I have tried DL'n it from the direct link as a windows handled DL and I have DL'd it using BitTorrent.
I am trying to DL and burn the iso on a Window$ XP box. Could my problem be that my browser is auto detected and I get the wubi install DL? Is WinRar messing with me? Am I just lame and even though when I launch 'Infra Recorder' and it can't see any ISO's anywhere in the decompressed Ubuntu, install/iso download, there is a magic manner in which you burn the image that I just don't get?
I just want to install Ubuntu on an older box and play with it. Then I am thinking I might like to reformat my laptop and make it dual boot with ubuntu as default load. But I can't even seem to figure out how to burn an install disk ?
How to compress a PDF document (open in vim, hold down D for a few seconds) and that's worked, but now the document won't open anymore. How do I decompress it?