Ubuntu Servers :: Gzip And Tar Hang Around And Don't Die Locks Up Backup Drive?
Apr 20, 2011
so this is really the result of another problem. There seems to be an issue with CPU spiking to 99% forever (until reboot) if I run apt-get or synaptic or update manager while an external USB drive is plugged in. Note, other USB peripherals are no problem, just an external HD.
So my work around was to eject the drive when doing apt-get or other installation work, then reattaching it to remount. Now, on to the present problem. I'm using the basic backup script (the rotating one) found in the Ubuntu Server manual. It uses tar and gzip to store a compressed version of the desired directories on my external USB. (which sits in a fire proof safe - this is for a business)
However, it seems tar and gzip which run nightly 6 days a week via cron as root, don't ever want to die, and they don't release the drive. I have to reboot the system (I can't logoff) to release the drive, unplug it, the I can do update/install work.
Of course, if apt etc. would work fine without conflicts with the external device, I'd not care about the tar/gzip problem other than it generally isn't a proper way for them to function and it chews up some CPU cycles. (they run about 0.6 and 1.7 percent respectively) I also can't kill them via kill or killall. They seem undead.
BACKUP_DIRS="/etc /boot /root /home" BACKUP_FOLDER="/tmp/system_backup/ for DIR in ${BACKUP_DIRS} do
[code]....
All the folders get dumped into seperate gzip files. Now I want all the gzip files in the backup folder into one final gzip or bzip2 file. My goal for this is to get one file instead of multiple so I can scp or ftp the one file to another file share. Which would be easier to send one file than a bunch of files.
I am using software RAID in Ubuntu Server Edition 9.10 to mirror(RAID1) two 1TB harddrives. These are used for data storage and websites.I also have a 80GB harddrive for the operatigsystem. This drive has no backup or RAID at all. Should this drive crash and the system therefore to become no longer bootable, will I be able to recover the data the 1TB drives or should I backup the 80GB drive as well?
I have limited access to several servers (key based auth) but cron facility is not available for me. Those servers getting filled up by large apache logs and I have to login to each node manually and clean them each and every day.
I tried to write a script to run from login box but when i try that it looks like it is looking for logs in the local server (login box).
So current situation is:
How can i modify this so that the script in server1 will look for files in that server and zip them?
Google showed another command called rsh but in my env it is also not avil.
I have an interesting issue Ubuntu Server 8.04, The server has been running for quite a while (not designed or put together by me) but recently it has started segfualting and now will not boot apart from in read-only mode. I see the following errors in dmesg.
I'm trying to dump a mysql database on a small web server without killing performance. I tried using the nice command to give the mysqldump and gzip a low priority, but gzip is still taking up 100% CPU. Pages on the web server are loading incredibly slow. Here's my command:
Code: nice -n 19 mysqldump -u USER -pPASSWORD DATABASE | nice -n 19 gzip -9 > OUTFILE.sql.gz How do I get gzip to run without taking up 100% CPU? I've attached a screenshot of top about 8 seconds into the dump.
I have a scheduled backup to run on our server at work and since the 7/12/09 it has be making 592k files instead of 10Mb files, In mysql-admin (the GUI tool) I have a stored connection for the user 'backup', the user has select and lock rights on the databases being backed up. I have a backup profile called 'backup_regular' and in the third tab along its scheduled to backup at 2 in the morning every week day. If I look at one of the small backup files generated I see the following:
Code:
-- MySQL Administrator dump 1.4 -- -- ------------------------------------------------------ -- Server version`
[code]....
It seems that MySQL can open and write to the file fine, it just can't dump
I'm just setting up a partition on a seperate HDD in my system. I plan to use the partition to backup the important files on my main HDD (to guard against HD crash).
The question I have is about where would be the typical location to auto mount this partition? Which would it be normal to go for:
I've just installed 10.04 as a dual boot with Windows 7. I did a clean install removing my old 9.10 install.I really like all the changes I've seen so far and everything seems to work smoothly except when the HD is being used a lot everything freezes and then un-freezes again and again until the file operation has stopped.The problem really only comes up with heavy operations like moving large files (I moved a 10gb vdi from my Windows partition to Ubuntu) and when backintime does it's daily backup.ometimes I can continue using it then it'll freeze I wait.. can use it again for a bit then it will freeze again.'ve never had problems with transferring large files on the Windows 7 install so I don't think there's a hardware problem. I can't seem to find anything I've been searching for two days now. I did find something about a problem with backintime/ext4 partitions but the solution hasn't helped.
Does anyone know of any decent enterprise level backup solutions for Linux? I need to backup a few servers and a bunch of desktops onto one backup server. Using rsync/tar.gz won't cut it. I need like bi-monthly full HDD backups, and things such as that, with a nice GUI interface to add/remove systems from the backup list. I need basically something similar to CommVault or Veritas. Veritas I've used before but it has its issues, such as leaving 30GB cache files. CommVault, I have no idea how much it is, and if it supports backing up to a hard drive rather than tape.
I am currently running LTSP on Ubuntu 10.04. It is a dual gig nic setup with 16GB of ram and dual AMD quad core 2.4s. I installed all the latest updates as well as likewise-open so we can use AD authentication. When testing the configuration in the lab, I boot 32 clients that successfully reach the login screen. Here comes the interesting part, I can log in, using all unique AD accounts, up to 22 clients. When I attempt to log in to the 23rd client LTSP hangs. If I restart a client it will retrieve an IP address, but TFTP will eventually time out. All of the clients are connected to a Gigabit switch along with the server so network speeds are not an issue. When I run system monitor it only shows 3.6GB of memory in use and the processors are all under 10% utilization. I have beating my head against this issue for 2 days .
The culprit is a Toshiba Samsung Storage Technology 222A CD/DVD RW drive, aka SH-S222A, sold under the Samsung brand. It occasionally locks up my entire system when I'm ripping an audio CD to a *.toc/*bin image with Brasero. I've had this happen with several different audio CDs, while ripping discs using both Imgburn under Windows XP and Brasero under Ubuntu 10.04. The display freezes completely, the mouse cursor doesn't move, and I can't get any response to keyboard input. Any sounds playing loop endlessly, repeating the last 0.5 sec or so before the cursor freeze. Meanwhile, the hard drive activity light stays on, but the optical drive light does not. I've let the system sit this way for several minutes, with no sign of change. To recover, I must press the system reset button.
I ripped the same discs without incident, using my other optical drive. It is a different brand, Lite-On, but otherwise similar to the Samsung drive: PATA interface, CD/DVD RW, etc. Anyone else have the Samsung SH-S222A? I'm wondering whether there is a bug in the drive's firmware, or I just have a defective drive. It works for other things. I can play audio CDs, access CD-ROMs, and rip audio CDs to individual tracks (rather than a disc image). I can also rip DVDs. Is there some way to recover my system when it locks up from drive misbehavior? I haven't found a way so far. I'm surprised that Ubuntu can be incapacitated so easily.
The SH-S222A has the most recent firmware revision, SB01. I tried to install the newer ID01 firmware from Samsung's website, but got a message that the installer couldn't find a "suitable" drive. I take that to mean that the ID01 firmware is meant for a slightly different variant of the -S222A, perhaps one only sold overseas. Yep, that's pretty much it. My drive's customer code is BEBE. Firmware ID01 is for drives with a different customer code.
I hate this, when the Optical Drive is locked, if I not burning something on the disk! I just want to remove it, by pressing the eject button on the drive, but this is not working because Linux locks it I know, I could eject it via the Device Notifier, but I want to eject it at anytime by pressing the eject button on the drive.
I was having this issue with my server when I tried upgrading (fresh install) to 10.04 from 9.04. But to test it out after going back to 9.04 I installed 10.04 server in a virtual machine and found the same issues. I was using the AMD64 version for everything. In the virtual machine I chose openssh and samba server in the initial configuration. After the install I ran a dist-upgrade and installed mdadm. I then created an fd partition on 3 virtual disks and created a RAID5 array using the following command:
This is the same command I ran on my physical RAID5 quite a while ago which has been working fine ever since. After running a --detail --scan >> mdadm.conf (and changing the metadata from 00.90 to 0.90) I rebooted and found that the array was running with only one drive which was marked as a spare. On the physical server I kept having this issue with one drive which would always be left out when I assembled the array and would work fine after resyncing until I rebooted. After I rebooted the array would show the remaining 6 drives (of 7) as spares.
I updated mdadm to 3.1.1 using a deb from debian experimental and the RAID was working fine afterward. But then the boot problems started again. As soon as I added /dev/md0 to the fstab the system would hang on boot displaying the following before hanging:
I added another disk in server and create mount point in fstab: /dev/VolGroup00/LogVol00 /opt ext3 defaults 1 2 Everything is working perfect... halt, boot, system... but when I wanna to reboot with a command sudo reboot, it hangs at the end of all initialization when it's rebooting and some number. If I remove disk in fstab, than reboot working.
So basically what I am trying to do is install Ubuntu 10.04 on a Dell Poweredge 2600 server. Right now I dont care wither I get the server version or the desktop version, I just need to get it on their. this is what I have run into. When installing the regular Ubuntu 10.04 desktop for 32bit, the server just locks up on the very first ubuntu install screen and the keyboard lights just start blinking non stop. When I use the server version, the server get to the Detecting Network Hardware and locks up at 94% and the keyboard will also start blinking nonstop. I have looked all over for an answer to this problem and have yet to find one. This guy figured out how to make it work but he never mentioned what version he was using. [URL]..
On this server I have 6 HDs, 5 of them are RAID 0 with 1 Hot Spare. I have also tried using the ubuntu-10.04.1-alternate-i386.iso.torrent and still had the same problem. I have spent the last 3 days searching all over google and the forum to an answer but had no luck.
I currently have a group of 3 servers connected to a local network. One is a web server, one is a mysql server, the other used for a specific function on my site (calculation of soccer matches!).
Anyway, I have been working on the site a lot lately but it is tedious connecting my USB hard drive to each computer and copying the files. This means I am not backing up as often as I should...
I have a laptop connected to this same network that I use for development so I can SSH into to the computers, is there any software for ubuntu that can take backups of files that I choose on multiple computers? I know I could rsync but is there something with more or an GUI?
Then I can just every 2 days move the most recent backup from my laptop to the USB drive. Then I will have the backup stored in 2 places if things go kaboom somewhere.
I'm running Xubuntu 9.10 on an older machine. I've got a flash drive (called "TF_FLASH") plugged into a USB hub. I am using simplebackup to backup my documents (I'm writing my thesis and that is really the only important thing on this computer).
The problem I am having is this: simplebackup will run and backup my files once or twice (I have it set up to go overnight). After that, though, the name of the flash drive changes (from "TF_FLASH" to "TF_FLASH_" - note the extra underscore at the end). So, simplebackup cannot find the drive. If I go into the settings of simplebackup and change the backup destination to "TF_FLASH_" it will work once or twice, but the drive will change to "TF_FLASH__" - again an additional underscore.
I don't know if the name change is being caused by Xubuntu, simplebackup, or some other method. The USB hub is a cheap one, but I don't think that's the problem (my mouse is plugged in and continues to function, etc.).
Is there a way to create aptoncd back up on a pen drive? Meaning when I insert a cd created with aptoncd an autorun dialog box appears asking if I want to start the package manager. But when I copy those files to a pen drive the above mentioned autorun feature doesn't work. In short I want my apt cache backup on a pen drive with the autorun feature.
I Just got a Removable Hard drive! Now I need to format it because I need it to hold more then 4 GB file sizes. What format should i used. I'll only need compatibility with Other LINUX computers. So should i use NTFS, BTRFS, or EXT4? How would i take ownership of the backup drivers file system and not ROOTS permissions
I have a fresh install of Ubuntu 10.04 and have configured it the way I like. Is there a way I can make a restore image to use for backups? I know there's software like Acronis for Windows to make bootable images, can you do something similar in Linux?
I have spent considerable time installing and getting my Ubuntu 10.04 LTS (Lucid Lynx) to where I want it. I am looking for something that would BackUp my entire hard drive to a CD/DVD (preferably bootable) --so-- if I crash -or- want to clone to another hard drive I would have the ability to a 'Restore' of the CD/DVD and simply be able to load the CD/DVD to an old / new hard drive and be back-in-business without a lot of hassle.
How to backup from an old mysql in a hard drive?I have this HD which used to have an Ubuntu 7.04... However, the system crashed and it is not possible to boot anymore.I have moved the HD to another computer and made a copy for every important file and folder... But I am facing trouble to backup mysql... To copy /etc/lib/mysql doesn't work because it is probably in an older version from the mysql I use on Ubuntu 10.10
I am a backup noob. My idea of backing something up is finding a big enough flash drive and copying the necessary files over.
So I really need to learn now. I'm wiping a Vista laptop for a friend to install Windows 7. But first, I want to do a whole-drive backup in case something goes wrong. It's a 100GB drive with 50GB of data.
Is it possible that I could do this via my home network or via a direct ethernet connection? I have a desktop with a 1TB drive I could back up to. Like I say, I'm a noob so I'm open to anything.
One more thing: I'd like this backup to be in a form that I can retrieve individual files from it if necessary. If everything goes right, I'll probably want to pull My Documents out of the backup and drop it into Windows 7.
Oh, and why am I asking on UbuntuForums instead of a Windows forum? Because I'm betting I'll end up booting a live CD on the laptop to do the backup. But I'm just guessing. At any rate, I'm sure I'll use Ubuntu tools, because that's what I know.
I just got a 1TB external usb hard drive to backup my comedy shows. On my smaller usb 'pen' drives, I set the file system to ext2 (occasional reads/writes) , but should I do the same for this bigger and more frequently acessed drive? (daily read/ocassional writes), should I go with EXT3 for the journaling?
Also, regarding security, I was thinking about making the drive writable only by root, so that when I mount the drive as a normal user, which will be for a few hours daily, if someone does get onto my system they couldnt write to the drive from my user account. That should just be a simple case of setting the device to 755 (and owner=root) should it not?
I've been working at this for the past 2 days now. My computer got some kind of virus or something that has caused it to loop at startup and continually reset. I run an XP OS on a Gateway. I desperately need to backup my files, because the person who had my backup absently deleted my stuff. I was able to boot up using an Ubuntu disc and I'm in it right now, I've found my files, I have an external hard drive. The problem:First, it wont let me paste into the hard drive. If I drag, it says "Error while copying to "/media/FreeAgent Drive". You do not have permissions to write to this folder." I've mounted the external drive, nothing changes.
I've gone in to properties, is says under permissions that the owner is root, folder access is "Access files" and at the bottom is says "You are not the owner, so you can't change these permissions." The drop downs where I need to change permissions is in gray, so I can;t change it.So next, I tried "gksu nautilus", went to the drive through there, and it let me use the drop down selection under permissions. I tried to change the folder access and I got this message: "The permissions could not be changed. Couldn't change the permissions of "FreeAgent Drive" because it is on a read-only disk." So I tried changing the file access to Read and Write. It didn't give an error, so I thought perhaps it finally worked. I hit apply, and tried to put my files in. Once again I got the message from before that said I didn't have permission. I tried to change the owner so it was no longer root and I got "The owner could not be changed. Couldn't change the owner of "FreeAgent Drive" because it is on a read-only disk."I'm getting so frustrated right now. These files are VERY important to me! The hard drive I have is a Seagate FreeAgent desk 500GB
I was busy making backups to my external hard drive just now, but Ubuntu crashed 10 mins into the backups. After rebooting the affected folders are now Read only, and I cannot add or remove anything. This is extremely annoying, I already threw away two USB flash disks because the same thing happened to me in the past. I don't want to throw away the external because it's far more expensive and packed with backups.
Symptoms: I can write or delete a file to the hard drive, in any folder, accept the folders that was being accessed when the computer crashed. I have tried to change permissions, but I get an error. I tried opening a terminal and sudo rm -r that folder, but I get a input/output error. I'm running Karmic. Backups were made by Back-in-time.
Recently, Ubuntu was doing a standard update. It got stuck in some kind of strange loop. So i put the boot disk in cleared the master drive and reinstalled ububtu 10.4. I have a backup 500gb drive that use to keep the contents of my important information for my fileserver. After the completed install and found the backup drive STILL named "FILESERVER" and still has my folders aka: our pictures, our music, and our video. I opened them up and they're all empty. Am I missing some informaton? I swear i didn't format the drive. I couldn't have since the folders are still there. Where are all my files at?