Software :: Use Wget To Retrieve Some Data From Tape Backup Utility?
Sep 30, 2010
I'm trying to use wget to retrieve some data from our tape backup utility (HP Command View 1/8 G2 Autoloader). The URL requires two parameters for the info I want to retrieve. I have searched for a few hours and have tried numerous combinations to get the data but the parameters aren't being executed. I have escaped the URL as well.
Currently I have bacula backing up the contents of my server on a daily basis (Mon-Fri) on to a tape. There are in total 9 backup tapes so far. One for each day of the week (Mon-Thurs) and one for every Friday in the month (so there are 5 friday tapes). The tapes are all from the same pool and once the tapes are full they should then get recycled. As I understand and from what I have seen, the data from my server backups up to the tape and is then marked with the append status. Every time that tape is used, data is then appended onto the tape until the tape is full. At this point it is marked as full but wont be recycled until all the tapes from the pool are marked as full (so until all 9 tapes are full).
My question is, is there a way of overwriting the data on the next backup on the tape, rather than appending or requiring that all tapes in a pool are full before a tape can be recycled. As the mon-thurs tapes currently cant be recycled until all the Friday tapes have been used. The reason why I dont have seperate pools for the Friday backups and the other days of the week or even seperate pools for each day of the week, is that if a tape is accidently not changed or someone puts in the incorrect tape by accident a backup will still occur. Which is better than no backup!
I am running Ubuntu 10.04, and i recently purchased an tandberg LTO-4 SAS tape drive. I want to access it and backup data on it. Do I simply just connect plug it into the server,and I should be able to backup/transfer data to the tape drive? Or are there intermediate steps before I can do that. Here are some results from commands that I have typed:
I am looking at getting a DLT drive for my network; however, I have never used the tar command with a tape drive. What happens if the data is larger then 1 tape? Does the tar application automatically span tapes or do I need to use switches so it spans multipule tapes? Right now my Full backup will take 2 or 3 tapes.
How many times should a backup tape be used before being discarded? i cant find an answer for this question does anyone know, I'm thinking 1 but i have almost no knowledge on tape backup so i could be dead wrong.
i have centos 5.9 running on my server and i have to take backup of my entire data from the different server.This one I want to make it as backup server. I need few informations about the tap drive
1.Which tape drive is good also compatible with Linux (centos ), pls send me the link 2.How to take backup into tape drive , good if you send any doc. 3. Any backup software which is kind of opensource
I am setting up a tape drive back up, but I am having "fun" with bacula configuration. Basically the drive is working , I ran the test with the btape program and all was correct. I am basically meddled up with the jargon and the very large bacula documentation. I created some 2 volumes and gave the mounted the tapedrive (labeled) the name of one of them "tapevol2" .... now I can not relabel the tape, I deleted the volume "tapevol2" but still the tape drive is mounted with that name, even though the volume is deleted. If I add ( create more volumes) I can not make the tape to mount with anything different that "tapevol2". I would like to go back to zero and delete all volumes, and mount the tape with one of the newly created volumes, so the jobs back up can run.
we have a server that runs a backup (cron job) at 9:15 every night. When I log on in the morning I have mail message that gives me a long list of all the files that were backed up the night before. For a couple of weeks now, the mail message gives me an empty list. Yet, when I run the same job manually from a #prompt, it runs. I am not able to run this job with cron in the daytime because too many users are in it. I wanted to browse the tape to see if the backup is really failing to copy the files or if they are on the tape and the mail message is bogus.
Since the backkup was done with cpio instead of tar, I'm not sure if I can browse the tape with restore -i anyway.What would be the best way to browse the tape on /dev/rmt/1 without actually restoring anything ?This is an ancient DGUX system, not Linux, and I'm not a unix expert I just inherited this server recently, but a lot of things are very similar to Linux and it looked like this might be a good place to ask.
I have s script that standard users use to back up usb drives to lto4 tapes.it asks for a JobNumber and sends an email upon success.I recently expanded it to use Multiple tapes.my specific Q is if the system reaches the end of the tape it happily mentions in the terminal that one needs to put in a second tape.what i would like to do is send an email that --It needs a second tape.( it presently sends an email upon successful compleation - so you see the only way a person knows that they need a second tape is if they
1-- know that the drive they are backing up is < 800GB (lto4) 2-- they do not get a completion emails and the think to walk to the basement and see the request for a second tape. I wish for an email that would tap them on the shoulder that : " you need to put in a second tape "so -- if i know the exact syntax that shows in the terminal - which is:Prepare volume #2 for `/dev/st0' and hit return:then can i test for that somehow?
I need help writing a script that will copy everything from tape to system directory. I have a Linux box with 3 TB of Hardware space. I am using the following commands
1) mt /dev/st0 rewind
2) tar xvf /dev/st0
3) tar xvf /dev/st0 fsf 1 (Using this to move to the next segment of the tape) and then
Does anyone know of any existing software to aid in this task? We already offer backed-up data storage to our users, but no means of say arching original data sets for long term storage(NIH, etc rules). Something that could be user initiated, where they login into the archive server, tell it I want such and such files archives, and on its own it takes those files and gives them to a tape library(drive+robot).
I am now preparing myself to upgrade lenny to squeeze and decided to do a backup on my system. I used backup-manager to do the job and it worked fine. how do you restore said backup data?
How many of you guys use Back In Time as your backup utility? I tried using it, and it doesn't copy all of the folder contents to the backup drive in one pass. For example, it will copy 26 out of 80-ish gigs of data. To further complete the backup, I have to hit the "Take a snapshot" button to do another pass to add more data to the snapshots. I have to do this a couple times to get all the data.Does anyone else have this issue?
[UPDATE] It appears to copy all of the files at once, so long as you only select one backup location at a time. I was backing up an entire multimedia drive, my home directory, and my usb drive. When I had it set to only do the multimedia drive, it copied all of the files, whereas it wouldn't if I had set it up to back up all 3 locations at the same time. I guess the lesson here is to backup one location, then add another, get another snapshot, and repeat.
I have a computer running Ubuntu 9.10 as a server (it is in standard Ubuntu not Ubuntu server edition). I have 4 1TB hard drives, three of which I want to back up to on certain days of the week. I have tried using Lucky-Backup and Rsync but neither seem to be able to handle the amount of data (there is currently about 400GB). Does anyone know of a program that can run scheduled backups of this size?
Can anyone recommend a good native backup utility for 10.04. I would like compression, the ability to image partitions and/or drives and a simple way to restore in the event of total drive failure. A nice incremental backup facility would be good too. I would be backing up to an external USB hard drive but of a smaller size than the source drive so compression and the ability to choose what and what not to backup is needed.
i am using Ubuntu 10.04 when i downloaded some thing using wget like wget [URL] where this page will get downloaded and second thing sudo apt-get install perl-doc i installed documentation for perl the same i have for postgreSQL... how to use these perl documentation in learning perl.
I've been pulling my hair out trying to get wget to post data to a webpage to automatically download some files. I've tried many methods of syntax, but wget always downloads the html for the login page. A snippet of code I found in the login html page is below. Some of the characters are japanese, because it's a japanese website.
I want to backup data and upload to online hosting services.Since I'm uploading stuff online, I only want to upload encrypted data (so that even the hostiing service admins cannot look at the data).Thus, I first want to encyrpt my data locally that I want to backup. Since I will be making changes locally to the data, I want some sort of incremental imaging system where the incremental changes are stored in seperate files so that I only have to upload the incremental encrypted changes.
Duplicity is an option, but it uses GPG, which makes it a bit complicated; and I was wondering if there was any alternative which was simpler as I am only doing the encryption and backup locally.
I have been using an LTO-5 Ultrium-3000 tape drive connected to an ATTO HBA without problem. I can control the tape drive using "mt -f /dev/nst0" and have been able to make successful backups using cpio, tar, and dump/restore. I followed some instructions on the web about how to install the HPE Library and Tape Tools application (version 4.21) which relies on conversion of a rpm to a deb file. The software seems to have been installed correctly and runs. However the hardware scan function does not recognize my tape drive. The following is suggested in the user manual if the tape device is not recognized by the software under Linux:
1. Login as root. 2. Edit the following file: vi /etc/modules.conf 3. Add the following line as appropriate: add options scsi_mod max_scsi_luns=128 4. Reboot the computer.
The problem is I don't have an /etc/modules.conf and am not sure exactly which file would be equivalent? If this is even the correct solution.
My tape drive is controllable and functions well using "mt -f /dev/nst0 status" so it seems to be a matter of LT&T software to detect the tape drive.
At the risk of providing too much info here some, possibly relevant, output from lshw
*-pci:3 description: PCI bridge product: 6 Series/C200 Series Chipset Family PCI Express Root Port 1 vendor: Intel Corporation physical id: 1c bus info: pci@0000:00:1c.0 version: b5
I'm trying to download all the data under this directory, using wget: [URL] I would like to achieve this using wget, and from what I've read it should be possible using the --recursive flag. Unfortunately, I've had no luck so far. The only files that get downloaded are robots.txt and index.html (which doesn't actually exist on the server), but wget does not follow any of the links on the directory list. The code I've been using is: Code: wget -r *ttp://gd2.mlb.***/components/game/mlb/year_2010/
To make an rpc call I need to sent an xml file as post data.I know how to do this with wget. It works fine when I have the xml already filled in (depending on the node values the response from the call is different).owever I want to be able to edit part of this file, and then sent that as post data using wget. can edit this file using sed (I dont want to rewrite the files each time this gets used; and it does get used alot, with alot of different values).
I just got a 2TB drive with the intention of backing up multiple Ubuntu machines to it. What would be the best way to do this, keeping ease of restoration in mind? Should I just copy each drive image to the BU drive, or use a utility like Back in Time?
I'm running ubuntu 9.04 64-bit server and am looking to backup my entire OS drive. I've got a 200GB main drive, and a 1TB storage drive mounted at /storage. I'm already good as far as setting up backups of my data - but redoing all of my settings and software would be a nightmare in the event of a HD failure.
So what I'm looking for is a command line utility to do an image of the main 200gb drive to an external usb drive. The software needs to function similar to the Windows Vista/7 System Image utility or DriveImage XML and be able to make the images without shutting down. The best I've found so far was [URL], but it uses a GUI, and doesn't support large files.
I have dell poweredge 830 server with tape drive and RHEL 4 running on it....the issue i am facing is,i am unable to insert the tape as i had ejected the tape forcefully from it....
i tried to do a listing of the contents backedup on tape and it got struck in middle throwing below error,
/dev/st0:device input/output error. after which i was unable to eject the tape using mt -f /dev/st0 rewoffl
i removed the tape by holding the eject button and now when i try to insert another tape, it's unable to take the tape in to tape drive...
I've tried doing all except reeboting the server, can any one help me out in this issue, hope the blow information may help in debugging the issue... code....
I am working from a laptop where all my work is stored on a 80GB drive. I am now also an owner of an external 250GB USB hard drive, formatted with FAT32. I want to keep it FAT32, so that I can offer some of my files to people that run Mac OS or Windows and I don't want to have them install ext3 for windows and what not.I am in need of a strategy which will allow me to keep a mirror of my laptop drive on my new external drive, i.e. no history / versioning required. However, I do care about file permissions. The files don't have to be stored as-is, they can be stored within a large (80GB?) tar file, that is fine - it would be easier for me to coerce people to open a .tar file than to install an ext3 driver for their OS, I suppose. I don't think I can keep file permissions otherwise, can I?
I have previously used a self-written sh script that used rsync to keep an up-to-date copy of my laptop filesystem on a USB flash drive, but in that case I had the flash drive formatted with ext3, so no problem with file permissions there. This time, it's trickier.
I want to backup data and upload to online hosting services. I first want to encyrpt my data locally that I want to backup. Since I will be making changes locally to the data, I want some sort of incremental imaging system where the incremental changes are stored in seperate files so that I only have to upload the incremental encrypted changes. Duplicity is an option, but it uses GPG, which makes it a bit complicated; and I was wondering if there was any alternative which was simpler as I am only doing the encryption and backup locally.
EDIT:I have only ONE computer on which the data resides, and on which the backup image image is made. That is, I have a directory foo on my computer, the backup of which will be made to back-foo on the same computer. I want back-foo to be in an encypted form Then back-foo will be uploaded (unencrypted) to microsft live storage or to spideroak storage etc. Since back-foo is encrypted, my upload is secure. And since I'm uploading, I want incremental backup support, that is, the backup utility should create new files which contain the incremental changes so that I can upload only the new files which contain the changes.
I have a laptop that had Windows XP on it - XP crashed out and I couldnt retrieve my data of it, despite booting up in safe mode, it just didnt want to do anything. So i thought i will load ubuntu instead then try and recover my data - bad move. Now my question is, can I still retrieve the windows data of the hard drive despite having Ubuntu installed on the same drive? I know on windows there is an app called File Scavenger that recovers deleted files even after loading a clean slate of Windows on it. Is there something like that for Linux? I had photos on it that has some real sentimental value. The rest of the data I dont care about. Just the photos. I didnt think before I reloaded it.
My son's laptop with Ubuntu on went belly up, so before we dumped it I salvaged the hard disk.....now some weeks later he remembers there is some things he wants on the disk. Luckily I had done nothing with it ....have just connected the disk to my system I can see everything above(should that be below) home but not anything in his account. Any bright ideas for getting at his data is there a way of mass changing permissions or whatever?