General :: Backup Large File To Multiple DVDs

Nov 2, 2009

I work for a school consulting company.We helped a school deploy about 1500 computers.The computers have windows XP but we have been using G4L for the restore partition on the drives.So far the software works great. We did however run into a problem in that many of the computers we deployed are missing the restore partition. The reason they are missing is long and convoluted and not really that important. What I have been charged to do is try and fix the restore partition problem. One solution that I had, which im not even sure if it will work, was to backup the recovery file, that g4l created, to DVD and write a basic script to recreate the partition and then copy the file over. This process would need to be as automated as possible since this disc will be inserted by the end user(the students). The backup file that g4l created is 5.9GB so it wont fit on just one disc and Dual layer discs are too expensive to use for this project, so the file will either need to be compressed again (not sure if that's a good idea or not) or split across two DVD's.

I have searched the forums here and I was not able to find anything to fix this problem. I was able to find some info on splitting files across two discs but im not sure how to use that to fix my problem.

View 5 Replies


ADVERTISEMENT

Software :: Backup Large Sets Of Files To Multiple CD's / DVDs

Jun 5, 2010

I have a large collection of pictures (12GB and growing) - way too big to fit on one CD or DVD.I want to back them up to CDs or DVD's in standard (I think it's iso 9660) format that Windows can read.I know how to do this the hard way - by manually selecting a pile of pictures that will fit on one disc, burning it and then going on to the next pile.There must be a way to tell k3b or a similar program to do this for me - to automatically make a backup of the whole thing using as many discs as necessary.Can anyone tell me how to do this?

I don't want to use tar or another archive/compression scheme because I want the pictures accessible to someone with minimal technical expertise who doesn't even know how to spell "Linux".

View 3 Replies View Related

General :: Writing Large Amounts Of Data To Multiple CD/DVDs?

Sep 1, 2009

Are there any tools out there that let me select a bunch of data and burn it to multiple cd's or DVD's? I'm using k3b but have to manually select cd and dvd size amounts.

View 1 Replies View Related

General :: Splitting Large Directory Over Multiple Blank DVDs?

May 19, 2010

I am currently trying to copy a directory of roughly 400GBs to dvd, have gotten myself stuck. I tried to tar and then split; however, I don't have enough room on my hard-drive to make a compressed tar and split it up and then burn to disk, so I need a way to tar the and compress the directory, split it, and burn to disk every 4.3GBs.

I went ahead and installed DAR as an alternative, as I hear it is designed for this type of task, but I can't figure out which way is heads or tails.

my OS is the newest version of ubuntu 10.

View 5 Replies View Related

Ubuntu :: Program To Use To Backup Across Multiple Dvds?

Jan 19, 2011

I need to backup about 25GB of stuff onto 6 disks, I was going to use deja-dup but It doesn't seem to have that feature, does anybody know of a program (with a GUI) that can do this for me?

View 9 Replies View Related

Fedora :: Join Multiple Mp3s Into A Large Single File?

Apr 15, 2011

is there an app, runs in Fedora, with a GUI, that can join multiple mp3s into a large single file?

I know Audacity can do this, but it's not really suitable for multiple files and I have voice notes that are in 10 minute chunks.

View 2 Replies View Related

General :: Nas - Most Effective Backup Software -> When Dealing With Large Numbers Of Files?

Jul 18, 2010

I have two NASes. I work off of one, and the other is used as a backup. As I have it set up now, it's slow. Running a backup takes a week. Even for 7 TB, with 1,979,407 files, this seems a bit outlandish,particularly as both systems are RAID-5 and the network is all gigabit. I've been digging about in the rsync man pages, and I really don't understand what differentiates the various topologies.Right now, all the processing is being done on the backup NAS, which has the main volume from the main NAS mounted locally over SMB. I suspect that the SMB overhead is killing me, particularly when dealing with lots of files.

I think what I need is to set up rsync on the main nas as a daemon, and then run a local rsync client to connect to it, which would hopefully allow me to completely avoid the whole SMB-in-the-middle affair, but aside from mentioning that it's there, I can find very little information on why one would want to use the daemon mode for rsync.

Here's my current rsync command line: rsync -r -progress --delete /cifs/Thecus/ /mnt/Storage/input? Is there a better way/tool to do this? Edit:Ok, to address the additional questions: The "Main" NAS is a Thecus N7700. I have additional modules installed that give me SSH, and it has rsync, but it's not in the $PATH, and I havn't figured out how to edit the local $PATH in a way that persists between reboots. The "Backup" NAS is a DIY affair, built around a 1.6Ghz Via Mobo with a Adaptec Hardware RAID card. It's running CentOS 5 with a full desktop environment. It's the hardware I'm running rsync from. (Gigabit is through a additional PCI card).

Further Edit: Ok, got rsync over SSH working (thanks, lajuette!).I had to do a bit of tweaking on my command line, I'm running rsync with the args:rsync -rum --inplace --progress --delete --rsync-path=/opt/bin/rsync sys@10.1.1.10:/raid/data/Storage /mnt/Storage (Note: I'm specifically not using -a, because I want to change the ownership to the local account, to not freak-out SELinux)

View 5 Replies View Related

General :: File Is An Automated Backup Script, Backup.sh?

Sep 13, 2010

Can some one give me a sample of a crontab for backing a directory please, System is Ubuntu 9.04Quote:

#!/bin/bash
# this file is an automated backup script, backup.sh.
# this backs up my domain site.

[code]....

View 7 Replies View Related

Ubuntu :: Need Program To Backup Across DVDs

Mar 9, 2010

Back in the DOS days (yes I am dating myself) there was a backup command that would back the selected files across a series of floppy disks. Is there such a command in Karmic to backup across multiple DVDs?

I would like to do some backups of my data and other important files but I have more than 4 GB of information. I have seen posts on ways to write a compressed file but that still stores it on the HDD and I would like the files on DVDs in case the HDD crashes.

View 4 Replies View Related

General :: View A Particular Ten Lines In A Large File Where Can't Open The File In Vi

May 12, 2010

I am using RHEL 5.I have a very large test file which cannot be opened in vi.The content of the file has some 8000 lines.I need to view ten lines between 5680 to 5690.How can i view these particular lines in a large file.what is command and option i need to use.

View 1 Replies View Related

Software :: Backup DVDs Including Multi-title And Subtitles

Jul 9, 2010

I'm looking for a semi-automated way to backup my 300+ DVDs. Most are single title movies but some are TV series with multiple episodes. I will need english subtitles also if they are available since a number of my movies are foreign imports. All are region 1. I'd like it all command-line so I can script it. Possibly also get the extra features like commentary, outtakes, etc. but not necessary.What DVD rip/encode software will handle single and multi-title DVDs with subtitles when available and run via command-line? Export to Ogg or Xvid. Speed is second to these features. Distro is unimportant as I have a new 2Tb build specifically for this.

View 6 Replies View Related

General :: Backup Script Multiple Gzips Into One Bzip2 Or Gzip?

Feb 18, 2010

I have a backup script basically is this

Code:

BACKUP_DIRS="/etc /boot /root /home"
BACKUP_FOLDER="/tmp/system_backup/
for DIR in ${BACKUP_DIRS}
do

[code]....

All the folders get dumped into seperate gzip files. Now I want all the gzip files in the backup folder into one final gzip or bzip2 file. My goal for this is to get one file instead of multiple so I can scp or ftp the one file to another file share. Which would be easier to send one file than a bunch of files.

View 2 Replies View Related

Server :: MySQL Backup - Deal With Large Amounts Of Data?

Feb 15, 2011

we've been trying to become a bit more serious about backup. It seems the better way to do MySQL backup is to use the binlog. However, that binlog is huge! We seem to produce something like 10Gb per month. I'd like to copy the backup to somewhere off the server as I don't feel like there is much to be gained by just copying it to somewhere else on the server. I recently made a full backup which after compression amounted to 2.5Gb and took me 6.5 hours to copy to my own computer ... So that solution doesn't seem practical for the binlog backup.Should we rent another server somewhere? Is it possible to find a server like that really cheap? Or is there some other solution? What are other people's MySQL backup practices?

View 8 Replies View Related

General :: Best Way To Copy A Large File Over NFS?

Aug 24, 2011

I want to transfer a huge file (60GB) over the NFS network on linux. Is cp the best option?

View 1 Replies View Related

General :: Can't Copy Large File?

Mar 26, 2010

I'm trying to copy a 6Gb file across from my laptop to an external usb drive but it quits at about 4.2Gb every time with a "file size limit exceeded" error. i have checked the output of ulimit -a and there is no limit there on the file size. I'm using the Slax live Cd for this as it always gets the job done

View 8 Replies View Related

General :: How To Send A Large File Securely

Aug 28, 2011

I need to send large files from a Linux machine to another using cryptography. The sender machine knows the recipient IP but not vice-versa. I don't need strong cryptography and prefer higher-speed less-secure solutions.

There are no problems with presharing crypto keys but I'd prefer not dealing with SSH users creation.

I think to HTTP PUT over TLS, but I never had experience with it and I prefer to hear which are the possible solutions. I know that it can listen as a daemon but I don't know anything about cryptography. So pipeing with OpenSSL may be a solution.

View 2 Replies View Related

Networking :: Multiple DHCP In One Large LAN?

Dec 6, 2010

I've searched on issues of multiple DHCP servers in one LAN. Just about everything is Windows or Novell based stuff. Also, the typically asked scenario is 2 or more DHCP servers for failover purpose (one goes down, the other takes over).

What I want to do with DHCP is different. My purpose would best be described as "administrative separation". Basically, if a given MAC address is configured on a specific DHCP server, that server should be the one to answer and not the other. The problem with that is that we also need a default to handle unknown MACs. So the DHCP server without the MAC configured would be answering, anyway, even if it is the only one configured to do global leasing. Timing would then be the determining factor.

The purpose is to set up a bunch of PXE network booting using program generated DHCP configuration. This server won't always be up, so it can't be used for general purpose. The DHCP server for general purpose is part of a wireless system, and it is configured by GUI and is impractical for the programmed PXE booting.

how to make these work together with everything being on the same LAN segment?

View 10 Replies View Related

General :: Using Sed To Replace A *large* Number Of Variables In A File?

Jul 28, 2011

I have a large number of log files, on a linux box, I need to cleanse sensitive data from before sending to a third party. I have used the below script on previous occasions to perform this task, and it has worked brilliantly (script was built with some help from here :-)

#!/bin/bash
help_text () {
cat <<EOF
Usage: $0 [log_directory] [client_name(s)]
EOF

[Code]...

However, now one of our departments has sent me a CLIENT_FILE.txt with 425000+ variables! I think I may have hit some internal limit. I have tried splitting the client file into 4 with around 100000 variables in each, this still doesn't work. I'm loathe to keep splitting though as I have 20 directories with up to 190 files in each directory to run through. The more client files I make, the more passes I have to do.

View 2 Replies View Related

General :: Compress A Large File Into Smaller Parts?

Aug 18, 2011

I'm looking for a way to compress a large file (~10GB) into several files that wont exceed 150MB each.

Any thoughts?

View 2 Replies View Related

General :: Monitoring Copy Progress Of A Large File?

Sep 15, 2010

Is there a clever way to monitor the progress (as percentage or hash) of copying a large file (using pv could be an option)?Like monitoring the progress of a copy command such as this:Code:cp linux.iso /tmp/

View 2 Replies View Related

General :: Cannot Find Large Untared MySQL File

Aug 6, 2009

After untaring a mysql file (very large) I'm trying to find where the file listed below has gone. I did a search on the file name:
fine / -name 'mysql-qui-tools-5.0' -print
But can't find the file.
-rwxr-xr-x root/root 9651820 2007-05-02 11:46:01 mysql-gui-tools-5.0/mysql-query-browser-bin

View 6 Replies View Related

General :: Move Large Amounts Of Music Within A File Structure?

Dec 20, 2009

i have a car stereo that reads a USB drive with all my music on it, however to sort through the music it uses a method of finding folders containing music, then displaying them all in a list. i find this interface annoying because in order to sort the music by artist i have to go and manually move it out of the album folders by hand, this takes a long time for 11+ GB of music so i was trying to use the linux CLI to quicken the process. use a command like this

Code:

mv /media/usb/music/*/*/* /media/usb/music/*/

but for some reason this moves all my music into the last folder alphabetically in my drive, the music is all pre-arranged like this /media/usb/music/artist/album/song

View 5 Replies View Related

General :: Split A Large File To Download On A Windows Machine?

Apr 16, 2011

I am removing some old graphics from my server and one of the gallery programs have created two enormous directories that I cannot even open with FTP.

I tried to tar each directory and the first came out to about 37gb and the second keeps failing (its bigger one would assume).

How can I archive and split these into smaller files?

View 13 Replies View Related

General :: Accidentally Type Concatenate Large File On Remote System

May 6, 2010

Every once in a while on a computer I'm ssh'd into, I will accidentally type "cat largefile.txt" and my screen will start rushing with text for the next 10 minutes. I'm always working in a screen session, so my current solution is to just log out and then log back in, and since it can go 100X faster when I'm logged out, it'll finish in the short time it takes me to type my password in again. Is there a better way? Either involving the fact I'm in a screen session? Or a way to do this within SSH? What doesn't work: detaching from the screen session (doesn't respond until file is done outputting) trying command to move to a different window in the screen session (also doesn't respond) typing ctrl+C to kill cat command (also doesn't respond, probably because the command is done and the buffers just have to catch up).

View 3 Replies View Related

General :: Cups Spooler Quits Printing Large Print File?

Mar 22, 2010

I have installed centos 5 and can print small to medium lpr files using cups fine (1 to 20 pages), but when i tried to print a file of 95 pages the printer just stops, I have to power off the printer and turn it back on and it will start printing again. It looks like some data is lost in this process. It may print 20 pages and stop. When restarted it may print 20 or 40 or complete the report.I can print to the devicectly and it works fine. It is only when the large print jobs are run through the spooler.I have tried on different printers and the same results, that's what makes me think it is a spooler problem.

View 2 Replies View Related

General :: Convert A Large Number Of File Types From None Standard To Text?

Sep 28, 2009

I have on my windows machine several hundred files that are a format of .nc .ncs for a CNC machine. I need to convert them to txt which is something as easy as opening in notepad and then saving as .txt but there are so many that this kind of action would take way too long.

The reason I am writing the linuxquestions is because I would feel more comfortable in loading a live CD and using some sort of terminal command to do this than I would to download one of the many "freeware" type programs I have found for windows (even more so since I have had a root kit before and had to start all the way over to get rid of it).

I need to know:

1. Is this possible to do with the terminal without super advanced knowledge.

2. Can one please point me in the right direction; something to read or an example

View 2 Replies View Related

General :: Grep - Manipulating Large Text File Full Of Records

Nov 26, 2010

I'm trying to manipulate a large text file full of records (metadata - one complete record per line). I need to delete every line on which certain words appear - there are five different words, all pretty simple all-caps strings with occasional whitespace. I tried using grep -v, which worked a treat, but only string-by-string. Ideally I'd like to run this as grep -v -f, where the file targeted by the -f contains the strings I need to match in order to delete the lines they're in.

i.e. grep -v -f filecontainingSTRINGS.txt targetfile.txt > outputfile.txt

When I try this, however, I don't get any matches - or more specifically, no changes are made in the output file. It works fine if there's only one string in filecontainingSTRINGS, but it doesn't work if there's more than one (I'm using newline as the delimiter). (Also my machine doesn't recognise /usr/xpg4/bin/grep - no idea what that's all about!)

View 5 Replies View Related

General :: Shell Script Or Command To Remove PDF File From Large Logs?

Jul 13, 2011

I need to remove a large binary file(PDF file) from a large log file which is generated daily.This is seriously hogging space on our servers.I need to remove the large PDF from the logs to make the logs smaller and manageable

I need to take out the texts (or binary file) between the strings

<my:PDF> and </my:PDF>
<applicationForm> and </applicationForm>

<extractedSignature> and </extractedSignature>

I am not sure whether sed utility can do this, these are large files and need to be pruned .I am not seeking logrotation advice just a script or command that can strip these large logs of texts between the characters above . I am not sure how to do this.These files are rather large.I am not sure how to achieve this with sed , tail, head , tr or any other facility .

View 2 Replies View Related

General :: Split Large File And Transfer To Windows External Drive?

May 17, 2011

How does one split a large linux file and transfer to windows external drive ?

View 2 Replies View Related

Fedora :: Printing Large Image Over Multiple Pages In GIMP

Dec 8, 2009

I have a large image that I want to print over 4 pages, each page showing 1/4 of the overall image, that I will past together. I'm doing this from GIMP on an up-to-date fc12 system. Searching around I find that there is a "scale" field in the print dialog and the lp command that cups supports and according to the documentation if I set scale to "200%" it should do what I want.However, when I set scale to 200% I get only one page with the upper left 1/4 of the page and then nothing. How do I get it to print the remaining 3 pages?

View 1 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved