Slackware :: Burning DVD - Files Which Are Spread Through Several Directories ?
May 12, 2010
I've got following problem: need to burn files which are spread through several directories, have names longer that 32 chars, and the chars like " " (space) nad other strange are allowed in the names of files. The file sizes are about 300 MB up to 1.5 GB (perhaps more). I want to burn .iso images (not on fly). I'm not using any graphical interface, so programs like k3b are not allowed.
I have about 50GB of .MOD video files that I need to compress to a smaller format. The only problem is that there are many files (30 second to 5 minute clips) spread out across several directories. I was wondering if anyone knew of a tool that will recursively search the directories and batch convert all of the files? I'm open to anything including a good script for mencoder or ffmpeg.
i am in need of linux help. iam at college and i need this back/restore script to pass this final part of an assessment. i require a backup script that will not only backup but also restore files to the relevent directories. e.g. users are instructed to store all wordprocessor files in a directory named wp. so i am needing to create a backup directory and 3 directories within that and some files within the 3 directories and then back them up ot restore them. l know i should/have to do this myself by been trying to get/understand info for the last few days and came up with zero.
I am writing a script, in that my requirement is, if all the fill types stored in one directory from that we need to separate different different directories based on the file types.
for example in a directory(anish). 5 different types files 1- directory 2- .txt files 2- .sh files
like that and my requirement is the (1- directory is moved to one new directory(dir) which we are given in the script)and (2 .txt files are moved to another new directory(test) which we are given in the script)and ( 2 .sh files are moved to another new directory(bash) which we are given in the scrip)finally the directory anish should be empty..using bash script.how it is possible !!
I'm confronted with a problem to which I cannot seem to find an answer. I'm trying to burn a DVD+RW and it continuously fails, whatever options I use. I've tried to burn an iso image, extracted the files from the image and tried burning to a 'new' k3b project, but all with the same result. Regular CD and DVD (not RW) works like a charm. This is the first try using DVD+RW. Below is the log from k3b session.
K3B burning data and ISO files. Disc Media:Datawrite DVD & CD Discs
Old Box OS: WinXP Pro SP3 32bit Burning Software:Nero 7.0.1.4 Mobo:Asrock P4V88+ Processor:AMD AthlonXP 2800+ Single Core CPU Memory:2GB DDR Optical Drive:NEC DVD RW ND-4550A (2x, 4x & 8x write speeds) [Code].....
My Current / Old box (WinXP)can burn 2.9GB of data over a LAN from a mapped drive using Nero in 7 minutes and maintain file & directory names as they are complete with upper and lower characters, spaces between words and special characters without shortening file / directory names out of the box. My new box (SuSE 11.2) was initially incapable of burning anything, ?Access Denied? I added the user to the cdrom group this allowed the burning of data but the process changed file and directory names. It changed all letters to capitols, replaced blank spaces and special characters with underscores. I still can?t burn ISO files, ?Access Denied?. The time it took to burn the 2.9GB of data on the hard drive of the SuSE install was sort of acceptable but a full 5 to 6 minutes longer and I was forced to shorten file names too.
I changed a setting to allow the full ascii character set hoping this would preserve file and directory names as they are during the burn process. I changed a setting to allow long file names too. I now have another problem, on starting the burn process and after a minute'sh K3B displayed the fact that it was going to take over 1 ? hours to finish! Yikes. After an hour it was going to take a further 2hrs and 10mins. This didn't include read varify. I aborted the burn process, this can?t be correct. I placed the LG drive in my old box, burn process took a second or two less than 7mins to complete using the same data over the same LAN. I've tested the drive for hardware errors, the drive is perfect. Am i missing something, failing to do something or doing things wrong. Is the optical drive or some other component incompatible in some way.
I'll be burning many gigabytes of data on a daily basis and i'm hoping to be able to burn a full DVD disc of mixed data types in less than 10mins whilst preserving the full ascii character set, file & directory names inc special characters and blank spaces. I still need to resolve my "access denied" problem when trying to burn ISO files too.
I am using rsync to create rotating snapshot style backups of my web files and sending them via SSH to a remote location in order to burn them for offsite storage. This is all working perfect. The remote machine is a Windows Server 2003 which has data that I combine with my web files before burning. I have cygwin installed on the remote server in order to archive and compress the entire backup using tar. (This is not a post about cygwin, I just thought I would mention it in case anyone was wondering how I was running Linux commands after transferring it to the Windows box). After compression, the backup is over 12gb. The next step in my process is to split this tar.gz file into smaller chunks in order to burn them to DVDs. I use dual layer DVDs which are 8.5gb storage size.
I also use cygwin to split the tar.gz into multiple 2gb files using the split command. When I burn them, I only put 3 files on each disk totaling 6gb to leave some padding in case this was a problem. The burn completes and says successful, although it errors out when in verification. I have tried this multiple times and it seems to fail verification at the same point every time which leads me to believe that it has something to do with the data. I have also done tests such as creating smaller backups with completely different data and brning that to a CD-R which worked fine, so I'm convinced this process can work, I just cant get it to work in the right situation. I have also tried burning one of the 7 split files to a dual DVD which also worked fine. I'm wondering if their is a chunk of data that is causing this problem in one of the other split files?
I need a script that will take all the files in a given directory and create new monthly sub-directories and sort all the files based on the creation date into the appropriate directory.For example, all files created between 01/01/09 and 01/31/09 will be placed in 'JAN-2009'
I am running Ubuntu 8.04 (installed by CD) and an .iso file for Ubuntu 10.04. Normally, we can upgrade system via internet. But it always takes much time to get it, with me it's around 2 hours @_@. What I need is saving time by upgrade the system from what I have (iso file), not via internet.
I want to copy all directories, files, and hidden files and hidden directories with one command. I want these items to replace any same items in the target directory.
I have tried several things, such as:
cp -r * cp -aR *
but I only seem to get visible files and directories. Obviously, I am missing something. (A brain, probably....)
I am trying to burn a few video files to dvd using brasero, but after the burning process is about half the way through it simply disappears with no error at all. Only the terminal says
++ WARN: [brasero] Discarding incomplete final frame LPCM stream 0 ++ WARN: [brasero] Discarding incomplete final frame LPCM stream 0 **ERROR: [brasero] Horizontal size is greater than permitted in specified Level.
I have always used the same programs to burn media in Ubuntu -- Brasero and DVD/CD creator. But after Natty install I get all the actions of burning media and then things hang at the last part with "Creating image Checksum" and the computer just goes into a loop forever increasing the time remaining. If I cancel and eject, the media is invalid. It won't mount and show files.
I have a folder called VIDEO-TS filled with .vob files and other things. I did it before, but don't remember what I used to make it a regular dvd video. I tried Devede, but when I drag the folder to the app, it says "folder contains non-vob files and can't continue" or something like that.
NFS server is on Slackware-current64 and client is on 13.1. The directory being exported is vfat formatted on a firewire drive. I've been accessing this share on the client with Amarok and Juk without incident for a couple of months. Yesterday I decided to try out moc on the client with this setup. After many revisions to fstab options on both the server and client I still get the same result: after a couple of hours running moc post remounting and re-exporting, the directories on the nfs share on the client start disappearing or corrupting.
I am wondering if there is a way to use src2pkg on source directories I have checked out from git? I can't see any such option in the manpage. At the moment it works if I tar it up, but I can't find a way to just use the directory directly.
I've upgraded to 13.1 from 13.0 but can't startx. I get this error; 'call to lnusertemp failed (temporary directories full?)'
When I ctl-alt-backspace there are numerous messages including these two; '/usr/bin/kde4-config: symbol lookup error:/user/lib/libkdecore.so.5: undefined symbol: _zn9qlistdatadetach3ev. and startkde: error: could not locate lnusertemp in /usr/bin/startkde: line 302::command not found.
Linux 13.0 worked OK right out of the "box" so I don't understand why this version has this problem. Was something changed from 13.0 to 13.1?
I'm running slackware 13.37 and when looking through the x.org log I see that two directories are missing:
Information[ 68.572] (WW) The directory "/usr/share/fonts/local" does not exist. Information[ 68.572] Entry deleted from font path. Information[ 68.572] (WW) The directory "/usr/share/fonts/CID" does not exist.
in openoffice i could open my excel spreadsheets. now that i have GNOME and Abiword, they won't open. any suggestions? can't find OOffice in the repository.
I'm looking for a means to automate filling out web-forms with data (names, addresses, etc...) from excel spreadsheets. I'm not sure how to do it with a macro, of if theres programs out there, or if a script could do the same work.
How do I delete just directories and not files when performing a "rm -r foo*" command? E.G. I have foobar.txt foofoo.o foorebar.jpg and foo/ foonuggets/ and footemp/ in a directory. In one fell swoop how do I delete just the directories and preserve the files?
Seeing as how I only use the -r switch when removing directories I accidentally ran this command and removed files that I wanted (luckily nothing vital). Lesson learned now I want to prevent ever doing that on files that *are* vital.
I had a fresh install of Slackware 13.1 up and running. I was looking to use an app which could only run on a gnome distro. So I installed gnomeslackbuild. Ran into dependency hell so I gave up the ghost and decided to reinstall.Re-install resulted in a message "Call to lnusertemp failed(temporary directories full).Had this before a different machine and sorted with another install at the time.Tried this but it keeps coming up with the same message. (4 installs later).Quote:
xset: bad font path element (#23) possible causes are: Directory does not exist or has wrong permissions Directory missing fonts.dir
i just want to know whether we can recover the deleted openoffice spreadsheets. iam using ubuntu 9.10. i don't know how to recover the openoffice files after deleting it from trash.
And i also wanted to know whether we can recover old content of openoffice spread sheet file after it had been saved with the new content.
I recently deleted some files. I would like to know are the files kept in a directory? Like in windows recycle bin. I would like to know where these files are?
I downloaded some backgrounds from Gnome Art and am having trouble moving them to the backgrounds folder. I've been trying this: sudo mv desktop <filename> usr/share/backgrounds.I moved to the desktop to make it easier.
I'm totally new to Linux and this website. I was wondering if anyone had or could help me create a shell script that would merge two files from two different directories and then have that new merged file in a third differnt directory.The merged file would need to eliminate duplicates and sort the contents.
I'd like to remove all directories of a certain depth that don't contain .txt or .log files -- is this possible? So far I have: find ~ -mindepth 3 -maxdepth 4 -type d -exec rm -r '{}' ; Is it possible to add in "only if the directory doesn't contain .txt and/or .log files"? Or do I have to start learning perl to do that?
For example: dir 1: hello.txt runme.sh dir 2: runme.sh oct12.log [Code]....