General :: Split A 7 GB .VOB File For DVD Authoring?
Jan 13, 2011
I have a 7 GB VOB file which I created from a DVD using ffmpeg dump to remove CSS protection (it is legal where I live to do so). Now, I want to create a DVD/.iso that will be understood by regular DVD players/appliances. How do I do it?
I've been trying to author a dvd, using an xvid avi. I've tried several authoring programs (the last I tried was 2manDVD). Every time I add the file to the project, it comes up as being 102 seconds long, when it is really something like 42 minutes.
I do have things set up enough that the file plays well in a media player on the computer.
Also, what is the best program to use for this sort of thing (taking an xvid file and converting/burning it so I can pop it in the dvd player). On windows I used to use the Windows DVD maker application, which was clunky and terrible but got the job done!
I have a file with 5 columns. Column 4 contains numbers.Is it possible to split the file into multiple files using a condition for the contents of column 4 i.e if column 4 contains a value between 0-10 then print the lines to a new file called less_than_10.txt
I know that one can use ffmpeg to extract a smallfile.avi from a largfile.avi. But What I am looking for is an tool/command to split a large file into several files of a given size.
I've a file with a size of 6GB. I would like to compress this file and split them into smaller files. I was also thinking in use bzip2 to compress it, because if offers a good compression rate. How can I split this file into small ones to compress it?
standard Linux installation utilities split the root file-system and the home file-system on two separate but relatively equal-sized partitions? For example, when I put fedora on an 80GB disk, it automatically gave the root file-system 32GB and home 30GB and the swap 8GB of space. However, since my home file-system has a directory with 28GB of files in it, why is my root file-system reading 100% usage? Is the home FS overlaid on top of the root FS? Is there an advantage to doing this? I just made a boot partition (50mb or so), a root partition (90% of the disk space) and a swap (4%-5% disk space).
have a gzip file ABC_000023232.gzipBCD_023232032.gzipI want to split these files into smaller files but keep the extension same because I am using this as a variable in a script
I want to split file foo into two parts, and I only intersted in the first one. But the N bytes in which the first part consists, I want it to be an exact replica of the corresponding bytes of foo. More exactly:
I have a file in which contains one line with a lot floating points.In the very first place and some times in the downstream, there are a few integers, surrounded by blank spaces.1 1.02-4 1.03-5 544 1.04-1 65 2.98-1 5.78-10 3.45-2 etc etc.I aim to split the file in more files each of them containing an integer and the following floatings until the next integer.
I have a huge rar compressed split file archive at my webspace but not enough space to decompress the file.
Is it possible to decompress the split file archive and deleting already decompressed parts on the fly? Currently, I'm using winrar 3.70 beta 2 but it seems that it hasn't such an option.
If there's another way to improve a script that I have created, since I'm not an expert! It works, and it made what I wanted, but it took a while to do it... maybe it can be improved. Here's the background. I have one file, with 244000 lines, let's call it X. I needed to split it in 1000 files, each one of 244 lines. I also needed the files to have the .arp extension. So here is what I did:
Code: for i in {1..1000} do sed -n '1,244p;244q' X > $i.arp sed -i '1,244d;' X # in this way I deleted the copied lines each time done
I have a log file on ubuntu 10.04 that has 500 lines of log data in it. What command could I use in a terminal to split the single 500-line file into generate ten files each with 50-lines of log files each?
i try to split mpeg file using ffmpeg. The splitting itself works OK, but the quality is lower. What should I do in order to keep the same video quality?
I have a large text file with three columns. I'm trying to write a PERL script that splits the file up based on the value of the 3rd column. So every time the third column reads 0, a new file is created and all the data up until the next 0 is found is written to that new file. This should happen over and over until the initial file has been entirely split up.
Mandriva 2010 kde4.3.5 Dolphin 1.3I like to use split screen with file manager. Would like duplicate up and back buttons that always corraspond to each pane of the window so I need not worry about which pane has the focus when I go to use the bauttons.I am hoping to find the functionality anywhere anyhow. I have been using the default Daulphin file manager but don't care if I need to change to a different one.
I want burn a complete normal DVD that can be played in an normal DVD player. My source file is capsuled in .mp4 and about 1.3 GB large. I have tried to use mandvd1, mandvd2, bombono DVD and KMediaFactory.
mandvd doesnt produce the correct output: I every time get an error-message that says there is a problem with the menu-creation. I have installed the latest package of DVDauthor. mandvd2 doesnt work at all: The rendering process is completed after 6 seconds.
I cant start KMEdiaFactory: "KMediaFactory uses "MJPEG Tools (mpeg2enc)", and this has not been found on your system. You must install this before KMediaFactory can continue." However mpjec tools are installed.bombono DVD doesntt create an iso but a corrupt mpeg file. I really need good advice here, I cant fix it alone. I never could solve a single problem with the community. I think now its time to start.