I have a large text file with three columns. I'm trying to write a PERL script that splits the file up based on the value of the 3rd column. So every time the third column reads 0, a new file is created and all the data up until the next 0 is found is written to that new file. This should happen over and over until the initial file has been entirely split up.
I've started having problems with large file downloads. A download will start and after a while freeze. The downloads window reports the correct connection speed and gives an estimated time to complete, but it stays frozen. Small downloads, torrents and surfing are not affected. I can do everything else normally even when the download is frozen. I've checked with my ISP and everything with my equipment checks out.
I have a linux (Ubuntu server 8) that is busy collecting data files for me, but I need to see them on a windows machine. The winXP computer is in an AD domain. The ubuntu server is running Samba and I believe I have set up sharing - I can see/list the files on winXP. however, when I try to open the files to read ( in this case by Wireshark ) I get a permissions denial. Where and how can I set those permissions?
I would like to use an extra physical hard drive in my linux server to provide my wife a place to backup her Windows XP desktop.I am willing to format this drive as NTFS (or anything else) and have it dedicated to this purpose. I am wondering what is the easiest way to proceed?
I need to split up a large file on windows so I can upload it in parts to a linux machine. I'm looking to do the opposite to this hopefully with some native utilities to keep it simple.
I understand the linux side of the equation to be cat filea fileb > file
what is the simples way to split files on a windows machine which can then be joined together via cat on a linux machine?
I have a 7 GB VOB file which I created from a DVD using ffmpeg dump to remove CSS protection (it is legal where I live to do so). Now, I want to create a DVD/.iso that will be understood by regular DVD players/appliances. How do I do it?
I have a file with 5 columns. Column 4 contains numbers.Is it possible to split the file into multiple files using a condition for the contents of column 4 i.e if column 4 contains a value between 0-10 then print the lines to a new file called less_than_10.txt
I am using RHEL 5.I have a very large test file which cannot be opened in vi.The content of the file has some 8000 lines.I need to view ten lines between 5680 to 5690.How can i view these particular lines in a large file.what is command and option i need to use.
I have NDISWRAPPER installed on my laptop, but when I try to install the download file which is a Windows Xp dos executable file of 8mbs I have tried every thing but without success I can see my Iomega 250 Zip drive when I go into system>administration>disk utilities and acess properties but cannot make it run,
I know that one can use ffmpeg to extract a smallfile.avi from a largfile.avi. But What I am looking for is an tool/command to split a large file into several files of a given size.
I've a file with a size of 6GB. I would like to compress this file and split them into smaller files. I was also thinking in use bzip2 to compress it, because if offers a good compression rate. How can I split this file into small ones to compress it?
standard Linux installation utilities split the root file-system and the home file-system on two separate but relatively equal-sized partitions? For example, when I put fedora on an 80GB disk, it automatically gave the root file-system 32GB and home 30GB and the swap 8GB of space. However, since my home file-system has a directory with 28GB of files in it, why is my root file-system reading 100% usage? Is the home FS overlaid on top of the root FS? Is there an advantage to doing this? I just made a boot partition (50mb or so), a root partition (90% of the disk space) and a swap (4%-5% disk space).
I'm the Administrating the computers in my office. I want to monitor the user's activity. How can i remote login without distrubing the user's activity on his computer? Any software need to be installed? (I don't want to use Terminal server client).
I am facing problem in copying a large number of file 18 lakh (18,000,000) files from my personal hardisk to another hardisk each file is very small and size of folder is around 3.95 GB copying files using copy given by Windows is frustrating and I am not even able to compress file its giving me error that its not readable.And problem is I am not able to open this drive in Linux it showing me error there saying do diskchk in Windows and Windows disk check is also not able to repair this drive and goes into some mode unsolvable.Is there any way to open disk with error to open in Windows and if not any way I can copy data faster?ERROR: Disk labled EDU is corrupt go to windows and chkdsk /f there and reboot into window 2 times.
I have a few hundred images of 30000 x 10000 pixels in size. Each image has lots of text (rendered as pixels) on it. I'd like to translate all text to another language. I speak both languages, and it's fine for me to translate each phrase manually. I need an image editor which can open these images quickly (faster than Inkscape, which needs about 60 seconds to open such an image), lets me zoom and rotate by 90 degrees, lets me erase (i.e. change the color of a selected rectangle to solid white), lets me add text, and lets me save the file as quickly as possible. I'd like to minimize the time I have to wait for the software to load, render and save images. Which is the best program for that on Windows? On Linux?
I have some file tools on a mint machine that I would rather not install on my mac laptop. Mainly because of the vastness of apt-get and the low risk of installation failure. Anyway, every so often I have a file that I want to process in place using some remote tool. Both machines can ssh right in to each other so I was figuring there must be some script or tool out there that would allow me to type out something like remote [file] [tool & args] to send my file to the other machine, get it processed, then get it back.
I'm trying to copy a 6Gb file across from my laptop to an external usb drive but it quits at about 4.2Gb every time with a "file size limit exceeded" error. i have checked the output of ulimit -a and there is no limit there on the file size. I'm using the Slax live Cd for this as it always gets the job done
have a gzip file ABC_000023232.gzipBCD_023232032.gzipI want to split these files into smaller files but keep the extension same because I am using this as a variable in a script
I need to send large files from a Linux machine to another using cryptography. The sender machine knows the recipient IP but not vice-versa. I don't need strong cryptography and prefer higher-speed less-secure solutions.
There are no problems with presharing crypto keys but I'd prefer not dealing with SSH users creation.
I think to HTTP PUT over TLS, but I never had experience with it and I prefer to hear which are the possible solutions. I know that it can listen as a daemon but I don't know anything about cryptography. So pipeing with OpenSSL may be a solution.
I have a large number of log files, on a linux box, I need to cleanse sensitive data from before sending to a third party. I have used the below script on previous occasions to perform this task, and it has worked brilliantly (script was built with some help from here :-)
However, now one of our departments has sent me a CLIENT_FILE.txt with 425000+ variables! I think I may have hit some internal limit. I have tried splitting the client file into 4 with around 100000 variables in each, this still doesn't work. I'm loathe to keep splitting though as I have 20 directories with up to 190 files in each directory to run through. The more client files I make, the more passes I have to do.