Software :: VIM Runs Into An Indefinite Loop While Editing A Large File?
Apr 12, 2011
May I ask that does VIM support editing large files. I'm editing a 3G (good, less than 4G) text file with several millions of lines. I have deleted the first ~110,000 lines and :x (save and exit). OMG: this operation takes more than 3 days. I think it hangs already, so I terminated it. So my question is, can VIM do this???
Mine is:
VIM - Vi IMproved 7.0
Included patches: 1, 3-4, 7-9, 11, 13-17, 19-26, 29-31, 34-44, 47, 50-56, 58-64, 66-73,
My .jar file needs and uses some files in the same directory it's in (everything, including the jar was unzipped into said directory). It runs perfectly when I do java -jar file.jar in the command line, but there's trouble when I double-click the file when running from the file system manager. I've tried a custom command under properties ie java -jar, but the problem is that the .jar file doesn't seem to be able to use any of the files in the same directory. When running, the jar can't find any of the files that it needs.
I am using RHEL 5.I have a very large test file which cannot be opened in vi.The content of the file has some 8000 lines.I need to view ten lines between 5680 to 5690.How can i view these particular lines in a large file.what is command and option i need to use.
I'm using vim 7.2.330 on 64 bit ubuntu 10.04, sometime in the last week every time I reopen a file it places the cursor at the top of the file. I can see in ~/.viminfo where it's saving the last position used, but it doesn't seem to be honoring it.
I want to loop through the records in the below file (homedir.temp) /home/user1 /home/user2 /home/user3
I want to do the following activities with each record1. du -s - to get the total usage for that directory (my variable name is SIZE)2. divide SIZE by du -c for /home to get the percentage of usage. (my variable name is PER)3. write the directory, SIZE, PER to a filePROBLEMI am using the below for loop: for record in homedir.tempthe mentioned activitiesdonehe above is not looping through the records. It does the first record perfectly and exits the loop.
I multi-boot several Linux distributions with an assortment of additional data partitions. I get frustrated whenever fsck is forced during boot. (It ONLY happens when I'm in a hurry don't you know...) So I wrote a script to automate forced fscking when I do have the time. (And/or while I'm doing something else in another workspace.
Because I multi-boot, I've learned that udev doesn't always assign the same device name to each drive for all distributions. I've had the same partition identified as hda5, sda5, & sdb5 by different distributions (without doing anything to affect the boot order) So my solution is to keep a list of partitions in a specific file on each distro with valid device names according to that distribution's udev process. Actually I'd use LABEL= instead but the labels don't show up in /etc/mtab, and I like to make sure a partition isn't mounted before I try to fsck it.
I can make this work in a for loop using cat. But I've seen so many things about NOT using cat that I wanted to rebuild my script. I can make this work with a redirect instead of cat via a while loop, But I "LIKE" old style for loops. But I can't seem to find a way to make a redirect work with one. I thought this might make a good first �LinuxQuestions.org� question. I'm also open to any other suggestions on better/alternative methods... Is it possible to redirect from a file into an actual for loop?
My script is as follows:
Code:
#!/bin/bash # FsckEm I script to force file system checking on unmounted ext2/ext3/ext4 # partitions in preselected list. FsckEm accepts no options. Partition
I have a 50 file name NSSAVE0001.vtk to NSSAVE0050.vtk.Do I have to manually type individually command to open each file or can I have a loop to open file?
I have a list of millions of filename in form of say "drau3DDFEA5E01205841DC1B277A". I need it convert it intousr/local/data/d/dr/dra/drau/drau3DDFEA5E01205841DC1B277AI believe that can be achieved by awk or sed.
I am trying to run a PHP script over and over again. I would love to be able to just write a small bash script that will just do this for me, however I am having trouble writing a for loop that works. Here is what I have now:
When I run this (./bash_import) I get the following errors:
I have also tried just doing the "php wps_product_import.php" on 20 different lines and I get an error saying it can't open the script file and only runs once.
I recently upgraded my file/media server to Fedora 11. After doing so, I can no longer copy large files to the server. The files begin to transfer, but typically after about 1gb of the file has transferred, the transfer stalls and ultimately fails with the message:
"Error writing to file: Input/output error"
I've run out of ideas as to what could cause this problem. I have tried the following:
1. Different NFS versions: NFS3 and NFS4 2. Tried copying the files to different physical drives on the server. 3. Tried copying the files from different physical drives on the client. 4. Tried different rsize and wsize block sizes when mounting the NFS share 5. Tried copying the files via a different protocol. SSH in this case. The file transfers are always successful when I use SSH.
Regardless of what I do, the result is the same. The file transfers always fail after approximately 1gb.
Some other notes.
1. Both the client and the server are running Fedora 11 kernel 2.6.29.5-191.fc11.x86_64
I am out of ideas. Has anyone else experienced something similar?
I've got a large .tar.gz file that I am trying to extract, I have had a look around at the problem and seems other people have had it, but I've tried their solutions and they haven't worked.The command I am using is:
I'm trying to copy a 6Gb file across from my laptop to an external usb drive but it quits at about 4.2Gb every time with a "file size limit exceeded" error. i have checked the output of ulimit -a and there is no limit there on the file size. I'm using the Slax live Cd for this as it always gets the job done
I had/have an old program which works on XP called PaperPort. It allows me to edit and fill in blank forms. I have been searching for something like that for Linux, and to date found nothing, Chances are that I am looking in the wrong places.
I have samsung laptop and there is a nasty program called samsung recovery solution that changes the active partition to Windows each time you enter it. The trouble is: I have 2 OSes, Opensuse and Windows 7, and Grub that is installed in my Extended partition, so each time I enter this recovery program active flag is changed and Windows is loaded instead of grub. I have noticed some boot files in that program's folder, so the question is how can I edit them in most convenient manner? I have kde 4.5.3 is there some sort of included program that would help?
I need to run ./pythonScript keyword one time for each keyword in a text file, how can I do this from a gnome terminal? (without having to modify the pythonScript)
pseudo code:
for each keyword in file: ./pythonScript keyword waitfor(pythonScript to finish)
I am trying to store the results of my code to a separate text file.But the problem is, as my results comes from a loop, my text file shows only the last result, not all of them.Like if the loop runs 5 time the text file shows the result for the 5th step.But i need to store all of them (1 to 5).Can I use awk to print the output field and store to another file and creat a new line so that the next output field goes to a new line?(just an idea, dont know).
#!/bin/basth for (( i=1; i<=5; i++)) do ./file.exe > output.txt done
I need to send large files from a Linux machine to another using cryptography. The sender machine knows the recipient IP but not vice-versa. I don't need strong cryptography and prefer higher-speed less-secure solutions.
There are no problems with presharing crypto keys but I'd prefer not dealing with SSH users creation.
I think to HTTP PUT over TLS, but I never had experience with it and I prefer to hear which are the possible solutions. I know that it can listen as a daemon but I don't know anything about cryptography. So pipeing with OpenSSL may be a solution.