Software :: Differentiate Two Large Text Files Using Shell Script / Files Are Like Below?

Jan 20, 2009

I want to automate this using script.How to automate it?

File1:
s.no# 1 name:aaaaaa
city:abcd

[code]...

View 1 Replies


ADVERTISEMENT

General :: Searching Text Files For Large Numbers?

Dec 31, 2010

I am looking for a way to search for large numbers in text files and print the nearby lines.

For example if I had a text file like:

Event: 11
blah: 3
blah: 41 bleh: 19
Event: 2
blah: 31

[Code].....

View 1 Replies View Related

General :: Cp Adds Exclamation Points When Copying Very Large Text Files?

Jul 13, 2009

For my research I have some very large files that are basically millions of lines of ten columns of numbers. These files can be up to 5 GB in size. Recently I noticed that when I made a copy of one of my files, some exclamation points appeared in it where there should not be any: in front of random numbers throughout the file. Making another copy of the file would result in exclamation points in front of different numbers in different parts of the file. Doing this many times has given me up to four exclamation points in different parts of the file. Sometimes the file copies just fine without producing any extraneous exclamation points.Additionally, I have occasionally seen a "^K" where there should be a newline (the data that should have been on the next line was instead on the previous line with a ^K in front of it) in copies that I have made of my files. I don't know if this is related or not.

View 7 Replies View Related

General :: Replacement On More Than One Line In Text Files Using Shell

Nov 25, 2010

How can I replace one instance of a word in a text file with a piece of text that spreads several lines ? I know sed or awk is the way to go but don't know that how I can introduce new paragraphs using these tools

View 1 Replies View Related

Software :: Join 2 Text Files Based On First Number Present In Every Line Of The 2 Text Files?

Jan 22, 2010

I have 2 text files : file1.txt and file2.txt

cat file1.txt

15 this is a sentence containing various words and spaces
34 this is a another sentence containing various words and spaces

cat file2.txt

2 this is sentence1file2
6 this is sentence2file2
54 this is sentence3file2

I would like to join these 2 files. The result should look as follows :

cat joinedfile.txt

2 this is sentence1file2
6 this is sentence2file2
15 this is a sentence containing various words and spaces
34 this is a another sentence containing various words and spaces
54 this is sentence3file2

==> so the joined file must be sorted on the first number. Any ideas how this can be achieved ?

View 4 Replies View Related

Ubuntu :: Command With The -r Option To Compare A Large Number Of Files And Files In Subdirectories

Jun 16, 2011

I am using the diff command with the -r option, to compare a large number of files and files in subdirectories. My main interest is to find out which files have been changed, and not what the actual changes are, and since a lot of files has been changed, it would be a lot easier to view the file names only. Is there and option for diff that might do this, or does there exist a similar tool/command that could do the job?

View 1 Replies View Related

OpenSUSE :: Dolphin Losing Files When Copying Many Files Or Large Folders?

Feb 14, 2010

I've discovered that Dolphin seems to lose random files when copying many large folders.

I first noticed this a few months ago when I tried to copy my music library from one folder to another on the same HDD. It consisted of around 600 folders and 6500 files. During the copy there were no errors but after the copy I found that some of the newly copied folders were missing files. I put it down to human error or a glitch.

Yesterday I tried to copy 13 folders containing rips of some of my DVDs. Each folder basically had one film of either 700MB or 1.4GB. Again no errors showed up during the copy but I found 3 of the newly copied folders were empty.

It's not so critical with music or films but I can't afford to lose work data like this.

Has anyone experienced or seen a similar problem with Dolphin? I'm going to have to do some more extensive testing but this is not good.

The first time I noticed the problem I was running KDE4.3.4 (I think) and now the latest was with KDE4.4.0.

View 9 Replies View Related

General :: Write A Shell Script Which Will Simultaneously Collect OS User Information And Write In An Individual Text Files?

Feb 17, 2010

I want to write a shell script which will simultaneously collect OS user information and write in an individual text files.Can anyone tell me the syntax of the script.N.B. The user name will be mentioned in an array within the shell script.

View 8 Replies View Related

General :: Shell Script For Identifying The File And Zip All Files, Move The Files To Target Dir?

May 7, 2011

1. Every Sunday2. Find all files older than 1 day3. Gzip these file4. Tar up the gzipped files into one tar file.5. Name the tarball with a date stamp indicating what day it was created, so we know that week's files are in the file

View 3 Replies View Related

Programming :: Command For Find/Replace In Text Files (inc. Files In Sub-folders)?

Oct 11, 2010

I found this command that works great finding and replacing a simple string to another in files located in that folder and all sub-folders.

Code: find . -name '*.php' | xargs perl -pi -e 's/OldText/NewText/g'

The problem I have is that I need to replace a more complex string, like this: Old string: /mnt/stor6-wc2-dfw1/627896/982574/ New string: /mnt/stor8-wc2-dfw1/369587/302589/ There I don't know how to do it... since the / is what separates the old from the new strings, and the strings that I want to replace have / in it. Also, I would like to know how to specify under what folder replace the files, for example, I want that it search/replaces all files under /var/www/mysite/htdocs folder.

View 1 Replies View Related

Fedora :: Compression Ratio For Text Files / Binary Files

Apr 15, 2011

Fedora provides several compression techniques. E.g tar, tar.gz, zip etc. I want to know which among them provides

1. the best compression ratio for text files

2. the best compression ratio for binary files

3. fastest compression

View 6 Replies View Related

General :: Convert Open Office (odt) Files To Text Files?

May 25, 2011

How do you convert Open Office (ODT) documents to Text files?
I have made a report using libre office. Now I wish to continue editing the document using lyx (latex front end). So the ODT file needs to be saved as some .tex file.

I don't see an option to do this in File menu (export/save as). So is there any other plugin to do this?

View 1 Replies View Related

Software :: Use Sed To Include A Text File In The Beginning Of Other Text Files Inside A Folder And Its Subfolders?

Jun 1, 2010

Can I use sed to include a text file in the beginning of other text files inside a folder and its subfolders? So it should be recursive.

View 4 Replies View Related

CentOS 5 :: Text User Interface To Configure Text Files

Nov 5, 2009

I would like to write a text user interface (TUI) to adjust some text config files etc. Is there a tool or application for creating TUIs like this. I�m talking about those types of config tools which you see executed at first boot.

View 19 Replies View Related

OpenSUSE :: Burn Large Files With K3b Udf

Nov 3, 2010

Problem is with files greater than 4G onto dual layer or BD disks.mkisofs crashes. I believe the problem to be when this requires udf filesystem and disk does not get written to. As yesterday version is suse11.3 patched to date. k3b writes standard dvd disks ok. I am seeing a lot of searches saying this is because of cdrkit rather than cdrtools, can I replace this easily? Is this the case? Tried setting the k3b options when burning to udf same error. Other searches show k3b has fixed probs with these issues, so appears to point to underlying mkisofs stuff full error log in yesterdays post if it helps. I will also try on ubuntu to see if it works there

View 1 Replies View Related

Software :: Cp To Have A Progress Bar For Large Files?

Mar 10, 2010

I always wanted cp to have a progress bar for large files. I came across this:[URL]... I just wonder, how could you install it as an Arch package? Is it possible?

View 9 Replies View Related

Software :: RPM Not Supporting Large Files?

Jan 5, 2010

I currently using RPM version 4.4.2.3. It fails to build with files larger than 2 GB and file sets larger than 3 GB. I have files larger than 2 GB and sets larger than 3 GB.Is there a work around, possibly a switch or option for RPM, that will ignore this limitation?

View 8 Replies View Related

Slackware :: Get Ftp To Download Large Files?

May 29, 2010

I am using xfce desktop and terminal.

1) How do i check to see what is the default FTP Client used by Slackware12.2 when i type:
$ftp

2) How do i get ftp to download large files? > 2gb from ftp server elektroni.phys.tut.fi or any ftp server.

View 14 Replies View Related

General :: Loop Append Text To Text Files?

Jan 15, 2010

may be an advanced question but I need to know how to do this. Here at work I am in charge of recruiting and we have about 1,000 resumes in already. All of the resumes are in a .pdf format. I need to rename every .pdf in the following format:{firstnameLastname}.pdfThe only way I know how to do this is to convert all the .pdf files to text, extract the name out of the first few lines of text, import into excel, and then use VBA to rename the files in mass:Here is my logic so far:~Deskop/a = houses all the .pdfresumesOpen terminal: Code: cd ~/Desktop/afor f in *.pdf; do pdftotext -raw $f; done That will convert all of the preceding resumes into text filesNow I would like to append the name of the text file into the last line of the text file. So, for example, for Resume1.txt, I want to append "Resume1.txt" to the last line within Resume1.txt. So after I run the command I open Resume1.txt and on the last line within I want to see "Resume1.txt" on the last line, at the end of the resume.How can I do this? I would like to use a loop and have the terminal append the filename to the body of the text file until all of the have been appended.

View 1 Replies View Related

General :: Search Text In All Text Files Of All The Sub-directories?

Apr 21, 2010

Currently, when I'm searching text in files of my PHP project, I use this line :

Code:

grep -r 'myTextToFind' *

But now, I would like to search only in ".lang" files. How can I do that ?

View 7 Replies View Related

General :: Downloading Very Large Files Via SFTP

Apr 1, 2011

I need to download some very large files (circa 75 GB) from a remote server via SFTP. I've been using SFTP via the command line on my Linux netbook. Around halfway through, the transfer stops and says "stalled." Can anybody recommend a reliable way to download these files?

View 1 Replies View Related

Ubuntu :: Rsync Really Slow On Large Files?

Mar 1, 2010

I have Ubuntu on both my laptop and desktop machines, both are connected to the same network. I back up the laptop to the desktop by running the following on the laptop:

rsync -avv --stats /home/alisdt alisdt@xxx.xxx.xxx.xxx:/home/alisdt/laptop_backup (with the IP address of the desktop instead of the many x, obviously). Whenever rsync hits a large file (greater than a few MB), the network use rapidly drops to ~60KB/s (that's kilobytes not bits). When I copy the same file to the same place using scp, I get > 500KB/s throughout the transfer. Things I've tried:

* mounting the desktop home dir on the laptop using SSHFS -- a simple file copy is fast, rsync is still slow
* ditto with NFS
* rsync --whole-file option, in case the delta-transfer algorithm was choking on large files
* rsync --inplace option
* HPN-SSH (http://www.psc.edu/networking/projects/hpn-ssh/) to enable dynamic window and unencrypted bulk transfer, just in case it was some ssh bottleneck I think it's either an rsync application problem, or a network problem that is only affecting rsync. Any ideas, or other ideas of what I can try to debug? In case it's relevant, I'm using 9.04 on both machines. (A standing bug prevents me from upgrading the laptop, and I haven't bothered to upgrade the desktop).

View 3 Replies View Related

Ubuntu :: Moving Large Amounts Of Files

Mar 6, 2010

I am trying to move a large amount of files (over 30k and 86GB) to another HDD but I get a Augment list too large error?? I tried rsync, cp, mv and still the same error

View 1 Replies View Related

Ubuntu :: Cups: Large Files Do Not Print?

Nov 29, 2010

i have a problem with cups on a lucid/64 machine.

"Unable to write print data: Broken pipe"

The pdf-file to print has a size of 4,7MB. After sending the file to the printer the size of the file is more than 18 MB.

We use a Xerox WorkCentre 7232 which is via

socket://ip_adress:9100

connected to cups. The same configuration had been working fine for several years with hardy.

Cups refuses to print large files. When splitting up the print-file all works fine.

View 4 Replies View Related

General :: Creating Random Large Files?

Aug 27, 2010

how I can randomly write / create a 1 GB file in bash to test disk / network i/o? I was told I could use the 'dd' command but I don't know if there are some better ways and or what the 'dd' command looks like.

View 7 Replies View Related

General :: Tar Not Working With Large Number Of Files?

Dec 6, 2010

in my middle of script i am using tar command to tar some 1000 images and size of each image is 5MB.

all the images are provided as argument as tar -cvf file.tar <all images as argument>

but my tar file file.tar does not contain all the images.

View 4 Replies View Related

Debian Configuration :: Transfer Large Files Between Windows ?

Jul 17, 2011

I would like to transfer my music library and movie collection from my Desktop computer running Windows Vista and my laptop running Debian Squeeze. I have the laptop connected via wireless but it's possible to connect the two either directly with a CAT5e cable or through the router. I'm just wondering what the best way to do this would be.

View 8 Replies View Related

Fedora Networking :: 12 Hangs On Transfer Of Large Files?

Dec 9, 2009

I have Fedora 12 (with all the latest patches, including the 2.6.31.6-162 kernel) installed on a new Supermicro SYS-5015A-H 1U Server [Intel Atom 330 (1.6GHz) CPU, Intel 945GC NB, Intel ICH7R SB, 2x Realtek RTL8111C-GR Gigabit Ethernet, Onboard GMA950 video]. This all works great until I try to transfer a large file over the network, then the computer hard locks, forcing a power-off reset.

Some info about my setup:

[root@Epsilon ~]# uname -a
Linux Epsilon 2.6.31.6-162.fc12.i686.PAE #1 SMP Fri Dec 4 00:43:59 EST 2009 i686 i686 i386 GNU/Linux
[root@Epsilon ~]# dmesg | grep r8169
r8169 Gigabit Ethernet driver 2.3LK-NAPI loaded

[code]....

I'm pretty sure this is an issue with the r8169 driver (what I'm seeing is somewhat reminiscent of the bug reported here). The computer will operate fine for days as a (low volume) web server, and is reasonably stable transferring small files, but as when as I try to transfer a large file (say during a backup to a NAS or a NFS share), the computer will hard lock (no keyboard, mouse, etc.) at some point into the transfer of the file. It doesn't seem to matter how the file is transferred (sftp, rsync to NFS share, etc.).

View 10 Replies View Related

OpenSUSE :: Burn Large Files With K3b Mkisofs Crashed

Nov 2, 2010

I have a problem with burning some disks. I believe the problem to be when the file sizes are large. using suse 11.3 x 64 and updates done. k3b version 2.0.1-1pm 2.7 burned normal dvd yesterday with3.5 Gb + 1.5Gb files. Tried burning to BD-R disk with 11.5 Gb file. I have burned blu ray disks before with my machine but used nero, I would prefer to use k3b as should do this. I get error of mkisofs crashed and following error log

[code]...

View 3 Replies View Related

General :: Copying Large Number Of Files From One Directory To Another

Feb 10, 2010

I've a directory containing around 2.8 lacs of files. I want to move them to another directory.If I use cp or mv then I get an error 'argument list too long'. If I write a script like

for file in ls *; do
cp {source} to {destination}
done

then because of ls command , its performance degrades.How can I do this?

View 7 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved