General :: Shell Script Or Command To Remove PDF File From Large Logs?

Jul 13, 2011

I need to remove a large binary file(PDF file) from a large log file which is generated daily.This is seriously hogging space on our servers.I need to remove the large PDF from the logs to make the logs smaller and manageable

I need to take out the texts (or binary file) between the strings

<my:PDF> and </my:PDF>
<applicationForm> and </applicationForm>

<extractedSignature> and </extractedSignature>

I am not sure whether sed utility can do this, these are large files and need to be pruned .I am not seeking logrotation advice just a script or command that can strip these large logs of texts between the characters above . I am not sure how to do this.These files are rather large.I am not sure how to achieve this with sed , tail, head , tr or any other facility .

View 2 Replies


ADVERTISEMENT

General :: Shell Script Which Will Search And Remove A Javascript From All Htm, Html And Php File?

Feb 21, 2011

I need a shell script which will search and remove a javascript from all htm, html and php file.

Code:
<script type="text/javascript"> if (navigator.cookieEnabled) {var user = getCookie("seostop");if (user !=1){winchristop();setCookie("seostop", "1", 7,

[code]....

View 14 Replies View Related

OpenSUSE :: Remove Large File On External Disk?

Jan 27, 2010

1. An external hard disk with VFAT32 file system has a continuous 23GB file (old HD disk image). It is too large to 'remove to wastebasket' and unlike MS Windows remove to wastebasket does not sense file size and wipe file index .

How to remove a large file in SUSE 11.2?

View 9 Replies View Related

General :: Shell Command To Display Contents Of A File?

Feb 23, 2010

shell command to display contents of a file? Like that of .txt or .html

View 9 Replies View Related

General :: Shell Command To Find Newest File In Filesystem?

Apr 13, 2010

I'm relatively experienced with UNIX and Linux, but this has me thrown for quite a loop, and it seemed like such a simple question. How would I go about finding the newest file in a file system? I thought something like:

Code:

ls -ltr `find /usr -type f`

would work, but I seem to be exceeding the argument maximum for ls:

ksh: 0403-029 There is not enough memory available now

I thought something involving xargs might work, but I really suck with that command.

View 3 Replies View Related

General :: Remove Words In A File Using Sed Or Any Other Command?

Nov 30, 2009

i want to remove words "Max" and "constrained" in a file given below:

Max 0.003745 constrained
Max 0.004549 constrained
Max 0.001689 constrained

[code]....

and further want to replace "Max" by line number so that i can plot the resulting file. i searched in forum, but couldn't do what i wanted to do. e.g. i used

1)grep command

grep -v "Max" inputfile >outputfile

deletes whole line,and hence whole text.

2) sed command

cat inputfile |sed 's/ .{1,12} //g' >outputfile

gives output

0.003745constrained
0.004549constrained
0.001689constrained

[code]....

View 4 Replies View Related

General :: Redirect Output Of A Command To Another File Inside Shell Script?

Aug 26, 2010

I am writing a script in which I am using AWK to append to a line in a file and save the file. The command I am using is:

Code:
awk '{s=$0; if ( NR==4 ){s=s ":/usr/java/jdk1.6.0_19/bin" } print s;}' $appName > $appName.new

[code]...

View 4 Replies View Related

General :: Cd Command Not Working In .cshrc File (bash Shell SUSE)?

May 29, 2011

Just trying to execute cd command in a .cshrc file (bash shell in SUSE Linux) it says."No such file or directory".Do you see any reasons for this

View 3 Replies View Related

General :: Shell Script That Logs Into Another Machine And Executes Some Commands?

Jan 5, 2011

I am trying to write a script that connects to a server and executes some commands on there. Something like this:

#!/bin/sh
telnet remote_machine
cd /home/some_directory
cat a_file_in_current_directory

Unfortunately after login/password I guess the script doesn't jump past the telnet command, until I exit. What do I need to do to make the script start executing commands in the remote shell?

View 1 Replies View Related

General :: Write Shell Script That Takes A File Path As Command-line Arguments?

Dec 14, 2010

How can i write a shell script that takes a file path as command-line arguments.and it should report whether the path denotes a file or a directory.

View 2 Replies View Related

General :: Convert Shell Logs - Incl. Escape Characters - To HTML?

Mar 6, 2010

Is there tool or a regexp that can convert shell escape characters to HTML code?

As an example, here is a logfile from GNU screen:

Which I would like to convert to something like this:

And send as HTML e-mail to an e-mail address, to archive my work.

Here is a related question, which shows how to convert it to regular text, but it would be nice to convert to HTML and not just throw the escape characters away.

View 1 Replies View Related

Security :: Exim Logs Spammed With Large Headers

Feb 12, 2011

Has anybody else seen this kind of attack? I see those messages on 2 exim mailservers. Looks as if someone sends a 50MB big mail header :S What is their goal except from increasing my traffic?

Code:
2011-02-12 07:48:53 SMTP protocol synchronization error (input sent without waiting for greeting): rejected connection from H=ns33.medialook.net [91.121.108.5] input="GET / HTTP/1.1
Accept: */*
Accept-Language: en-us

[Code].....

View 4 Replies View Related

Debian Configuration :: Jessie LVM - Full Disk / Large Logs And GParted

Sep 23, 2015

So, my issues since upgrading to Jessie seem to compound. When I fix one issue, two more arise. Right now, I have a full system disk. How it got so full. So I started poking around. I ran

Code: Select all find / -type f -size +50M -exec ls -lh {} ; | awk '{ print $NF ": " $5 }'

Found a few files I could delete, and did, but I also found Code: Select all/var/log/syslog.1: 33G
/var/log/messages: 33G
/var/log/user.log: 33G

What I find strange is that they're all exactly 33G each. So that accounts for the missing 99GB I deleted them, however only recovered 27Gb. Whats weird is when I type df -h I get

Code: Select allFilesystem      Size  Used Avail Use% Mounted on
/dev/dm-0       106G   74G   27G  74% /
udev             10M     0   10M   0% /dev
tmpfs           3.2G  9.7M  3.2G   1% /run
tmpfs           7.9G     0  7.9G   0% /dev/shm
tmpfs           5.0M  4.0K  5.0M   1% /run/lock
tmpfs           7.9G     0  7.9G   0% /sys/fs/cgroup
/dev/sda1       228M   27M  189M  13% /boot
/dev/sdb1       1.9T   62G  1.8T   4% /media/ntfs
tmpfs           1.6G     0  1.6G   0% /run/user/0

What are the tmpfs's and how can I reclaim that space, and what is /dev/dm-0 and why is that taking up so much space?

I have 2 LVGs vgdisplay -v

Code: Select allroot@SETV-007-WOWZA:~# vgdisplay -v
    DEGRADED MODE. Incomplete RAID LVs will be processed.
    Finding all volume groups
    Finding volume group "WOWZASERVER"

[Code] ....

After deleting the log files, I was able to regain access to my GDM session. But I still cant find out what /dev/dm-0 is, and where all the 75 GB is being taken up.

I just noticed, however, even though I can access the drive A-OK via browser, terminal, and web services (Our wowza) when I enter gParted I get this error for sda, my primary OS drive!

Code: Select all  Libparted Bug Found!

Error informing the kernel about modifications to partition /dev/sda2 -- Invalid argument. This means Linux won't know about any changes you made to /dev/sda2 until you reboot -- so you shouldn't mount it or use it in any way before rebooting

Now that I'm in gParted I see 3 partitions: [URL] ....

It reports now, that I have used ALL of my disk space.

Post Log delete, and fresh reboot, this is what Code: Select alldf -h outputs

Code:
Select all Filesystem      Size  Used Avail Use% Mounted on
/dev/dm-0       106G  8.7G   92G   9% /
udev             10M     0   10M   0% /dev
tmpfs           3.2G  9.8M  3.2G   1% /run
tmpfs           7.9G   80K  7.9G   1% /dev/shm
tmpfs           5.0M  4.0K  5.0M   1% /run/lock

[Code] ....

What the heck is going on?

View 0 Replies View Related

Software :: Check The Contents Of A Text File For A Specific String And Remove It From The File From The Command Prompt?

Oct 14, 2010

I want to be able to check the contents of a text file for a specific string and remove it from the file from the command prompt. I would basically be searching through a number of files and if a specific string is found I would like it removed automatically. pretty much a find and replace, were the replace is nothing. any one got any ideas on how you would do this. I already have the search part sorted just need to be able to remove the string I don't want from the multiple files.

View 4 Replies View Related

General :: View A Particular Ten Lines In A Large File Where Can't Open The File In Vi

May 12, 2010

I am using RHEL 5.I have a very large test file which cannot be opened in vi.The content of the file has some 8000 lines.I need to view ten lines between 5680 to 5690.How can i view these particular lines in a large file.what is command and option i need to use.

View 1 Replies View Related

General :: Best Way To Copy A Large File Over NFS?

Aug 24, 2011

I want to transfer a huge file (60GB) over the NFS network on linux. Is cp the best option?

View 1 Replies View Related

General :: Can't Copy Large File?

Mar 26, 2010

I'm trying to copy a 6Gb file across from my laptop to an external usb drive but it quits at about 4.2Gb every time with a "file size limit exceeded" error. i have checked the output of ulimit -a and there is no limit there on the file size. I'm using the Slax live Cd for this as it always gets the job done

View 8 Replies View Related

General :: How To Search Logs Between Two Timestamps In Log File?

Nov 19, 2009

The requirement was to write a shell script for a cron job set for every two hours for all days.The Script has to scan log files (*.log) for the logs posted only for the last two hours.... and append them in a new file...I am clueless abt how to scan/compare based on time stamp seen in above logs.

View 2 Replies View Related

General :: Errors Executing Shell Script: "command Not Found" And "no Such File Or Directory"

Jan 14, 2011

A colleague gave me a shell script ("dti_motion") which needs to be run from the directory containing all the files it works on. I want to run the same script for several different directories. But I don't want to have to cd into each directory, run the script, wait for it to finish, and then cd to the next directory (there are 52 to do altogether).

So I wanted to write a very simple script that will cd to each directory and perform the script there, before going on to the next one. My colleague's script ("dti_motion") is stored in my home/bin/ and is executable. My home/bin/ is in my path, as verified by echo $path. When run from a directory containing the necessary information, the dti_motion script works perfectly well. I wrote an extremely inelegant script called "dti_motion_do_all" which is also stored in my home/bin/ and executable:

#!/bin/sh
#Get motion information for each subject, using Mark's script, called dti_motion
cd /imaging/cr01/PD_DTI/C_10/12x5

[code]....

I know there will be more elegant ways to write this script with loops, rather than simply using cd, but for the moment, I just want this to run, until I have learnt to use loops properly. how to correct either of those "command not found" or "no such file or directory" errors, given that both the original dti_motion and my dti_motion_do_all script are in my /home/bin/ (which is on my path) and both scripts are executable?

View 1 Replies View Related

Programming :: Writing A Shell Script That Logs Some Actions?

Mar 26, 2010

I need help creating a script that makes a log file in wich to save information about every user that uses the ftp command (information like username and date) and the server to wich he is trying to connect.

View 2 Replies View Related

General :: How To Send A Large File Securely

Aug 28, 2011

I need to send large files from a Linux machine to another using cryptography. The sender machine knows the recipient IP but not vice-versa. I don't need strong cryptography and prefer higher-speed less-secure solutions.

There are no problems with presharing crypto keys but I'd prefer not dealing with SSH users creation.

I think to HTTP PUT over TLS, but I never had experience with it and I prefer to hear which are the possible solutions. I know that it can listen as a daemon but I don't know anything about cryptography. So pipeing with OpenSSL may be a solution.

View 2 Replies View Related

General :: How To Read Individual Logs From Server Into Another Log File?

Jun 1, 2011

I am new in perl, i have a question i.e 'How to read individual logs from linux server into another log file using perl script', I need to capture the individual logs from different paths and output the result of those log files and store to a file in another location.These Logs are generated in Linux Server..

View 2 Replies View Related

CentOS 5 :: Command To Find And Remove Specific Letters From A File?

Nov 4, 2009

I have a file with tens of thousands of lines. I need to remove specific letters eg eggs, from every line that has the letters in it. Is there a command which can help me do that easily?

View 4 Replies View Related

General :: Using Sed To Replace A *large* Number Of Variables In A File?

Jul 28, 2011

I have a large number of log files, on a linux box, I need to cleanse sensitive data from before sending to a third party. I have used the below script on previous occasions to perform this task, and it has worked brilliantly (script was built with some help from here :-)

#!/bin/bash
help_text () {
cat <<EOF
Usage: $0 [log_directory] [client_name(s)]
EOF

[Code]...

However, now one of our departments has sent me a CLIENT_FILE.txt with 425000+ variables! I think I may have hit some internal limit. I have tried splitting the client file into 4 with around 100000 variables in each, this still doesn't work. I'm loathe to keep splitting though as I have 20 directories with up to 190 files in each directory to run through. The more client files I make, the more passes I have to do.

View 2 Replies View Related

General :: Compress A Large File Into Smaller Parts?

Aug 18, 2011

I'm looking for a way to compress a large file (~10GB) into several files that wont exceed 150MB each.

Any thoughts?

View 2 Replies View Related

General :: Monitoring Copy Progress Of A Large File?

Sep 15, 2010

Is there a clever way to monitor the progress (as percentage or hash) of copying a large file (using pv could be an option)?Like monitoring the progress of a copy command such as this:Code:cp linux.iso /tmp/

View 2 Replies View Related

General :: Backup Large File To Multiple DVDs

Nov 2, 2009

I work for a school consulting company.We helped a school deploy about 1500 computers.The computers have windows XP but we have been using G4L for the restore partition on the drives.So far the software works great. We did however run into a problem in that many of the computers we deployed are missing the restore partition. The reason they are missing is long and convoluted and not really that important. What I have been charged to do is try and fix the restore partition problem. One solution that I had, which im not even sure if it will work, was to backup the recovery file, that g4l created, to DVD and write a basic script to recreate the partition and then copy the file over. This process would need to be as automated as possible since this disc will be inserted by the end user(the students). The backup file that g4l created is 5.9GB so it wont fit on just one disc and Dual layer discs are too expensive to use for this project, so the file will either need to be compressed again (not sure if that's a good idea or not) or split across two DVD's.

I have searched the forums here and I was not able to find anything to fix this problem. I was able to find some info on splitting files across two discs but im not sure how to use that to fix my problem.

View 5 Replies View Related

General :: Cannot Find Large Untared MySQL File

Aug 6, 2009

After untaring a mysql file (very large) I'm trying to find where the file listed below has gone. I did a search on the file name:
fine / -name 'mysql-qui-tools-5.0' -print
But can't find the file.
-rwxr-xr-x root/root 9651820 2007-05-02 11:46:01 mysql-gui-tools-5.0/mysql-query-browser-bin

View 6 Replies View Related

Ubuntu :: Remove #Chanel Logs In Xchat 2.8.6?

Aug 27, 2010

How can I remove #Chanel logs in Xchat 2.8.6?

View 2 Replies View Related

Ubuntu Servers :: Remove Some Logs From Messages?

Jan 3, 2011

I have configured my Cisco ASa Firewall to send its logs to my ubuntu server in /var/log/cisco/ I see the logfiles populating in real time, but i can also see all the logs are also wtitten to /var/log/messages. How can i make sure i do not have a log redundancy? I dont want my firewall logs displayed in messages since there are now sent to /var/log/cisco.

View 1 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved