Programming :: Output IP Into A File?
May 6, 2010have a file (called it A) contains;
hostname 192.168.23.65
hostname 10.18.13.253
hostname 10.18.16.253
[code]...
have a file (called it A) contains;
hostname 192.168.23.65
hostname 10.18.13.253
hostname 10.18.16.253
[code]...
My script.
This is may script:
Code:
Problem: Output file doest not exclude the values in grep -av
My employer issues pdf files with everyones work schedules. I copy the content and save it as plain text in a file called unformatted (hope to be able to automate this step someday). Im working on a SED script that reduces unformatted to only display what I want to see and saves the result in a file Iïve named formatted. After that I have to manually copy formatted and save it with that days date as a filename e.g. 2011-02-25 or whatever day is scheduled in the pdf, for use on a mobile device (Nokia N900). I noticed that the date occurs on certain lines in the file so I added a line like:
sed -n 's/^Date: (201[1-9])/([0-1][0-9])/([0-3][0-9]).*/1-2-3/p' < unformatted >theDate
That creates a file theDate with the date in it that I wish to use as the filename for this particular instance. So I would like to skip the file formatted all together and have the sed- script write to a new file using the content of the Date as a filename, but how do I make that happen? And of course it would be more elegant if I could skip the intermediate theDate file as well.
I am again struggling to make a script work, but hey, it is fun, I am learning new things. I discovered the set -x option which was, for me, like the second coming. Still, what I am not able to do is redirect ALL output to a (log) file, including what is produced by the -x setting. Let's assume a very simple script:
Code: #!/bin/bash
set -x
source="/home/atelier/Bureau/"
ls -la $source and I am running it as . test.sh >> /var/log/test.rmcb.log
The result of ls goes inded into the log file, but the rest still shows on the console where I am running the script: Code: ++ source=/home/atelier/Bureau/
++ ls --color=auto -la /home/atelier/Bureau/ Is there a way to redirect EVERYTHING to the log file ?
I want to pipe the output of ls in a folder to a file (lets call it test.txt) but when i do so, but when i do ls > test.txt in test.txt there is also test.txt (logical
View 4 Replies View RelatedCode:
I'm trying to make several files: each named after the display and containing resolutions. But for some reason I get null when trying to read lines.
i have wrote a long piece of code above with the "main" which is calling openFile( &fout, filename )filename contains the txt name in a form of "data.txt"i wanna read the data from the file and output it into fout for later use.the data in that file is a vector looking interger group.i have the following code:
int openFile( ofstream * fout, const char * filename)
{
ifstream iFile(filename);
[code]...
I am learning Linux daemon programming and write a simple daemon program. The issue is no data is written to the log file (/var/tmp/simpledeamon.log) though the file is successfully created (it's file size is zero from ls -l output). Could someone kindly point out the error in my program,The code is based on the Devin Watson's article at Here is the code:
// simplydaemon.cpp
// A simple Linux daemon.
#include <sys/types.h>
[code]...
I'm writing output of top command to a file However since top does not provide time I would like to append the 'date' command and then write all this to a file.
so something like top -d 1 -b; echo 'date' >>file
I have got a script with an outer and inner loop. The inner loop issues loads of echo's which need to be redirected to a log file determined by the outer loop. The obvious solution is to redirect every echo to >$LOG and set LOG in the outer loop.
Code:
for f in $FILES ; do
LOG=<logfile>
for l in $LINES ; do
[code]....
it is possible to map stdout to $LOG in the outer loop without having to redirect every subsequent individual command output?
I did a select on my db and now I need that this if consult return true for me salve the columns information in file. How I do this in Shell?!
View 3 Replies View RelatedConsider this PHP Script or just skip to the Output:
Code
Code:
-bash-2.05b# cat myDate.php
#!/usr/bin/php -q
<?php
[code]....
I have a requirement like this:Cut the characters from each line of a file with following positions: 21-24, 25-34 ,111-120.Thse fields now need to be placed in a tab delimited output file.Currently this is how I am achieving it:
#!/bin/sh
cat newsmaple.txt | cut -c 21-24 > out1.txt
cat newsmaple.txt | cut -c 25-34 >out2.txt
[code]....
I have a problem when using awk:
e.g: awk '{processing text}' File1 > File2
But when I'm processing the File1, I want to print out some messages to the screen (not the File2). How can I do that?
The perl script I wrote works fine if I print the result to screen
x0_amber.pl 1 1000 0 5
But whenever I want to output to a file with
x0_amber.pl 1 1000 0 5 > x0_out
it never really prints out to the file.
It worked earlier, but I was playing with my PATH lately, I don't know if it's related to that
Within PyGTK I'm using gobject.spawn_async to launch a bash script. I would like the output of that bash script to be displayed within my application. I have a textview set up to receive the text ...
Here are the commands that launch the program:
Code:
def run_command(command):
global keep_pulsing
keep_pulsing=True
(cpid, cstdin, cstdout, cstderr) = gobject.spawn_async(command,flags=gobject.SPAWN_DO_NOT_REAP_CHILD|gobject.SPAWN_STDERR_TO_DEV_NULL,standard_code....
Here are the two callback functions. But like I say ... I have no idea how to get that data from the 'cstdout' file descriptor into a textbuffer.
Code:
### THE FOLLOWING ARE GLOBALS:
textview = wTree.get_widget('textview1')
textbuffer=textview.get_buffer()
def update_textview_callback(fd, condition):
global keep_pulsing
if keep_pulsing:
progressbar.pulse()
code....
Its my first post in here so please be patient I am trying to use regex in perl script to detect allowed words from the file and then print output to the screen.
As an example : I have text file with orders and returns :
Item2-SKU-2-11.08.2010-online
Item3-SKU-3-11.09.2010-return
Item4-SKU-4-11.09.2010-store
My question: is it possible to make sure that i am ony outputing to the screen orders based on few conditions like Item,order form e.g. online.And is it possible to have multiple matches (Item2 only diplay if ordered online etc)
We make everyday a DB Mysql backup on Linux redhat Enterprise. We are using a bash shell script (and putting it in the crontab) to execute it automatically everyday. We added a line to this script telling, once the backup has completed, to find old backup files (stored on hard disk after each backup) older than x days to remove them. We use the find command (search for file type) with the mtime option and in combination with rm command. Everything runs ok but we also want to add some new code to the same line: If find command cannot find anything or fails, for example if it cannot delete file or fails, send the error message (standard error output) to an error file (like error000001 and increasing) and mail the errorxxxx file to an email address for example to admin@companyname.com. What would be the code for this issue to add it to our find command in the same bash shell script??
View 2 Replies View RelatedHere is the block of code : (The red part is the code that doesn't work) The file is not created and see the output after the code. # i loop create environment structure and k loop create std procedure sub structure.
for i in TRAX2 TRAX BENCH PROD
do
eval mkdir $"acsayul02501_${i}"
eval chmod 2770 $"acsayul02501_${i}"
[code]....
I am trying to grep multiple numbers from file, grep does have the -f option for that.
Code: grep -f <`seq 500 520` /etc/passwd I know this could be done with
Code: for i in `seq 500 520`; do grep "$i" /etc/passwd; done But my question is fare more behind this example. It is possible to redirect one command output which will be treat as a content of file for another command ?
I am curious if perhaps I am doing something wrong extracting pages from a pdf doc using pdftk and creating a new file. I am only extracting the odd pages from the file and outputting them to a new file that is now only 20 pages instead of the input's 40 pages, yet the new output file is still 1.4Mb in size, the same as the original.
It seems strange to extract only half the pages of a large document and end up with a result that is the same size. how to streamline the resulting pdf's using pdftk?
BTW this is the command I am using, in case perhaps I am missing an option to optimize file size or something:
Code:
pdftk A=ch15.pdf cat A1-40odd output odd.pdf
When I try to copy PDF files from one folder to another folder, it give me this error: "Error while copying "2004-SNUG-Europe-paper_...log_DPI_with_SystemC.pdf". There was an error copying the file into /media/CCDCE66BDCE64F70/Backup Master/Heterogeneous_cosimulation/Documentation" "Error splicing file: Input/output error" What is the reason of this error and how can this be fixed?
View 9 Replies View RelatedI have a 7.2 GB file (VMWare virtual machine file) that I am trying to copy from its original location to the another folder OR to external hard drive...each time I try to do this, I always get the following error after the copying process reach 'exactly' 1.4 GB
Error reading from file input/output error
And I have to either Cancel or Skip
I've tried to split the files to smaller pieces but the idea didn't work as I still get the same error whenever I try to compress/ split or do any operation with this file. how I can copy this file?
I am new to shell scripting.What i am trying is to write a shell script which take the input file and output should like as mentioned below.Output file should have data till SOK (marked in red)from every second line and then the selected data(marked in green) from 4th line.So selected data from 2nd and 4th line in one line of O/P file and then similarly selected data from 6th and 8th line in second line of O/P file.Input File:
3c3
< c1111;11.11.11.11;pOK;SOK:abcde;Universe:aa
---
[code]...
I am using below script to ftp a file to remote machine
Code:
#!/bin/bash
ftp -nv <<EOF
open ${SERVER}
[code]....
When I execute the above file its working fine and displaying output on to the screen. How can I log the output to a file?
As i am new to C++ i couldn't figure out how to input a file and make some change on the file and produce a output file. like this problem i have is.
"Program that processes an input file and produces an output file. The input file will contain lines of data, each containing two floating point numbers. The lines of the output file should contain the two numbers read and their average (with a '$' sign and 2 places after the decimal point)."
having following problem in linux environment.I have a following basic file with name " BASEFILE", as shown example below :
sl.no pol.no name status loan
1 123 rama FORCE 500
2 234 jama LAPSE 800
[code]...
I have a file with something like** The total time for processing is 1245 seconds *when I doawk 'BEGIN{FS="The total time for processing"} {print $2 } ' fileI get correctly on screen 1245 seconds *but when I try to direct this to a file awk 'BEGIN{FS="The total time for processing"} {print $2 } ' file > outputthenoutput is empty ie the 1245 seconds * is not saved in ...Know why?
View 4 Replies View RelatedWith the command "tail -300 /var/log/apache2/access.log | less" i can look in the log for the 300 latest visitors. and i wanted to ask if it's possiblle to get that command to run from a php file and if yes how ?
View 4 Replies View RelatedLex's (actually Flex) output contains this:
Code:
#ifdef __cplusplus
extern "C" int yywrap (void );
[code]...