Ubuntu Servers :: Log Stdout To File During Boot?
Nov 30, 2010
do you know of a way to redirect stdout during boot time to have a log of the entire process?
I have a remote server which is not booting and I would like to know at which point it gets stucked, what could I do?
View 3 Replies
ADVERTISEMENT
Jun 7, 2010
I have a command line server that logs to stdout, which I start along the lines of ./server > log.txt
What I want to do is limit the size of log.txt, without modifying the server.
I am assuming there must be some kind of tool already that lets me do this, something like where I can pass in my server, the output file and a size limit? If so, can anyone enlighten me?
View 3 Replies
View Related
Apr 20, 2010
I have a script where I want to redirect stdout to the terminal and also to a log file aswell as redirecting stderr to the same log file but not the terminal.I have the following code which I found on the net which redirects both stderr and stdout to a file and the logfile,
Code: if [ -p $PIPE1 ]
then
rm $PIPE1
[code]...
View 3 Replies
View Related
Dec 27, 2008
I cannot redirect output from commands such as iptables, iptables-save, and ifconfig. For example, any of the following DOESN'T work ( as root ):
Code:
iptables > tmp
iptables-save > tmp
ifconfig > tmp
The file tmp is ALWAYS blank, that is, 0 bytes in size. Wackier things DO work, such as:
Code:
echo "`iptables-save`" > tmp
iptables-save | tee tmp
Other commands like:
Code:
ls > tmp
DO work as expected.
Note that this problem happens regardless if I log-in remotely via ssh or locally on the computer in question. I am clueless as to what is causing this. Any ideas?The box is running 2.6.25-14.fc9.i686 and boots to runlevel 3. The modifications I've made to the box since installing the OS are things like compiling/installing latest OpenSSH,OpenSSL,httpd,BerkeleyDB,subversion,zlib etc -- nothing really out of the ordinary I'd say.
View 5 Replies
View Related
Sep 26, 2010
I have a process which logs output to log.txt. If I want to see the process's status in real-time, is there a way to echo that output to stdout instead of opening the log in a text editor and constantly reloading?
View 3 Replies
View Related
Aug 16, 2011
I want to keep a trace of the URL I visit, so I use a command line like this:
tcpdump -ien1 -v -X 'tcp port 80' | sed -nl
's/^.0x[0-9a-f]{4}:.{43}(.)$/1/p' |perl break.pl |perl -pe
's/(GET|POST).(.*?).HTTP/1....Host:.([a-zA-Z._0-9-]*)../"
BEGURL
[Code]....
I also tried redirecting stdout and stderr to /tmp/out, it's still empty. The file has write access. I have no idea what it can be. Is there anything else than stdout and stderr?
View 2 Replies
View Related
Jan 22, 2010
In this example, why does blacklist end up in the file blacklist and $a end up in stdout?
[code]...
The desired result is to have a file containing the results of lsmod which had the first word on the line beginning with snd_ copied into another file preceded by the word blacklist.
View 4 Replies
View Related
May 12, 2009
I have a little complex Makefile system. A parent Makefile call dozens of Makefiles in subdirctories. And the subdirctory Makefile calles shell script to do real building. I want to grab all output this Makefile system generate. So, i employ "make 2>&1 > make.log". but not all output messages are filed into make.log. The message generated by sub-makefile called shell script cannot be recorded into make.log. And another curiouse thing is, if i launch "make 2>&1 > make.log" in a perl script, all output do be sent into make.log.
View 2 Replies
View Related
Aug 5, 2011
Am having issues getting the output from a script to be logged in a file. I need the script to output both the stderr and stdout to the same text file.
At present I have the following script:
Code:
#!/bin/bash
echo TR3_1 > printers.txt
snmpget -v 1 -c public 10.168.**.* SNMPv2-SMI::mib-2.43.10.2.1.4.1.1 &>> printers.txt
[Code].....
View 4 Replies
View Related
Sep 7, 2010
I'm writing a script to execute bash commands in the PHP CLI. I would like to suppress errors from bash and write my own error message if an error occurs. So far I have this (assuming log.txt doesn't exist!):
Code:
tac log.txt 2>/dev/null
Which works as expected, tac kicks up an error but the error is suppressed, but when I use this:
Code:
tac < log.txt 2>/dev/null
I get:
Code:
bash: log.txt: No such file or directory
The tac error is suppressed but bash still gives me a dirty error.
View 2 Replies
View Related
Jun 1, 2011
I have a Linux program which can write information to stdout and stderr.
I have a shell script which redirects that output to a file in /var/log. (Via >> and 2>&1.)
Is there a way to make that log file rotate? (max size, then switch to a different file, keep only a limited number of files)
I've seen a few answers which talk about the logrotate program, which sounds good, but they also seem to be focused on programs which are generating log files internally and handle HUP signals. Is there a way to make this work with a basic output redirection script?
View 3 Replies
View Related
Jun 30, 2009
I want to parse my mail log file and reuse the results but I'm having a hard time structuring the syntax. Something like:
Code:
grep hostname /var/log/mail.log |
grep NOQUEUE: |
sed -e 's/hostname postfix/smtpd/[[0-9]*]: //g'
at this point I want to redirect what I have in hand to a file but also ... fork? or split? whatever the term, to continue onward so that I can pipe the results further into wc -l or sort or programX. without having to re-loop through that huge log file.
View 2 Replies
View Related
Mar 24, 2011
I want to have the output of a program go to 2 different files but not going to standard out. Is there a way to do this in bash? I know that in Z shell its really easy. omething like: Code: echo "test" >> file1 >> file2 Would work. But in Bash it doesn't seem that easy. I know that tee will send the output to 2 files but it also sends it to STDOUT.Something like:Code: echo "test" | tee -a file1 file2 Would put the word "test" in file1, file2, and STDOUT. Is there a way to just send the output to file1 and file2?
View 2 Replies
View Related
Aug 12, 2010
Why doesn't this work?
Code:
cat | myvar=$(</dev/stdin) <<EOS
This is some content
EOS
echo "$myvar"
i have also tried several variants of cat in place of the redirection.
Code:
$(cat -)
$(cat)
$(cat /dev/stdin)
None of the variants print "This is some content"
I have a version that works:
Code:
myvar=$(cat <<EOS
This is some content
EOS
)
echo "$myvar"
but i don't like the syntax and it doesn't play nice in my editor with code folding.
View 5 Replies
View Related
Apr 5, 2011
I want to output the stdout and stderr in a logfile,moreover i do want to log stderr also to a separate logfile, and print str to the screen I searched arround and tried:
Code:
$ command 2>&1 > log | tee -a log log.err
But then in log first the stdout appears, and then stderr.
View 1 Replies
View Related
May 19, 2010
Code:
MY_STDOUT=`my-command`
MY_STDERR=`my-command >&2`
That is, i want to have to run my-command only once and get the same result. I've tried this:
Code:
YYY=$(XXX=`{ echo -n 111; echo 222 >&2; }` 2>&1); echo $XXX $YYY
where "{ echo -n 111; echo 222 >&2; }" is my-command. But it gives this output:
Code:
222
111
instead of "111 222". What's wrong in my script?
View 2 Replies
View Related
Mar 11, 2011
When iwconfig is redirected thus Code: iwconfig >> wireless.txt things like Code: eth0 - Has no wireles extension are still outputted to the terminal (stdout)
View 2 Replies
View Related
May 5, 2013
I'm piping stdout from mplayer to awk, but the output stutters.
Code: Select allmplayer audiofile.m4a 2>&1 | awk -vRS="
" '$1 ~ /A:/ {print $0; fflush();}'
Instead of a steady output of lines to the terminal, output only occurs after a few seconds, between 6 or 12. This happens whether the input is from mplayer or avconv/ffmpeg. This never used to happen (a few years ago) so I wondered whether an awk update caused this to happen.
View 6 Replies
View Related
Jan 17, 2014
I am writing a script that calls a program which writes a lot of lines to stdout continuosly. If the last line in stdout has some regex, THEN, certain variables are updated. My problem is that I don't know how to do that.
A simplified example would be (it's not my exact case, but it I write it here to clarify): suppose I issue a ping command (which writes output to stdout continuously). Every time that the response time is t=0.025 ms, THEN, VARIABLE1=(column1 of that line) and VARIABLE2=(column2 of that line).
I think the following code would work in awk (however, I want the variables in bash and I don't know how to export them)
Code: Select allping localhost |awk '{ if ( $8 == "time=0.025" ) var1=$1 var2=$2}'
In the previous code, awk analyzes each line of the output of the ping command as soon as it is created, so the variables $var1, $var2, ... are updated at the appropriate time. But I need the "real-time" updated values of $var1, $var2 in bash, for later use in the script.
View 7 Replies
View Related
Mar 26, 2011
I tried this command to print the buffer of an existing screen session to stdout but I don't know why it doesn't print anything.
screen -x lftp -X hardcopy /dev/fd/1
screen -x lftp -X hardcopy /dev/stdout
It works if I use a regular file instead, so why doesn't it print to stdout when I use /dev/fd/1? I do this with other applications don't have an option to write to stdout and it works, so what does GNU/Screen do that makes it not work?
View 1 Replies
View Related
Feb 22, 2011
I have a starServer.sh command in a shell script along with a bunch of command. Th startServer.sh command prints out stuff on stdout and stays printing since the server is up. However eve though I want to start the server I want it to continue executing the commands after ./startServer.sh in the same flow.
View 4 Replies
View Related
Jun 3, 2011
I have several commands in a bash script, and in the middle of the script there are several commands whose output and error streams I want to redirect to a file. I think I could simply add '>> myfile.txt' to the end of every command, but is there a way to set it before that block of commands, then reset the streams to their original state at the end of that block?
View 1 Replies
View Related
Jul 3, 2009
I have this expect process:
Code:
spawn -noecho telnet my.host.com
expect {
[code]...
View 3 Replies
View Related
Nov 24, 2010
I'm trying to write a program that will fork a series of FTP sessions. For each session, there should be separate input and output files associated with stdin and stdout/stderr.
I keep reading how I should be able to do that with dup2() in the child process before the execl(), but it's not working for me. Could someone please explain what I've done wrong? The program also has a 30-second sniper alarm for testing and killing of FTPs that go dormant for too long.
The code: (ftpmon.c)
#include <stdio.h>
#include <stdlib.h>
#include <string.h>
[code]....
The output:
$ ftpmon
Connected to gila-crstest.gilacorp.com (172.16.20.8).
220 (vsFTPd 2.0.1)
ftp> waitpid(): Interrupted system call
Why am I getting the ftp> prompt? If the dup2() works, shouldn't it be taking input from my script and not my terminal? In stead, it does nothing, and winds up getting killed after 30 seconds. The log file is created, but it's empty after the run.
View 3 Replies
View Related
Aug 8, 2010
I'm working on an application used for backup/archiving. That can be archiving contents on block devices, tapes, as well as regular files. The application stores data in hard packed low redundancy heaps with multiple indexes pointing out uniquely stored, (shared), fractions in the heap.
And the application supports taking and reverting to snapshot of total storage on several computers running different OS, as well as simply taking on archiving of single files. It uses hamming code diversity to defeat the disk rot, instead of using raid arrays which has proven to become pretty much useless when the arrays climb over some terabytes in size. It is intended to be a distributed CMS (content management system) for a diversity of platforms, with focus on secure storage/archiving. i have a unix shell tool that acts like gzip, cat, dd etc in being able to pipe data between applications.
Example:
dd if=/dev/sda bs=1b | gzip -cq > my.sda.raw.gz
the tool can handle different files in a struct array, like:
Code:
enum FilesOpenStatusValue {
FileIsClosed = 0,
FileIsOpen,
[code]....
Is there a better way of getting the file name of the redirected file, (respecting the fact that there may not always exist such a thing as a file name for a redirection pipe).
Should i work with inodes instead, and then take a completely different approach when porting to non-unix platforms? Why isn't there a system call like get_filename(stdin); ?
If you have any input on this, or some questions, then please don't hesitate to post in this thread. To add some offtopic to the thread - Here is a performance tip: When doing data shuffling on streams one should avoid just using some arbitrary record length, (like 512 bytes). Use stat() to get the recommended block size in stat.st_blksize and use copy buffers of that size to get optimal throughput in your programs.
View 4 Replies
View Related
Nov 12, 2010
I'm getting (1)various errors spat out on STDOUT while running X and (2)a hard fault in Openoffice.
(1)
> Warning: Duplicate shape name ""
> Using last definition
[code]....
View 3 Replies
View Related
Apr 1, 2011
I'm curious if anybody can shed some light for me in this department. We're in a large environment with a Windows DHCP Server. We have been tinkering with LTSP on Edubuntu as thin and fat clients. It works great, but right now we just have 1 server handling the lab, which works fine unless we want to expand, which may be very possible.
These are the instructions I received:
Login to your windows server and load the DHCP configuration screen
Create a DHCP reservation for the MAC address you obtained
Add the configuration options below to enable the machine to boot from the LTSP server
017 Root Path: /opt/ltsp/i386
066 Boot Server Host Name: <ip address>
067 Bootfile Name: ltsp/arch/pxelinux.0 # Specify CPU architecture in place of 'arch', for instance 'i386'
From: [url]
I'm curious, what if I want to have multiple Ubuntu servers on the network that I want to have bootable? For example, let's say I have 3 labs, and 3 servers. Server A to Lab A, Server B to Lab B, and Server C to Lab C. I want all C's computers to boot to C, and B to B, A to A, etc.
1 - How would I add multiple entries on the Windows DHCP Server to allow all 3 (A B C) servers to boot?
2 - How would I be able to isolate the clients so ONLY Lab A clients boot to Server A, etc?
View 7 Replies
View Related
Jul 26, 2010
I'm using libxml2 to handle/manipulate some XML files. In order to check the consistency of a XML file, I have a DTD and I'm using the xmlValidateDtd method to compute the check.
However, when an error occures during the check (for example an attribute is missing in a XML tag), then libxml2 writes the error on the stdout/stderr. For exemple:
Code:
/home/XML/FreeFour.xml:18: element CA: validity error : Element CA does not carry attribute maxlength
The method return the right result (true or false depending on the check result), but occurring errors are written on the stdout/stderr, and I actually don't want that.
View 4 Replies
View Related
Jan 7, 2011
I'm planning to add 1tb sata disk to my lovely file-server under ubuntu 10.10,what i want is use this disk as additional storage for network user,indows and ubuntu?I mean when my ubuntu server down (worse case) I can easily take out the disk from ubuntu machine and plug in on windows machine
View 2 Replies
View Related
Jan 14, 2011
I am using python as a cgi for a simple game that i'm planning to run on a website. It requires the user to enter his name and age. This is saved in a file newly created in his/her name. However, I'm getting this error The above is a description of an error in a Python program, formatted
63 for a Web browser because the 'cgitb' module was enabled. In case you
64 are not reading this in a Web browser, here is the original traceback:
65
66 Traceback (most recent call last):
67 File "/var/www/webprog.cgi", line 51, in <module>
68 main()
69 File "/var/www/webprog.cgi", line 44, in main
[Code]...
View 4 Replies
View Related