Ubuntu :: Writing Log Files With A Shell Script?
Feb 6, 2011I want to perform an action with a shell script and then log the event in a file in /var/log. However, I keep getting permission denied error messages.
View 5 RepliesI want to perform an action with a shell script and then log the event in a file in /var/log. However, I keep getting permission denied error messages.
View 5 RepliesI am trying to write input to a shell and get the shell to parse the input that I am writing to it as if a user was typing in commands.
Thus far I have tried echoing some text into the shell's FD for STDIN in /proc/<pid>/fd Whilst this displays the text that I echo, the shell that I am writing to never tries to execute the command that I pass to STDIN. What is the difference between a shell taking STDIN from the user and data written to STDIN by another process e.g. echo ? It appears I am missing something fundamental.
I need help creating a script that makes a log file in wich to save information about every user that uses the ftp command (information like username and date) and the server to wich he is trying to connect.
View 2 Replies View RelatedI am about to write my first shell script for backing up my server.
These are the steps I have identified so far.
Copy files to be backed up to /srv/backup/ run mysqldump and copy the file to srv/backup/databases run duplicity to backup /srv/backup/* to another folder on my machine
I am writing a bash shell scrip that will be run everyday, and will carry out the three tasks mentioned above.
Note: point 3 (backing up to a local folder) is only a temporary measure - to allow me to understand what I'm doing, since all the tools I am using, are new to me. Once I can backup and restore correctly, I will use duplicity to compress and encrypt the files and upload them offsite.
If my understanding of duplicity is correct (according to the documentation here), the first time I run the script, a FULL backup will be done. Every subsequent backup will then be incremental. I will then force a FULL back on say a weekend.
First things first though - I have a few questions:
I would like to use backup rotation for the 'scheme' described above - I would like some recommendations on what kind/type of rotation to use. Once I have implemented the backup rotation, how can I restore from a particular day back in time (assuming the backup exists of course). ?
I am running Ubuntu 10.0.4
I want to run gallist pkge from iraf for several times.. The step for running that
is:>xgterm &
>ecl &
ecl>noao
noao>artdata
artdata>epar gallist (I set some parameter) then by writing gallist iraf terminal the pkge is run:
artdata>gallist
I want this pkge is run for 1000 time and all output result are collected in only one file.
I want to write a single shell script that allows me to, once executed from a panel launcher, change the image preview setting between "local files only" and never. Right now i have two tiny scripts, one for local files only and another one for never, that is:
[Code]...
and the other says string "local_only". But that means i need to have two launchers, because i don't know how to write the condition <<when set to never, change it to local only. And is it possible to make a script that also changes the launcher's icon when the preview config is set to one or the other value? That way i'd know what it is set to just by looking at it. It would act as a diagnostic and therapeutic tool XD
I have been trying to write a simple snip of bash shell code to import from 1 to 100 records into a Bash array.
I have a CSV file that is structured like:
record1,item1,item2,item3,item4
record2,item1,item2,item3,item4
record3,item1,item2,item3,item4
record4,item1,item2,item3,item4
And would like to get this data into corresponding arrays as such:
$record1[item1-4]
$record2[item1-4]
$record3[item1-4]
$record4[item1-4]
I'm new to UNIX scripting; I�m stuck with the following I have an Oracle SQL script that takes three parameters
1- File Name
2- File Path
3- File creation date
Under UNIX I have a folder where files will be placed frequently and I need to upload those files to Oracle, what I need is a UNIX script that can do the following
Loop through Directory "/home/applmgr/snktmp"
Picks only files
Pass the file name to parameter &1
[code]....
Is the above possible? I already knows how to call the Oracle Script from UNIX Im only stuck on writing the UNIX part where it List the files attribute(name,path,date) and store them to parameters ,Looping until the last file in the directory If the above is not possible,then how can I create the below from the command line
Filename{concatenation Mark}filePath{concatenation Mark}creationdate
Filename{concatenation Mark}filePath{concatenation Mark}creationdate
Filename{concatenation Mark}filePath{concatenation Mark}creationdate
Filename{concatenation Mark}filePath{concatenation Mark}creationdate
Currently when I create a folder, it comes down as 755 permissions.
I want it to come down as 775 permissions by default.
How can I change this?
I am trying to write .pgm images using the O_DIRECT flag in open().I have a char* buffer which has the image data. I know that I have to align the buffers and have done that using posix_memalign() yet only a part of the image gets written.Has someone used O_DIRECT for writing files successfully?
View 1 Replies View Relatedhow to match to find matches in two different files when comparing timestamps. The fields I'm wanting to match up are in the format:
Jul 26 09:33:02
I have tried reading the file line by line and using awk '{print $1,$2,$3}' which only gets and stores the timestamp in one of the files. I've been looking around and saw this example:
awk 'FNR==NR{!a[$3]++;next }{ b[$3]++ }
END{
for(i in a){
for(k in b){
if (a[i]==1 && i ~ k ) { print i }
}
}
}' $FILE $FILE2
Which sorta works but its way over my head at the moment. The two files can be found in your /var/log/syslog and /var/log/auth.log (using Ubuntu 11.04)
Reading and writing text files in C?
View 9 Replies View RelatedSamba seems to crash and come back after some seconds if I copy a lot of small files in a short period of time over the network. How do I fix it?
I have Ubuntu 9.10 Server 64bit running on a D945GCLF2 board sharing two 1TB ext4 formatted HDDs to my Windows PCs using samba. I've been having an issue with reading or writing files through samba. It happens during copying operations or checksumming, anything that reads or writes MANY small files in a small amount of time. I am pretty sure the problem has to do with my server because the server has run on two different LANs in different homes and will crash from activity with any of several other PCs. There is no crashing if I access the files through SSH, although when I do that the max transfer speed is less than 1MB/s.
When I induce the crashing, there is absolutely no output to the server terminal.
As an easy access example of something that will crash samba, extracting Cinebench R11.5 to the server will do the job. It always fails.
Whenever I am busy reading or writing large files, or large sum of files, my computer is unresponsive. Screens are getting greyed-out and I just can sit there and wait until the reading/writing is done.
This is not caused by the CPU which is overstressed because it is not. Look at the attachments and you will see the CPU is used for about 20%. When these pictures were captured the computer was using hellanzb to unrar a long list of rar-files.
When you look at my signature you see the computer is not bad at all, just disk-access is slow. I can transfer files with a maximum speed of 30MB/s. Is that normal or is it very slow? I don't know the numbers. I have 2 SATA disks.
O.S. is Mint 9-Isadora, based on Ubuntu 10.04 and I use the 64-bits version.
I have tried to configure an Enemy Territory Server in an way that a common user could run it just executing a command line. The first thing I did was writing a script like that
/usr/local/games/enemy-territory/etded +set dedicated 1 +set net_port 27960 +set fs_game etpub +set fs_homepath /usr/local/games/enemy-territory/27960 +set sv_punkbuster 1 +set +exec server.cfg +set +exec punkbuster.cfg +set +exec bots.cfg
and then putting it in the /usr/local/bin directory. Ok, the things seem to be fine, but then I realized that the program tries to write some config and log files. I noticed that because some warnings appear in the command line, like that Couldn't write etconfig.cfg always that I run the command as a normal user. On the other hand, if I give writing permission to these files, all the warnings disapear.
But I don't think it is a good way, because someone could change these files by hand, what would not be good.
My last try was to set the suid of the script up, with the command chmod u+s /usr/local/bin/etded-server
But as I already knew that suid does not work well with shell script I wrote a C source like that:
[Code]...
For some time now, I'm having some problems with configuring an NFSv4 server to let it work with a firewall. I've already searched to web, but I was unable to find a solution that works for me.
The situation is as follows:
I'm trying to connect an NFS client to an NFS server that is behind a firewall. I don't have access to this firewall, but I can contact the administrator to open some ports for me. I already did this for opening port 2049.
The result is that the client can read files from the server, but is unable to write files to the server. I believe that for writing an extra RPC-connection needs to be set up. However, the ports on which the RPC-connection is set up, seem to be different for every connection (I verified this using 'netstat -tn').
Clearly, this is a problem since the server is protected by the firewall.
Thus, what I want to do is configure the server in such a way, that it always uses the same server-side port(s) to connect with the writing clients (just like 2049 for reading). I've already tried to configure the /etc/default/nfs-kernel-server and /etc/default/nfs-common files, but that hasn't really worked out yet.
Note: Because I don't like to contact the system admin every day, I hooked up 2 computers (client/server) on which I set up the same configuration (without the firewall). I'd like to see it working on those machines first (that is, 'netstat -tn' showing the correct port), before I contact the admin to open some extra ports.
How do you send files, save or other wise write to CD using Mandriva Linux? On windows you get a helper menu. Linux does not offer this option in it's helper file and you can't click and drag a file in the CD folder. The dialog box reads "you do not have permission to write to this folder" when I try to drag it in and I can't change the permission signed in as Root.I don't have a clue. I wish Linux Questions would add a emotioncon that has the expression " what the hell buddy? are you on ten hits of acid?
View 8 Replies View RelatedI am working on a little project in python. i have produced this prototype
Code:
#!/usr/bin/python
# -*- coding: iso-8859-1 -*-
#DocC documentation prototype
[Code].....
1. Every Sunday2. Find all files older than 1 day3. Gzip these file4. Tar up the gzipped files into one tar file.5. Name the tarball with a date stamp indicating what day it was created, so we know that week's files are in the file
View 3 Replies View RelatedI want to automate this using script.How to automate it?
File1:
s.no# 1 name:aaaaaa
city:abcd
[code]...
I synthesized a seismogram by using Fortran codes, I need plot the synthesized seismogram and the data together, so I can verify the accuracy of code. Now I encounter a question: how to read the SAC data by Fortran code, I have searched some codes on Internet, the details as follow:the velr12a.sac is my data file.
Code:
c read sac file
PROGRAM RSAC
PARAMETER (MAX=1000)
DIMENSION YARRAY(MAX)
CHARACTER*10 KNAME
[Code]....
Well my drive has errors where i can't mount it using a GUI but it will mount and let me see my files using shell or w/e it is that i'm using while in recovery mode. I have manage to change my directory to the one that has ALL of the files that i want and need to copy. I was wondering whats the easest way to copy ALL of the files which are like 20 GB onto my 500 GB external HDD? and How will i know everything is done copying?
View 2 Replies View RelatedI've got an interesting challenge for the shell scripting wizards here. I've got a mySQL dump of three files for my amarok database with the intention of copying some files to my media server (cover art) so that I can keep the server the server and not rely on my local machine.
Step 1: Identify any cover art files on my local machine.
I did this with:
Code:
mysql -u amarok -p amarok -e "SELECT * FROM images WHERE path like '%.kde%'" > cover_art.txt
Output looks like this:
[Code]....
What I have here now is the ENTIRE album list in my collection -- and something to compare the IDs in Step 1 against. I'm going to stop here and will update the thread as I get past this stumbling block. "ID" in cover_art.txt = "image" in albums.txt... straightforward enough, right?
So the question is this: how do I create a simple shell script that will loop through the IDs in cover_art.txt (i.e. characters 0 -> 4 -- it will always be a 4 digit ID) and then search for that ID in the Albums.txt file.
I cannot find the way to run some command for a subset of files in directory - how can I do it
View 3 Replies View Relatedi have n files and each file has 5-6 .sql files.Now I need to wrie a shell script that executes all the n files in parellel and for each n file the .sql files sequentially.eg
CRM_File CM_file AP_file
crm_file_1425.sql
cm_file_5789.sql
ap_file_4124.sql
[code]....
run CRM_File CM_file AP_file in parellel and each file under CRM_File,CM_file,AP_file sequentially.
I have an issue with shell script
#! /usr/bin/ksh
HOST=myhost
USER=myuser
PASSWD=myuser
ftp -nv <<EOF
open $HOST
user $USER $PASSWD
[Code]...
I am trying to read the fields of a file and manipulate them, record by record. Lets say using awk :
awk -F":" `{print $1 $2 $3 $4 $5}' TrackMsgFile.0806`
This prints my fields on screen.But I dont want to print these fields while reading the records instead store them in some variable and manipulate them as per my logic. Does "awk" or some other shell command provides something for this ?
I have a huge database of students, I would like extract these data and write to individual file for each students.
I am running a loop in shell program (.sh file), the output of each run in the loop need to redirected to a file with variable name.
I tried the following line, but it did not work, where BodyMsg is the data and Rollno is the students roll number.
echo $BodyMsg > $RolNo".html"
I continue to work on automating the update and deployment of a vendors WAR files, and have bumped into my next challenge... The vendor provides web.xml files have entries that look like this
Code:
<context-param>
<param-name>siteminder.enabled</param-name>
<param-value>false</param-value>
</context-param>
I need to search the file for a param-name and replace the param-value below it with the correct value. I expect sed or awk is the trick on this, but I am not sure how to have it search for one line, and have it update the line below it.
i need to add a line in the login start up file(s) (one of ~/.bash_profile, ~/.bash_login, or ~/.profile) and startup file run by my shell when started as a non-login shell (~/.bashrc) so as to set up my account envirnoment for one of my courses.I don't know how to proceed with this. I tried doing this in my ubuntu enviroment and my system got locked after that.
View 3 Replies View Related