Programming :: Sed Running Under Bash - Getting Error
Jul 18, 2010~$ sed s/^bb/bbbbbb/ foo1.txt
~$ sed: -e expression #1, char 3: unterminated s command
~$
Where 'b' stands for space. What is the error here? Sed is running under bash.
~$ sed s/^bb/bbbbbb/ foo1.txt
~$ sed: -e expression #1, char 3: unterminated s command
~$
Where 'b' stands for space. What is the error here? Sed is running under bash.
I'm having trouble trying to make a script. What I want to do is check if xScreenSaver is running in my user account. If not, run it. If it's running, kill it.
So this is the script I've made:
Code:
The problem is that I've echoed the output of $(pgrep -u $(whoami) xscreensaver) and it always seems to add 4 numbers to the pid, even if the pid doesn't exist. What do I mean by "doesn't exist"? That no xscreensaver is running in my user, and if I run pgrep -u $(whoami) xscreensaver in bash, I get not output, but if I run the command though the script, I get (for instance) 4050. If I run it again, I get 4054, and again 4058... etc. What the hell is going on with that?
We have a custom app that runs on boot on some older hardware running DSL linux, and their startup manager was quite simple. We purchased some newer Asus eeebox's which run xandros and things are quite stable and run nice with 1 exception.The application only runs from the root (/) location. This box auto logs in as 'user' and there is a /home/user/.kde/Autostart folder where you can stick scripts to run at boot. So I have a start.sh script, and with little bash programming tried things such as;
sudo cd /
sudo /startapp.pl
but the errors start spewing with the basic;can't find data/xyz as it's looking in the local.I thought there was a basic cwd (change working directory) but everything I try just forces the run from that location.Any ideas or suggestions are appreciated, but things like can you change the code, etc. can't be done, so it must be a programming thing. The only other thought I had but not sure, can you do a cronjob with @boot or something, that when the box starts, it can run this job as root and fire off?
is this a good way of doing it? Shall I use & when starting the new process? Itested in bash and dash.
testbash=$(ps -e | grep $$ | grep bash)
if [ 0 = ${#testbash} ]; then
echo "new process"
[code]...
I have a bash script that I want to be running on a "clean" screen, but when the script finishes/exits I want to see what was previously on the screen. Any thoughts? The "Clear" command does not enable me to get the information back, and the "Screen" Command runs the program in the original window, so you see nothing until the new screen is exited.
View 4 Replies View Relatedhow can i run a bash script script by using java in rhel5... then please reply....i have vary good project in my mind.....
View 2 Replies View RelatedI am trying to replicate what is happening on this page under the tcsh shell, but using the bash shell found in Wheezy. Here is the page I am referring to:[URL] The command I am trying to replicate is on page 6 under figure 2.4. The command is "prompt> ./mem &; ./mem &".
I would like to run the same program twice, concurrently, but do not know how. Note that I am not trying to use a bash script, but rather by simply using syntax on the command line.
I have around 600 empty text files that I need to add the name of this file as part of the data, I meanfiles from "file1.txt to "file599.txt, all of them empty, and I need to get the name inside the file, so, when I open the file show the name as part the data "file1".these files were created on my web site, I am thinking in a small script in bash
View 5 Replies View RelatedI am writing a script based image manipulator but i need to know if X is running so i can tell if i use CACAVIEW to imagemagick DISPLAY command.
View 14 Replies View RelatedI am running a simple script that I copied from slug.ceca.utc.edu/docs/2009-3-26-linux-server-health.pdf and edited with the names and paths of my own servers. I don't know much about scripting (re: nothing) but I wanted to try and be efficient in my new role as a Linux Sys Admin. The script was saved to root's home directory and runs as part of root's crontab once a week. The script runs with no problem, but it doesn't actually seem to run all of the commands contained within. It skips some in the middle and the end and I don't know why. The script itself is this:
Code:
#!/in/bash
uname -a > /tmp/server.txt
[code]...
How do you catch user input while the script is running? Or, how would you make two scripts run at the same time, but use input from one script to the other? The program I'm trying to make, echos text on the screen continuously, but while thats happening, I want the user to be able to input something, so the program can detect the input and display something else. So I thought maybe I could make two scripts run to do each task.
View 5 Replies View RelatedI'm trying to get a daemon to start automatically using an init bash script - i suppose this is what it is called. This is what I did to 'install' the script.
sudo cp inittestdaemon /etc/init.d/
sudo update-rc.d inittestdaemon defaults 91
sudo chmod 777 /etc/init.d/inittestdaemon
The script didn't work, so I tried running it directly on the terminal and this is the error which i got:
[Code]...
I found, in bash, something similar to 'try/except' in python. I've been using something like this:
Code:
if ! 'command';then
echo 'damn, there was an error'
[code]....
I have a script that connects to a windows server, downloads a file, appends to it and then re-uploads the updated file. I want to implement error handling. An email is to be generated indicating whether there was an error or not. This email should include all standard and error output as a body. The current script looks something like this:
Code:
function Email_ServerSupport {
for time in once; do
echo "Subject: Billing - smb copy to accounting" $1
cat /tmp/smbx
cat /tmp/smbxerr
done | mail $EMAILADDR
}
/usr/bin/cp /dev/null /tmp/smbx
/usr/bin/cp /dev/null /tmp/smbxerr
cd /tmp
/usr/sfw/bin/smbclient $LOCATION -A $AUTHFILE >>/tmp/smbx <<EOF
get $OUTFILE
exit
EOF
cat $INFILE >> $OUTFILE
/usr/sfw/bin/smbxclient $LOCATION -A $AUTHFILE >>/tmp/smbx <<EOF
put $OUTFILE
cat /tmp/smbx | grep -v "Domain" | grep -v "putting file" | grep -v "getting file" >> /tmp/smbxerr
if [ -s /tmp/smbxerr ]; then
Email_ServerSupport " ERROR"
exit
else
echo "Transfer successful."
Email_ServerSupport " SUCCESS"
fi
The reason for the grep -v's is because, from my understanding, when using smbclient, ALL output goes to stdout, even errors. For this reason, I need to filter out lines including "domain" "putting file" or "getting file", all of which aren't errors. The problem is that even though the script seems to catch errors successfully now and then, the success email ends up blank (/tmp/smbx is somehow empty). I'm also worried it could miss possible errors I haven't tested. I'm thinking it has to do with the way "EOF" functions. Is there any way to capture output from the "put" and "get" commands? I can't simply redirect the output, can I?
Is there a way to use exec, but if exec fails to go on with the script?
Example:
Code:
#!/usr/bin/env bash
exec startx
echo "Starting of X failed"
If startx fails, the echo will be seen on the screen. I tried all kind of stuff, but guess it ain't of much use to post it here. I searched the web, but searching for "exec and bash" in one sentence does give results which are not what i am looking for.
I have a daemon script which wakes up every 5 minutes and checks the health of started processes. It works fine during the day but throws a syntax error just after midnight.Here is the log:
(02/22-23:49) Check all started processes
(02/22-23:54) Check all started processes
(02/22-23:59) Check all started processes
[code]....
how to set the subject of this up.. but here is what I am trying to accomplish (please keep in mine, this is only my first month playing with ANY Linux programming): My shared web host limits running 2 CRONS or 2 SSH sessions at one time. I need to run more than that.. So, my solution is to run what I need on my home computer, and then push all the results via SSH to my web server.
To keep things timed, I am trying to call 4 bash scripts from inside of 1 bash script... Each bash script has variables I need to export out to the remote (web) server. Being that I can only run 2 SSH or 2 CRON sessions on the remote, it wouldn't do me any good to open up CRON or SSH remotely or locally - either way I'm maxing out. That is why I would like to call 1 final script that takes the output of the 4 bash scripts and does the job.
Main bash calls via CRON every 30 minutes:
Code: ./script_1 &
./script_2 &
./script_3 &
./script_4 &
[code]....
I need to scp the file saved by wget to the remote server. I also need to pass the SQL statement generated in each script as a command in SSH. I'm lost how to get the info from "script_x" into a string that can be used to SSH - and doing this all inside of ONE SSH command. Would I store the SQL strings in a file and call that in the SSH command line?
If so, what is the command to make sure the variable output in the "script_#" file is sent to a file? Can I call the variable from the main Bash Script? Now - the good news is, I can SSH from my local machine to the remote one.That is about as far as I got.Again - I am so new to this that my ears are still wet. This has been something I have been working on for a while, and I'm just lost at this point.
below are the details of my system. I have bash as my current shell, some really common commands aren't working.
Do I need to do a re-installation of bash? Or how do I install a selection of bash commands which I need? (for example a subset of [URL])
Code:
root@sdptfw:~ # uname -a
Linux sdptfw.sdpt.co.za 2.4.36 #1 Tue Jul 22 13:13:24 GMT 2008 i686 i686 i386 GNU/Linux
root@sdptfw:~ # echo $SHELL$
/bin/bash$
[Code]....
I have a bash script giving me the following error:
[Code]...
When I run it I am getting: ./svnup: line 61: syntax error: unexpected end of file Can't for the life of me figure out what is wrong. It's a script to export the latest revision from SVN to the web root folder and archive the previous version, basically.
I have the following working script. It checks the directory for txt files, if files are there, it copies to another directory or gives error. I would like to exclude "file not found" errors and send them to /dev/null. All other errors should go to the email address as usual.
Code:
#!/bin/bash
function err
{
if [[ $? -ne 0 ]]
[Code]....
I was trying to run small shell script, but could not run. I got the error like in subject.
This is exact way i was trying to do.
I presume there is something in the JWM window manager - or Puppy Linux Lupu 5.01 itself that is conflicting with the normal loading of GMPLAYER....
I would like to know how to troubleshoot, and fix this problem if possible (even a workaround would be great).
I have an ubuntu 10.10 server and want to run a script on it to check if a process is running. If it is not running, it will start the process and also write into a log file.
When running the script i get the following error message:
syntax error near unexpected token `else'
Here is my script.
I would like to know how do I print the line # in a script. My requirement is, I have a script which is about ~5000 lines long. If there are any errors happen I just exit. And I would like to add the line # of the script where the error happened.
View 3 Replies View Relatedsimple bash code:
Code:
#!/bin/bash
trap "echo 'you got me'" SIGINT SIGTERM # to trap ctrl+c
echo "Press ctrl+c during 5 sec loop"
for ((i=0;i<5;i++)); do
[Code]...
How come code behaves normally and stops when ctrl+c signal is caught and resumes, but after I use at least one timeout read in the code it looks like, if signal is caught again it doesn't pause the execution but skips the loop. If you remove -t (timeout) option from the read, both loops look the same!
I'm trying to use ${VAR:0:4} substring extraction described here: tldp.org/LDP/abs/html/string-manipulation.html and it works perfectly if i issue a command in bash. But when i put it in a script file and run it, bash gives me "bad substitution" error. Does anyone know how to fix it?
View 5 Replies View Relatedi'm trying to write a bash script to upload an image to [URL], but i can't get i working properly. Everytime i try, the html returns me a error saying "Upload is disabled during short maintenance work (ETA 10 minutes). Brb!", while from the browser everything works fine. This is my current command line:
Code: curl -L -b cookie-pix.txt -c newcookie.txt www.pixhost.org/cover-upload -A "Mozilla/4.0 (compatible; MSIE 5.01; Windows NT 5.0)" -F 0=@/home/admin/Desktop/karm.jpg -F content_type=0 -F press=Upload And this is the html of the form:
[Code]....
I have a config file that contains:
my.config:
Code:
Now in my bash script, I want to get the output /home/user instead of $HOME once read. So far, I have managed to get the $HOME variable but I can't get it to echo the variable. All I get is the output $HOME.
Here is my parse_cmd script:
Code:
I have written quite a few separate bash & scripts and php scripts that up to now I have run from cron jobs. However I have to estimate how long each takes to run, before running the next and so it probably takes much longer than necessary to run them all. They have to run in order.
Now there are so many I am thinking it would be better to have a master bash script that would run one after the other, but I am not sure how to get the master script to wait before starting to run the next script. Is this possible and is there a command that will make the script wait between bash and php scripts , for them to finish, before running the next?
I would like to get the filename (without extension) and the extension separately. The best solution I found so far is:
Let FILE="thefilenameofsomefilesfor_instance.txt"
Code:
NAME=`echo "$FILE" | cut -d'.' -f1`
EXTENSION=`echo "$FILE" | cut -d'.' -f2`
I think it would be better to count the len and remove 3 chars to right to get the extension, but it can be macintosh filenames with have 4 chars for extensions.