here is a fragment of a bash that i have to send an email when a back Up is made. basically a usr runs the sh - it asks for the job num and I read that from standard in . I do a tar of the drv to tape and I wish to send an email that job such and such has happened. My problem is : the subject of the email. I can set it - but if i try to use a var ( as in the $jobNum ) that Does work in the mail body - the subject comes out blank. I have also tried to buld a text file and shove that in the SUBJECT. my Q is how can I use a var in the subject.
# email subject ### so I have tried bulding and reading a tmp file -no works EMAILSUBJECT="/tmp/emailsubject.txt" echo "backup job" $jobNum >$EMAILSUBJECT
I'm writing a bash script that executes a few perl scripts. One of the perl scripts that I need to execute requires two arguments with it. The arguments are stored in a txt file, each line contains a hostname and its corresponding IP address separated by a ":" (colon), the txt file looks like this below:
[Code]...
I'm not sure if it's the best way to accomplish this but here it goes. In the bash file, let's call it getHosts.sh, I create an array and assign each line of the file to an element in that array. I then think I need to create a new array where I take the hostname (which is before the ":") separate it from its IP address and place the IP address on a new line just below the hostname (this way I can reference to it like $hostNames[$x] would be the hostname, and $hostNames[$x+1] would be its IP address). So the new array would now look like this below:
I'm trying to write a script that will continuously ping a server and then send out an email when the server is down, and then when it is back up. Then, continuing with monitoring. I would like to not run this in cron, because I don't want to script to run with multiple instances.
For example, Ping a server every minute. -If successful, do nothing. -If failed to ping, then send out email stating that server is down. -Once ping is successful, then send out email stating that server is up.
I only want it to send an email once after a failure, so the end user isn't get an email every minute that it fails. Once it is successful, then send the email (one time), stating that the server is up. Then, continue to ping and if fails again, repeat the process.
I wrote this script for bash & perl. If you run it in bash it should work. It changes title - (uuid) kernel - initrd ... to title - uuid UUID=the_uuid... kernel - initrd .... When I wrote it I replaced end of lines by . It's the second $block definition. But now I need to repair it, because I will work with the 1st $block definition. That is not to exclude end of lines, but leave it be untouched. Now when you escape the second $block definition, the code does not work. What I have to do to repair it working with multiline input data?
I wanted to run bash and perl scripts which requires SU privileges by clicking on desktop Terminal window opens and closes fast without knowing what happened.
scripts work on terminal window by telling sudo perl file.pl sudo bash file.sh
Perl has this header #!/usr/bin/env perl or #!usr/bin/perl -w
Bash has header #!/bin/bash
How can I run them with desktop shortcuts with SU privilege so, the terminal will not close after execution? Should not the scripts work without telling perl or bash, since they have the header?
I've been trying to figure out a way to more easily color text in Perl like I do on Bash on a Linux box. In bash, what I'll do is set color variables up to equal the escape sequence, then echo out with escape seqeunces to print it exactly how I want it. Typically I'll want a character or a word in a different color, not the whole line. For example
echo -n -e "My face is turning ${RED}red${UNCOLOR} like a lobster." In Perl with the term::ANSIColor module, it seems to just do a line. Am I being dense? Is there a way that I can do it like I do it in BASH that's fairly easy to read after the fact?
I've recently inherited a bunch of files at a new job and am trying to figure out some of the problems that have constantly popped up. The one i'm getting a huge headache with results from a bash script that is supposed to change a date format from a client populated txt field to one we want defined a certain way. Everything in the script works fine, except that one function. Below is the line i'm trying to manipulate, with date examples.
The one caveat is that the first date is non-static and changes daily. It is, however, always the current date. If it helps, the second date will always be a year away from the first date.My idea was to pull the current date via perl's DATE function, but...how to do it, and calculate a year away without throwing the rest of the bash script off? Any help would be appreciated. I'm sure it's a simple solution but i know absolutely nothing about these scripts and how they were written.
This pretend to be a script for rename a lot of files automatically. So I put the list of files in an array named @lista. But, as you can see, at the end of the command I use a sed filter to print out a backslash for those files that have spaces in their names, so the path for those files could be rightly interpreted.
But there's no way I could print a backslash. It works well when I use the Perl's sed substitution s///, but I need every path in the array to be fixed.
I'd like to add that the bash command works perfectly well alone. I mean outside the Perl script.
I wanted to run bash and perl scripts which requires SU privileges by clicking on dektop Terminal window opens and closes fast without knowing what happened.
scripts work on terminal window by telling
sudo perl file.pl sudo bash file.sh Perl has this header #!/usr/bin/env perl
[Code]....
How can I run them with desktop shortcuts with SU privilege so, the terminal will not close after execution?
should not the scripts work without telling perl or bash,
I've been asked to do certain jobs doing scripts in bash and perl. So this time they asked me to check which users hasn't been able to loggin. DONE with Code: lasb Now they asked me to show how many times all the users have input certain Bash Commands like
Code: Ls Cd pwd I was wondering if there was a command of something that could show me how many times those commands has been used, I already know I can see all input with Code: history Sorry I'm really new to this, been working with this for a couple of weeks, and its really interesting.
I need to find a way to download the attachment from a daily report e-mail to me. The kicker is it will need to be down with a cron tabbed bash script.For example, which linux based CLI client is best suited to be scripted?
I wanted to find and replace a string from a perl file. I have written a script in bash which runs the following command.
perl -pi -e "s/$findstring/$replacestring/" testfile where as $findstring = print F_WC_TMP"$line "; and $replaceString = $line = join ' ', split ' ', $line; print F_WC_TMP"$line ";
But when I am running the above command, i think it is replacing the $findstring with the above mentioned string and hence it contains a $line, it is looking for the variable $line and not finding the exact string. I am confused about how to search for a string that contains $ in it and replace it with another $string.
I am trying to fix a perl script, and I really suck at perl. But I think this problem will be easy for people who know it.
The problem is, I have an old setup script someone wrote many years ago. It fails if the standard shell is dash and not bash. The only way I've gotten it to work is to point /bin/sh to bash. I looked thru the script and it uses "system" many places, and I think that's the problem.
I searched for it and found this link:url
My plan is to include this function:
Code: sub system_bash { my @args = ( "bash", "-c", shift ); system(@args); } Then I could simply change all calls to system into system_bash and it should work?
The parameter to the system calls is usually some variable. What if the parameter is a list already? Do I need to test for it somehow, and if it's a list, prepend "bash" and "-c" to the list? How do I do that?
In the script there are lots of places like this:
my $error = system($cmd); if ($error) { die/warn "some error message"; }
Shouldn't there be a return in the system_bash function?
Just recently I started using a cluster in my school to run some heavy tasks, which might require long periods to get done. The thing is that I can only send one task after the previous one has been completed (cluster use policies) and since I don't know in advance how much time each task will take, I'd like to have a way of knowing when one given task is already done (other than keep checking it).I am allowed to submit the jobs in bash scripts, so that's what I have at my disposal.
I'm not sure if this is best done in Perl or Bash. I'm thinking surely someone else has created something close to what I'm looking for. The results of the script would be that someone would kick off "linux_hosts.sh" r whatever you want to call it, then a top "folder" of options (with hosts contained within each of these top menu choices), then, based on which number corresponds to that top level, they're presented with a set of linux hosts that are relevant to that top level name. Example:
$ linux_hosts.sh 1. VMware hosts 4. Private Domain 2. ESX servers 5. Red Hat boxes
I'm trying to send an email using mailx, in a bash script, but I can't get it to work. In the terminal I can, and this is how I do it:
Code: $ mailx johndoe@gmail.com Message^D
Within seconds of doing this, I get sent an email. The problem is with the bash script I'm trying to make. Among other things, I tried this:
Code: #!/bin/bash mailx johndoe@gmail.com < "Message" I honestly don't know what I'm supposed to do, and I've Googled a bunch of things too, and didn't have too much luck. Is there anyone who could help me out?
Edit: Figured it out. This is what I did, and it works for me:
Code: #!/bin/bash echo "Message" | mail -s "Subject" "johndoe@gmail.com"
In my Windows environment, I use email client such as Microsoft Outlook to connect to our email server to send email with the following configuration:
Incoming server (POP3): 995 - (requires with SSL) Outgoing server (SMTP): 465 - (use encrypted connection SSL)
[code]....
And the mail server requires user ID login and password.how do I setup a text command based email client in my Linux (Centos 5.1) to send out email through the existing email server above, which is in another machine? The email client has to be text command based because I need to use command line to send notification email from anothar application installed in my Linux (Centos 5.1) Since the email client will only be used to send email notification, I don't require setting up of an email server in my linux.
Using CentOS. I have a cron setup to run this command: Code: /var/test.sh | mail -s "Test Cron" mr182@somewhere.com The email is sent but the output of the script is not in the email body, it's just blank. I know there is some output because there are some echo statements in the script.I don't want to get an email for all cronjobs, just this one.
I have an email setup where all of the emails from our email host is downloaded to our Linux server using Fetchmail. Then some of the incoming emails are sent to an MS exchange server (server1.domain.com) using Postfix. What i want to do is to send a copy of all emails to Another server (server2.domain.com) for redundancy.Can postfix be configured to send copies out to both?
If I use "#!/usr/bin/perl" in the beginning of a perl script the script won't work if the script is at all complicated. Simple scripts like "Hello World" works.
But if I use "#!/usr/bin/perl -w" in the beginning all scripts work?
If I don't use the -w this is whats in the logs:
(2)No such file or directory: exec of '/home/test.net/html/cgi-bin/uh/meny.pl' failed
Premature end of script headers: meny.pl
When I use the -w in the script the error-log shows me this.
meny.pl: Name "main::http_path_cgi" used only once: possible typo at /home/test.net/html/cgi-bin/uh/meny.pl line 24.
I am new here and want to lern CentOS. Current I have installed CentOS 5.5 x64 and Perl 5.8.8. Now i have install Perl 5.12.1 which located to /usr/local/bin/perl. But how I can move it to /usr/bin/perl so root based on Perl 5.12.1?