quick guidance on running a bash script using Perl. The script is calledarchive_web.sh.The regular script I have and use now (archive.sh) creates an archive directory (uses the current date as the dir name) and moves all existing files there.The script is cronned to run every day before midnight. I'd like to be able to run this "archiving script"manually, from the web, at my own discretion, rather than necessarily wait till midnight (but want it to run then, too, in case any new files present themselves in the directory).These shell scripts sit in an .htaccessed directory. Talk about making up verbs all over the place.
How do I obtain information about the running x-server from a remote shell session? I want to know things like resolution, color depth, etc. My xorg.conf is basically empty. The only thing I can think of doing is to read the Xorg.0.log file which seems inefficient.
I thought that 'xrandr' displayed some text output but that behavior has changed (?). It seems to require an X display. Is there another way? Something I could incorporate into a shell script? (This is Fedora but that shouldn't really matter)
from nmap localhost | grep mysql command i will come to know whether mysql server is running or not.my problem is i want to test whether mysql server is running or not in shell script and if it is running i need to tell the user to stop the server to run the shell script.
I am new here and want to lern CentOS. Current I have installed CentOS 5.5 x64 and Perl 5.8.8. Now i have install Perl 5.12.1 which located to /usr/local/bin/perl. But how I can move it to /usr/bin/perl so root based on Perl 5.12.1?
i want to run/call shell script from perl, i used this system( "sh", "script.sh", "--help" );The 'script.sh' contains IPTables firewall script, but that's not run.i test 'script.sh' with a simple thing such create a chain only, but it not doing.How to run a shell script from a Perl program?
i used ldap for authentication.now i have a perl script that users use it for login(contact.pl),then this script calls a shell script(ip.sh) for create iptables chain.
when i call that into contact.pl ,it just prints: usr/sbin/iptables but ip chains didn't change.what is problem? whenever run ip.sh in bash it works correctly
I recently set up an automated shell script (bash on Ubuntu 8.10) to download new files from a server using ftp. Unfortunately the other end of the link is not terribly stable (and there is nothing I can do about this) which has resulted in the script hanging sometimes and then being kicked off again at the time set in the crontab.
This has resulted in multiple hung sessions taking up all the system resources.
The offending section of code is given below.
Code:
I'd like to know if there is a way I can force an exit if the connection hangs or alternatively if something like the Perl Net::FTP can handle these sorts of errors internally.
I want to compare the following two tab-delimited .txt files (both were subsets of the original files) by comparing Columns 3 and 4 simultaneously. It is easy to compare C3 because both C3s are just numbers. But how to compare C4s?Basically, in File1, "G,G" = G in File2, "C,C" = C in File2, "A,A" = A in File2, "T,T"= T in File2.In File2, A/T in Column4 just equals "A,T" or "T,A" in Column4 of File1. C/T in Column4 just equals "C,T" or "T,C" in Column4 of File1, and etc.
when I execute the command from the shell command line - it works and no error code.if I do the exact same command from a perl file - it fails with code 32512.the file is created from the same perl script that runs the command that fails. file permission is 0664.
I am trying to fix a perl script, and I really suck at perl. But I think this problem will be easy for people who know it.
The problem is, I have an old setup script someone wrote many years ago. It fails if the standard shell is dash and not bash. The only way I've gotten it to work is to point /bin/sh to bash. I looked thru the script and it uses "system" many places, and I think that's the problem.
I searched for it and found this link:url
My plan is to include this function:
Code: sub system_bash { my @args = ( "bash", "-c", shift ); system(@args); } Then I could simply change all calls to system into system_bash and it should work?
The parameter to the system calls is usually some variable. What if the parameter is a list already? Do I need to test for it somehow, and if it's a list, prepend "bash" and "-c" to the list? How do I do that?
In the script there are lots of places like this:
my $error = system($cmd); if ($error) { die/warn "some error message"; }
Shouldn't there be a return in the system_bash function?
is there a recursive shell or Perl script to delete files with the same name as the parent folder? i wish to include the starting folder name as argument to the script.
I'm using Ubuntu 10.10, and I just installed findimagedupes (plus dependencies) from the repositories. I get the same error message no matter how I invoke it:
Code: $ findimagedupes -R -- images/ Can't locate findimagedupes/C.pm in @INC (@INC contains: /usr/lib/findimagedupes/lib /etc/perl /usr/local/lib/perl/5.10.1 /usr/local/share/perl/5.10.1 /usr/lib/perl5 /usr/share/perl5 /usr/lib/perl/5.10 /usr/share/perl/5.10 /usr/local/lib/site_perl .) at /usr/bin/findimagedupes line 41.
I am having a shell script which runs perl jobs.The script is starting the perl jobs when it is executed manually from the command line , but when the same script runs from crontab it is not starting the perl jobs.I have these things in the begining of the script
I wanted to run bash and perl scripts which requires SU privileges by clicking on desktop Terminal window opens and closes fast without knowing what happened.
scripts work on terminal window by telling sudo perl file.pl sudo bash file.sh
Perl has this header #!/usr/bin/env perl or #!usr/bin/perl -w
Bash has header #!/bin/bash
How can I run them with desktop shortcuts with SU privilege so, the terminal will not close after execution? Should not the scripts work without telling perl or bash, since they have the header?
I'm getting something(s) wrong, trying to run commands (both simple and piped) in shells from Perl programs. The ultimate objective is to set up "copy X selection to clipboard" from urxvt but apparently simple debugging statements are not working.Here's the Perl, taken from here and modified to use xclip instead of xsel and with debugging added, shown in green
I wanted to run bash and perl scripts which requires SU privileges by clicking on dektop Terminal window opens and closes fast without knowing what happened.
scripts work on terminal window by telling
sudo perl file.pl sudo bash file.sh Perl has this header #!/usr/bin/env perl
[Code]....
How can I run them with desktop shortcuts with SU privilege so, the terminal will not close after execution?
should not the scripts work without telling perl or bash,
I am using ubuntu10.04-server 64bit AMD with fluxbox. After I ran Matlab in a shell (without GUI) the shell does not display characters anymore, but will execute any command, I just can't see the characters that I'm typing.. I use aterm and xterm, does anybody know why that is, am I missing a package?
I have a shell script that I want to be able to run from the terminal just by typing "topcat", regardless of where I am or what user I'm under. How would I go about changing the bash configuration files (if I have to) in order for that to work, for me and for the other users on the computer (I have root access)? As it stands right now, I have to type "/bin/topcat/./topcat" in order for it to execute
I am trying to access an aspx page which, when accessed with certain parameters in the URL, downloads a file. I need to do this with a shell script, rather than interactively. I tried using wget, but I get a response of 302 from the server, which redirects me to the default page and then downloads the default page html itself. I quickly tried curl which seemed to be doing the same thing. It works perfectly from browsers on either Linux or Windows. Originally, I had a problem with interpretation of ampersands in the URL, but I put quotation marks around the URL, so that isn't the problem now.
I cannot fix this on the server side, because the aspx page will be accessed on a variety of servers which are probably all set differently and which aren't under our control.
I've been having a bit of trouble running a shell script with cron. A friend of mine does a community radio show and the station has a live stream but no podcasts, so I've set up a script to record the stream and encode it as an mp3 while I'm away using mplayer and lame -that's what I'm trying to do anyway.
Here's the script, but it doesn't seem to run- at least, I don't see any of the files it should be outputing, would they be in the cron.weekly directory (where I have the script) or in my home directory?
#This is a script to record 'The Unnamed Show'
#it will record the show from the live stream, then convert the output #to an MP3
#Finally, it will delete any files no longer required HOME=/home/byron/
I have been playing around with shell scripting, nothing too complex just learning the basics. if i try to run a script as root (by entering "sudo" then the "command") it says command not found. i can only do it ass root if i specify the full path (/home/username/bin/command) im pretty sure the directory that my scripts are in are part of the superusers path.
I know that shell scripts can be used to execute more than one linux command at the same time. But can the same be performed in an e-mail (i.e) whenever i open that specifice-mail, the linux command should execute.
Im having some difficulties with running a shell script from cron which I am unable to resolve for almost 2 days now.
For testing purposes, im trying this with simple shell script test.sh with chmod 777
This script is located at /var When I type /var/test.sh it runs perfectly and prints asdasdasd When I type /var/test.sh > /home/log it writes asdasdasd to /home/log - works
The problem occurs whenever I add it as a cron job to var/spool/cron/crontab/root
There is: 11 10 * * * /var/test.sh > /home/log - however, at 10:11 there is no file at /home/log
Cron as a service works, forexample, every day at 4 am it makes this backup sshpass -p xxxx rsync -avz -e ssh root@x.x.10.7:/data/backup/ /home/backups/isp_admin
I am using Shell Script to run my Java program. I have written a Cron job to invoke Shell Script every day at 7pm. Cron job is running every day at 7pm ,but it is not invoking my shellscript and also I am not getting any error message. I am able to run same shell script from cmd propmt using bash, and also I am able to invoke the same script from mainframes Universal command job. Same ShellScript and Same cronjob is working fine in my Dev server. But in my QA server it is not working.
But there is problem here. I have to close and open Internet Explorer to see my changes effect on iptables! Is there a way that I don't have to restart the IE?