I would be running SQL commands (UPDATE/SELECT) from within my bash script. I am completely new to this subject. Is MYSQL used for this purpose? Alternatively, what is sqlplus?
I am running a simple script that I copied from slug.ceca.utc.edu/docs/2009-3-26-linux-server-health.pdf and edited with the names and paths of my own servers. I don't know much about scripting (re: nothing) but I wanted to try and be efficient in my new role as a Linux Sys Admin. The script was saved to root's home directory and runs as part of root's crontab once a week. The script runs with no problem, but it doesn't actually seem to run all of the commands contained within. It skips some in the middle and the end and I don't know why. The script itself is this:
What happens when the script executes is that the ssh connection works and parks me at the remote hosts's shell login. Therefore, the "firefox" command refuses to execute. I need to know how to make the "ssh" connection occur, stay open, and go into the background so that the rest of the script can execute.If I could also do this with the "firefox" line so that the entire term window could be closed would also be helpful.
This is a really odd bug I can't seem to figure it out. Basically, commands like ls can see all the files in the current directory, however when I go to execute the file it will give errors like "file not found", even when it most obviously is. If you look at my command history in the screenshot, you can see I can ls into a directory and see it's contents. When I try to run the file, I get the "no such file or directory" error.
However, if I type simply 'vm', I can't use tab completion to complete the directory name, and my third command is me typing 'vm' and hitting tabtab, it lists a bunch of vmware specific tools instead of the subdirectory name. I can then ls and see my current directory contents, and it will list only the single subdirectory. However, then I tried to use the full filepath from root to run the file, still to no avail. If anyone has any insight,
is there any way I can pass commands to the CLI of a tool directly?
I would like to script some actions, for example:
./OpenBTS < "tmsis"
I do not need to retrieve the results (I watch it in the log file). how I could realize that? There is now way to do this using command line parameters, at least not that I found out. So it looks like I have to figure out sth myself. Maybe I could automate screen in a way to detect the prompt and "paste" my command there. Are there tools for this on Linux?
I made a script that contains repetitious commands (snmpget and awk are the only ones at the moment. Running these commands from standard terminal work, but when run within a script, I get:
./reg_sm_count: line 10: snmpget: command not found ./reg_sm_count: line 10: awk: command not found ./reg_sm_count: line 10: snmpget: command not found
I'm creating a bash script that contains the following line:"ssh user@$server1 cd /tmp; pwd"What I want is to print /tmp of server1, but the script it isn't printing that
I'd like to add custom startup commands (for example starting a process, registering to a registration server, downloading a configuration file) to the Linux startup process. Those commands should be triggered on startup only. What is the standard/appropriate way to do this?
EDIT: Is /etc/profile the right place to trigger such things?
I am running a shell script from a rc file in Linux. The shell script is going into a loop which runs for 8 hours. Now I want to prevent the shell script from running when Linux boots or I need to find a way to kill the shell script when it is running. I tried using killall, kill $! and Ctrl+C etc. Nothing seems to work. Can you suggest a way out. I am new to Linux.
Recently I gained interest in running command from the terminal, like rhythmbox-client --play-pause and vlc --open, but I could not find the vlc's pause command under vlc. there's a way I can have a terminal display the commands that run when I do some action. For example, when I click on pause in vlc, the terminal should show me what command it used to pause vlc. What's the closest ting I can get to this?
I wrote a simple bash script to let me treat any set of programs like a deamon. For example if I configure the script a certain way I can start/stop/get the status of apache, mysql and php all from one command. I am having a bit of a problem though. I am passing commands as strings to a function and then depending on the arguments to the script it might run one of these commands or another. Some of these commands need to beun in the background though, such as deluge-web. When I send "deluge-web &" to the function and it execute it deluge-web does not start in the background. I can't figure out why this is. I have tried escaping the & with ''s and with a , but nothing seems to work. I know that this is some idiotic thing that I am overlooking, but I am a bit stumped. Here is the script configured to start/stop/get status of deluged and deluge-web.
$ execute_some_long_command <command is executing> <Accidently press middle button that inserts bunch of garbage (including, for example, `rm -Rf ~/*`) into console>
How to let execute_some_long_command finish, but not execute inserted things?
I've found myself using the -v flag for lots of applications less and less (especially for trivial stuff like tar and cp). However, when I did and I was, say, unzipping a large file, it would take longer than when I didn't use the -v flag.I assume this is because the terminal has to process the text and I'm filling up whatever buffer it might have. But my question is, does this make the application actually run slower or does it complete in the same amount of time and what I'm seeing is the terminal trying to catch up?
In a script I am writing I am trying to add logic so that the script can figure out if a remote server uses rpm or dpkg and then run the appropriate command to print a list of installed packages. This works locally, but I need to get it to work through SSH and I have no idea how to do that. The relevant portion of the script is below. It would also be nice to find a way to not need the full path to the executables but I'm not real concerned about that.So anyone know how to make this code work via SSH?
Code: if [ -x /usr/bin/dpkg ]; then dpkg --get-selections
I need to process billions of small files using bash shell commands with limited memory size (256MB). If any of those files contain certain "keywords", the file will be removed. I tried with command:
#!/bin/sh su et cd "media/ET" export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:.
[code]....
I want ET to be run as the user "et" and for some reason, I can't directly su/sudo to run the file without being in the user "et" and the "/media/ET" directory.
I have a hardware audit script I want to run on several remote machines around my office. Is there a way to run the script that resides on my machine via ssh or do I need to copy the script to the local machine and then run it...
I know I have to count how many instances are running: ps x | grep apache2 | wc -l result if it's running: 2, or else: 1 I also know there is a command called test that I could use to perform the verification, but I don't know how to use test with wc
so I wrote a small script that pretty much just takes in two numbers and counts from the first to the second, e.g.
unknown-hacker|544> count.sh 1 3 1 2 3
My problem is I want to make it so that if you input invalid parameters, such as non-numerical characters, more than 2 numbers, etc., you'd get an error message
I've been trying to write a bash script called runSorter.sh that runs an executable that also takes in some parameters and outputs the results to a text file. The executable, sorter, takes in a number parameter. I want to make it so that you can input as many number parameters into runSorter.sh as you want and it will run the sorter executable for each one. So far, what I have looks like this:
#!/bin/bash args=("$@") INDEX=0 if [ -z args ]; then echo "Error" else while [ $# -gt $INDEX ]; do NUM=${args[$INDEX]} echo $NUM echo ./sorter $NUM let INDEX=INDEX+1 done fi
My problem is that when I run ./run-sorter.sh 100 on my terminal, it just prints this to the screen: ./sorter 100 How can I have so that it properly executes sorter and outputs everything to a text file?
I am a newbie in linux. I tried to write an autorun bash script on /root and select System-Preferences-More Preferences-Sessions-Startup Programs-Add.But the bash script seems like not working. Following is my bash script: