Programming :: Awk Commands Inside Of A Bash Script ?
May 4, 2011
im pretty sure this is a remedial task for many of you but im having an issue with arrays from a shell script being accessed in an awk command. im pretty good with shell scripting but i am embarrassingly unfamiliar with awk. so here's the meat of the script...
Code:
I am trying to take an input file of ip addresses and corresponding netmasks and put it into a format to be loaded onto a juniper switch. the result should look something like this.. x.x.x.x/netmask using the cidr notation. no matter what subnet is provided though, /32 always gets appended to the end of the ip even when it should be /16, /24, etc... also, the cisco part works fine so that doesnt need any attention.
I'm trying create script to manage one mysql database, including new db and user creation. But I'm not able get it working when I put SQL commands into function. So I create simple script for testing which is still not working
Writing script to create backup of file by adding datetime to file name. Basically test for file presence if there, cp with datetime then rm original cp works fine from command line but get cannot stat `full path to file': No such file or directory
Code:
Here are the errors: cp: cannot stat `~/html/CVP_dadamail/.dada_files/.logs/errors.txt': No such file or directory rm: cannot remove `...': No such file or directory
The for statement is a placeholder as I have same file to backup out of several directories. using "bash -x scriptname" -OR- inserting echos, I can see I've constructed the strings properly. Believing it might be related to the hidden directories, I tried setting the shopt "glob" options to no avail.
Ultimately I'll add the other directories to the for loop and then run this from a cron job, so if you see potential pitfalls knowing I'm headed in that direction...believe construct would be
I'm trying to create an SSL certificate and answer the questions inside a bash script. The command used to create the SSL certificate
Code: openssl req $@ -new -x509 -days 365 -nodes -out /etc/apache2/apache.pem -keyout /etc/apache2/apache.pem The first question asked is. Country Name (2 letter code) [AU]:
want to set more text files. They have "tab" differently (3, 4, 6 or 5 characters space).I have to use "sed" or "awk" sette them in the same tab (for example five space haracters).
I have a command which on the command line needs to look like this
rlam -if3 '!pvalue -H image1.jpg' > image2.jpg
Nevermind what rlam or pvalue do ... they are part of a program package I am using. The above command works on the command line, and also when written verbatim in a bash shell script.
My problem is: in the script I wish to replace image1.jpg with the content of a variable, e.g.
IM1=image1.jpg
How to I get the script to insert the value of $IM into the command when the pvalue part of it needs to be quoted?
need to modify some scripts to repeat the commands in them until a variable returns a proper value. I need it to add some redundancy to some scripts i use to upload files to a remote server.This is an example of a portion of those scripts:
I am running a simple script that I copied from slug.ceca.utc.edu/docs/2009-3-26-linux-server-health.pdf and edited with the names and paths of my own servers. I don't know much about scripting (re: nothing) but I wanted to try and be efficient in my new role as a Linux Sys Admin. The script was saved to root's home directory and runs as part of root's crontab once a week. The script runs with no problem, but it doesn't actually seem to run all of the commands contained within. It skips some in the middle and the end and I don't know why. The script itself is this:
I have read where C is first converted to Assembly before its final compilation to binary. Is there a way to do this with Bash commands? I would like the understanding that Assembly allows to Bash somehow.
#!/bin/sh su et cd "media/ET" export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:.
[code]....
I want ET to be run as the user "et" and for some reason, I can't directly su/sudo to run the file without being in the user "et" and the "/media/ET" directory.
i am working on some kind of PBX and i have list of telephone numbers inside a file, i have to insert these numbers into the correct command and then telnet to a remote server and execute these commands. i can read the telephone numbers and insert them into the command with no problem, but when i try to insert these commands into the send i face problem. here is the basic code
i can make external loop inside the Bash which read the input file and issue the command and then telnet and execute, but this will make the script connects and disconnects again for each line which cause high load on that server and hardwar problem. i am wondering if there is an option inside the expect interperter which makes the send read directly from a file... somthing like this:
I've noticed something, and hoped there was a work around.when I write a simple bash script, and run it, if I close the terminal i ran the bash script inside, the bash script stops. What are the solutions for this? Basically I want to run my bash script and close the terminal, keep the bash script running.
I have trouble with using an alias inside aash function. I would like to ssh into multiple machines by executing:ssh machine To achieve this, I put something like the following into my ~/.bashrc:
I know that using alias I can run a whole command with a shortcut. But my requirement is to use parts of a long command and in between I have to pass some user defined values. E.g. Suppose I have to routinely copy a directory to another remote directory on a remote machine.The remote machine name is quite long as well as the directory path to which I want to copy the files into.So the command to do scp would look like this[URL]Now I want to do some sort of aliasing (say "ecp") so that I just need to pass the source_directory name and the ecp command and do my job
What happens when the script executes is that the ssh connection works and parks me at the remote hosts's shell login. Therefore, the "firefox" command refuses to execute. I need to know how to make the "ssh" connection occur, stay open, and go into the background so that the rest of the script can execute.If I could also do this with the "firefox" line so that the entire term window could be closed would also be helpful.
I'm having problems with bash quoting. Maybe someone can tell me what's going on.. Basically, I need to create a command line inside a bash script that contains arguments that contain spaces and bash variables that need to be expanded.
I am attempting to write a backup script that will do the following:
1) lock and flush tables on a mysql db 2) dump the db to a file 3) unlock the tables 4) rsync the file to offsite storage
It all seems to be going well. However, obviously I don't want to setup ssh to the storage server on another network as the root user without a password. so I am attempting to su as the backup user inside of the script but when I try to run the script everything happens as it should until I try to so.. then it jumps out of the script .. akss me to login as the backup user.. proceeds to rsync to the offsite storage it does all this and then resumes execiting the script. it is not going to be setup as a cron job. it will be executed manually. assuming that is the case, how can I get the script to run without prompting for a password?
Here is what I've come up with so far... assuming that the script is run as root and the identity of the backup user will need to be assumed inside the script without perstering the user to enter the backup user's password.
I'm setting up a scheduler to run some bash script commands but they won't run when I point them to a script file. If I change the cron to call
[code]...
If I run ./writeTimeToLog from the terminal - it, well, writes the time to the log file! I then use
[code]...
to test I can schedule this to run every minute just so I can see it working. the entry was a basic as I could make. It adds the cron successfully but never seems to update the file. Where would an error be put if one occurred.
I would like to be able to connect to a machine, list a directory, wait long enough for me to see the results then move on to the next machine.This is failing:
I would be running SQL commands (UPDATE/SELECT) from within my bash script. I am completely new to this subject. Is MYSQL used for this purpose? Alternatively, what is sqlplus?
Our CentOS 5 server had a weird issue last Friday. We couldn't run any bash commands, such as ls, vi. It said that it could not find /bin/ls. The only commands we can run are internal commands, such as ps, cd. After we reboot the server, everything is back.
I don't know what's wrong with it. Does anyone give me a explanation?
I need to launch a bash file in Linux from an unprivileged user session, file that will run bash commands as root. But I do not want to create an user with root privileges to do that.
This is a really odd bug I can't seem to figure it out. Basically, commands like ls can see all the files in the current directory, however when I go to execute the file it will give errors like "file not found", even when it most obviously is. If you look at my command history in the screenshot, you can see I can ls into a directory and see it's contents. When I try to run the file, I get the "no such file or directory" error.
However, if I type simply 'vm', I can't use tab completion to complete the directory name, and my third command is me typing 'vm' and hitting tabtab, it lists a bunch of vmware specific tools instead of the subdirectory name. I can then ls and see my current directory contents, and it will list only the single subdirectory. However, then I tried to use the full filepath from root to run the file, still to no avail. If anyone has any insight,
is there any way I can pass commands to the CLI of a tool directly?
I would like to script some actions, for example:
./OpenBTS < "tmsis"
I do not need to retrieve the results (I watch it in the log file). how I could realize that? There is now way to do this using command line parameters, at least not that I found out. So it looks like I have to figure out sth myself. Maybe I could automate screen in a way to detect the prompt and "paste" my command there. Are there tools for this on Linux?