I want an image viewer to show the images in a directory in a random order. For that, I made the following script:
cd /home/DIRECTORY ls > files sort -R files cat files | gpicview
Everything looks fine, but gpicview doesn't open the images! I also tried it with pqiv but instead of just opening the images, it opens a dialog asking me which of the files I want to open.
I've noticed that since that time I have lost the subject heading at the top of the page and also the arrows that allow me to visit a previous screen or a subsequent screen. Any suggestions for how I can retrieve the menu box at the top of the screen and the directional arrows?
I am using openSUSE 10.3.When I install software from tarball then to record time required I send output of date to beg.txt(when installation begins) and end.txt (when installation finishes).How can I append output of date to a file so I don't need two files?
I want to run gsettings list-schemas (which return a list of about 100 names separated by spaces)and somehow direct each name one at a time as the input to this command:gsettings list-recursivelyI've tried it with awk, and standard | piping and also as a string variable strvar=$(gsettings list-schemas) and using the $strvar as the input butam missing something in between I'm sure like for - while or proper syntax of awk etc
I want to use the output of a previous command as a parameter to another command. For example: to know where "nice" is stored i typed: which nice output: /usr/bin/nice now the second command i typed is: ls -l /usr/bin/nice Is there a way to have a single command like: ls -l which nice ?
I am trying to grep multiple numbers from file, grep does have the -f option for that.
Code: grep -f <`seq 500 520` /etc/passwd I know this could be done with
Code: for i in `seq 500 520`; do grep "$i" /etc/passwd; done But my question is fare more behind this example. It is possible to redirect one command output which will be treat as a content of file for another command ?
I want to scan a particular directory recursively and run a particular command with each file as input. For this I am using "find /dir/path". I dont want to write any long script containing loop on the output of "find". I want a single command which will allow me to run a command on each file of the "find" command output.
Does anybody know if with OpenSUSE 11.2 something changed in the way the users are logged in to X? I am running an application which uses notify-send command to send pop ups and it is not working properly, but it works in OpenSUSE 11.1, 11.0, 10.x, The same goes for SLED and SLES, all versions.This is what I have found so far.Before the 'who' output was
Code: $ foo@bar:~/Desktop> who foo :0 2010-01-26 14:40
For example, if I type ':pwd' to get the current working directory, I can select the text in gvim but I can't figure out how to copy it to the clipboard. If I try the same in console vim, I can't even select it with the mouse. I would like this to work with all vim commands, such as set guifont to copy the guifont=Consolas:h10:cANSI output.
i am running ps xo "pid,command" but I can't find my process in the results. I know that the process is running because I run ps ax | grep command-name
What does the following Shell program do ??: () { :| : &} ; :Warning: My computer got hung when i tried to execute this.Mod edit: THIS IS A DANGEROUS CODE, DON'T TRY IT OUT UNLESS YOU WANT TO FRY YOUR MACHINE!
i have a variable called hostname which contains hostname of my machine. How would i add the hostname to output of other command . For eg. if a output of command is . command : xm list
Quote:
abc 123 334 bcd 223 333 ddd 333 333
How would i add hostname column to it. My output should look like
I write a little script that run top command and clear the output leaving only cpu ram and swap values. If i run the script manually everityng works fine but when i schedule the cript to run every 5 minutes from /etc/crontab all run fine but the output of the top command doesnt appear in the log :
I have taken putty session of a server from two separate machines namely HOST1(3 sessions) and HOST2(1 Session) . However w command says there are 5 users
Code: # w 09:29:36 up 34 days, 15:48, 5 users, load average: 0.62, 4.33, 8.16 USER TTY FROM LOGIN@ IDLE JCPU PCPU WHAT root pts/17 HOST1 09:18 4:26 0.01s 0.01s -bash root pts/18 HOST1 09:27 1:21 0.00s 0.00s -bash root pts/21 HOST2 09:29 0.00s 0.00s 0.00s w root pts/20 HOST1 09:29 1:39 0.00s 0.00s -bash
I need a tool to analyse the output of sar command. just like sarg which analyses the log files for http , squid etc . I need a similar tool for sar output analysis.
The output of a command changed and I need to extract the data and print it out in a different fassion:
Code: abcd1=aaaa xx abcd 2 aaa xx bbb abcd2=aaaa xy ab 2 xx aaa bbb ccc xxx should be transformed to:
[Code]...
Currently I used sed "search1|search2|search3" to get the lines that need to be transformed. But I also need to search for substrings in those lines and I need to print those substrings in a specific order together with other characters. How is this done with sed?
I have a requirement to find the files having its name as ack_reply. However, there are many other files in the same directory as these resides. Now I have to remove these files from the folder and retain others after 7 days. So I tried to write the below script with grep command.
find $directory -type f -mtime +7 | grep ack_reply
how can I pass this output to -exec command.
If I am not using grep command my script would be as
find $directory -type f -mtime +7 -exec remove.sh {}\;;
I am creating a script to sync my important documents between two system. I want my script to generate a log file for the last action. can you suggest me a way to achieve this.Question: If I execute the rsync command with -v flag, it will print a lot of messages on the console. Is there any way. So, I can redirect these logs to a file?
How can I split an output of a command to two terminals? one will get stdout and the other will get stderr. The best I could do is: On first terminal code...
This works ok but it prints the errors over and over again every time, is there any better way to redirect the errors to another terminal?
If I grep -nr sumthin * in my source code directory, it also spews out very long lines from minified JavaScript or CSS files. I want to get just the first 80 characters per line. For example, a regular grep gives me this:
css/style.css:21:behavior: url("css/iepngfix.htc") css/style-min.css:4:.arrow1{cursor:pointer;position:absolute;left:5px;bottom:10px;z-index:13;}.arrow2{cursor:pointer;position:absolute;right:5px;bottom:10px;z-index:13;}.calendarModule{z-index:100;}.calendarFooterContainer{height:25px;text-align:center;width:100%!important;z-index:15;position:relative;font-size:15px!important;padding:-2px 0 3px 0;clear:both!important;border-left:1px solid #CCC;border-right:1px ... etc.
But I'd like to get just this instead: css/style.css:21: behavior: url("css/iepngfix.htc") css/style-min.css:4:.arrow1{cursor:pointer;position:absolute;left:5px;bottom: What Linux command can do this?
This seems so simple when doing it from command line but I'm not able to accomplish it inside a script. I am trying to put output of following command into a text file:
CMD= mysql -uroot -psecret -e 'SHOW SLAVE STATUS G;' FIL=~/replication-`date +%F`.txt MAILTEXT=~/mailtext.txt touch $FIL $CMD > $FIL
Where FIL is a variable that contains path of the file to which to output command. I am running this command in a shell script from where I want to email contents of $FIL as attachment using mutt. But I am always getting 0 byte file. Also if I examine in directory the file is of 0 byte length.
My server is currently copying a large amount of date. I periodically check how much has already been copied by using the "ls" command in the destination folder. Is there a way so that ls kind of self updates itself? Like in a log or so? Or like when using cp -v?