What am I doing wrong here ? Shredding a directory with files is incredibly slow. If I create a file of similar size it is around 1000 times faster. Filesystem is ext4, OS is Slack current. Looking at top it shows shred's status as 'D' which, according to the man page, means uninterruptible sleep ! Also /usr/bin/time shows around 10 minutes but that time was printed on stdout approx. 7 minutes before the command prompt reappeared !
I'm testing some multi-plat java code and I'm getting a bit frustrated with the Linux tests. I need to run the command: Code: $ java -jar /home/developer/TCO/TabletComicOptimizer.jar <file> <args[]> against all the files that match a specific criteria. I've tried various find syntax and I can't seem to get it right.
Normally I would just create a bash script and populate the results of find into an array and then just enumerate the collection but in this specific case I want to demonstrate this operation at the bash terminal.
I've tried things like: Code: ~/TCO $ find . -type f -iname "*.cb[rz]" | xargs java -jar TabletComicOptimizer.jar {} 1200x1800 ; Thinking that the {} is the substitution for each file returned by find but it's not working. How do I execute my java program against each result in the find operation?
I will eventually script this but wanted to know where I screwed up when trying to get this oneliner to work. In a nutshell, I want to created a backup directory, find any files that have changed in the last 48hrs and them copy them to the newly made backup directory.
This hard disk is an old Maxtor (60G) extracted from an old computer, working fine. I placed in an enclosure and connected it to my Linux Box (Slack 12.1). It was recognized immediately and mounted automatically. However, before giving away the hard disk, I tried to wipe the data by running shred /dev/sdf At some point, I realized that it was going to take a long time, and aborted the process and disconnected the enclosure. Now, none of my Linux boxes recognize the disk inside the enclosure. Here is the output, when I plug in the usb cable, while using systemrescuecd on my desktop computer:
Quote: Aug 20 00:58:17 sysresccd kernel: [ 3582.116029] usb 2-6: new full speed USB device using ohci_hcd and address 12 Aug 20 00:58:17 sysresccd kernel: [ 3582.316054] usb 2-6: not running at top speed; connect to a high speed hub Aug 20 00:58:17 sysresccd kernel: [ 3582.322051] usb 2-6: New USB device found, idVendor=04b4, idProduct=8613 Aug 20 00:58:17 sysresccd kernel: [ 3582.322054] usb 2-6: New USB device strings: Mfr=0, Product=0, [Code]....
Is there a way to use this hard disk at this point?
Consider the following: mount | grep home type reiserfs rm -Rf /home/user/over_9000_little_and_big_super_secret_files/ # oops, I should have shredded it instead.
How can I properly and securely "initialize free space" to ensure that no additional info can be restored restored by digging in free space (Preferrably without stopping or disturbing the filesystem much.) Is dd if=/dev/frandom of=/home/qqqqq really secure for this (tails, journal, etc.)?
I considered making my system run the following if an incorrect password is entered 10 times in a row or a specific dead-password is entered.Code:shred /home/.ecryptfs/$USER/.ecryptfs/wrapped-passphrase.Because ext4 doesn't journal the contents of the file, only the metadata, the file would be shredded and it would be impossible to recover the encrypted home folder even with the password.Is there a simple way I could make GDM check this or would I have to patch and recompile GDM for something like this to work?
I don't use the Trash bin because it does not really delete things,speaking from a security point Instead, I gotten used to 'shred' and 'secure-delete' .But to move around files, cut-n-paste is very handy.And I was wandering if items from the Clip get stored somewhere ?i realize that they get overwritten again and again in the clipboard but do they also get stored somewhere else?
I use nautilus-action to create a right-click shred command with these parameters.: -f -u -v -z %M
I thought the -v would give me some feedback as well as an "are you sure" dialog before deleting. But when I run Shred it just deletes the file without any feedback and no confirmation beforehand.
How can I get a confirmation prompt to occur before shredding occurs (to prevent me from accidentaly shredding something--sometimes I click the wrong item cause the mouse shifts last second).
Any why don't any icons ever appear on my context menu even though I'm assigning icons?
I am really new to linux and probably getting over my head but got to start somewhere. with making a shell script to run on puppy linux to shred all drives connected to it. I have been able to successfully use the command to wipe all drives connected but cant figure out how to write a script for it. I have several hard drives i got and want to shred wipe them before using them..
It is possible to not only have the progress of the shredding, but the time elapsed/remained/whatever as well? It would be great to be able to see the time elapsed/time remaining along with the percentage complete.
I often want to extract some info using awk from a variable/filename while running other things using xargs and sh. Below is an example: Code: ls -1 *.txt | xargs -i sh -c 'NEW=`echo $0 | awk -F'_' '{print $1}'`; echo $NEW' {}
In the above case I would like to grab just the first field from a filename (delimited by '_') from within an sh command. This is a simplified example, where normally I would be doing some further data processing with the sh command(s).
The error message that I get is: Code: }`; echo $NEW: -c: line 0: unexpected EOF while looking for matching ``' }`; echo $NEW: -c: line 1: syntax error: unexpected end of file. I haven't been able to figure out how to escape the awk command properly.
I'm trying to copy a list of files except the files which has ".log" in the filename to another folder.I can run it correctly when I am located in the Source folder, but not when I am in any other location.cd /home/me/Sourcels /home/me/Source -1|grep -v "^.*log$" |xargs -n 1 -iHERE cp -r HERE /home/me/DestinationHow can I indicate both Source and Destination Folder?
I used the command shown below to remove a list of software using yum. It worked, but is there a way of doing this without using the -y option? I would like to review the results before the transaction takes place. I would like to use the same method for installing additional software after a clean install. cat filename | xargs yum -y remove
I want to insert a picture into a group of pictures (I memorized their names in a text file) resulting a new group of picture (which I also memorized their name in another text file), but I have a problem doing that. I want to write something like that:
I get this behavior on Slackware 13.37, which includes BASH 4.1.010. Yes, BASH is my shell. I have a file called a.flac and I'm in the directory that contains it.
The output of the ls command is expected: Code: ls *.flac gives: Code: a.flac
Removing the extension with basename works as expected: Code: basename a.flac .flac gives: Code: a
Putting the above command in a variable substitution works as expected: Code: echo `basename a.flac .flac` gives: Code: a
Using xargs with ls and a variable substitution works as expected: Code: ls *.flac | xargs -i echo `echo {}` gives: Code: a.flac
However, when I try to add the basename command to the above command, it stops working. Code: ls *.flac | xargs -i echo `basename {} .flac` gives: Code: a.flac
Whereas the result I expect is: Code: a Why is it not working, and how do I make it work?
I'm trying to find all zip files timestamped from the past 7 days, then unzip them into a different director.I tried the following, but it only unzipped one of three files that meet the 7 day criteria. What am I missing?Code:find /home/user/public_html/zip_files/ -iname "*.zip" -mtime -7 -print0 | xargs -n10 unzip -LL -o -d /home/user/public_html/another_directory/
I would like to ask the following: 1) ls -l |grep test -> this will grep every "ls -l" output line 2) ls -1 |xargs grep test -> this will grep every single file with test 3) ls -1 |xargs echo -> this will echo directory list 4) ls -1 |echo -> this does nothing!!!
My question is: how some command can receive input from "both sides" (grep can grep whole output or every single file - xargs, the same is for i.e. wc command). 4) echo does nothing (it's a single echo command).
I have an awk program that finds all files of a specific filename and deletes them from selected subdirectories. There is logic in the awk to avoid certain subdirectories, and this is initialized via a parameter in the beginning statement of the awk. The parameter should have all of the subdirectory names at the top level. This varies from time to time, so I cannot hard-code the value.I'm having a problem initializing the awk parameter using sed. I'm setting a variable (named subdir) using an "ls" command piped to "xargs". I'm then trying to substitute that value into the awk using the sed command.
I have the following spec: - Intel Core 2 Duo 1.86Ghz - 4GB Ram - 1TB Hard Disk - 256MB Nvidia GeForce Graphics Card
I believe my system is quite fast, yet programs like JDownloader and Vuze (aka azureus) are very slow. More so on the GUI side, not so much program functionality (although I have not tested this either). If I scroll down a list of items, it takes ages, it lags and response time is quite bad.Also, I have two java processes running, each using 200MB of memory. I am assuming one is for Vuze and the other is for Jdownloader. It seems to me that Windows XP/Windows 7 ran java applications much faster and had less ram. I definitely see that with Vuze as it never went into the 200MB area when under windows.
I'm an Ubuntu user since Jaunty, and I've always upgraded my system (NOT fresh install). Everything went fine, but yesterday I upgraded to Lucid. My only concern -for now- deals with startup time. I'm a desktop user (Core2@3GHz) so I think I should boot in less than 10 seconds. Anyway, boot time is 30 seconds - not too much, but there is definetly something wrong with tools I don't know (ureadahead, plymouth, etc.). Attached is my bootchart: can anyone explain me what's wrong?
Also, I don't even see a plymouth Ubuntu themed bootsplash: I only see a blank (black) screen standing for seconds, then I see the bootsplash for less then half a second, then GDM appears :S Not crucial -I know- but how can I fix it? (I don't know if it's related, but I can see the animation at shutdown)Finally, GNOME desktop takes too long to load. I don't know why, but there are 15/30 seconds in between login sound and a usable desktop (with panels and icons, I mean).Please help me, I don't want to do a fresh install. Boot speed is not a dial with desktops - I know - but it can be a symptom that my system is a bit a messy (and I don't like it, since I installed Jaunty less then 1 year ago). (!Forgot! I also installed grub2 by hand
Whenever I transfer a movie into my 16GB USB flash disk, my whole system becomes windows-like and unusable!
When i drag the file(s) into the USB disk folder, it starts out fine and pretty darn fast (25mb/sec) then slowly decreases until it's unbearably slow (3m/sec) and as a side effect my whole system starts deteriorating. I basically have to wait for the file to finish transferring before i can use my desktop again!
This has been happening with every version since Karmic (all 64bit)- I put up with it because I don't use the USB stick that much.. but lately it's been my go to source for transfering large files to/from work.
I have good experience in microsoft enviroment, now tiring to use linux, i tried Ubuntu 9.10, OpenSuse on different computers bur there is same big problem: Very slow download speed compared to microsoft.same file at same time downloaded by microsoft winxp toke incomparable short time. for example file 5.5MB attached to e-mail on Yahoo toke ~1minute to download on winxp computer,same file at same computer but with Ubuntu takes more thane 30minutes!
My wireless seems to be fast for a good 30secs then bang takes good while to load the next page almost as if it's disconnecting and then reconnecting/scanning reconnecting. Why cant it stay connected. I have WAP PSK security here is my network setting please let me know if I should change any of them:(side not is there a way to fix this problem occuring so frequently it says on the wiki that it should only occur once in a whilce https:[url].....
I have 4 Linux machines with cluster.My target is to find all kind of IP address (xxx.xxx.xxx.xxx) in every file in the linux system remark: need to scan each file in the linux system and verify if the file include IP address if yes need to print the IP as the following
I am trying to do a find/grep/wc command to find matching files, print the filename and then the word count of a specific pattern per file. Here is my best (non-working) attempt so far:
Is there a way to specify to find that I only want text files (and not binary files)? Grep has an option to exclude binary files, so I thought find probably has a similar feature, but I've been unable to find it.