Programming :: Formated Output Of All The Files Under A Particular Directory?
Jul 7, 2010
I wanted formated output of all the files under a particular directory. I am trying to use find.Something like find -P ./ -type f -name '*.cpp' -printf "%p "I want all the files with specific extension like .c .cpp .h to be printed out separated by space. One more thing i want is absolute path names instead of relative.
View 5 Replies
ADVERTISEMENT
Nov 14, 2010
There are millions of files in many directories. Wherenver i try rm * or find or use xargs, they say 'argument list too long' and exit. How can i deleted files in a directory with so many files without deleting the directory itself.
View 3 Replies
View Related
Jan 3, 2009
After i try to find logfiles follow date/month/year. i want copy this files to another directory with name's directory is time you find(date/month/year).
View 4 Replies
View Related
Mar 24, 2011
I want to have the output of a program go to 2 different files but not going to standard out. Is there a way to do this in bash? I know that in Z shell its really easy. omething like: Code: echo "test" >> file1 >> file2 Would work. But in Bash it doesn't seem that easy. I know that tee will send the output to 2 files but it also sends it to STDOUT.Something like:Code: echo "test" | tee -a file1 file2 Would put the word "test" in file1, file2, and STDOUT. Is there a way to just send the output to file1 and file2?
View 2 Replies
View Related
May 21, 2011
I have a huge database of students, I would like extract these data and write to individual file for each students.
I am running a loop in shell program (.sh file), the output of each run in the loop need to redirected to a file with variable name.
I tried the following line, but it did not work, where BodyMsg is the data and Rollno is the students roll number.
echo $BodyMsg > $RolNo".html"
View 5 Replies
View Related
Jan 26, 2011
I have wrote a 1 line command that parses a file, locates the IP Address in the file and then trims the output the way I want it, and then sorts numerically and by uniqueness and then >> appends to output.txt
I can get all the IP's into 1 file "output.txt", but what I am really looking for is some type of way to create a text file, for each IP it finds labeled xxx.xxx.xxx.xxx.txt and also put that ip address into that file..
xxx.xxx.xxx.xxx = the ip address it finds
View 14 Replies
View Related
Oct 26, 2010
To redirect standard output to multiple files:
Code:
echo Test | tee file1 file2
My problem is that the word "Test" still displays to the screen? I want same effect as:
Code:
echo Test > file1
but with multiple file redirection.
View 3 Replies
View Related
Aug 23, 2010
I am trying to develop a method of reading files generated by other programs. I am trying to find the most versatile approach. I have been trying bash, and have been making good progress with sed, however I was wondering if there was a "standard" approach to this sort of thing. The main features I would like to implement concern reading finding strings based on various forms of context and storing them to variables and/or arrays. Here are the most general tasks:
a) Read the first word(or floating point) that comes after a given string (solved in another thread)
b) Read the nth line after a given string
c) Read all text between two given strings
d) Save the output of task a), task b) or task c) (above) into an array if the "given string(s)" is/are not unique.
e)Read text between two non-unique strings i.e. text between the nth occurrence of string1 and the mth occurrence of string2
As far as I can tell, those five scripts should be able to parse just about any text pattern. I am by no means fluent in these languages. But I could use a starting point. My main concern is speed. I intend to use these scripts in a program that reads and writes hundreds of input and output files--each with a different value of some parameter(s).
The files will most likely be no more than a few dozen lines, but I can think of some applications that could generate a few hundred lines. I have the input file generator down pretty well. Parsing the output is quite a bit trickier. And, of course, the option for parallelization will be very desirable for many practical applications.
View 14 Replies
View Related
Nov 2, 2010
I have encountered a problem:I have "while" loop; at each run a set of outputs is produced but then I need to shift them into a corresponding folder ; otherwise next run the new outputs will be over-written. Furthermore, I need to pipe what I have on the screen inside a file. I have put my code in the following:
Code:
# !/bin/bash
jf="GeoQuery.jar"
[code]...
View 3 Replies
View Related
Dec 7, 2010
I've a program which manages my pdf and references. I wish to put some of the information on my website but that program (Mendeley) does export only in XML (or bibtex). I'd like to simply convert the XML output files to SQL in order to create or update an SQL database.I'm not an expert in either XML or SQL (use only PHPMyadmin). Does someone get help me to figure out?
View 2 Replies
View Related
Nov 8, 2010
I am going through a multi-step process to produce output files, which involves 25,000 greps at one stage. While I do achieve the desired result I am wondering whether the process could be improved (sped up and/or decluttered).input 1set of dated files called ids<yyyy><mm> containing numeric id's, one per line, 280,000 lines in total:
Code:
123456
999996
[code]....
View 14 Replies
View Related
Apr 8, 2010
I was preparing a script which will remove all my files from directory which are 24 hour old.I tried some thing like thisfind . ( -name 'log.*' -mtime +1 ) -exec rm {}; but it is throughing error like : missing argument to exec.
View 8 Replies
View Related
Mar 26, 2010
I am trying to get the total file size for certain files per directory.I am usingfind `pwd` /DirectoryPath -name '*.dta' -exec ls -l {} ; | awk '{ print $NF ": " $5 }' > /users/cergun/My Documents/dtafiles.txtbut this lists all the files in the directoriesI need the total per directory for all dta files.
View 1 Replies
View Related
Jan 3, 2011
find . -type f -name "*" -print0 | xargs -0 --replace=% mv % `pwd`
works fine at moving all files anywhere below the current directory to the current directory.
My question is can it be modified so it only moves all the files up one directory layer? Otherwise I shall have to cd 100 or so times and run it in each directory i want to compress...
I imagine the directory below which the file is stored is in the % somewhere it is just a case of extracting it and applying it to the mv command, yes?
View 8 Replies
View Related
Apr 7, 2011
Word Count for all files in a directory
View 1 Replies
View Related
Nov 19, 2008
How can I write a script that will go through the files in a directory and print on the screen the name of the files stating if it is a file or directory. The directory is already provided by the user.
View 12 Replies
View Related
Feb 14, 2011
When I run:make -f mymakefile clean I get:rm -f mybinary *.so.* *.dep *.o mybinary.symand all the above files are removed from the current directory.But I have a directory /src/ where I have all my source files located. The *.o file in this directory are not removed?
View 4 Replies
View Related
Nov 6, 2010
I am trying to write a simple script to list all the files in a directory. The script I wrote was as below where the pdb_files is a directory and all the files which I want to list are in that folder.
Code:
files=`ls -F pdb_files/*THERMO*`
for inFiles in $files
do
echo $inFiles
[Code]....
View 5 Replies
View Related
Jun 23, 2010
I once had a script that when run would find the first 800GB of files in a directory (including subdirectories) and write them to a file (ie: ./800gb.sh > manifest.txt).I used this to create manifests of 800GB worth of data from large directories in order to dump to tape (LTO4).I'm sure its gotta be a pretty simple script, but I am not very skilled at writing bash scripts.
View 4 Replies
View Related
Aug 20, 2010
I got a directory with files in it like: 2006-07-01.foo2007-08-04.foo I need to update the timestamps on these files using "touch -t 200607010000 2006-07-01.foo" on each file in the directory so I came up with the following one liner:
for i in `ls -1`; do touch -t `ls -1 | sed -n 's%([0-9]{4})-([0-9]{2})-([0-9]{2})(.*)%1230000%p'` $i; done
My goal was to use sed and get the timestamp for touch and then loop through each file and touch with the timestamp.However the script, not giving me the results I intended. Can anyone chime in on what I am doing wrong?I have been banging away at this for a couple of hours now and am clueless on what it could be. I also tried another variant such as:
for z in $(ls -1 *.foo); do echo $z $(for i in `ls -1 *.foo | sed 's%([0-9]{4})-([0-9]{2})-([0-9]{2})(.*)%1230000%p'`; do echo "$i"; done); done
View 5 Replies
View Related
Jul 22, 2011
I have about 50 files that the script will operate on, they are all located in the same directory.I need a bash script that will operate on all files in a directory. The script needs to add two lines to the beginning of each file based on the file name, and one line to the end of the file.A file named myfile.h should add these two lines to beginning of the file:
Code:
#ifndef MYFILE_H
#define MYFILE_H
[code]....
View 5 Replies
View Related
Jan 31, 2009
i was wondering if there's a bash script to check the amount of files in a directory with an IF statement..
View 8 Replies
View Related
Feb 15, 2010
How to list only hidden files in current directory ?
View 2 Replies
View Related
Aug 19, 2010
I have a bunch of files that contain a date, then data. When I use join, I get exactly the output I need. But manually joining the first with the second, then that output with the third and so on would take days. I have thousands of files. Can any of you folks help me write a script that would do this?To put it another way, for clarity's sake, here is what I am currently doing
If I were to repeat that 3000 times the final output would be what I need. I know a simple script would do this for me, but I can't figure out how to write it.
View 12 Replies
View Related
Oct 1, 2010
Code:
<html>
<head>
</head>
<body>
[Code]....
Alright this works fine to pull the directories/files in the /var/Store/2010/ directory.
But when you click on of the links it tries to http://'serveradress'/$filename
note that $filename in the url is the filename clicked on so the php script is working. but I need it to change to that dir so that you can see the folder/files there and work your way up/down/side wise thru the folder tree to where you need to go. Not try and pop it as a direct url which doesn't work.
View 3 Replies
View Related
Nov 25, 2010
I have managed to get it to install the files in the BUILDROOT directory correctly but when I add the directories under the file it seems to try and install the files from root. my rpmmacros is as follows:
Code:
%_topdir /home/rpmbuild
# The directory where buildroots will be created.
%_buildrootdir %{_topdir}/BUILDROOT
[code]...
View 1 Replies
View Related
May 16, 2011
I have to create a script to identify those users who have un-sanctioned (forbidden) files in their home directory. I tried something like this (this is a try and I need some opinions):
Code: #!/bin/bash
user_belongs() {
if `groups $var1 | grep $var2`
then
return 0 else
return 1
fi
} .....
View 1 Replies
View Related
Dec 29, 2010
Originally Posted by Kenny_StrawnPlease wrap [CODE] tags aroung any code posted here. The full source that way could still be posted.I am trying to copy all the files in the directory based on the modification date (i.e created on Dec 29). Not able to find the proper command for this. This is what I have tried.
(none) login: root
#
# cd /mnt/hd/
[code]...
View 8 Replies
View Related
Apr 1, 2011
I want to write a script that will detect all the unwanted files in a particular directory and delete them in one fly..
View 11 Replies
View Related
Mar 18, 2010
I am facing a problem in Windows due to a virus called Newfolder.exe which creats files with the same name as it's parent directory and an extension .exe and this happens for every directory in the entire hierarchy in the infected pen drive. The antivirus detects them, but is sucking slow. So I thought this is a good opportunity to use the concepts of the all mighty shell script to remove those as they follow the same pattern. Say my complete path is
Code:
/home/pkd/fol1/
The virus would have created an file with complete paths
Quote:
/home/pkd/fol1.exe
If fol1 has two more directories fol11 and fol12 Then there would be two more .exe(virus created) in the following path
Quote:
/home/pkd/fol11/fol11.exe
/home/pkd/fol12/fol12.exe
View 1 Replies
View Related