I'm attempting to find a particular word wherever it appears within a massive directory listing of teeny tiny text files -- and then copy all files containing this word into a "staging" directory.
Whenever I use the command below (on a test folder of smaller scale), it never carries the directory structure but attempts to copy all files within the "output" folder.
Code:
cp `grep -ir 'word' *` output
how to copy all these text files and have them retain some sort of directory structure?
just installed ubuntu couple of days back on my netbook. I am still a beginner, enjoying my adventure exploring ubuntu. I have another desktop which runs on XP. I am able to access XP shared folders through my netbook(linux). However, i wanted to copy files from XP infact folders using TERMINAL in my netbook, not copy and paste using my mouse. Are there any commands for it?
I have many files and folders in my source folder. I want to copy some files and folders from that source folder to destination folder. What should be require to given with the "cp" command?
At the moment I'm using:"cp -ap . /destination-folder"to copy everything from the folder I'm in to another folder.That works.Is it possible to cp everything except:folder1/folder2/in the current folder I'm in?
I wanna copy all folders and files created from 01.01.2011 until today to new placeie:cp -r /home/moviecar/public_html/wp-content/uploads/ /home/teaser/public_html/wp-content/uploads
I have encountered a problem:I have "while" loop; at each run a set of outputs is produced but then I need to shift them into a corresponding folder ; otherwise next run the new outputs will be over-written. Furthermore, I need to pipe what I have on the screen inside a file. I have put my code in the following:
I am developing a Web-based application and have some folders that will generally reside outside of the Web accessible area of the server. However, since some people will not be able to store those folders outside of the "public_html" folder, I am looking to put a blank "index.php" inside of every folder within that section of the application. To make things easier, I would like to know if there might be a way to recursively copy one file into every folder in a certain location.
In other words, is there a command that might do something like: Code: > cp -R index.php /home/user/public_html/source-files/* Basically, I want every directory inside of "source-files" to get a copy of "index.php". The directory hierarchy within "source-files" can go at least three or four levels deep, so the command would need to be recursive. I am looking for a command-line statement that I can type to perform this action.
I have a 160GB harddrive which I installed a F12, would like to upgrade to a bigger drive, but I hate to have to re-install everything.
Recommend a good disk copy utility? The utility should be able to not only copy files, but boot sector and everything. So I just need to make a copy, change my BIOS to boot from the new drive and run everything as before.
I'm using Ubuntu 10.10. I want to copy all the pictures from various folders in my Documents to a single folder. So in Nautilus, I clicked on Places > Search for Files. Then chose the Documents folder and typed *.jpg in the search criteria. It found all of my pictures just fine. However it would not let me copy and paste them into a folder I put on my desktop. Copy is not on the "right" click menu and Ctrl C did not work for the highlighted search results. This is so simple in Windows but it does not seem to work in Ubuntu.
1. I need a script(like a scheduled cron job) which should perform Automatic copy (backup) of a directory from ubuntu server to windows and vice versa....
i tried every thing like samba mounting,ftp,etc..with samba its possible to map the network drive of an ubuntu directory in windows..but if the ubuntu server is not available then i cant access the network share..
Exactly this situation is happening for me while backing up tmp directory of ubuntu server ...
so in this case i need a permanent backup of tmp directory of ubuntu server in windows rather than a network share.
2. Also any ideas or scripts regarding automatic ftp (while doing a backup) from ubuntu server to windows... which can easily solve the above issue...
marek@marek$ ls -al /usr/share/solr/ razem 36 drwxr-xr-x 5 root root 4096 2010-11-30 08:25 .
[code]....
i want to copy it to ~/solrTest but i want to copy files from symlink as well when i try to cp -r /usr/share/solr/ ~/solrTest i will have symlink here:
I am trying to create a simple bash script to rsync some folders within a directory stucture. I am using wild cards, in the rsync source directory structure, but my command always fails. I believe it is the way I am using wild cards within my for loop. Here is my command ;
Code:
for seq in `cat test.txt` ; do rsync -nvP /folder/folder/folder/folder/folder/**/$seq /folder/folder/folder/ ; done This always fails, where if I do a ls to the destination, to test the path, it always works.
I am attempting to copy a set of sub folders from their multiple parent directories to a new location.
For example, I have three folders to copy:
I would like them to be copied to:
In actuality there are many folders besides folder1, folder2, folder3, and no numerical order exists. So, the folder named 'photos' would be copied to its parent folder's name in a new location. I would need this to occur for all folders in the '/home/user' directory.
I want to make the script to copy all folders which older than 7 days from Linux server to Windows server by Samba. And make it automatically by using cron
I have noticed that the files and folders search in Unity, only shows up those files which have been amended (or possibly just opened) since the install.
i was wondering if there was a way i could have the search index (or something vaguely equivalent) all the files on my machine. This is especially important given that i reinstall the OS every six months on a new distribution cycle, copying all my old files across.
Without being able to see my old files the search is pretty much reduced to a recent history search.
I have an account in university on Linux machine with 10TB of free space accessible via SFTP. I would like to backup my Windows 7 x64 laptop to university. Currently I am using rsync+cygwin, but backup is pretty slow (without shadow copy) and I hate console window appearing every day on my screen when I login.
So I am looking for something like Windows Backup but with support for SFTP. Combination of tools will work too.
how to search for those files which contain word "AM_COLLECTION=22". I need to know all the files with this string. ( I know the grep command can do it but either
I use grep <pattern> abc.* to read the files. But here grep read files in the order of 1,2,3,4 . I want to to grep the files in reverse order like 1st the abc.4 file will be read and abc.1 file will be last.
I've discovered that Dolphin seems to lose random files when copying many large folders.
I first noticed this a few months ago when I tried to copy my music library from one folder to another on the same HDD. It consisted of around 600 folders and 6500 files. During the copy there were no errors but after the copy I found that some of the newly copied folders were missing files. I put it down to human error or a glitch.
Yesterday I tried to copy 13 folders containing rips of some of my DVDs. Each folder basically had one film of either 700MB or 1.4GB. Again no errors showed up during the copy but I found 3 of the newly copied folders were empty.
It's not so critical with music or films but I can't afford to lose work data like this.
Has anyone experienced or seen a similar problem with Dolphin? I'm going to have to do some more extensive testing but this is not good.
The first time I noticed the problem I was running KDE4.3.4 (I think) and now the latest was with KDE4.4.0.
The problem I have is that I need to replace a more complex string, like this: Old string: /mnt/stor6-wc2-dfw1/627896/982574/ New string: /mnt/stor8-wc2-dfw1/369587/302589/ There I don't know how to do it... since the / is what separates the old from the new strings, and the strings that I want to replace have / in it. Also, I would like to know how to specify under what folder replace the files, for example, I want that it search/replaces all files under /var/www/mysite/htdocs folder.
I want to traverse a directory and get a list of files that contain a set of patterns. I assumed I could use grep for this, but I having trouble getting grep to only return files that match ALL patterns. Here's what I've come up with so far:
However, this gives me a list of files that match ANY of the patterns in the searchpatterns.txt file. I want to match ALL of the patterns. I've looked through the man page, but can't find anything that allows me to change the "OR" to "AND" for multiple patterns.
I am looking for all the files that contain the text string 'moo.sql'. I ran the following:
find . -name '*.php' | grep -lir 'moo.sql' *
Unfortunately it seems to return non-php files in addition to php files. I thought the find portion of this would filter the file names so grep would only search php files.
I want to find files containing the "$" char (ascii 0x24). 'Grep -irl $ *' would output the names of every file in path *, of course, because it means end of line (EOL). So giving grep the string "$" won't do. So I tried 'grep -irl $ *'. But this doesn't work either and I do not understand why. Am I not escaping the dollar sign? grep should interpret it literally. Neither 'grep -irl "$" *' will work. Fortunately, there's LQ, besides grep's man page.
If I type 'grep alias .bashrc' a whole load of stuff comes up. However, if I type 'grep alias *' nothing comes up. Is there some switch for including 'hidden' files - like the -a switch for ls?
I am new to linux as well as awk, grep or sed. I need a find and replace command single liner or script that loops trough input file (file1) and find the particular input in file2 and add "!" in front of the found string.
Example: input file: file1 g+h=o+p a+b=c+d file2 (file that need to look for) a+b=c+d1e105 x+y=z+s5e105 g+h=o+pabcdefg t+r=w+qxvyderf
Output file (file3 should look like this) !a+b=c+d1e105 x+y=z+s5e105 !g+h=o+pabcdefg t+r=w+qxvyderf
I have tried many awk and sed method of find and replce but it did not work the way I wanted. This is mainly due to my lack of experience in awk and sed. The program should loop trough file1 and find in file2 and output in file3 for the 1st (g+h=o+p) set then repeat the same process again for set 2 (a+b=c+d).
I've got a quick grep question. I'm trying to work out a command I can use to locate all of the files in a directory that have sql database connection details. I want to do it by looking for the strings "localhost" and the name of the database.find . -type f -exec grep -l -E '^(localhost|DATABASE_NAME)' {} ;
Is there a method at the command line to copy files from one location to another and retain the source files group and user?I'm migrating some MySQL files from one machine to another.I want to back-up the original files in the directory presently. They have owner:group of mysql, some have owner:group root:mysql and so on. To copy them under cli or Nautilus everything changes to root for I execute sudo cp or gksudo nautilus and copy via gui.
Since it is MySQL data I could simply do a dump of the database and restore it on the other machine. But there's about 20 db's and I want to do this via a copy for it will be faster - at least that is what I think.