I am attempting to use the zip command with the '-x' option to exclude a folder e.g. 'zip upload.zip public_html -x public_html/jquery/*'. However, parts of this folder are still being added to the archive. I made a shell script (saved as 'compress.sh' and ran as '. compress.sh') to do the archiving so I could test adding nested wildards for multiple subfolder levels.
Code:
#!/bin/bash rm -f upload.zip zip -r upload.zip public_html -x public_html/jquery
[code]....
Each new line I added here that has the nested wildcards made the archive file size a bit smaller. Adding more /*'s than this didn't affect the file size. Even after all this though, there were still a couple megabytes of files and folders from the 'jquery' directory that were added to the archive.
Here's some examples of files and folders that were created after I unzipped the archive: public_html/jquery/js/tablesorter/addons/pager/icons [folder] public_html/jquery/js/tablesorter/addons/pager/.svn/entries [file] public_html/jquery/js/tablesorter/build/.svn/text-base/js.jar.svn-base [file]
Why is it that despite all the -x lines, the files and folders like these were still being added to the archive? How can I simply recursively exclude the entire public_html/jquery folder from the archive?
Say, I have a header file containing all required includes: Code: /* global.h */ #include <stdio.h> #include <unistd.h> ... #include <dirent.h> #include <signal.h> ... /* and so on */
I have several modules in a program (*.h *.c files) and in each *.h I include global.h, then they are included in corresponding *.c files. And I receive strange messages from compiler, like a "warning: implicit declaration of function fdopendir", "error: expected declaration specifiers or '...' before 'siginfo_t' ", "error: 'DT_DIR' undeclared..." though these types, functions and constants are all declared in system headers. What does it mean?... Compiler is GCC 4.4.4-2, system is Fedora 13 x86_64
how the bash script should look to copy huge directory with multiple sub-folders to a new place place while checking load and stopping for several seconds if load reached lets say 3 or 4 ? I only know the simple command cp -r /dir/allfiles /dir/newplace However would like to copy over 30 000 files which will cause me a high load.
If I type 'grep alias .bashrc' a whole load of stuff comes up. However, if I type 'grep alias *' nothing comes up. Is there some switch for including 'hidden' files - like the -a switch for ls?
What I would like to do is to print the contents of all text files in a particular directory, recursively. Problem being that there are directories and possibly binaries scattered around in the filesystem as well.
Trying cat * works as long as there are no directories in there, but when there are it gives an error instead and prints nothing.
I'm sure it's easy using file -f or something but I can't figure it!
I have a directory that has a another directory inside it. The top directory is rather redundant since it only contains the one other one. Is there a way to delete the top level directory and have the contents simply "move up a level"?
I would like to move a user's home directory to a different disk. Is there a "clean" way to do this? Specifically, is it safe to just copy all the .* files to the new destination and then change the home in the user config? Or are there maybe environment entries with absolute paths which will cause problems with this strategy?
I am hoping someone already has a script or knows of an app that will let me do this fairly easily - I have a fairly large folder structure that goes several levels deep, etc. In many cases there are duplicate file names that are not really different, e.g., /home/chris/folder/folder1/doc1.doc /home/chris/folder/folder2/folder3/doc1.doc
I want to recursively go through /home/chris/folder and move everything to /home/chris/another_location/ without subfolders and renaming duplicates as appropriate, e.g., /home/chris/another_location/doc1.doc /home/chris/another_location/doc1_1.doc
I'm facing a little trouble with copying a .txt file(only) from a directory and subdirectory to another directory. -R command don't work I think if I want to do this, since I don't want to copy subdirectory.
I've done a low level format on them so they're completely empty. When I use them with my windows machines, they're absolutely fine. When I plug them into my Ubuntu machine, there is a hidden directory created called 'RECYCLER' which I'm assuming is for deleted files?However, it also creates a .exe file in this directory called 0x2D9FA278 which has an Icon with an H in it and a comment of 'Facebook Photo' This has the effect of making all the directories on the stick into shortcuts! I googled the file name and it seems to be some sort of Trojan, but I don't understand how it's go into my Ubuntu machine, I've scanned with ClamAV and it finds nothing.
I want to copy all directories, files, and hidden files and hidden directories with one command. I want these items to replace any same items in the target directory.
I have tried several things, such as:
cp -r * cp -aR *
but I only seem to get visible files and directories. Obviously, I am missing something. (A brain, probably....)
When I run "ls -al somedir*" (I use the "ll" shortcut, actually), Linux not only list files that match, but also the contents of directories whose name also happens to match.Is there a way to limit "ls" so that it will only show names (files and directories) and ignore the contents of the directories?
How to move hidden folder from /home to another location - on another partition? Is it possible? I'd like to move some folders for example ./thunderbird or so that I wouldn't need to make a backup. Or at least is it possible that program can right files to two folders, or that everything from /home./thunderbird would copy automatically to ./thunderbird on another partition every time there is a change? Is it possible to write a script or something? I use luckybackup but I would like to be able to forget about backups and make script or program to do it for me.
I need to clone Ubuntu HDD from a 80GB to 20GB and the content , desktop should stay the same as they are now or is possible to re-size the Ubuntu partitions after installation
1) Any process (including boot, shutdown, download, open programs, etc) is freeze (don't have any progress) until I move the mouse or hit any button, so for install and use my ubuntu I have to keep moving my mouse and as you imagine is VERY TIRED.2) I get some general freeze on random times, neither the mouse, keyboard or process move, I notice that if I unplug power it unfreze, and I can work for some time.This 2 problems annoy me a lot, and I suspect is a problem related to my Netbook Model, I read about someone that has the same problem but no solution there.
This is what I did:I Try Ubuntu loading from my USB memory (1GB), run very good but with the 2 problems, but I thought it can be resolved installing.I installed along side with W7, couldn't boot Ubuntu, so I configured GRUB2 and it could enter now, but if I don't press **** after selecting Ubuntu it give me the BusyBox, and then the same 2 problems.I get rid of my W7 and did a Full Install on my 160GB HDD, hoping it can work fine, but no.I install all updates and no effect on that.
I moved to Mac OS X recently and bumped into the "feature" of Mac where copying files from an external drive resets the file modification/update date/timestamp to the current date (which Windows does not), causing a disaster for my 10+ years of backup work files where date is important. So, before I learned how to avoid that (e.g. using the -p "preserve" flag in the "cp" copy command) I have in the meantime added to my new Mac hard drive many more files as well as updating existing old files.
I have a backup external hard drive with all my old data and proper modification dates. I have a Mac hard drive with reset modification file dates (a single or two particular days). The Mac hard drive has all the "true" and "current" file contents with files modified and added. I need to Copy all the original files from the external harddrive, preserving file metadata (really only modified date), but ONLY overriding the new internal Mac hard drive IF
The file contents (md5 or whatever) is the same or The file was updated after the day (which of course I can see on all files) on which the original disasterous cope was performed (implying the file is new or modified) Ensure the copy leaves all the new and modified files completely intact on the Mac internal hard drive. "No prompting/stopping of the copy of any kind (i.e., not verbose) is required but is o.k". "Recursive copy - obviously I would like to copy all* files folders and subfolders found in export".
I'm trying to tar a collection of files in a directory called 'my_directory' and remove the originals by using the command:
tar -cvf files.tar my_directory --remove-files
However it is only removing the individual files inside the directory and not the directory itself (which is what I specified in the command). What am I missing here?
If I pass in /home, I would like for it to return 4 files. Or, bonus points if it returns 4 files, 2 directories. Basically, I want the equivalent of right-clicking a folder on Windows and selecting properties and seeing how many files/folders are contained in that folder.
How can I most easily do this? I have a solution involving a Python script I wrote, but why isn't this as easy as running ls | wc or similar?
I have a system where the permissions of many files are messed up. I have another system that has the same files, if I put that hard drive in, without simply overwriting the files, is there a way where I can recursively set the permissions of each file to that of this other directory?
How can I get the last time any of the files in a directory or its subdirectories has changed? e.g
Dir - changed 1/1/1 Sub Dir 1 - changed 2/1/1 Sub Dir 2 - changed 3/1/1 File 1 - changed 10/1/1 File 2 - change 5/1/1
The output for this for Dir should be 10/1/1 (File 1 was the last modified one). Getting the last file name to be modified is a bonus but isn't necessary.
I need to invert the colors of a lot of images that are in different folders in the same directory, is there a way to use image magic or something to do this in only a few commands?
I am new in Linux and I need to extract alot of zipped files (different format (e.g tar.gz, tar.gz2)) which are in subdirs and I do not want to go to each subdir and extract each file because it will take alot of time. Is there away to extract all files that are existing in dirs and subdir with "for loop" or is there a script that can do the job automatically.
Gparted shows that my dual boot laptop has the following partitions: [URL] I want to create a partition and move the contents of my Home folder into it.
I'm under linux . by default, other user can't read anything under my home directory. let's see my home directory is /home/superman , and I tried to use
chmod +r /home/superman
to let others can acess files under my home directory , but it does not work .
I did a clean install from Ubuntu 09.04 to 10.04 and restored my files from tar. Everything worked fine until I tried my weekly rsync backup. The permissions seemed to be causing problems, so I recursively changed all the permissions in my home directory:
Code: ~/Documents$ sudo chmod -R 644 /home/wolf/ [sudo] password for wolf: chmod: cannot access '/home/wolf/.gvfs': Permission denied So now all the directories and files have read permission for everyone:
Code: ~/Documents$ ls -A ls: cannot open directory .: Permission denied ~/Documents$ sudo ls -lA [sudo] password for wolf: total 80 drw-r--r-- 2 wolf wolf 4096 2010-05-22 20:45 career drw-r--r-- 23 wolf wolf 4096 2010-05-02 17:17 computer_languages drw-r--r-- 2 wolf wolf 4096 2009-08-09 23:29 .ecryptfs drw-r--r-- 21 wolf wolf 4096 2010-05-02 17:23 misc -rw-r--r-- 1 wolf wolf 27298 2010-05-23 13:01 next.odt drw-r--r-- 3 wolf wolf 4096 2010-05-23 15:46 PC_maintenance drw-r--r-- 5 wolf wolf 4096 2010-05-08 01:43 software_projects Now I can't even look at my own directory:
Code: /home$ cd /home/ /home$ ls -lA total 20 drwx------ 2 root root 16384 2010-05-07 01:01 lost+found drw-r--r-- 42 wolf wolf 4096 2010-05-23 15:35 wolf /home$ cd /home/wolf bash: cd: /home/wolf: Permission denied /home$ sudo cd /home/wolf [sudo] password for wolf: sudo: cd: command not found /home$
somewhere lurking is a file containing the default print resolution, which is not being overwritten by printer settings or cups management. I've asked on the cup forum and nothing successful.
So here's the question:
How can I configure grep to search recursively through all files in a directory, or if need be starting from root to find the pattern "2880" I've looked in the man page for grep and I can't see how to do it, is grep the right tool to use for this ?