How would i go about copying all .jpg or .JPG files from a folder and all its subfolders to my /usr/name/pictures folder? I'm guessing I'd have to use some sort of .[jJ][pP][gG] to get all the pictures from other examples i've seen, but really not sure how to use that in a recursive cp.
I want to copy everything in templates/blue to the folder code/ However: cp -r 'templates/blue' 'code' Creates a folder called blue inside code. I tried cp -r 'templates/blue/*' 'code', but that didn't do anything.
Is there a way to recreate all the folders from one directory to another without copying over the contents of the folder? I've been trying to do something like this,
Code:for i in `ls $X`; do mkdir $PATH/$i; doneUnfortunately $i is deliminated by whitespaces in the filenames and not the actual folders.
$X contains only other folders so I dont have to worry about regular files but any kind of more "advanced" solution would work.
I'm looking for some way to search a given directory, copy its folder structure to another given directory, and copy from the original only certain files into the newly created directory.Vague, I'm sure. But here's what I'm actually looking for:My music player won't display album art without a "cover.jpg" image in each folder where the files are, but I have Rhythmbox set to only copy the top rated songs. Each file does have its cover embedded in it, but there's also a cover.jpg in each folder.
So I'd like to let Rhythmbox handle the actual song copying, and then have a script or something that I can run that will scan my onboard music directory for cover.jpg files and then place them in the appropriate folders within my music player's folders. Obviously I can't just copy my onboard music directory, or I'm getting *every* song, not just the top-rated, which defeats the purpose of syncing at all.
I have to work in terminal but I don't know the command of copying files from one folder to another. e.g from /share/ParaDiS.v2.3.5.1 to /home/newfolder what should I type?
I do have an Ubuntu Headless server which is running Samba on it. My mp3 file collection resides on that server on is being share.
So far, no problems connecting to that drive and writing from my Windows box on that share. But if I use my main laptop, which runs Ubuntu Lucid and download an mp3 song from Amazon, the moment I move that to the share, I got permissions problems from the Windows machine. This is clearly a permission issue with group and others; the song is being created on the share without read and write permissions to others or the samba group I created.
My question is. How can I make this process simple or automatic, when moving songs to the share? I don't want to go there everytime and run ...
Code:
Which was basically how I reset or fixed the problem.
I've read about umask, but not sure if that applies here or not, because I'm not creating the file but moving it.
The problem is i have to copy folders from network into the RHEL server. When i am copying i can copy the folders and documents which do not have a sub folder. But the folders with subfolders is not getting copied. How should i do it?
i want to get the latest file name from a directory, how can i do with perl script. In other words, how to sort the contents of a folder on the basis of time and capture the latest file using perl script,
I'm not positive if this is in the correct section but I am hoping so. I am running dual-boot with Windows 7 and Ubuntu 10.10. I hunted down my files from Windows that I need for school (old papers, research, etc.) and found it under "file system" --> "host" --> "users" --> "zbollman". I can access all of my files and I'm happy now that I don't have to boot between the two constantly to get what I need. However, I tried to copy the file to my home folder, but it said I do not have enough room. I'm about 5GB short. How do I go about allocating more space so that I can copy this folder so that all of my information is easily accessible?
I have a new network attached storage unit that I'm trying to transfer my data to. On this NAS, it has a very basic linux installed with SSH enabled. Browsing through the programs installed on it, i found smbclient. Am I able to copy files directly from my old NAS to my new one using smbclient?It would sure beat transfering 950GB from my old NAS through a computer then onto my new NAS.
I am running openSUSE 11.2 KDE 4.3.4 on my ThinkPad R51 laptop.I would like to copy 'File A, File B, File C' (for example) that has photographs of a holiday , to a blank CD inserted into my drive and cannot find a way to 'copy and paste' into the volume.
In Linux bash shell, for a given directory, how can I list:The create date for that directory The number of files in that directory The number of subdirectories in that directory.
I just read the Linux scp command issue question and it reminded me that I regularily forget to specify the colon in the host part of a scp command, and thus copying a file locally instead of copying to a remote host, e.g. I do scp foo host instead of scp foo host: But I never use scp to copy a file locally. So I wonder if there is a way to make scp fail if both (the source and destination) arguments refer to local files.
i am accessing linux through putty and i wrote somany programs in unix using putty and gedit but now i need to copy all files into windows. how to copy directory(linux) to folder(windows) without installing any softwares?If it is necessary to install software to copy files then tell me the process of using that software.
I'm trying to copy files from my current server to a new server. Both servers have SSH installed. These are the commands I'm using. However I'm getting connection refused. I did a google search and found out that maybe the reason of this error could be due to the fact that my current server doesn't have SSH. However I use SSH often on my current server so I can say that it has SSH for sure.
OpenSSH_4.3p2, OpenSSL 0.9.8e-fips-rhel5 01 Jul 2008 debug1: Reading configuration data /etc/ssh/ssh_config debug1: Applying options for * debug1: Connecting to IP_OF_CURSERVER [IP_OF_CURSERVER] port 22. debug1: connect to address IP_OF_CURSERVER port 22: Connection refused ssh: connect to host IP_OF_CURSERVER port 22: Connection refused
I need to copy a large number of files, it comes to 1 lakh from one server to another. When I tried various commands using scp , ftp etc. It is saying "Arg list too long". In which way can we copy all the files. The Two servers are under Linux.
I have 60+ directory's each containing multiple .doc files. I need to move them to a single directory and keep their file name intact. I don't think cp will do that with out listing all the file names. I was thinking of something like: cp -r /dir/*.doc /newdir . Or should I use a combo like find -type *.doc|cp /newdir?
I've a directory containing around 2.8 lacs of files. I want to move them to another directory.If I use cp or mv then I get an error 'argument list too long'. If I write a script like
for file in ls *; do cp {source} to {destination} done
then because of ls command , its performance degrades.How can I do this?
I am facing problem in copying a large number of file 18 lakh (18,000,000) files from my personal hardisk to another hardisk each file is very small and size of folder is around 3.95 GB copying files using copy given by Windows is frustrating and I am not even able to compress file its giving me error that its not readable.And problem is I am not able to open this drive in Linux it showing me error there saying do diskchk in Windows and Windows disk check is also not able to repair this drive and goes into some mode unsolvable.Is there any way to open disk with error to open in Windows and if not any way I can copy data faster?ERROR: Disk labled EDU is corrupt go to windows and chkdsk /f there and reboot into window 2 times.
I want to copy a .tgz file from my computer to an external hard drive. However, I get the following message:cp: cannot create regular file `/mnt/usbkey/ws_2008/misc/minipar-0.5-W indows.tgz': ermission deniedI get this error with any file I try to copy to the external disk. rnal disk is recognized, when I mount it, I can see the files and folder I have there, but seems that I cannot copy anything to it.When I try to copy the same files from my computer to a usb flash drive, everything works
Suppose I have a tree structure like this: /home/mahmood/sim/a/b/file1.cpp /home/mahmood/sim/a/b/file2.h /home/mahmood/sim/a/c/file3.txt /home/mahmood/sim/d/file4.txt
How can I copy all of them to /home/mahmood/sim. So that when I run "ls" in /home/mahmood/sim, I see all files: file1.cpp file2.h file3.txt file4.txt
Can 'cp' search for all file and copy them in another folder?
I am trying to find a command which will copy all the files in the folder with extension ".log" which is created one day before the current date. By going through other threads in this forum I found the half solution to this problem
find /mnt/hd -mtime -1 -exec scp {} /mnt/usb ;
This command copying the all the files created one day before(not only *.log) to the /mnt/usb folder. what is the modification required to above command.
I have two USB 2 external hard drives. I want to copy about 30 gigs of data from one to another. What command line command do I use ? I was thinking of using cp with the -R and -n options, but I have no idea what devices to refer to. I can't find any external hard drives in /etc/fstab and I'm not sure what /dev device each USB external hard drive uses. I just want to copy the files and the directories that they are in just as they are. There are no links and I do not want to do a backup.
I have a problem while copying files from a remote computer to my local one using the scp command. I am sure that I am using it correctly, please check it below: --- blah@blah.com:~/g4work> scp blah2@blah2.com:IndirectMethod_Spher...s/H_1.mac.root . --- What I get in return (instead of the statement saying 100% of file copied) is: --- On this machine the G4SYSTEM=Linux-g++ ---
The interesting point is that the above returned statement is one of the Environment variables set on both the machines that are necessary to work with a toolkit called Geant4. Here is what I get when I type 'printenv | grep G4' just to show you (note the statement in bold): --- G4LEVELGAMMADATA=/home/blah/geant4/geant4.9.3.p02/data/PhotonEvaporation2.0 G4INSTALL=/home/blah/geant4/geant4.9.3.p02 G4LEDATA=/home/blah/geant4/geant4.9.3.p02/data/G4EMLOW6.9 G4NEUTRONHPDATA=/home/blah/geant4/geant4.9.3.p02/data/G4NDL3.13 G4VIS_BUILD_OPENGLX_DRIVER=1 G4RADIOACTIVEDATA=/home/blah/geant4/geant4.9.3.p02/data/RadioactiveDecay3.2 G4ABLADATA=/home/blah/geant4/geant4.9.3.p02/data/G4ABLA3.0 G4LIB=/home/blah/geant4/geant4.9.3.p02/lib G4VIS_BUILD_RAYTRACERX_DRIVER=1 G4LIB_BUILD_SHARED=1 G4VIS_USE_OPENGLX=1 G4UI_USE_TCSH=1 G4VIS_USE_RAYTRACERX=1 G4REALSURFACEDATA=/home/blah/geant4/geant4.9.3.p02/data/RealSurface1.0 G4SYSTEM=Linux-g++ G4WORKDIR=/home/blah/g4work --- The other thing that I would like to mention is that these Geant4 Env. Variables are loaded each time a new (bash) shell is started as a result of the bash login script.