General :: Copying A File To Multiple Machines?
Mar 7, 2011Is it possible to copy a file on multiple remote machines through scp in one command?
View 3 RepliesIs it possible to copy a file on multiple remote machines through scp in one command?
View 3 RepliesIm new to the forum and fairly new to Linux as well. my apologies if this is posted in the wrong section. My problem is How can you search for a file in multiple machines (like a server farm) ? For example i log onto machine num1 and want to search for a file named "xxx.yyy" which may be in one of 4 machnies. What i do right now is manually log into each machine and run the find command. However i have heard that it is possible to do it via a couple of simple commands. I have looked into pssh and cssh as well as ssh tunneling (along with public key authentication to stop the machines requesting a password every time i log in) and unfortunately i was unable to find an answer.
View 4 Replies View RelatedTwo machines connected via LAN Older machine is running RedHat 3 Kernel 2.4.21-50.ELsmp on i686 Newer machine is at a current level of SUSE Linux. Newer machine has a DVD drive. Here's the question. Is it possible to, remotely, mount the DVD device to the older machine? If so, can it be used to install software packages?
View 1 Replies View RelatedI have 60+ directory's each containing multiple .doc files. I need to move them to a single directory and keep their file name intact. I don't think cp will do that with out listing all the file names. I was thinking of something like: cp -r /dir/*.doc /newdir . Or should I use a combo like find -type *.doc|cp /newdir?
View 5 Replies View RelatedI have a network of 20 machines, all running Ubuntu 10.04.
Each machine has about 200[GB] of data that I'd like to share with all other 19 machines for READ ONLY PURPOSES. The reading should be done at the FASTEST POSSIBLE WAY.
A friend told me to look into setting up HTTP / FTP. Is it indeed the optimal way to share data between the machines (better than NFS)? if so, how do I go about it?
UPDATE: Just to clarify, all I want is to be able (from within machine X) to access one of machine Ys files and LOAD IT INTO MEMORY. all of the files are of uniform size (500 [KB]). Which method is fastest (SAMBA / NFS / HTTP / FTP)?
Suppose I have a tree structure like this:
/home/mahmood/sim/a/b/file1.cpp
/home/mahmood/sim/a/b/file2.h
/home/mahmood/sim/a/c/file3.txt
/home/mahmood/sim/d/file4.txt
How can I copy all of them to /home/mahmood/sim. So that when I run "ls" in /home/mahmood/sim, I see all files:
file1.cpp
file2.h
file3.txt
file4.txt
Can 'cp' search for all file and copy them in another folder?
How can I copy multiple files, each with a slightly different name from the SOURCE in the same directory?
Example: '/home/junk' contains A.txt, B.txt
I want to copy /home/junk/A.txt to /home/junk/A1.txt and /home/junk/B.txt to /home/junk/B1.txt using a single command.
i have 30 linux PCs running. i need to check the performance of all pcs( memory,ram and process usage) in single command or in GUI mode.In solaris we have perf script to check performance in GUI mode. i need same type in linux?
View 1 Replies View RelatedDescription: I am a newly appointed system engineer taking care of linux servers. We have a new set of data coming in which need below configuration: How to do a script with function?:
for files with ".txt" in sm
copy each of the files to folder : sm1 and sm2 (log every copy)
if succesful:
remove original
log into the log file
if not successful: (not successful copying 1 particular file to all the folders)
retain and retry
log into the log file
mail out the admin with that particular file name
I have already do try a bit:
cd /export/home/
for dir in sm1 sm2; do
cp -p sm/*.txt $dir/
done
Is my starting right? How to do the rest parts?
Asuming I have two files, one large file and one small file, I want to write the smaller file to the large file without overwriting the remaining part of the larger file.
Both are binary files, and the large file can become very large, so I want to avoid copying the whole file, as that will take some time. Is there any standard Linux console utility to do this, or do I need to write it myself?
copying permissions from one file to another.I know that command for changing permission is "chmod", for example chmod 666 filename However, I have one file filename1 and by listing all contents of a directory with ls -al I can find out its permissions in form -rwwx and similar. Now I want to define exact same permissions to other file "filename2". How to use chmod command to accomplish this. Other way around would be to simply copy permissions from one file to another. Is there any command for this purpose?
View 3 Replies View RelatedThe current directory contains:A file called "original.txt" Many directories called "source_001", "source_002", "source_003" ... From the command line how do you copy "original.txt" to "source_001" and "source_002" and "source_003" ...
The total number of these source directories is unknown, it changes every week.
I just installed ubuntu 10.04 x64. I have installed it and set it up the way I like on my server, however I now have 3 terminals and a laptop to set it up on the exact same way. As a result Id like some advice on what can and cant be done and how. Preferably I am looking to use programs with GUIs. Thing the style of ubuntu one, software manager etc, not really too into the command line, or conf editing unless there is no other option.
1. I installed all the software from software centre I like. Now I need the same software on the other computers. I know mint has mintbackup where it just exports and imports a list if installed software. I am looking for an ubuntu equivalent.
2. Firefox and Thunderbird add-ons. I install like 20 firefox and thunderbird add ons. sometimes I can't even keep track of them. There must be a way to sync them so that each computer I install ubuntu on I can simply import the list of mozilla add ons and have it auto download them.
3. Mail Contacts and Calender Syncing. I need to sync between my 3 ubuntu machines and my windows mobile 6.5 device for calender, email/phone/name contact lists. Email is taken care of by imap, but lightnings calender isnt and neither are the contacts. If I input a contact on my phone I want it to sync to the other computers I use.
4. Ubuntu one. I have /storage/workfolder in my root directory on my personal server. I want to be able to work on documents at work on my laptop and sync them so that when I get home, they are synced on the server without causing overwrite changes etc.
5. Desktop settings, icon themes, preferences from ubuntu tweak etc. I know this one is a bit more of a long shot, but it takes 20 minutes each computer, to set up the little maximize minimize buttons on the other side, set up cairo dock the way I like, install the icon theme and backgrounds I like. isnt there a way to back this up so it doesnt have to be done each time or so I can make my other computers the same?
I have a few servers that are exposed to the internet. When someone tried to brute force hack in to the ssh, ossec adds their IP to the hosts.deny. Then the hacker (read: script kiddie) moves to the next IP up the line and hits my next server, etc, etc.
I end up getting 20 emails for all the servers that they hit.
My question, is there anyway to sync the hosts.deny file across multiple servers so that if they are locked out of one, they are locked out of all?
I'm getting a video from a camera connected to the computer and saving it to a constantly increasing file.
The thing is that I'm trying to make a non-stop copy of this file over the network (i.e. using scp, rsync or something like that).
utility to let me automatically apply shell commands I type to a list of given remote machines? I'm configuring and maintaining multiple servers all running Fedora 12 and I want them to have exactly the same configuration. I also need to check out code from subversion onto these machines; the same code from the same location into the same directory. I know I could use ssh to run each command individually for each machine, but is there a tool that will make this much easier?
View 3 Replies View Relatedrun a cron job to download my email logs to my laptop. But the question is HOW? Im not sure how to write the php script for the cron job. the file is on "computer A" and the file is setup to chmod I just need to know how to write the php to "access" the other computer and then download the file to a certain file on my laptop.
View 3 Replies View RelatedSolaris is the os used. I want to copy files from UNIX Machine to windows network drive.I know smbclient,ftp can be used. But is there any other best option i can use?
View 2 Replies View RelatedI am trying to copy four files from my machine, through a second machine, and finally to the destination. The destination computer can only be reached through the second computer, and I am curious to know if there is an easy way to do this. I am able to ssh to the middle machine and then ssh from there to the destination. I know that I could just copy from the first machine to the second, and then from there to the third. I guess that I'm curious to know what kind of command I can run to do this all at once or even if I could do such a thing (which I'm betting I can). I need to copy these files as root on the destination machine too.
View 6 Replies View RelatedI am quite confused about the following description on fork. Could you please explain it ?The child process shall have its own copy of the parent's file descriptors. Each of the child's file descriptors shall refer to the same open file description with the corresponding file descriptor of the parent.For example, I am opening a socket and then fork. Now, does the child have a separate socket or is shares it with the parent. Does I have any impact on using it in child?
View 2 Replies View Relatedrecommend a utility to let me automatically apply shell commands I type to a list of given remote machines?I'm configuring and maintaining multiple servers all running Fedora 12 and I want them to have exactly the same configuration. I also need to check out code from subversion onto these machines; the same code from the same location into the same directory. I know I could use ssh to run each command individually for each machine, but is there a tool that will make this much easier?
View 6 Replies View RelatedRegularly I find myself cloning a machine using rsync. I find it understandable, reliable and fast, faster than dd, and I don't have to worry about different partition sizes etc. However, usually I partition my hard disk in a number of partitions:
Code:
/
/home
/usr
/var
When I start with a new, empty machine, I start up with a USB stick or live CD, and my new, empty hard disk becomes /dev/sdb. After creating the 4 partitions I have /dev/sdb1, /dev/sdb2... etc. My root directory is on the disk I used for booting, usually /dev/sda. So, in order to access my newly created partitions, I mount them on the /mnt/directory of my root:
Code:
mounted now later
/mnt/sdb1 /
/mnt/sdb2 /home
/mnt/sdb3 /usr
/mnt/sdb4 /var
In other words, I mount now /dev/sdb1 on /mnt/sdb1, while after copying /dev/sdb1 will become my root directory, /dev/sdb2 become my /home directory, etc. When I start the resync process to copy the image from a remote machine, I have to copy all 4 partitions separately. First the root directory, excluding /home, /usr, /var, then /home, then /usr, /var, like this:
Code:
action 1:
rsync --exclude='/home' --exclude='/var' --exclude='/usr' my.remote.machine:/ /dev/sdb1/
action 2:
rsync my.remote.machine:/home /dev/sdb2/
action 3:
rsync my.remote.machine:/usr /dev/sdb3/
action 4:
rsync my.remote.machine:/var /dev/sdb4/
That is a lot of typing and waiting. Sometimes I have a different partition scheme so it is not really feasible to write a script to use always. Now the Question: is there a smarter way of mounting the newly formatted disk (/dev/sdb1, /dev/sdb2... etc) in my root tree so I can perform the rsync copy in just one time, without all the excludes, but assuring that the correct source partitions end up on the correct destination partitions?
I am working with DM355 target board. Here we record. The video coming from IP cameras. Now I have to write c program for copying. The recorded avi files with date and time to NAS server using scp. I wrote a script to copy single file to NAS server.
#!/bin/bash
DATE=$(date +%Y%m%d_%H_%M_%S)
mv Camera1.avi Camera1_$DATE.avi
scp Camera1_$DATE.avi root@192.168.1.4:/root/test/
mv Camera1_$DATE.avi Camera1.av
But I have to write c program for copying multiple avi files with Date and Time to NAS server.
I'm trying to resume copying from a mounted CIFS device to my local hdd with cURL. I tried
Code:
$ curl -C - -O file://myfile
and also
Code:
$ curl -C <manual offset> -O file://myfile
(looked up the manual offset using "$ wc -c")
This resumes copying if I cancel it eg with ^C.
But it does not work if I unmount and remount the CIFS device. cURL then ignores my given offset and continues again from start as if nothing were there without saying a word. With "-C -" the same effect.
I have a home network with 6+ x86_64 machines, all with similar setups. In the past (FC10 and before), I've had common package repositories (e.g /var/cache/yum/fedora/packages) shared via NFS with all the machines (and with keepcache=1 in /etc/yum.conf). That way, a given RPM only got downloaded once; the other machines would then pull it from my local package repository. And I don't mind the disk usage of keeping one copy of all my old RPMS around.
It seems that while DRPMS is great for a single machine, it doesn't make sense in my case. If I have to download the DRPM 6 times (and take the time/CPU hit to recreate the RPM 6 times), I might as well have downloaded the RPM once and been done with it. Is there a not-too-convoluted method to keep a common package repository across multiple machines even with DRPMS? Or, better, to have that first downloading machine pull a DRPM, generate the RPM, then save the RPM in the local shared repository?
I'm trying to write a script that will prompt the user for a username/password, then create that user/password in the right groups on all my machines. I know this is kind of a long way around to avoid a NIS server, but I like making my life more difficult.
This is what I have so far:
Code:
the script has 2 problems: The "if" functions return an error and do not compare the strings successfully. whatever password is entered does get applied properly and the user is unable to login
Create a copy of the file above and call it commands.sorted. Use the vi command to manually sort this file. I.e. use yy to copy a line, P or p to paste a line, and dd delete a line. Order the commands with the two lines starting with double quotes first. Then list the rest of the command in alphabetical order.
Anyone have any ideas what he's talking about? Can I copy a file and rename it at the same time while copying it to the same exact directory again? Now sure what the two lines things means either. I have an email out to him but it usually takes a long time for him to answer me. I got alot of work to do so everytime I get hung up it kills me.
I am trying to get preseed working on a bunch of machines with multiple NICs but it doesn't pick the right interface and/or gets "no link" on all interfaces. My PXE kernel line looks like so (I have auto=true priority=critical and interface=auto)
label squeeze
kernel debian-installer/squeeze/i386/linux
append vga=788 initrd=debian-installer/squeeze/i386/initrd.gz auto=true priority=critical ramdisk_size=10800 root=/dev/rd/0 rw url=example.com/d-i/squeeze/preseed.cfg interface=auto netcfg/dhcp_timeout=60
[Code]...
Am using 10.04 and quite happy with the way it is working. my nephew and my mother have also switched to ubuntu.
To save on internet usage and load on the ubuntu servers, can i download upgrade files in one computer and then upgrade all systems?
I have three Ubuntu desktops that I would like to upgrade from 9.10 to 10.04. Is there a way to avoid having each PC download the same packages? Is there some magic I can do with two of the PCs to maybe point the software source list at the third 'master' PC that does all the downloading?
View 2 Replies View Related