Software :: Copying All Directories On A Server?
Aug 9, 2010I would like to copy all the directories (including data) from the Linux box to an external hard drive.
View 1 RepliesI would like to copy all the directories (including data) from the Linux box to an external hard drive.
View 1 RepliesThe current directory contains:A file called "original.txt" Many directories called "source_001", "source_002", "source_003" ... From the command line how do you copy "original.txt" to "source_001" and "source_002" and "source_003" ...
The total number of these source directories is unknown, it changes every week.
I have 60+ directory's each containing multiple .doc files. I need to move them to a single directory and keep their file name intact. I don't think cp will do that with out listing all the file names. I was thinking of something like: cp -r /dir/*.doc /newdir . Or should I use a combo like find -type *.doc|cp /newdir?
View 5 Replies View RelatedAfter scouring the internet I was able to find an Adobe release of Adobe Flash Player Square for Linux 64 bit installations in the form of a file (libflashplayer.so). After copying the file to several different Mozilla directories and also making libflashplayer.so executable with the use of chmod +x libflashplayer.so I was unable to install the plug-in.To what directory must I install this .so file to get it to work? Also when I open the browser and user url://aboutlugins I don't see Adobe Flash Player Square yet. Is there an intermediary step?
View 8 Replies View RelatedDescription: I am a newly appointed system engineer taking care of linux servers. We have a new set of data coming in which need below configuration: How to do a script with function?:
for files with ".txt" in sm
copy each of the files to folder : sm1 and sm2 (log every copy)
if succesful:
remove original
log into the log file
if not successful: (not successful copying 1 particular file to all the folders)
retain and retry
log into the log file
mail out the admin with that particular file name
I have already do try a bit:
cd /export/home/
for dir in sm1 sm2; do
cp -p sm/*.txt $dir/
done
Is my starting right? How to do the rest parts?
How do I copy and/or move files to the base folder of a user? I don't know what is is called, so I do not know what to put in the my file "?" command? I know you would normally put mv filename /directoryname, but what is the base username called?
View 1 Replies View RelatedI know little if anything about svn, so need your assistance. I have a linux server running svn. I have a Windows server running svn. First, I'm looking for a way to copy an svn repository from the linux server to a new Windows server repository. I've looked for how to do this on the Internet and within the Apache.org website(which is now the maintainer of SVN).
I haven't found anything on a very granuler way to copy a repository from one O/S to another other than folks say it can be done. Not how.
Second, is there a way to automatically do this on a regular basis? I see the manual process of svnadmin dump and svnload. Also there is a manual process of svnadmin hotcopy.
I'm trying to rsync files and directories from a RedHat linux host(v 4.5 & 4.7) to a Windows server 2003R2 Standard Edition with cygwin running. I'm executing the rsync command from the cygwin shell. The transfer involves rsync'ing approximately 1 TB of data from the linux server to the windows server. After about 280+GB of data transfer, the transfer just dies.
There seems to be no particular file or directory that the transfer stops at. I'm able to rsync GB's of data from other linux hosts to this cygwin server with no problem. Files and directories rsync fine.The network infrastructure is essentially the same regardless of the server being rsync'ed in that it is GB Ethernet running through Cisco GB switches. There appear to be no glitches or hiccups across the network path.
I've asked the folks at rsync.samba.org if they know of any problems or issues. Their response has been neutral in that if the version of rsync that cygwin has ported is within standards then there is no rsync reason this problem should happen.I've asked the cygwin support site if they know of any issues and they have yet to reply. So, my question is whether the version of rsync that is ported to cygwin is standard. If so, is there any reason cygwin & rsync keep failing like this?
I've asked the local rsync on linux guru's and they can't see any reason this should fail from a linux perspective. Apparently I am our company cygwin knowledge base by default.
I have one doubt, is copying /var/lib/mysql is a good alterntive to mysqldump?.
Because i use rsync to copy /var/lib/mysql for back up without dumping the database. I use rsync to do differential backup up so that it copies /var/lib/mysql to /var/tmp every one minute.
I've ran into a pickle regarding permissions and ACL. My users has a folder called "UVdokumenter", in wich they want to put documents up for sharing and editing. Everyone needs to be able to edit the files in that folder, so i set the ACL to this:
Code:
# file: UVdokumenter/
# owner: fleten
[code]....
My current setup is:
old server:
www.mydomain.com main site
www.mydomain.com/subdirectories related sites from same server, different directories
I am adding an additional server that I would like to initially only use for the main site, something like this:
new server
www.mydomain.com main site
www.mydomain.com/subdirectories would be pointed back to the old server instead
What's the best way to redirect the traffic for the sites found in sub-directories on the old server?
i am in need of linux help. iam at college and i need this back/restore script to pass this final part of an assessment. i require a backup script that will not only backup but also restore files to the relevent directories. e.g. users are instructed to store all wordprocessor files in a directory named wp. so i am needing to create a backup directory and 3 directories within that and some files within the 3 directories and then back them up ot restore them. l know i should/have to do this myself by been trying to get/understand info for the last few days and came up with zero.
View 14 Replies View RelatedI have a server with RedHat distro which stores and updates someg files and second computer with Windows 7 on it. I must configure both machines so one can get those files from server to Win7 machine using ssh. But I have no idea how to do it.
View 3 Replies View RelatedI'm trying to copy files from my current server to a new server. Both servers have SSH installed. These are the commands I'm using. However I'm getting connection refused. I did a google search and found out that maybe the reason of this error could be due to the fact that my current server doesn't have SSH. However I use SSH often on my current server so I can say that it has SSH for sure.
OpenSSH_4.3p2, OpenSSL 0.9.8e-fips-rhel5 01 Jul 2008
debug1: Reading configuration data /etc/ssh/ssh_config
debug1: Applying options for *
debug1: Connecting to IP_OF_CURSERVER [IP_OF_CURSERVER] port 22.
debug1: connect to address IP_OF_CURSERVER port 22: Connection refused
ssh: connect to host IP_OF_CURSERVER port 22: Connection refused
I need to copy a large number of files, it comes to 1 lakh from one server to another. When I tried various commands using scp , ftp etc. It is saying "Arg list too long". In which way can we copy all the files. The Two servers are under Linux.
View 4 Replies View Relatedi would like to copy all files from my server001 (/var/www/vhosts/*/httpdocs/) to my server002 (/var/www/virtual/*/htdocs/) i would do it via rsync... but i dont want to do it as root! what would be the right user with which I should login myself via rsync? www-data? its the group of each domain-folder...
Quote:
server002:/var/www/virtual# ls -lh
insgesamt 4,0K
drwxrwx--- 10 vu2001 www-data 4,0K 9. Mär 09:58 domain.com
server002:/var/www/virtual#
but the files inside htdocs are only accessable for the user!
Code:
server002:/var/www/virtual/domain.com/htdocs# ls -lh | grep index.php
-rwxr-xr-x 1 vu2001 vu2001 397 24. Feb 23:30 index.php
server002:/var/www/virtual/domain.com/htdocs#
server002 will be the backup-server if the server001 is down!
I am working with DM355 target board. Here we record. The video coming from IP cameras. Now I have to write c program for copying. The recorded avi files with date and time to NAS server using scp. I wrote a script to copy single file to NAS server.
#!/bin/bash
DATE=$(date +%Y%m%d_%H_%M_%S)
mv Camera1.avi Camera1_$DATE.avi
scp Camera1_$DATE.avi root@192.168.1.4:/root/test/
mv Camera1_$DATE.avi Camera1.av
But I have to write c program for copying multiple avi files with Date and Time to NAS server.
I want to make a webserver with multiple users allowed to login through SFTP to a specific folder, www.Multiple users are added, lets say user1 and user2, and all of them belonging to the www-data group. The www directory has an owner www-data and a group www-data.
I have used chmod -R 775 on the www folder, but after I try to create a folder test through my SFTP server (using Filezilla) the group of the directory created has only r and x permissions, and I am not able to log in with the second user user2 and create a directory within www/test due to a lack of w permission to the group.
I also tried using chmod 2775 on www directory, but without luck. Can somebody explain to me, how can I make it so that a newly created directory inherits the root directory group permissions?
I'm hoping to set up a cron job that takes a file and copies it to a remote password protected FTP server. I've got a command that formats the file with the correct name and I've put it in the anacron file in /etc/cron.d (which I think is right, haven't tested it yet).I'm not sure how to copy the file to a remote server though. I do actually have the ftp server bookmarked in my places menu. So is there a simple way of suppling a file path that will put it straight into that folder? The only problem I can see with this is that the connection won't be open continuously, so would need to be re-opened when needed (I could presumably save the password in the keyring so that I don't need to be there to type it in).
Or maybe set up a cron job that connects to and mounts the ftp server a minute before it has to copy the file over?
I have a problem while copying files from a remote computer to my local one using the scp command. I am sure that I am using it correctly, please check it below:
---
blah@blah.com:~/g4work> scp blah2@blah2.com:IndirectMethod_Spher...s/H_1.mac.root .
---
What I get in return (instead of the statement saying 100% of file copied) is:
---
On this machine the G4SYSTEM=Linux-g++
---
The interesting point is that the above returned statement is one of the Environment variables set on both the machines that are necessary to work with a toolkit called Geant4. Here is what I get when I type 'printenv | grep G4' just to show you (note the statement in bold):
---
G4LEVELGAMMADATA=/home/blah/geant4/geant4.9.3.p02/data/PhotonEvaporation2.0
G4INSTALL=/home/blah/geant4/geant4.9.3.p02
G4LEDATA=/home/blah/geant4/geant4.9.3.p02/data/G4EMLOW6.9
G4NEUTRONHPDATA=/home/blah/geant4/geant4.9.3.p02/data/G4NDL3.13
G4VIS_BUILD_OPENGLX_DRIVER=1
G4RADIOACTIVEDATA=/home/blah/geant4/geant4.9.3.p02/data/RadioactiveDecay3.2
G4ABLADATA=/home/blah/geant4/geant4.9.3.p02/data/G4ABLA3.0
G4LIB=/home/blah/geant4/geant4.9.3.p02/lib
G4VIS_BUILD_RAYTRACERX_DRIVER=1
G4LIB_BUILD_SHARED=1
G4VIS_USE_OPENGLX=1
G4UI_USE_TCSH=1
G4VIS_USE_RAYTRACERX=1
G4REALSURFACEDATA=/home/blah/geant4/geant4.9.3.p02/data/RealSurface1.0
G4SYSTEM=Linux-g++
G4WORKDIR=/home/blah/g4work
---
The other thing that I would like to mention is that these Geant4 Env. Variables are loaded each time a new (bash) shell is started as a result of the bash login script.
I'm planing to copy a productive mysql innodb file from one server to another, and the file size is around 300GB. As the file is keeping changing all the time, I have to shutdown mysql instance and copy the large data file to other server as quickly as possible.I should have to find a way to speed up file copying ... I'm wondering whether there's a way to copy file block by block.If the destination side block has same content, then bypass it.
View 4 Replies View Relatedwe had a CENTOS 5.5 x86_64 machine with 8ered with MySQL 5.0.91-community.Currently we had a very high number of Created_tmp_disk_tables (31k in 4,5 hours!!!).I've read suggestion that we need to increase tmp_table_size, and we've set tmp_table_size to 64M, but this drupal modul's query still cause mysql create tmp table on disk
Code:
SELECT DISTINCT node.nid AS nid,
comments.subject AS comments_subject,
[code]...
Problem : centos 5.3 hangs while remotely copying data onto this server.
I have A centos 5.3 x86 Version installed on Xeon Quad core , running on 8 GB DDR2 RAM, RAID 1 confgured on SATA drives. this was old installation copy and It has many more things running on it.Therfore, I can't just reinstall it.
I have setup SAMBA PDC + ldap on it. I encounter that system hangs when I copy data into this server from LAN computers.
My troubleshooting steps :
1. I have checked for network issue , My network works fine when I transfer data between other LAN compuers.
2. Even I have tried confguring seprate LAN....
3. I thought their might be some problem with my SAMBA configuration, therefore I did data transfer from sftp .
If I share directories with NFS, how do I control the access of the users to the information?
View 1 Replies View RelatedI'm planning a NFS share for a small enterprise (25 NFS clients). I need to create a directory structure but I'll need to set up differents permissions (rw/ro) to some directories of the tree. I wonder if it's possible to grant access using groups IDs, so that would be ideal for this application. Is it possible? I was thinking that I would kneed some kind of centralized user info, such as NIS or LDAP. Is that necessary?
View 4 Replies View RelatedI'm having trouble setting up a vsftp server correctly. What I want to do is allow a number of users to log on (no anonymous user) and each of them to be taken to their own "top level directory" from which they can not escape.
I've got most of this working, but I can't find a way to automatically transfer each user to *their* working area. The "local_root" directive doesn't quite do what I want as everybody has to share the same working area (potentially users could interfere with each other). On the other hand I don't want each user to work from their home directory because there are loads of special files there that I don't want users playing with.
To add one extra compilation, I'm also running an html server on the same machine. One of the directories the html server can see is one of the ftp area root directories (So what I'm trying to do is give one special user ability to ftp files onto the html server. Other users must *NOT* have this ability)
I am writing a script, in that my requirement is, if all the fill types stored in one directory from that we need to separate different different directories based on the file types.
for example in a directory(anish). 5 different types files
1- directory
2- .txt files
2- .sh files
like that and my requirement is the (1- directory is moved to one new directory(dir) which we are given in the script)and (2 .txt files are moved to another new directory(test) which we are given in the script)and ( 2 .sh files are moved to another new directory(bash) which we are given in the scrip)finally the directory anish should be empty..using bash script.how it is possible !!
I'm running an Ubuntu 9.10 Linux server. I'm trying to find a way to backup the machine while it is running and from what I see, this eliminates the disk clone utilities. All of the disk clone stuff I have seen for Linux requires that you reboot into a special live CD.So my question is this, what is the best solution for backing up the system while it is running? Also, I don't really care about the OS config too much, I just want to be able to keep my stored files and my programs that I have installed on it.
View 3 Replies View Relatedhave had a server running for a very long time using Ubuntu Server 7.10, and I think it's passed time that I upgraded.I'll be installing fresh, and I've already backed up /var/www (as well as a home directory with a few files)I've only used this as a Web / SFTP / file server. Might there be any other directories that would be good to backup? I set it up so long ago and have made a few changes along the way.
View 1 Replies View RelatedI want to get all the directories from a remote server using ftp. I know how to use mget for files, I would like to know if there is a similar way to get the whole directory with the files included obviously.
View 1 Replies View Related