Server :: Backup A Single Directory And It's Subdirectories On Lucid To A Freenas Box Across Network
Jul 20, 2010
This should be a quick one. I'm trying to backup a single directory and it's subdirectories on my Lucid Server to a freenas box across my network. This is what I'm using to do that..
rsync -r -a -v -z * --delete freenas:dSIBackups It almost works perfectly except for one problem. When a file is deleted at the source, this command doesn't seem to delete it on the receiving end. I assumed that the --delete would do that but aparently not. Can anyone think of a reason that this would happen?
I am trying to write a very simple script that will go to every subdirectory of a single directory and run a command (lets call it make_ndx).I know I can write this the long way with in a text document with something like:
cd /"the directory"/"the 1st subdirectory" make_ndx cd .. cd "the 2nd subdirectory" cd ..
Alternatively, I also tried: for i in 'find /path/somemorepath -type d -mindepth 1'; do cd $i; make_ndx -f *.gro; done which returns me with the error cd: find: no such file or directory. But if I run the find command by itself to test if I am calling the right directories, it gives me the exactly the output I am looking for. Any ideas? Should I just write the find results to a file and loop through the contents of the file (which seems a little bit like overkill) or am I just making a simple typographical mistake and I am just not seeing it?
I'm trying to setup an Apache server on my computer which will allow browsing of files in a specific directory and subdirectories, without needing any sort of authentication.
I've got the Apache2 server up and running through yast, and everything works fine as long as I try to point it to the /www/htdocs folder. However, I want to point it at another folder, which is on another partition. This partition is formatted as NTFS, if that matters at all (here's some background on some permissions issues I had with the NTFS partitions recently).
When I change the "Directory" setting in the Yast http server configuration utility to the directory on the NTFS partition I wish to use, attempting to access the server results in the following error:
Code: Access Forbidden: You don't have permission to access the requested directory. There is either no index document or the directory is read-protected. If you think this is a server error, please contact the webmaster.
Error 403 192.168.1.100 Mon Jun 13 23:43:29 2011 Apache/2.2.17 (Linux/SUSE)
I am currently trying to figure out what's the best way to backup several PCs (about 5 computers each with Windows 7) from my family.
As I want the same solution for all Computer - I set up my old computer (Windows 7) and added some hard drives and there should now be enough space to backup the data of everyone from my family via the network. (Lets call this computer "Server")
But now I am wondering whats the best way to do this? What I do not want:
I do not want to start the Server each time manually when a computer tries to backup. (I thought about using WakeOnLan.. but I do not know if this is a good idea) I do not want the Server to run permanently I do not want to make the backups manually they should backup about every week automatically.
So which Software on the Computers / or the "Server" would you recommend?
Or would you eve recommend me to use Linux on the Server? If so, which Software would you use then?
I am going to use 'rsnapshot' to create backups of my local & remote computers. My intension is that I can use these backups to restore disk images when needed -> install Linux from scratch and then copy the subdirectories from 'rsnapshot' backup to the new Linux installation.My question is what subdirectories should be excluded in the backup.
I am trying to create a backup script that will back up a single folder for a class i am in. I was wandering if I could get some help. If possible I would also like to know how to write a script that can encrypt that same file . I will be putting the back up in my /home/usr/Backup directory. I am not trying to back up my whole system just a single folder. I am using Fedora 11
I have a file server on my network. It is accessed mainly by linux machines throught NFS, but sometimes I need to access it from windows, and I managed to get Samba up and running with only one share with no password, which is what I want.My users have their "private" folders which are just chmodded 700, and under NFS it works fine, but on samba I get, of course, access denied.How can I configure samba so that it asks a password to access those directory? They can become separate shares, and have their own username and passwords (not the ones in /etc/passwd in the server), I don't care.
I've been doing some file-sharing with Ubuntu. And I've noticed that the files only in the immediate directory is shared, the rest of the folders are shown in other PCs but access is denied. How can I share all the subdirectories in a folder without having to them manually?
What's a good cron script for backing up and zipping a directory of files, or multiple directories with files, to a backup directory on my server, on a daily basis?I found an easy to use mysql backup script, now I need to backup my site directory, but not all the directories in it. So I need a method in the script to omit certain directories from backing up, ie dirs that contain gigs worth of files.This seems like it should be one of the most common crons to set a server up with but two pages deep in google (and here) I have yet to find anything remotely resembling a solution.
1. How can you find all first level subdirectories under the current directory? 2. How will you show the last 100 lines of the file "foo.log"? 3. How will you Stream the contents of a the log file "foo.log" as it gets written to? 4. How can you grep for a pattern on a gzip'ed file? e.g., find "foo" in bar.gz 5. Find all lines in the file "foo" which DON'T have the pattern "bar" 6. Your web server is running very slowly. If you can login to the server, what command will you run to find out cpu and memory use? 7. Extract the file foo which is a part of the tar'ed, gzip'ed file bar.tar.gz 8. You attach a usb disk to your linux desktop, but it does not show up. How can you get more information about the error? 9. What is the secure way to login to remote systems? 10. What is the difference between TELNET and SSH? 11. Given a file 'a' with the following permissions -rwxrwxrwx 1 rohit rohit 0 2011-01-24 13:30 a Change its permissions such that it is only readable and writable by its owner, not accessible by anybody else in the group and only executable by the world 12. Difference between using ' and " for quoting a string / command in a shell 13. In the attached text file (test.txt) replace all occurrences of 'red' with 'yellow' without using an editor (i.e. from the command line) 14. How would you suppress output written to stderr by a command 15. Meaning of the #! notation in scripts e.g. #!/bin/sh 16. What is the output of the attached shell script test.sh Scripting questions, all based on the attached file access.log. Use one of perl, python, ruby, or shell scripts to solve these questsions. If any answer is obtained using just the command line, please include those commands as well. 17. How many accesses were made between 10am and 11.30am on Jan 24, 2011? 18. How many unique IP addresses accessed this server? 19. For every IP address which accessed this server, output a report showing number of hits for every type of HTTP status. For e.g., IP 192.168.1.20 has 164 hits with status 404 and 1690 hits with status 200.
I have .jpg files in many subdirectories from where I need to copy all the images from all the sub directories and paste them to a specific directory.I have used `cp -rf *.jpg media/sik/` which only copies the .jpg files of the directory in which I was working.
I am a noob and I am trying to display a count of the number of subdirectories in a directory. I have been able to use find -type d to list directories and subdirs but I want a numerical value of dirs and subdirs. I know ls -l gives a count but when I try ls -l -d all it shows is "." I also have tried a combination with the -R option but nothing seems to be working for me.Please forgive my ignorance but I am working on a script for class and this is the first step.
In reading the rsync man page and browsing a lot of websites, I ended up a bit confused, or maybe it was just too much eggnog. Anyway, to exclude a directory "videos" with everything in it, which is /home/user1/camera/videos and I'm rsyncing the whole user1 directory to an external drive
what is the best way to go for setting up a NAS. It will be used for 2-3 weeks worth of HD Video storage and needs to have a redundant power supply, swappable SATA drives. I'm thinking of using either FreeNAS or OpenFiler as the OS.
We are limited to a small amount of available server hardware in South America. To buy an 8TB HP NAS server costs $11000, quite pricey for us. In Canada I'd just get a couple of Buffalo Terrastations, but that's not an option now.
So I'm thinking of going with an INTEL SR2400 ($550), maybe even two of those for more redundancy. Anyway, I was thinking of using 1TB or 1.5TB drives, but according to other web sources, that's a bad idea with RAID, as it's more prone to hardware errors.
I'm able to use the following to remove the target directory and recursively all of its subdirectories and contents. find '/target/directory/' -type d -name '*' -print0 | xargs -0 rm -rf
However, I do not want the target directory to be removed. How can I remove just the files in the target, the subdirectories, and their contents?
I've found several posts discussing how to do this in with the terminal, but none exactly fit what I am trying to do. And since I'm still very new, I was hoping for some help.
I have a parent directory called "Music." The subdirectories all start with "artist", some go further as in "artist/album/cd1". So right now the structure varies in the following ways code...
How can I move all the files (or the file types that I choose) to the parent directory "music"?
(By the way, for any who are interested, this is so that I can use an external hd with a PS3. ("playstation 3"--for anyone who was in my predicament searching the threads)
I need to write a script that is given a directory as an argument, and it prints the last modified file from that directory and all its subdirectories.
for example:
$ newest /usr/etc --> /usr/etc/httpd/httpd May 28 12:16
If I had to do it only for the current dir, it would be easy...I'd probably use "ls -lt" and then show only the first line...
I need to give a user write access to /var/www and its subdirectories. The current directory permissions are as follows:rwx r-x r-x root root
I added the user to the root group but that didn't seem to help.I read I could chmod -R to change the access to write for the www directory and subdirectories but I don't want to change things and mess up the website. How can I give the user access to write to the www directory and subdirectories without messing anything up? Would changing the www directory group owner to his group cause an issue anywhere?
I need to copy all subdirectories and files from one directory to another ever 5 minutes or so, with the old data automatically being overwritten with the new data. I'd also like this to run at startup. Is there any way this can be done? If so, what program would I need to schedule the automation and what is the command line I would need.
using Back In Time to backup my home directory to a second hdd that is mounted at /media/backupThe trouble is, I can do this using Back In Time (Root), but not using Back In Time without the root option. This is definitely a permissions issue - it can't write to the folder, but when I checked by right clicking on the backup directory and looking at the permission tab, it said I was the owner
This script simply deletes files older than a certain age (in this case 7 days) from a certain location; I use it to purge old backups nightly, and it works as expected:
# delete backups older than 7 days find /mnt/backup/* -mtime +7 -exec rm -Rf {} ;
The problem is, every morning I get an email with an error message something like this:
find: `/mnt/backup/subfolder': No such file or directory
I'm trying to share a file via Samba on a Ubuntu server that is actually stored on a FreeNAS box. The FreeNAS drives are mounted via NFS and the Samba share contains a symlink to file on the FreeNAS drive.Browsing the Samba share I can see the file and size, but any attempt to read the file fails. It complains about authentication but all credentials across all machines are the same.So, is it possible to share a file this way or is there another way to do this?I know I could create all the profiles on the FreeNAS box but for convenience and ease of maintenance I was hoping to do this via the Ubuntu server
I think I can eliminate Media Companion as the problem since all other samba servers work with it.
I want to trouble shoot this but don't even know where to start. How do I figure out what makes OPENWRT samba server different from the other 100% working FREENAS and PC-01 Servers?
I have a finely working CentOS server. I want to clone the complete OS (over network) so that I can use it for same functionality on several other machines..
The place I work has a web/dns server on opensuse 10 up and running hosting a few websites (our company one & a few vhost ones). The box was set up before I got here. Now we are wanting to create a new 11.2 server that is basically a backup / clone of 1st server in case it goes down.