I've been given an old rsync script to update some data files on a few different servers... I am trying to understand what directory on the server it gets it's files as the way it seems to refer to the data on the server is:
data_source='random-server.fqdominan.net::Firstlogic'
Not sure what the "::Firstlogic" part with the two :: means.
Here is the entire script:
Code:
#!/bin/sh
# Begin user-configurable options #
# Who can run this script
authorized_user='appdtools'
# From where we rsync the update data
data_source='random-server.fqdominan.net::Firstlogic'
# From where we rsync the rollback data
rollback_data_source='random-server.fqdominan.net::Firstlogic_rollback'
# Where updates will go
data_target='/Firstlogic/postware/dirs'
# How to do the rsync .....
# Here is where we do the "work".
splash# Check user and print warning
if [ "${rsync_test}" == "-n" ]; then
update_address_data# Test the monthly update
else
stop_address# Stop the address standardizer
update_address_data# Load the monthly update
clean_logs# Empty the old log files
start_address# Start the address standardizer
fi
I have not tested this script yet but evidentialy it used to work. I just need to find out where to stick the files on random-server.fqdomain.net to test it out. The server it points to no longer has the files on them...
I launched a script using rsync and the option -delete-before, however the destination folder was the wrong one. I noticed it only seconds after but it was still too late. In less than 5 sec rsync deleted over 200gb of data on my external hard drive... It is not in my Trash nor on the HDD trash but it's not possible that the data were ereased in less than 5 sec! (I dont know how rsync handle deletions.) I know I am really stupid but is there a way to get back all my data (mainly HD movies in mkv, mp4 and some avi)
I have successfully backed up my files using a script to a remote server with a log file output.However the log file is appended each time.I wish to have a different log file each time with date and time and have yet to figure that part out.
My rsync takes backup of everything from the differenct linux servers to my backup device which is 2 TB only .Since it takes almost full backup of source , it consumes space lot in the backupdevice. So i wanted to keep all my backup files of one month old latest files in backup device, it should remove all files more than one month data.
I am backing up may data using Rsync via SSH. I am trying to find a solution that allows me to push my backups from windows and Ubuntu to a Ubuntu server. I don't like the solution that requires my windows machines to have shared drives. Not viable for laptop users that may piggyback on free wireless networks. I would also like a solution that has a GUI for the desktop user but not totally required. I will need a solution that allows me to restore files by date. I do a lot of changes to files on a daily basis and sometimes need an older copy of a file.
I love using Rsync as it is a very simple solution and extremely quick. The only thing I don't know how to do is recover different version of a file. It would be nice to see a visual representation of all the files and different dates of each file. I know I might be asking a little too much, but, it is after all, Linux.
shed some light on what I am doing. I am wondering if I just havehings back to front.Server (MESH):Fedora 13Firewall ports open tcp 22(ssh), tcp 873(rsync)sshd service started
I'm trying to do an rsync from one RHEL box to another, but when I run it in verbose mode to see why its not working, all it does is show the root folder, then one folder in, then it stops. There are about a dozen folders under that root folder where the rsync starts, with about 350GB data spread between them. How can I tell why this isn't working? the same command is setup to run as a cron job, which was working.
Windows computer continually generates images (AutoGrabnnn). All downstream analysis on linux computer. I need analog of rsync bringing files over as they arise. rdesktop seems to be the approach. Is this already described somewhere to bypass much trial-and-error?
I'm looking for a most possible, secure solution to transfer data using rsync over Internet between 2 linux server. I have 3 option: SSH, IPSEC and Kerberos. Which one in your opinion should be most secure solution?
I am using read() in c++ to get data from a serial port. However, if no data is available on the serial port the function blocks until dta arrives.Example code:
I have some data files that should be distributed with my program. Using dist_pkgdata_DATA in Makefile.am, I get these files installed to /usr/local/data/share/package-name. The problem is that data is read-only, and my program needs to modify it. Playing with dist_sharedstate_DATA, dist_localstate_DATA, dist-data_DATA varibles, I got different installation directories, like /usr/local/com, usr/local/var, but data is always read-only.
How can I distribute modifiable data files with my package? I need some common directory for all users, or maybe local data in a user directory.
So I am trying to learn a little about Cygwin and rsync. I'd like to rsync some data from a Windows machine to a Linux machine. I've got Cygwin installed but I can't figure out how to tell Cygwin were rsync will be pulling the data from; basically, how to set the directories that I want to be rsnyc'd. I've googled and googled but I can't seem to find the answer to this exact question.
PHP in FC 12 has a lot of issues. Is there any way to rollback to an earlier version still using Yum?I'm running drupal on an FC11 machine and the site works great. When I move it to a FC12 machine, I get all sorts of errors which I've traced to problems with PHP 5.3.2. Looking at the PHP site regarding the kinds of issues I'm running into is a scary experience. The developers know its broken,
I've had so much trouble with PHP and broken releases, I suggest Fedora keep at least 4-5 releases in yum so that users can easily switch versions. Certainly 5.3.2 should not be released.I've been looking for my FC11 disk and as soon as I find it, my FC12 machine is going backwards just so I can run a version of PHP that actually works. This should not be necessary, but.....
I've searched a dozen or more threads, some say there is no such thing as rolling back an update, others say it can be done, but I can't find one that outlines how to do it.
I am on a fujitsu tablet and usually the "low resolution mode"that would run with this was really good.Now it looks like someone wrote over it with pastel.I can hardly read it.Is there a way to rollback the upgrade?
I play this game called Minecraft, and I'd like to run the server package as well, but the server is laggy as hell. I did my research, and some people were saying that one of the problems was the Java version. They recommended rolling back specifically to 1.6.0_13, but they didn't say how. So I Googled for a tutorial on how to roll back Java in Ubuntu, and got nothing. So here I am.
I also learned that you can add some extra code into a Java run command to make it run off of a specified version, so I added this to the original command I have to run my server
Code: -version:1.6.0_13 and got "Unable to locate JRE meeting specification '1.6.0_13'". So now I'm thinking that Ubuntu didn't retain any previous versions of Java. My second question would be, where does Ubuntu keep it's versions of Java, cause I'd just download that version of Java, put it where Ubuntu keeps it, then run that command again and hope it works.
I've used the 270'th version of NVIDIA driver with my GTS450 graphic card. After publishing the new 280 version of driver on NVIDIA resource, i've tried to install it manually. Installation was successfull, but when booting a conflict between versions was found(from kern.log) and only console mode is available. How can i rollback NVIDIA driver in console mode? Previous configuration of xorg.conf has been backuped.
I have a cron job that runs overnight for a rsync script that backups my home directory to a external hard disk that is connected to the computer via usb 2.0. I also output the results to a log and follow it via tail -f command from a terminal, to monitor it.
Here is the script: Code: #!/bin/bash # backup data on a daily basis via rsync and a cron job echo echo backup started `date` >>/home/user/scripts/backup/backuplog echo rsync -avh /home/user/ /media/Linux_ext3/ echo echo backup complete `date` >>/home/user/scripts/backup/backuplog echo echo disk used: `du -csh /media/Linux_ext3` >>/home/user/scripts/backup/backuplog echo echo disk free: `df -h /media/Linux_ext3` >>/home/user/scripts/backup/backuplog echo
Two things - The rsync is doing a complete mirror of my home directory to the usb drive. For example, say I have a .txt file at the root of /home/user. It gets copied over. However I delete the .txt file the next day and if I go to /media/Linux_ext3/, the .txt file is still there. I was always under the impression that rsync would mirror the two directories, correct? -For my log, I think it is creating an entry, I think, for each file. Again, was under the impression that rsync would only copy over files that have be updated, correct?
I'm writing a script to automate rsync backups which so far is working great. I prefer to send output to a text file instead of watching it fly by but the blank output is slightly boring? I would love to use a progress bar. Not for each file but for overall progress. Something like
syncing /home directory 50% [++++++++]
I already know it's way over my head and maybe bash isn't even the right tool for the job.
Is there a way to rollback the packages installed in the last software update?I cannot reach the gui login screen (x-server) ever since my last software update.Hence cannot login.Too bad the old xserver restart key (crtl+alt+bkspace) dont work anymore.Also, I dont remember the command to view the system logs to know what went wrong.
Using Ubuntu Lucid with Gnome desktop.I was just playing around trying to find a media player I liked and installed Bangarang via the Software Centre. This took an absolute age and now I realise why - it has basically installed the entire KDE environment and associated lib packages as well.I have found /var/ log/ dpkg.log shows what has been installed and of course I can wade through that to make a list of all the packages and uninstall them all via Synaptic. But that will take a long time to do.
Is there anyway to somehow automate rolling back any package changes since a certain time?I've checked the man for dpkg and I can't see any mention of anything like this.
I am trying to create a simple bash script to rsync some folders within a directory stucture. I am using wild cards, in the rsync source directory structure, but my command always fails. I believe it is the way I am using wild cards within my for loop. Here is my command ;
Code:
for seq in `cat test.txt` ; do rsync -nvP /folder/folder/folder/folder/folder/**/$seq /folder/folder/folder/ ; done This always fails, where if I do a ls to the destination, to test the path, it always works.
iam trying to sync file server data into backup server machine by command- rsync -avu path/of/data ipaddress-of-backup-server:/path/where/to/save after running it ask for root password and manually it is successful.but i want to make it automatic.for that i also tried cronjob and also generated authentication key but iam not successful in login automatically..anybody know how to authenticate root to login for storing data in backup server.
Thought I'd post it here because it's more server related than desktop... I have a script that does:
[Code]....
This is used to sync my local development snapshot with the live web server. There has to be a more compact way of doing this? Can I combine some of the rsyncs? Can I make the rsync set or keep the user and group affiliations? Can I exclude .* yet include .htaccess?
When I run rsync --recursive --times --perms --links --delete --exclude-from='Documents/exclude.txt' ./ /media/myusb/
where Documents/exclude.txt is
- /Downloads/ - /Desktop/books/
the files in those directories are still copied onto my USB.
And...
I used fetchmail to download all my gmail emails. When I run rsync -ar --exclude-from='/home/xtheunknown0/Documents/exclude.txt' ./ /media/myusb/ I get the first image at url.
I need to create a script that will compare the differences between two folders and then to copy only the updated and new files only to another directory. I know I need to use rsync here, I can write scripts so really it not how to create a script it is how do I accomplish the transfer of only new or changes files between two folders to a new file. Do I need to link these two folders first and then use the "--compare-dest" switch.
I have a tiny shell script to rsync files between two servers and remove the source files.
This script works fine, when it has been initiated manually or even when the rsync command is executed on the command line.
But the same script doesn't work, when I try to automate it through crontab.
I am using 'abc' user to execute this rsync, instead of root, as root login to servers are restricted in all of our servers, by us.
As I mentioned earlier, manual execution works like charm!
When this rsync.sh is initiated through crontab, it runs the first command(chown abc.abc ...) perfectly without any issues. But the second line is not at all executed, and there is no log entry i can find at /mnt/xyz/folder/rsync.log.