Server :: Backup Of Svn Repository Using Svnadmin Dump Dumpfile Location
Apr 14, 2011
One of my clients needs a backup of his svn repository. I see that this is possible using svadmin dump command. I see where the location of the source repository is, but I don't see anything in documentation as to where the actual dump file is located. I need to know where the dump file is so I can scp or rsync the file to another server for backup.
View 5 Replies
ADVERTISEMENT
Feb 28, 2010
I'm working on setting up a new subversion server and getting an error I'm not having much luck resolving.
I'm attempting to create the repository in /home/svn/foo with the command svnadmin create /home/svn/foo while in /home/svn.
I am greeted with the error: svnadmin: SQLite compiled for 3.6.20, but running with 3.6.17
I installed subversion via yum install subversion
I've installed SQLite 3.6.22 from source and removed/installed subversion with no change.
What gives? I wasn't really wanting to install subversion from source. Seems a bit excessive.
Environment stuff:
[root@COS svn]# yum repolist
Loaded plugins: presto, refresh-packagekit
repo id repo name status
fedora Fedora 12 - i386 enabled: 15,366
updates Fedora 12 - i386 - Updates enabled: 4,731
repolist: 20,097
View 1 Replies
View Related
May 10, 2011
I install and tested Restore EE Backup server on a test PC with basic configuration and its working fine.
[URL]
The issue i have is where is the location these backup snapshots or files are saving? I want to add a separate Storage to save the backup?
View 1 Replies
View Related
Sep 6, 2010
I have used Dump Command to dump the application files. For Full backup the level 0 is working fine. For incremental backup I used the level 1 or 2 it is getting the error as
DUMP: Only level 0 dumps are allowed on a subdirectory
DUMP: The ENTIRE dump is aborted.
The code I used
===============================
#!/bin/bash
#Full Day Backup Script
#application folders backup
#test is the username
now=$(date +"%d-%m-%Y")
[Code]...
View 2 Replies
View Related
Nov 23, 2009
I have my webserver with cpanel , and I have a FTP location. I wanna backup to the FTP location directly, because I dont have enough space on the server to back up first then rsync over the data, I have this script I need a little help in getting it to backup directly to the ftp location , or is there a simpler way .
Code:
#!/bin/sh
# System + MySQL backup script
# Full backup day - Sun (rest of the day do incremental backup)
# Copyright (c) 2005-2006 nixCraft <url>
# This script is licensed under GNU GPL version 2.0 or above
code....
View 15 Replies
View Related
Jan 15, 2010
After I spent some time discovering The BIG BANG of Universe and The Meaning of Life :
I managed somehow to create a script to make some backup of files on server and TAR it and then FTP the archive to another location FTP server and then emails result.
It also measures time needed to complete it and deletes archive older than XX days (set in find -mtime +20) and makes incremental backup every weekday and FULL on Sundays (which suits me bcoz no heavy load).
Files for TAR to include and exclude are in txt files listed each line separate name:
file: including.txt:
View 7 Replies
View Related
Mar 27, 2011
I would like to have dump backup just my home directory but am having problems the command I am using wants to back every thing and takes hours upon hours it has been running for about 10 hr and only 21% is done. This is the command dump -0u -f dp_hd /media/CENTON USB/ /how can I get this to back up only my home directory
View 7 Replies
View Related
May 19, 2009
Using tar is it possible to backup different types of file system e.g.ext3, ufs, or any other file system. I know using dump it is not possible because it is reading through raw device. Then what about tar? Where I get more info about this? Means suppose I want to backup files from different file systems using tar then is it possible?
View 1 Replies
View Related
May 22, 2010
Does the dump command back up entire file-systems or is it capable of backing up subsets of a file-system? And is tar capable of taking device names (for file systems) as input to be archived?
View 1 Replies
View Related
Apr 7, 2010
I have a repository which i setup using reprepro. I have some packages on there but it seems that the Ubuntu repositories have a more recent version of those packages. What I want to do is when someone does an apt-get install package-name, it downloads the packages from my repository (in respect of any updated versions in other ubuntu repositories). I would like to achieve this with zero configuration on the client. Ideally what I would want is to be able to just point the sources.list to my repository for all packages, and if the package does not exist in my repository, it then goes to look in the ubuntu repos.
View 1 Replies
View Related
Mar 20, 2011
I have a problem with a script i wrote, the script runs fine if manually executed however it doesn't run *fully* when executed via cron
here's the script :
Code:
#!/bin/bash
FILENAME=mysql_full_dump_`date '+%m.%d.%y'`.sql
`which mysqldump` --all-databases -uroot -p************ -h127.0.0.1 > /root/$FILENAME
RETVAL=$?
[code]....
the script resides in /root/bin and the cron entry is as follows:
Code:
0 0 * * * root "/root/bin/mysql_daily.sh"
the result is the .sql file, but it doesn't archive it.
View 2 Replies
View Related
Jan 20, 2011
I am using Cron for nightly backups to a usb device. I was just wondering in my script for the backup, how do I find the location of my usb device.
View 2 Replies
View Related
May 27, 2009
I have a mount called on /home for /dev/sda12..I want to mount /dev/sda12 onto /backup..I tried to do this by changing things in the fstab file i.e. i replaced /home with /backup. This change caused boot up problems and I had to change my fstab file back to get going again.
View 2 Replies
View Related
Oct 7, 2009
I want to generate core dump files from my program when it crashes. Its a pretty big process and has about 10-11 threads in it.I have followed the documentation to enable core dump by setting ulimit to unlimited etc. I quickly tried "A demo program creating a core dump" from the following webpage, which succeeds in Segfault and dumping a core file in the directory that I configured.However, I tried running my original program and caused it to crash. I did this by making calls to kill(), raise() or the same null pointer access as shown in the webpage above. In each case, my program crashed but did not generate a core dump file. Am I missing something?My program is in C++ and my environment is Redhat 9.0 (kernel 2.4.20)
Going through the "Why do I NOT get a core dump?" section on the same webpage as above, I can see two potential problems. One - there are issues with the suid/sgid (bullet # 6). I am not able to change any settings with suid because my system does not contain either /proc/sys/fs/suid_dumpable or /proc/sys/kernel/suid_dumpableTwo, my program has threads in it and the bullet # 8 is the problem.
View 1 Replies
View Related
Jul 15, 2010
These days i try make a simple sniffer for a embedded system. and it need the function of dump all the packets into a file, which can be read by wireshark..etc. First i copy a code called simplesniffer.c from the Internet,and and now I want to add the dumpfile funtion to it. i find some problems.Quote:
/* Come from ---- http://blog.chinaunix.net/u/24474/showart_226419.html */
/* simplesniffer.c */
#include <stdio.h>
[code]...
View 2 Replies
View Related
Jul 18, 2010
I am trying to install apache on centos-5-i386 (# yum install httpd) and get the following error:
==========================
[root@IDK3 /]# yum install httpd
Loaded plugins: fastestmirror
Loading mirror speeds from cached hostfile
* base: centos.mirrors.tds.net
[Code].....
Edit: The url works if I leave out "repomd.xml" at the end; and the repomd.xml is exactly in that folder? i have downloaded the file but don't know how to use it.
View 1 Replies
View Related
May 11, 2011
i am using ubuntu before, i dont have problem to backup my updates or my repository by using APTONCD, i switched now to fedora to give it a try. so far its good, my question now is how can i backup my repo, or any similar software like APTONCD.
View 3 Replies
View Related
Aug 6, 2011
I need to make a scheduled backup of repository of subversion in ubuntu. E.g., backup the repository at 13.00 pm every Monday. May I need to write some hook scripts to do that? And I also have to recover the backup of repository. If possible, I want to backup the trunk of repository
my repository is project1
/project1
/trunk
/tags
/branches
View 6 Replies
View Related
Oct 8, 2010
I'm setup on a virtual dedicated server with a host running ubuntu, but I'm trying to setup subversions.Now I can set it up in say /svn/myrepository But I want to set it up on one of my websites e.g.
/var/www/vhosts/example.com/httpdocs
but when i run svnadmin create on httpdocs i get the following error.
svnadmin: Repository creation failed
svnadmin: Could not create top-level directory
svnadmin: 'httpdocs' exists and is non-empty
View 1 Replies
View Related
Sep 7, 2010
I am still new in linux (Redhat) i used dump command to backup the root of the linux server: #dump -0u -f /dev/st0 / the command is achieved. how to restore this dump.
View 2 Replies
View Related
Aug 7, 2010
Mencoder has not a support for my webcam.
However "
Quote:
" works flawlessly well.
So option dumpfile in man page is there to help us!
But well the command
Quote:
results in a 0kylobytes files.
How to force mplayer to dump file -tv flow data to file?
View 5 Replies
View Related
Jun 5, 2009
I need to discover more detail error kernel panic in my linux box, some body know any way do dump in kernel Linux?
View 1 Replies
View Related
Jan 11, 2011
My application team is asking me to generate the kernel-dump.
Here are details about my server.
OS: RHEL 4.7/32 Bit
Kernel Version: 2.6.9-89.0.23.ELsmp
Processor: Intel(R) Xeon(R) CPU E5520@ 2.27GHz
Hardware: HP Proliant 380G6 series server.
I am using Electric Cloud applications. Sometimes it creates some kernel panic and immediately got rebooted. Kernel-debuginfo rpm is not installed. In some thread, I read the kernel-debuginfo rpm's version should match with the kernel version. In my case I couldn't even find the exact version of kernel-debuginfo version.
View 4 Replies
View Related
Apr 14, 2010
I'm quite new to linux, but I've managed to grasp some basics. Now my intention here to create a virtual directory, which I resorted to creating an Image File so that I can mount it and have my folder have a dedicated storage. I will mount this image as a loop device. Well it's not much of a problem, but I would like to know whether this is suitable. Say I want to create a 25GB Image.
Code:
dd if=/dev/zero of=/home/disk-img/25GB.ext3 bs=1G count=25
Is this recommended? I'm using block size as 1G which is really huge, so I was wondering, if this is actually recommended. From what I read, some said that it's only advisable to use 4096k or lower, but what I found was that these suggestions are very dated (year 2003), and it is now 2010, so I would like to know if it makes any big differences.
View 13 Replies
View Related
Dec 29, 2010
I am using RHEL 4.7 (32bit) on HP Proliant 380G6 series server. We are using Electric Cloud Agents on these servers. Nowadays we are facing some memory issues and its creates some kernel panic and then restarts the server. When i reported the issue to my application team, they asked me to come with the core dump. I googled it enough, then i set ulimit value as unlimited. (previously it was 0, then i made a entry in /etc/profile file as follows
ulimit -c unlimited) But still whenever my server restarts due to that kernel panic, it couldnt generate the core dump. My application was installed on /opt
The attached document has the kernel panic logs
View 3 Replies
View Related
Jul 7, 2010
Using CentOS 5.5. I have a handful of users that I need to have connect to my server via sftp and start in the same directory. for example, user1, user2, user3, etc.. will connect via sftp and upon connection will all be in the /some/dir/path/ftp-root directory.I know one way is to create these users all with the same 'home' directory, since by default a user starts in their home directory when connecting via sftp, but before just doing that, I wanted to find out if that is really the appropriate method to use? alternatives? Is there some setting on the sftp server end that could direct all users to one starting directory so that these users don't have to have the same 'home' dir? I'm using the sshd daemon that comes with CentOS 5.5 (with all current updates/patches)
View 4 Replies
View Related
Mar 15, 2010
Having done a short DoD wipe of hard drive (Dareks Boot & Nuke),I installed Windows XP on the first half of the drive and again zeroed the other half of drive for installing Debian.Please see attachment of screen shot for command lines input and output.After doing a grep for non zero characters on the second half of the drive(sda2)I was puzzled to find a grep command line search for non zero characters actually turned some up. I have no idea why they are there or what if anything they mean
View 4 Replies
View Related
Oct 18, 2010
Is there any command available inorder to read the server crash dump files?
View 4 Replies
View Related
May 10, 2010
Does anyone know of any decent enterprise level backup solutions for Linux? I need to backup a few servers and a bunch of desktops onto one backup server. Using rsync/tar.gz won't cut it. I need like bi-monthly full HDD backups, and things such as that, with a nice GUI interface to add/remove systems from the backup list. I need basically something similar to CommVault or Veritas. Veritas I've used before but it has its issues, such as leaving 30GB cache files. CommVault, I have no idea how much it is, and if it supports backing up to a hard drive rather than tape.
View 7 Replies
View Related
Jan 12, 2010
Need confirmation if the following scenario works for making my client and server as identical?
My local(source) Linux server @192.168.0.2
My remote Linux client @192.168.0.70
On the local system :
#df -m
Filesystem Mounted on
/dev/hda3 /
/dev/hda1 /boot
tmpfs /dev/shm
On the local system , issue the followings to make client and server as identical :
#dump -0uvf - /dev/hda3 | ssh root@192.168.0.70 -c "restore -rf - /"
#dump -0uvf - /dev/hda1 | ssh root@192.168.0.70 -c "restore -rf - /boot"
#dump -0uvf - /dev/shm | ssh root@192.168.0.70 -c "restore -rf - /tmpfs"
View 1 Replies
View Related