General :: Way To Take Remote Backup For All Incoming Mails?
Apr 5, 2010
I have to take two copy of incoming mails on different servers(local & remote). I am using plain server without any panel on it and using sendmail, Dovecot for mail service.Is there a way(any configurations) to take a remote backup for all incoming mails? except configuring forwarder to each mail ID.
As a beginner I installed Fedora 11 yesterday. Everything went well until I installed Evolution and Thunderbird. Incoming mails went well, but outgoing mails not.
i have a mail server that uses Postfix as a mail server, it runs ok, but i need to add some features to a specific users only.what i need to add is Auto-reply message for some users only.
am using qmail and have webmin, all is running smoothly, but i have users spamming other staff accounts.The question: How do I block a user from sending out mails but still able to receive mails. Just denying access to sending mails?if anyone can guide me to do it via terminal as well as webmin.Why webmin you ask, because I have tried it once it works but sadly it block both incoming and outgoing mails.
I have 2 servers. One is an operational CentOS + cPanel server and the other is a blank CentOS server. I want to use cPanel's backup suite to backup our customers accounts but I want to do it so the blank server is mounted to the cPanel server, rather than an FTP backup. If that makes sense. I believe from previous experience that NFS is the way to go but I'm not sure.
can i just copy/backup postfix mail queues in /var/spool/postfix and paste that folder back in after i done migrating all users and mails to a new mailserver?
Is it possible in dd to use it for the output file to be stored at some remote location. I do not have free space on the LVM partition whose backup I want to have via dd.
This is not a regular backup. I only want to backup selective directories so personal files (photographs, documents, sourcecode) will be kept safe in case of a total system meltdown. This'll be 15GB max. Basically the digital variant of a fire resistant safe. I looked into duplicity but that requires me to install gpg keys on the target machine, which I can not do. I rather have a solution that just relies on just a working shell account and diskspace on the target server.
I thought of writing a simple script to do the following: 1. Mount remote server with sshfs 2. Mount encrypted container at remote server (LUKS, TrueCrypt?) 3. Loop over predefined directories on local machine and copy to encrypted container (rdiff-backup?)
Based on these requirements: - Target server is "dumb": only ssh access + diskspace (i.e. no installing of gpg keys) - Encrypted container should grow/shrink to fit contents - Encrypted container should be easily decryptable on any OS if you have the password - Once data leaves client server it should be encrypted: sysadmin on target server should never be able to see unencrypted data.
What's the current best practice for automounting a remote drive for automated backup? I want to use Back in Time and maintain snapshots but it can't do that remotely so I have to mount a folder outside of Back in Time. I have used sshfs from this howto in the past and it works mostly but seems to lose connection and not reconnect a lot. [URL]. Is there a more "modern" way? NFS is horribly unreliable and dog slow for me so it's OUT unless it's changed in the last year.
I have a machine on my network and that machine is a mass storage server that I will eventually use as a media server (to stream movies, videoclips and music on my home theater system). I use slackware 13 on ALL of my machines.
I am trying to automate the backup of the "/home" folder of my laptop onto the mass storage server. I currently use rsnapshot and it works great, but I would like to automate the whole process, even if I am not home or in front of my machines...
Here's what I imagined (in pseudo code):
1) Poll if server is active (up); 1.1) If not: 1.1.1) Wake up the server (WOL); 1.1.2) Wait for the server to boot; 1.1.3) Confirm the server has made it to the login prompt (normal boot); 1.1.3.1) If not, send an alarm via email;
Attempting to create a backup script to copy files from one file system to a remote file system.
When I try this I get:
Quote:
# tar -cf - /mnt/raid_md1 | gzip -c | ssh -i ~/.ssh/key -l user@192.168.1.1 "cat > /mnt/backup/fileserver.md1.tar.gz" tar: Removing leading `/' from member names Pseudo-terminal will not be allocated because stdin is not a terminal. ssh: Could not resolve hostname cat > /mnt/backup/fileserver.md1.tar.gz: Name or service not known
[Code].....
I know that the remote file system dir is RW and the access is working fine. I am stumped...
I already have an ubuntu backup server in my location and need this one server to be backed up remotely in another state. this other location is a helpdesk so there's a danger that they can gain access to confidential data. I'll be setting up this new server as an ftp server but need to set the ftp folder to only allow access to the backup server and me. Because its remote on the helpdesk side, they'll need some access to the file system but need to be completely blocked off from the ftp folder where all the data is at. How can I make sure I can keep them away from my data and still be able to retrieve or copy files over without permission issues between both servers?
Rsnapshot is a software written in Perl to make backup of local and remote file system. The well proven rsync is behind this utility. rsnapshot does not need root user intervention to restore the data of a normal user. It does not take much space in your Backup server. It can be easily automated (scheduled) to make life easier. Just setup once and forget it configuration. Basically it takes snapshot of file system (or a part of) in regular interval such as hourly, daily, weekly and monthly.
This can be configured easily through a simple text based configuration file. The above task can be setup in a few easy steps in a few minutes. Two major tasks are configuring rsnapshot and openssh automatic login. To make the backup automatically, we need to automate the remote login in a secured way. This can be done through openssh tools. This scenario depicts backup of desktop (assuming that IP address is 192.168.0.100) data to a backup server. My desktop runs on Ubuntu 10.04 and backup server runs on Debian Squeeze. [URL]
I'm looking for a free backup solution how work in client-server in both environments Linux(server) and Windows(client). in my case, i want to give a disk space quota in my Linux server for each remote windows client.
I am trying to create a backup script that will back up a single folder for a class i am in. I was wandering if I could get some help. If possible I would also like to know how to write a script that can encrypt that same file . I will be putting the back up in my /home/usr/Backup directory. I am not trying to back up my whole system just a single folder. I am using Fedora 11
I'm just setting up a partition on a seperate HDD in my system. I plan to use the partition to backup the important files on my main HDD (to guard against HD crash).
The question I have is about where would be the typical location to auto mount this partition? Which would it be normal to go for:
I'm running SuSE 11 with sendmail and dovecot.I'm sending and receiving mail. But as soon as someone sends me a mail with an attachment over 1mg it seems to fail. I have no problem sending big attachments.
In a office network, how to queue incoming downloads and later, it will be scheduled by priority? Is there any such open source project /tool available? I have heard about squid proxy, but does it allow re scheduling ?
I'm trying to set up my Thunderbird 3.1.8 (running on Ubuntu 10.04 LTS) to play a .wav file of my choice when new mail arrives. The file I've chosen is 8,8 KB in size and lasts 3 seconds. I've selected the "Play a sound" + "Use the following sound file" options in my Preferences, but only the first second (if that) of the sound gets played - both when I click the "Play" button in the Preferences menu to test it, as well as when receiving mail. Restarting the program with my new settings doesn't help.
Actual question(s): - Is there a limit on the size/duration of the sound one can use for such a signal in this program? - Is there any way to modify this limit?
Request your help in writing a shell script for the following requirement:
1) scan each incoming mail with conditions on sender name and subject 2) scan the mail body for set of characters and email them.
general usage is as follows: I get a mail as follows from: [URL]... subject : Urgent, reply body : name : xyz contact : 12345 the script should mail the following as subject line: n:xyz c:12345 I know procmail is a good option for this, but I'm compelled to use shell script
How can I drop or forward a incoming connection from a part of a host like *.alicedsl.de For example: The user is connection from *.alicedsl.de on port 12345 So how can I drop this connection or forward to google.com on port 80
I want to set up mails from server22 should only go to [URL] but should not go to others [ like yahoo,hotmail, gmail etc ] Seached the threads, but no luck.
I am failing to receive e-mails on client machines. My server has linux Red Hat 7.3. Can you help me with linux command to view sendmail configuration.
It downloading today's mail again and agin whenever I click on GetMail Button. Its very difficult for me to configure thunderbird again as it will take a lot of time in downloading the older mails.