I've been a DOS/Windows guy for 20 years, and recently became a SW test lab helper. My company uses CentOS for a lot, so I've become familiar with it, but obviously not as comfortable as I am with Windows.
Here's what I have planned:
machine: Core 2 Duo E8400, 8GB DDR2, 60GB SSD OS drive, ATI 4650 video card, other storage is flexible (I have 3 1TB drives and 4 750GB drives around that can be used in this machine.)
uses: HTPC, Network Storage, VMWare server host: SMTP, FTP server, and Web server virtual machines
I've figured out how to do much of this, but I haven't figured out how to do backups in Linux. I've been spoiled with Windows, with the built in backup system so simple to use. I find myself overwhelmed with the array of backup software, and unable to determine which to use. none of them seem to do everything I need them to do, but some come close, I think. I'm hoping someone here can help me out in figuring out which program to use and how to use it.
Here is what I need the backup software to do: 1. scheduled unattended backups, with alerts if the backups fail 2. a weekly full backup with incremental every 12 hours 3. removing the old backups when the new full backup runs, I would prefer to keep 2 weeks of backups, but that's not necessary 4. a GUI would be preferable, since my arthritic fingers don't always do as I want them to do. I typo things a lot, and the label worn off my backspace can attest to that.
I have a eucalyptus instance (vm) running an older version of centos (5.3?). As a vm it has no graphical display. I'd like to run a graphical app there so that it displays back on my local machine.
Used xhost locally and it shows the remote IP (eucalyptus instance) as enabled. On the remote side (eucalyptus instance) I set DISPLAY with:
export DISPLAY xx.xx.xx.xx:0.0
where xx.xx.xx.xx is my local ip address. Oh, I did install X in the vm instance (yum groupinstall "X Window System"). X is installed but not running there (does the point of origin of the x app need to have X running as well - and what does this mean in a vm which is a non-graphicla environment?).
Anyway, I try to run (from remote to local) xclock and get the typical
I installed syslog-ng so I can receive remote logs. this is working however since I disabled syslog on my syslog-ng server I am not logging in /var/log/messages cron and some others.locally)I know this is because my syslog-ng.conf only references remote and not local.How can I edit the syslog-ng.conf file so that I can receive remote and local? I tried this however when adding in portions of the default config, I only receive local and not remote logs anymore. I am forwarding my config.
# syslog-ng configuration file. # # This should behave pretty much like the original syslog on RedHat. But
In the office there is a local network with samba+openldap PDC. The local domain name is company.net. The company desided to create a corporate Website on a remote hosting and desided that the site's domain should be company.net which is same as local network's domain name. So now it is not possible to reach that corporate website from within the company's local network because, as I guess, bind9 which is installed on above menioned PDC looks for company.net on a local webserver. Is there a possibility to let people from this local network browse the remote site?
I've become reliant on Net Objects Fusion for building and maintaining my web site. NOF is still only offered as a windows app and it is the only reason I have to keep Windows installed.Can anyone propose a comparable application (preferably open source) that will run in Linux?
I am trying to set up a web based FTP site on my home server, I got ftp up and running and I can log into it using an ftp client but I want to set it up so I can get to it from the web. I put the directory in the www/html folder but that does not seem to help it
If anyone could point me in the right direction that would be great. I also need to let anonymous users get access to it.
I'm currently backupping our home data (pictures, videos, our CDs ripped to FLAC which I spent a lot of time to tag accurately ), totalling almost 300 Gb, on 2 external USB drives, one of which is meant to stay at a friend's. I left the factory msdos filesystem as it was, thinking it could be useful to be able to connect the drives to a windows machine with no problems. It's certainly useful to have «normal» data that I can take with me e.g. when visiting my family.
I'm simply using rsync manually, checking for suspicious changed or deleted file before commiting the change. I do that every 2 weeks or so.
Now I want to add a file integrity management to my backupping scheme: I want to be able to check that new data I'll be committing has not been tampered with (integrity check before updating tags on my main drive), and I want to be able to check that backupped data is still sane on my USB drives, especially if I need to recover from data corruption on my main drive.
Since I'm essentially mirroring the data, I thought run of the mill integrity software would let me just rsync the integrity database, and I'm done.
But after browsing through the docs of tripware, afick and the like, I fear they work only with absolute paths, so the database for my main drive wouldn't work for my USB drive, that's mounted elsewhere when I plug it in, obviously.
So, I feel I'm missing something. It looks to me I'm trying to solve a very common problem, how do people do it?
Did I miss a file integrity software that works with backups?
Is there a trick like using a symbolic link pointing to whatever file hierarchy I want to check, and have tripware/afick/... monitor that link?
Should I run a more elaborate backupping system than plain rsync? Which one? (Storebackup for instance looks promising since it involves md5 sums, but it's targetting a completely different problem, and I'm not sure I can use it at all for what I need.)
Does anyone know of an application for making copies of web sites that can be read offline? I've tried using wget, but with very mixed results. Something a bit more reliable would be useful
As the title says I'm trying to build myself a local RPM mirror. I have multiple laptops and a desktop that use Fedora 11. So I used 'rsync' to setup and sync my directories. I next went on to create my repo with the 'createrepo' function. I run my server backend as FreeBSD so I moved my data over there and setup my 'lighttpd' service.
Everything went fine until I used 'rsync' and synced up my data. Am I supposed to run 'createrepo' after each sync? If so, does anyone else use the same kinda setup, even if not FreeBSD, but a different os other than linux for their server that they run this from? I've been dealing with this for 2 weeks now and finally gave up researching and testing and thought. Not something I'm good at doing. Check my register date and my first post date.
Edit: FreeBSD doesn't have a port or unofficial port for this. I noticed it seems to be written as a python script so thought I could somehow get it to run on FreeBSD with linux emulation.
I installed a web server in fedora 13 and configured everything the way I wanted but when I go to localhost the screen is white and has files on the site, this also happens with phpMyAdmin.
I am starting to develop websites in Ubuntu. I have set up apache server, php, mysql, and phpmyadmin. I have also set up a local domain for the site. Everything works great with html files, but how do I get Firefox to open .php files instead of trying to save them.
I am running ubuntu 10.04, and I have one website up and running 1 website.from my local server, from my local computer on the same network I can type in the ip address of my server and I do get the default webserver is running page, however when I try to connect to my domain, it cannot find it. I have run a ping, and it reached it just fine.
I have the problem with Ubuntu to access the local site within my company network. This network has access to Internet, but the company's site does not have a public address. It is like "http:e-learning.local". I can access this site from any windows machine, but I can't access it from ubuntu. Only if I use IP address, I can access it. However, I have normal access to other sites on the Internet.
Do you have any suggestion how to solve this problem?
I'm looking at building a small NAS setup for my home to hold all my media (DVD's, pictures, music etc..). I'm not too concerned with backups of it at this point as I have all the data elsewhere as it is, but would like one central spot to access it.
A friend of mine turned me on to Ubuntu to do this with but he's never tried it, so I'm looking to see if anyone else has done this or if there issues with it. I've used linux off and on for several years, but this would be my first Ubuntu install.
I'm looking at either using some older hardware I have sitting around, or something cheap like a small dell server (T310 or something). I'd get 4 2TB drives and just use them in a software raid 5 array. It would be accessed with a few different samba shares I'm thinking.
1) should questions about websites be posted here or the therver platform forum?
2) I'm trying to host a website from an old computer I have. I'm running Ubuntu 10.04; I have installed the LAMP package. I've made it's ip address static on my router (linksys WRT160nv3). I can ssh into it from my desktop on the same home network.
I've been following this guide: URL... I'm trying to make the server public. I've set up DMZ on my router to forward to my server's local ip address.
So now, after going to whatsmyipaddress.com, I can log onto my server from my desktop, which is on my home network, but not anywhere else.I'm fairly sure that my ip address from my ISP is dynamic, but it hasn't changed in the last hour.Let me know if you need anymore details. I'm very new to all of this.
How do you get Rsync to do incremental backups rather than full backups? At the moment I have a script that will create a backup folder (if it doesnt already exist) then copy the source files into the backup directory with the command
Target is where the files will be backed up to Sources is the dir(s) to be backed up Exclude files is the list of files not to backup log file is where the output will be saved to. At the moment it only does full backups, but I would only like to do incremental, how would this be achieved? Am I missing out an option in the Rsync that is required.
I need some advice or tips or maybe your own experiences about building a home data storage or NAS.Here's some thoughts / requirements I think it should have:It should expandable. I'll stick a couple of 1TB HDDs and a little later I'll stick some moreIt should easily integrated to both Ubuntu and Windows 7. Ideally it'll be an integrated part of the filesystem.I'm thinking some sort of RAID as a backing up my data. RAID 1 seems like a such a waste but then again, these days, HDDs are cheap.And when I do add more HDDs, I'd like them to appear as one big storage unit instead of separate drives.Any suggestions and tips on how to go about this is welcome. Questions are plenty: should I go with server hardware or is bigger ATX case and standard hardware enough? I'll need some pointers so keep 'em coming
However, the page I'm downloading has remote content from a domain other than somedomain.com. It was asked of me to download that content too. is this possible with wget?
I am trying to set up a home web server for my personal site using Ubuntu 11.04 and Apache. I have set up a user called www and given it FTP access to its home area (/home/www) using vsftpd. I then edited /etc/apache2/sites-available/default and set the DocumentRoot directive to /home/www. When I made a test index.html file in that directory it worked fine. Then I FTP'd to the server (as www) from another PC and uploaded the site files. Now when I try to access the site I get an error 403 (forbidden).Obviously I'm doing something wrong here but I'm not sure what. What should I do to fix this.
I have a few mail servers, a mail log server and a web server running on Centos 5. Now I have a task: to avoid accidental crashes on the production servers while installing updates, my boss asked me to do clones (these clones will all be VMware virtual machines) of the servers (EXCLUDING the actual e-mails and log contents) and then to run those clones on VMWare Server. This way, first I will install and test updates on the clones and - if they will be running without crashes - I will apply the updates on the real production servers themselves.
I have already installed VMWare Server 2.0 I have a few questions: How do I build the virtual machines to exclude the actual mail files and mail logs? Can I use VMware Converter for this purpose, or do I have to use another program? How do I actually do this cloning? Is there a tutorial on how to do this?
Running Ubuntu 9.10. In the Remote Desktop config dialog I get: "Your desktop is only reachable over the local network. Others can access your computer using the address 127.0.0.1 or tabatha.local." I understand this means only the loopback ip address is available. All my other machines show their true local ip address (e.g., 192.168.1.104) in this dialog. Thus I cannot log on to this desktop from other machines.
When I try to do a remote logon from another Ubuntu 9.10 box (or from an XP box using a VNC viewer), I get: "Connection to 192.168.1.102 has been closed." What steps are needed to make this machine show its actual ip address? All file sharing between the various machines is working properly and all windows shares back and forth between XP and 'nix, and among the the vaious XP boxes and linux boxes are available as designed.
What this is doing is simply wiping the remote files within folders(not the folders themselves) and not actually syncing anything down to my local folder(as in no files at all on my /local/folder)
I have the need to SSH into a Slackware 12 box to provide remote support. I got this, but it doesn't provide for a real 2-way communication.
while : ; do read -p "Enter text to Local: " TXT ; DISPLAY=:0 Kdialog --inputbox "$TXT" ; done
So this loops and all, but it doesn't have a history and I have to wait for a return from the Local operator. If the operator has changed focus I can be waiting all day for a response and I would have to start another session to post a second comment.
What is nice is that it's small and I can create the .sh when I remote in.
-----Update since I started
I now have two scripts to take over from the first one. I have to have 4 SSH running to get this to work.
1 SSH to move(archive) and create a chat.txt; it also fires off a .sh that fires off a console that tails a chat.txt so the operator can see the chat history 2nd SSH to fire off a .sh that loops a Local kdialog input box that appends the chat.txt 3rd SSH to tail -f the chat.txt file on the remote so I can see the chat history 4th SSH to loop a read -p on the SSH so I can append the chat.txt
..and the "listeners" with: load-module module-rtp-recv
Then, playing on the sender, and using PulseAudio Volume Control /Playback to set "Null Output", my listeners all start working as expected. The outstanding problem is that the sender is silent - nothing from its speakers. Perhaps not surprising after the "Null" setting above.
Is it possible to stream like this and also listen on the sender at the same time -