Ubuntu Servers :: FTP From Server To Remote File Server?
Jul 6, 2010
I support a small business which runs a headless Ubuntu Server (10.04 32bit) as a file server which is accessed by Windows machines.Although the company has it's own back-up procedure they have decided to back-up some (none sensitive) files online. The have chosen FileFactory (http://www.filefactory.com/) as the host for this. FileFactory allows files to be uploaded to their server by FTP however I do not know how to set this up on the server.
The idea, if it is possible, is to connect to FileFactory through FTP and then synchronise the data using an Rsync command.I normally access the server through Webmin and it has vsFTP installed. I can access the company's server by FTP from inside and outside of the network so I know that vsFTP is working for incomming connections however I cannot work out how to configure it to connect to the FileFactory server.
I have an ubuntu fileserver and an ubuntu laptop both running 10.10.For some reason I can't connect to the server (file or remote terminal) from the laptop, even though I can access ssh through terminal on my mac and have been able to mount the filesystem on another computer running the ubuntu liveCD. I just get the error 'no route to host'.I've tried turning off the firewall on the laptop and re-installing ssh on both computers, but I don't have a clue what to do next!
Does anyone know the best and simplest way to do this? I'd like the share to be mounted over the tunnel on boot with as little scripting as possible and be as secure as possible without exposing more than one port to the outside. I will be trying this method: [URL]... once the tunnel is established and 'always on' NFS would take care of the file system mount obviously. Lots of the information I have been reading is not up to date it seems. Does anyone have any experience with this?
I have a compressed backup that I want to crypt and upload to a remote server through ssh once in a while. The problem is with the size, more than 4 GB. If the connection drops how does scp know to resume? This should be an automated process.
I am running a local webserver mainly for development. I have everything set up. The issue I currently have is that I cannot shut down the machine from the command line. I can issue the command but the machine remains on. I also cannot get to the desktop via VNC.The reason for this, is that there is no monitor attached so Ubuntu says that it is trying to run in Low Graphics Mode. I found this out as I plugged the monitor in to my server.So question is, how do I get around this? How can I set up Ubuntu to get past this, or do I need to install Ubuntu Server?
I am looking for a client/server application that can allow me to have remote GUI access (eg: VNC) but with the ability to use sound.
I see that TeamViewer has a linux application out now but for my company, it would compromise security protocol because it reaches out to the TeamViewer servers to do any work.
Basically, we have applications that require a GUI and we need to hear sounds from them if they throw an alert (the app is a monitoring tool).
We are running a VNC server on it currently and when our monitoring machine was on Windows, TightVNC was able to get the sound...that is not the case with any Linux client so I am wondering if there is actually one that exists.
I have limited access to several servers (key based auth) but cron facility is not available for me. Those servers getting filled up by large apache logs and I have to login to each node manually and clean them each and every day.
I tried to write a script to run from login box but when i try that it looks like it is looking for logs in the local server (login box).
So current situation is:
How can i modify this so that the script in server1 will look for files in that server and zip them?
Google showed another command called rsh but in my env it is also not avil.
I already have an ubuntu backup server in my location and need this one server to be backed up remotely in another state. this other location is a helpdesk so there's a danger that they can gain access to confidential data. I'll be setting up this new server as an ftp server but need to set the ftp folder to only allow access to the backup server and me. Because its remote on the helpdesk side, they'll need some access to the file system but need to be completely blocked off from the ftp folder where all the data is at. How can I make sure I can keep them away from my data and still be able to retrieve or copy files over without permission issues between both servers?
I discovered in running a script on a remote server. Take this little demonstration script:
[Code]...
This runs as expected (it does nothing) on both my local desktop and the remote server. However if I log in via ssh to the remote server and run this script everything on the server freezes. I do mean everything ftp/ssh/nfs/login-requests/... even if I go and physically plug in monitor and keyboard there is no response. It basically needs a full restart to clear. I have repeated this several times. Now something is clearly wrong here but does anyone know what. I didn't think a non-admin user should be able to actually lock up system processes.
I need to parse the file of same name which exist on different servers and calculate the count of string existed in both files.Say a file abc.log exist on 2 servers.I want to search for string "test" on both files and calculate the total count of search string's existence.For example if file abc.log on server 1 has string "test" 2 times and file abc.log on server 2 has string "test" 4 times.then the output will beStringName : Countexampletest : 6 timesNote : I have created the password less connectivity using ssh-keygen.
I am playing around with the idea of being able to use a cloud or instance based service to install Ubuntu 9.10 Server. This will enable me to have remote access via SSH command line.So far, I've installed Ubuntu 9.10 Server + Ubuntu Desktop to a virtual machine. I can access this via SSH and locally via the desktop. However, in the real environment the only access I am going to have initially is via SSH.
I would like to be able to connect using Windows Remote Desktop or VNC (whichever is easier and most importantly - most secure) to the machine.. even though the desktop is on there, I need to somehow configure the remote access all from the command line.I've had a read of various forums and have been trawling support forums for days but can't find a working solution for 9.10 Server or that fits my situation above where I will not have any physical access to the desktop or machine to configure remote desktop. It all has to be done via SSH/command line.
I was trying to install PEAR on a remote Ubuntu server using putty. I ran an apt-get install php-pear command and everything went smoothly, but now i cannot access the website as it says 'can't establish a connection to the server' and in firebug, it shows the status 'aborted'. I even tried adding the pear path in php.ini file and restarting the apache server but no luck.
I have a fast computer in my office and I want the person using the slow computer in the same office to boot up and see the login window (gdm) and log-in from there into the fast computer and be able to use their session on the fast computer the same time I am locally logged in to the fast computer as a different user and session.Is this best done through XDMCP? Where is a good tutorial on how to set this up?
The rsa was generated from example.com server using example.com as CN Common name.
GoDaddy's website adds the extra names to a CSR you provide, does the checks and grants the cert.
My problem is that whilst the certificate works fine on the server example.com (from which the csr was created), it comes up with two errors when restarting apache on remote servers.
1>> Certificate common name does not match server name 2>> SSL Library error - check private key:key missmatch.
I donn't understand how these keys could ever work as no reference to the private keys of the remote servers is ever used in creating the UCC certificate.
I have a problem with my Terminal Server Client, I cannot seem to connect to any windows server via IP address. Can anyone please recommend any tool I could use to connect? I need to work on the server for admin interfaces example Admin Kits for workspace protection
I'm hoping to set up a cron job that takes a file and copies it to a remote password protected FTP server. I've got a command that formats the file with the correct name and I've put it in the anacron file in /etc/cron.d (which I think is right, haven't tested it yet).I'm not sure how to copy the file to a remote server though. I do actually have the ftp server bookmarked in my places menu. So is there a simple way of suppling a file path that will put it straight into that folder? The only problem I can see with this is that the connection won't be open continuously, so would need to be re-opened when needed (I could presumably save the password in the keyring so that I don't need to be there to type it in).
Or maybe set up a cron job that connects to and mounts the ftp server a minute before it has to copy the file over?
I am able to telnet to the server via a remote connection, but for some reason, it will not accept mail. ere is the bounce back email I am getting.Quote:This is the mail system at host smtp.mydomain.net.I'm sorry to have to inform you that your message could notbe delivered to one or more recipients. It's attached below.For further assistance, please send mail to postmaster.If you do so, please include this problem report. You candelete your own text from the attached returned message.The mail system
<root@mydomain.net>: temporary failure. Command output: pipe: fatal: pipe_command: execvp /usr/bin/perlbin/vendor/spamc: No such file or directory Reporting-MTA: dns; smtp.mydomain.net
I have a webGUI in php where users can make certain settings. How can I edit a file on a remote server from my php webserver? Currently I use my FTP client, vsFTPd and a chrooted user in a specific directory where the file resides. I think this is pretty save as long a nobody else uses my FTPclient. How can I make changes to this file on the remote server from within my php-code on my webserver ?? (so that not I need to make the changes but my users can do it from a html-form) I found this but the credentials for the FTP-connection are plain :
<?php $file = fopen ("ftp://loginasswd@server", "w"); if (!$file) { echo "<p>Unable to open remote file for writing. ";
[code].....
I use https for the webGUI, but I guess this does not mean the connection to the remote server will be encrypted also ? Can I use my FTP-user (has no shell) from within php to edit the file ?
I have install samba server.Now i am trying to shift data from linux machine to windows machine.it gives following error. NT_STATUS_ACCESS_DENIED opening remote file history.txt firewall and iptables are also disabled.
If I need to append a set (or sets) of data to a file(or files) on remote hosts what is the best mechanism by which to do that? My first thought was ssh but the command syntax to append to a remote file isn't clear to me. Can anyone point me in the right direction here?
I have been working on NTP to find out resolution of my issues unable to find. let me briefly explain here. I have three servers and no server is fully synced with remote NTP server.I don't know why it sync time alternatively with remote NTP server and LOCAL whereas there is not issue in connectivity/reachability of NTP server and NTP clients. Also server 1 reporting kernel time sync disabled 0001.