Ubuntu Servers :: Crypt And Upload To A Remote Server Through Ssh?
Jan 5, 2010
I have a compressed backup that I want to crypt and upload to a remote server through ssh once in a while. The problem is with the size, more than 4 GB. If the connection drops how does scp know to resume? This should be an automated process.
have setup a LAMP server with ubuntu 10.04 server edition x86 for my study in VMware Workstation 7.1 For a assignment I had to make a php script that would load a file up to the server and set the name in a mysql database. According to the book the server should set the image in a cryptic folder in the /tmp/ folder.
This isn't working and i also try'd locate and find to find the image i uploaded. I checked the php.ini and file uploads were on but no folder so i set that one to /tmp/ but still no images. Can anyone help me with enabling this function?
I support a small business which runs a headless Ubuntu Server (10.04 32bit) as a file server which is accessed by Windows machines.Although the company has it's own back-up procedure they have decided to back-up some (none sensitive) files online. The have chosen FileFactory (http://www.filefactory.com/) as the host for this. FileFactory allows files to be uploaded to their server by FTP however I do not know how to set this up on the server.
The idea, if it is possible, is to connect to FileFactory through FTP and then synchronise the data using an Rsync command.I normally access the server through Webmin and it has vsFTP installed. I can access the company's server by FTP from inside and outside of the network so I know that vsFTP is working for incomming connections however I cannot work out how to configure it to connect to the FileFactory server.
Does anyone know the best and simplest way to do this? I'd like the share to be mounted over the tunnel on boot with as little scripting as possible and be as secure as possible without exposing more than one port to the outside. I will be trying this method: [URL]... once the tunnel is established and 'always on' NFS would take care of the file system mount obviously. Lots of the information I have been reading is not up to date it seems. Does anyone have any experience with this?
I am running a local webserver mainly for development. I have everything set up. The issue I currently have is that I cannot shut down the machine from the command line. I can issue the command but the machine remains on. I also cannot get to the desktop via VNC.The reason for this, is that there is no monitor attached so Ubuntu says that it is trying to run in Low Graphics Mode. I found this out as I plugged the monitor in to my server.So question is, how do I get around this? How can I set up Ubuntu to get past this, or do I need to install Ubuntu Server?
I am looking for a client/server application that can allow me to have remote GUI access (eg: VNC) but with the ability to use sound.
I see that TeamViewer has a linux application out now but for my company, it would compromise security protocol because it reaches out to the TeamViewer servers to do any work.
Basically, we have applications that require a GUI and we need to hear sounds from them if they throw an alert (the app is a monitoring tool).
We are running a VNC server on it currently and when our monitoring machine was on Windows, TightVNC was able to get the sound...that is not the case with any Linux client so I am wondering if there is actually one that exists.
I already have an ubuntu backup server in my location and need this one server to be backed up remotely in another state. this other location is a helpdesk so there's a danger that they can gain access to confidential data. I'll be setting up this new server as an ftp server but need to set the ftp folder to only allow access to the backup server and me. Because its remote on the helpdesk side, they'll need some access to the file system but need to be completely blocked off from the ftp folder where all the data is at. How can I make sure I can keep them away from my data and still be able to retrieve or copy files over without permission issues between both servers?
I discovered in running a script on a remote server. Take this little demonstration script:
This runs as expected (it does nothing) on both my local desktop and the remote server. However if I log in via ssh to the remote server and run this script everything on the server freezes. I do mean everything ftp/ssh/nfs/login-requests/... even if I go and physically plug in monitor and keyboard there is no response. It basically needs a full restart to clear. I have repeated this several times. Now something is clearly wrong here but does anyone know what. I didn't think a non-admin user should be able to actually lock up system processes.
I have limited access to several servers (key based auth) but cron facility is not available for me. Those servers getting filled up by large apache logs and I have to login to each node manually and clean them each and every day.
I tried to write a script to run from login box but when i try that it looks like it is looking for logs in the local server (login box).
So current situation is:
How can i modify this so that the script in server1 will look for files in that server and zip them?
Google showed another command called rsh but in my env it is also not avil.
I am playing around with the idea of being able to use a cloud or instance based service to install Ubuntu 9.10 Server. This will enable me to have remote access via SSH command line.So far, I've installed Ubuntu 9.10 Server + Ubuntu Desktop to a virtual machine. I can access this via SSH and locally via the desktop. However, in the real environment the only access I am going to have initially is via SSH.
I would like to be able to connect using Windows Remote Desktop or VNC (whichever is easier and most importantly - most secure) to the machine.. even though the desktop is on there, I need to somehow configure the remote access all from the command line.I've had a read of various forums and have been trawling support forums for days but can't find a working solution for 9.10 Server or that fits my situation above where I will not have any physical access to the desktop or machine to configure remote desktop. It all has to be done via SSH/command line.
I was trying to install PEAR on a remote Ubuntu server using putty. I ran an apt-get install php-pear command and everything went smoothly, but now i cannot access the website as it says 'can't establish a connection to the server' and in firebug, it shows the status 'aborted'. I even tried adding the pear path in php.ini file and restarting the apache server but no luck.
I have a fast computer in my office and I want the person using the slow computer in the same office to boot up and see the login window (gdm) and log-in from there into the fast computer and be able to use their session on the fast computer the same time I am locally logged in to the fast computer as a different user and session.Is this best done through XDMCP? Where is a good tutorial on how to set this up?
I have an ubuntu fileserver and an ubuntu laptop both running 10.10.For some reason I can't connect to the server (file or remote terminal) from the laptop, even though I can access ssh through terminal on my mac and have been able to mount the filesystem on another computer running the ubuntu liveCD. I just get the error 'no route to host'.I've tried turning off the firewall on the laptop and re-installing ssh on both computers, but I don't have a clue what to do next!
I have a problem with my Terminal Server Client, I cannot seem to connect to any windows server via IP address. Can anyone please recommend any tool I could use to connect? I need to work on the server for admin interfaces example Admin Kits for workspace protection
I am able to telnet to the server via a remote connection, but for some reason, it will not accept mail. ere is the bounce back email I am getting.Quote:This is the mail system at host smtp.mydomain.net.I'm sorry to have to inform you that your message could notbe delivered to one or more recipients. It's attached below.For further assistance, please send mail to postmaster.If you do so, please include this problem report. You candelete your own text from the attached returned message.The mail system
<email@example.com>: temporary failure. Command output: pipe: fatal: pipe_command: execvp /usr/bin/perlbin/vendor/spamc: No such file or directory Reporting-MTA: dns; smtp.mydomain.net
I am running php on my ubuntu server, and I am trying to upload files to it.
Heres my code:
I use the move_uploaded_file(), but I can't get the location to work correctly. "/var/www/upload/" uploads to "/var/www/". This is the only way i can get this code to work. I would like to upload to "/var/www/upload/".
i have been trying to complete the following project1) Configure a FTP server where we can upload and download files.........2) server must run at 9 pm & stop at 9 am automatically ............although the first task was easy ,i have no idea how to accomplish the 2nd task(not to mention I'm a new user)
I have been trying to make it possible to upload images on my website via a web browser. I have created the HTML form, I can browse to my desktop and upload an image. But my question is, where is it on my server after I upload it? I am running 8.04 on my server.
I would like to upload files via ftp or sftp to my web directory at /var/www/...
Originally I had installed openssh-server (through apt-get, before learning about tasksel). I assumed this only had ssh support and not ftp, so after a quick search, I installed vsftpd. I would like to learn how to configure openssh, and I mention vsftpd in case there is a conflict.
Right now I am able to log in to my server box through ssh/ftp, but I can only modify my home directory. I created a directory: /var/www/andrew and set permissions to drwxr-xr-x. I am unable to upload files to this folder.
What do I have to do to resolve this, and is there anything else I should know about my situation?
But I can't upload files or pictures all the time.Everytime I tried to upload , it tips me : "can't create file or folder ...... . is its parents folder writable by the server?"Now it's considered as a bug.But I tried 3 methods introduced on the net . But none of them worked. They are : 1: change the folder permission to 777 , not work for me.2: change the URL to "wp-content/uploads" not "/wp-content/uploads" , not work for me.3: use full url not "wp-content/uploads" also not work.
When I try to upload files via FTP to a specific folder (/var/www/myfolder/) I get the message 550 permission is denied. I set CHMOD 777 for this folder (I had to do this via SSH cause I was not allowed to do this in FileZilla)
I am setting up a small competition for a forum where there are a set of programming tasks to complete (in various languages). At the end of the allocated time, the fastest program that successfully completes the task wins.I would like to set up a server similar to the way the Google AI challenge works, where users can upload code to the server via an html page and the code is executed and profiled on the server.The problem is obviously security... so... how can I configure my server so that the uploaded scripts are run in a completely jailed environment and do not have access to anything outside their folder and cannot execute any commands on the server?The chain of events looks like this: User writes script on local machine -> user zips script -> user browses to server html page -> user uploads script -> script is compiled and executed on the server -> user is shown output from script
I have a server (fedora 11 , LAMP). I want to know if I can upload something to my server via http (I mean from WAN),and this data stream can directly run into MySQL database . Do I need to write some special codes on my web page , or just change apache's configure file
I've write a php code to upload files. by move_uploaded_file. I've given 777 to the folder. But file upload unsuccess, error log show no premission. I test the code in Centos it wotks. I change the dest folder in Fedora to /tmp it works, but it only work on /tmp. neither /tmp/abc nor /temp. I guess it's a setting problem on apache or php. I also copy http.conf and php.ini from Centos, but mod_file_cache and mod_mem_cache not find in Fedora.
We are running into issues with a File Upload script written in PHP. We can upload files without issues except with .*x files (such as .docx) We are getting permission denied errors. The error occurs when we use move upload file, to the new directory within our PHP app. If we give the uploads folder 777 access, it works fine without error. I dont like that. So I set it to 775 (Also dont like this), but it didnt work until I gave group ownership to www-data (I really dont like this)
This issue only happens on our production server, which is Ubuntu 9.04, running Apache2.2 and PHP5 will all the newest updates. We also have all MIME's configured, and are able to download the file from Apache without error. The first thing we noticed before the file permissions error, was that the MIME type changed to .zip when we used mime content type function. But yet using the FILES array, it still showed .docx.
looking to implement a website where business partners can download/upload documents. The files and the "partner areas" should be password protected.Are there open source projects / Ubuntu packages readily available for implementing this type of web-based file sharing service