How to set up a script to log on to a server through SSH, copy file from the server to the local machine, and then run a script on the downloaded file. More specifically, I've got a minecraft sever that is run on ubuntu.
I know that I can do
# ssh username@ipaddress
to log on to the server through the terminal.
After this it asks me for a password. Once I type in the password I have access to the directories on the server. How can I set up a script to log on to the server and enter password? Once I do this, how do I automate it to copy a file from that server to ~/Desktop? If I can do this, I have a script that will run from there.
I've learned that I can do
scp -r remoteuser@remotebox:/remote/directory /local/directory to copy files from a server to my local machine, but it still asks me for the server password. how do I make it so that the password is automatically entered?
As a former TVersity user, I enjoyed being able to remotely access my server on my phone outside of my home network to search for and download/stream songs. The ability to search for a song and download it to my phone (via HTTP) was my favorite feature. Looking through the popular media servers available for Ubuntu Server, I have not found anything that can accomplish this (they are meant to just stream on internal network).
I have a question about using ubuntu to download files from an HTTP server remotely and didn't know where to put it, so hopefully it falls under general support. Anyway, I am about to move into a place with an incredibly slow internet connection and a tiny data allowance and my brother has said that, if possible, I can use his internet connection to download any large files to a box I can just leave at his place, then I can simply come over to his place every few weeks and copy said files to a hard drive and all will be well. The problem is that I am not sure how to do this.
Today I went out and bought a few parts and built a cheap computer with a HDD big enough to hold whatever I need, however when I got home I realised I had no idea how I was going to handle the software aspect of this. Is there any way that I can access that computer remotely over the internet and schedule fairly large downloads from an http server? Also after talking to a friend I was told that I need to install the server version of ubuntu if this is to work, is this correct? Also, if its relevant the specs of the computer I have for this is using an "Intel Desktop Board D510M0 + Intel Atom Processor D510" which uses 64 bit architecture.
I've been using PAN for quite some time - recently installed 0.133 from the Ubuntu software centre. Worked fine for a while, no issue. Then, a week or so, it started downloading .msg files in company with the binary files I was getting. Sometimes one .msg file for one binary, sometimes quite a few. It seems to have some correlation to the size of the binary, that is the larger the binary, the greater the number of .msg files were downloaded.This morning, it would ONLY download .msg files. I could see the decoded binary in the PAN viewer pane, but it isn't present on my system. I have made NO changes to any configuration files, other than installing the recommended updates.
How do I correct this?Can anyone tell me what these .msg files are, and how to stop them from downloading?Are there as-good or better newsreaders out there that I can/should try?
When I try to load a php file my browser downloads the file instead of displaying the page. Here is my apache conf file. # # Based upon the NCSA server configuration files originally by Rob McCool. # This is the main Apache server configuration file. It contains the configuration directives that give the server its instructions. # See [URL] for detailed information about the directives. # Do NOT simply read the instructions in here without understanding what they do. They're here only as hints or reminders. If you are unsure consult the online docs. You have been warned. # The configuration directives are grouped into three basic sections: # 1. Directives that control the operation of the Apache server process as a whole (the 'global environment'). # 2. Directives that define the parameters of the 'main' or 'default' server, which responds to requests that aren't handled by a virtual host. # These directives also provide default values for the settings of all virtual hosts. # 3. Settings for virtual hosts, which allow Web requests to be sent to different IP addresses or hostnames and have them handled by the same Apache server process. # Configuration and logfile names: If the filenames you specify for many of the server's control files begin with "/" (or "drive:/" for Win32), the server will use that explicit path. If the filenames do *not* begin with "/", the value of ServerRoot is prepended -- so "/var/log/apache2/foo.log" with ServerRoot set to "" will be interpreted by the # server as "//var/log/apache2/foo.log". #
I'm trying to download Star Wars the Clone Wars torrents from two different feeds. The scantime below is temporary while I'm trying to get it to work. Will change it to 15 min later on.Below is the verbose output.
Quote:
INFO --- RSSDler 0.4.2 DEBUG writing daemonInfo INFO [Waking up] Wed Oct 27 18:12:11 2010 DEBUG checking working dir, maybe changing dir
I have rented a remote dedicated fedora 7 system. The service provider provisioned the system and does not allow direct "root" access to the system.
I currently use putty to login into "user1", switch to "root" using su -, and then edit the file.
I would like to edit the "root" file directly from my home XP system which has a SFTP connection to the remote fedora 7 system (using Aptana RadRails).
I am not sure how to do that. I had 2 ideas but I am just guessing and afraid to proceed with so little knowledge.
I considered editing /etc/group and changing the line from root:x:0:root to
I have an SSH server set up on CentOS 5.5 which I can succesfully use to access my file system remotely.
On this machine, I also have a partition with XP installed on it. Is there a way I can set up the SSH server so that I can remotely access the files on the XP partition?
I have an Ubuntu 9.10 server set up at my house. I have Apache2 and PHP5 installed on it. Every time I go to the server on a web page and try to load the PHP index page it downloads instead of displaying.
I have virtual servers set up and have the files stored at /home/cusinndzl. If anyone needs to take a look I can let them into the webmin panel.
Ubuntu Lucid 32-bit desktop machine. I run "aptitude update" and "aptitude safe-upgrade" manually from the command-line regularly to upgrade my machine. I usually don't pay that much attention to the downloading, but I noticed yesterday that aptitude appeared to be downloading four files simultaneously, rather than the one at a time that I've always seen. Would this be correct? How can I change that? I can't say I've ever noticed it before.
I would like to be able to access my data file that reside on my Linux machine at home from the Internet but I don't want to open any "doors" for lack of a better word that will compromise the security of my files. I am running F11 and I am using cable broadband and a Linksys router.I have been able to get ssh working with OpenSSH while I am at home but I don't really need or want to ssh remotely, I would rather setup what I think is called an ftp. I just want to be able to up and download files to my Linux machine.
I understand wget is used to download files. Is there a way I can search a url for what files are available for me to download. I need to install a plug-in from an adobe website.
I want users of my website to be able to download files from it.
When they request a download, I would like them to see a window with the following attributes: ability to choose where downloaded file goes; progress of the download, and estimate of remaining time; a button to cancel the download.
There are hundreds of sites out there that do this, so obviously it's possible, but can it be done from a php script on the server? If not, does anyone know of a script in java(/script)?
I would also like to monitor when a download failed and succeeded, and if the failure was not because the cancel button was pressed, pickup whatever system information is available about the failure cause.
I'm not terribly new to Linux, but I am new to the forums, so hear me out! I am in the process of creating an electronic mapwall for our meteorology program, and have designed the computing system from scratch. I have two Linux Boxes, each with capabilities for 6 attached monitors...a total of 12 displays driven from two machines. My intention is to have one machine be the master...it has a touchpanel control. The inputs to the touchpanel will then trigger events for the both the master and the slave machine to display. Each of them has a specific IP address (DNS entry), and are not on a subnet.
Now...is there a way to remotely login to the slave machine and have it display on it's OWN monitors? The code is Java and which works on the master machine to animate directories of .gifs for each of the master's attached monitors. I will most likely have Java execute shell commands for the remote login (ssh), but I believe the answer lies somewhere in the X-configuration. Do I have the machines in an adverse configuration (creation of a subnet would be better)? Lots of questions...lots of desire...few answers!
I have some scripts that need to use a newer version of PHP, Im running Debian 6 which has PHP 5.3.3 support, I found I could install php 5.4 using [URL]. This worked, it updated my PHP to a newer version, the only issue is that when the install completed apache now downloads the PHP file instead of rendering it.
Im guessing this has something to do with the Apache configs, but I don't know what to do.
Code:
Select alltom@vps:~$ dpkg --list |grep -E '(apache)|(php5-)' ii apache2 2.2.16-6+squeeze12 Apache HTTP Server metapackage ii apache2-doc 2.2.16-6+squeeze12 Apache [Code] ...
I was thinking of migrating my apt-mirror repository to the recommended ftp scrips: [URL] .....
I pre populated my pool with already downloaded files, and setup the scripts.
However, if I run the bin/ftpsync, and monitor rsync with lsof -p, I can see that it is still downloading files from oldstable (wheezy) despite exclude options.
I'm guessing it's a configuration error, but I can't seem to figure it out. Any thoughts? My etc/ftpsync.conf is as follows:
Actually, I don't think it works like I thought it did. A few guides I found listed the exclude options, but the sample config file has this:
Code: Select all## If you do want to exclude files from the mirror run, put --exclude statements here. ## See rsync(1) for the exact syntax, these are passed to rsync as written here. ## DO NOT TRY TO EXCLUDE ARCHITECTURES OR SUITES WITH THIS, IT WILL NOT WORK! #EXCLUDE=""
So it looks like it doesn't exclude the suites at all.
I keep on downloading tar.gz files into my downloads folder and i cant do anything with them. what i need to do to install the file so i can use it? An example, i am trying to install Frets on Fire, and am failing bad.
I need to download some very large files (circa 75 GB) from a remote server via SFTP. I've been using SFTP via the command line on my Linux netbook. Around halfway through, the transfer stops and says "stalled." Can anybody recommend a reliable way to download these files?
How do I download all the files form here: [URL]. I am on freeBSD 7.0 and I tried wget with the -r switch and it gives me URL's only. Maybe this is simply not an ftp site I don't know. How I can download all those files with the same directory structure.
Is there an easy way to get all of the multilib files from [URL]? I've tried:
1.) wget - won't use wildcards on an http server, and trying to get the whole folder just gets me index.html
2.) ftp - it's not an ftp server and I can't login anonymously via gftp. I wasn't expecting this to work, it was just something else to try.
3.) rsync - If this could work, I may not be using the right syntax. I tried (-n for a dry run first):
Code: rsync -avn [URL]
And it just sat there doing nothing until I hit ctl-c.
Obviously I could download each file from my web browser but I figured there had to be a more elegant Unix-y way without all the clicky-clicky. Are the files hosted on an ftp server anywhere?
I have a server at home running as a file server and DHCP connected to a switch with a wireless AP in there as-well. with this setup I can access the files and do some configurations via SSH on the actual server anytime I'm able to get the wireless signal, now lately I felt the need to be able to do the same but this time over the internet. I've read somewhere already that I'm gonna need a router with port forwarding and NAT, then know the IP address of the server but my problem is once you start talking about routers then you need broadband connection which is something I don't have. Getting a router is not much of a problem but without ADSL like connection I guess its useless, what do I need to do?
I currently mirror the updates repository to my computer using rsync. I was wondering if I could save space and bandwidth by only rsyncing the .delta.rpm files? Are there any disadvantages to this or does zypper/YAST handle updates just fine with it?
I want to limit bandwidth for downloading files with squid. I want to reserve bandwidth for other traffic (esp. web browsing). I know about delay pools but I don't understand well. Some users use download managers to download large movie files. I don't want to block downloading but I want to give them limited bandwidth. May be 5KBps or 6KBps because I have only 512Kbps (64KBps) connection.
I am looking for a file sharing program to install on my dedicated server that will allow me to upload large MP3 files and allow my clients to download them. these files are recordings of counseling sessions for families who are seeking help for their children.
What I am looking for is similar to the system this company uses [URL].
New apache install, php files are downloading rather than displaying, does anyone know whats causing this. When I compiled php with apxs it automatically added this line:
Code:
LoadModule php5_module modules/libphp5.so
I restarted apache, and also try adding this line:
For my project, I'm interested in the Scatterometer Products of Oceansat 2 from an Indian page [URL]
It's no problem for students to get a password to access and download their data for free. Nevertheless it's quite complicated to download the files by hand, since you have to mark every file by hand and click on a download button at the end of the page. When I tried it with my Script (which is below), and an internal server error 500 occured. I hope you're not too busy and could have a look on the script where the cookie and IP are entered manually for trial purposes.
The construction of the page is: The adress, where you have to login: http://218.248.0.134:8080/OCMWebSCAT...controller.jsp The adress, when your're logged in: http://218.248.0.134:8080/OCMWebSCAT...controller.jsp
Just during the last three days, when running Update Manager, clicking Check, downloading all available updates, then Clicking Install Updates, the Downloading Files window pops up but it is blank. and it stays blank. The little rotating icon indicates something's going on but, NO. Nothing has happened, even after a couple of hours. Trying to close Downloading Files brings up a window to Force Quit. Then trying to close Update Manager has no effect. to shorten the story, finally a window pops up saying that the "AT SPI Registry" is not responding. I'm sure trying the same thing will not yield different results. I'm running 10.04.
I can ssh to the server but it wont allow me edit files, even though I have basic text editors like gedit and notepad installed on my windows computer. Anyone have an idea what the problem is? (I get an error message like this-(gedit:23978): Gtk-WARNING **: cannot open display