Ubuntu Networking :: File Uploading Using SFTP In Filezilla?
Apr 27, 2010
I have set up a local server for testing on my home network and installed openSSH. I can login using filezilla and SFTP and can even download files. Error messages saying cannot find directory (the directory I am trying to upload)?
Do I need to configure openSSH to allow this. I am using my usual ubuntu login. Maybe I need to set up another user for SFTP.
I can't connect to sourceforge with SFTP from filezilla. I'm using the correct username [URL].. and password (Same one as allows me to login to web administration) so why do I keep getting "Authentication failed; Critical error"
I have been provided with a PPK file to connect to a remote server using SFTP. I've also been given a username and password. I created a user test and created a .ssh directory in their home directory. I've put the ppk file in here and renamed it id_rsa (though I'm pretty sure I'm not correct in doing so) When I SFTP using sftp test@remoteserver.com I get prompted for the key password - which I don't know and am presuming is blank, so I just hit enter Then is prompts me for the user password which I give it. it falls down here. I used the -v option to see where it stops - which tells me: debug1: Trying private key: /home/test/id_rsa debug1: PEM_read_PrivateKey failed I presume this is because SFTP is expecting a .PEM key instead of a PPK?
i have a windows vista laptop that has a 20GB file that i want to use FileZilla to transfer that over to my desktop. I installed filezilla client on both computers. For host, on vista do i type in the ip address of the ubuntu desktop and then on the ubuntu desktop type in the vista's ip address for host and use the same port numbers? using the vista's ip address as 192.168.0.3 and port 22 and the desktop address is 192.168.0.6 i am getting a message like:
Status: connecting to 192.168.0.6 Response: fzSftp started command: open "name@192.168.0.6"22 Error: connection timed out error: could not connect to server
i hit quickconnect on both sides and they both give me same results (obviously adjusted for the ip address). Could someone help me figure out what the heck i am doing? I have not used filezilla before.
I am using ubuntu karmic koala and i have firefox web browser. Whenever I try to upload files in some file hosting sites, my firefox web browser hangs. I think the file hosting site is using some kind of uploader that requires java/javascript.
Let's say we have machine 1 and machine 2. Machine 2, which is linux based, is receiving a file (let's say cat.jpg sent via SCP from machine 1 (doesn't really matter which operating system..) How can I know in machine 2, that the file has finished uploading from machine 1 ? Please note that by "knowing" I mean by running a command (via a cron that runs every minute), and not by an event driven way ("make the scp issue a command when it is finished uploading"). Another limitation I have is that I can't change anything in machine 1 or in the way it SCP files.
If it really matters: the files being transferred are video file, MP4 extension. machine 2 is ubuntu 10.10
I got an error while uploading a file named "xxxxxxxx.php"to linux server
Error in detail Upload of file 'xxxxxxxx.php' was successful, but error occurred while setting the permissions and/or timestamp. If the problem persists, turn on 'Ignore permission errors' option. Permission denied. Error code: 3 Error message from server: Permission denied Request code: 9
I am trying to write a shell script in order to automate the process of uploading a file onto an FTP server using the built in FTP commands in ubuntu server (lucid). In order to connect I can use the following:
Code:
ftp wsbeorchids.org.uk Name (wsbeorchids.org.uk:danielgroves): USERNAME Password: PASSWORD
In need to pass my username and password in when prompted the prompts. How should I go about doing this? I have tried echoing the values without success. Please not that I am something of an amateur with scripting.
I just installed Fedora12 in a Core i3 machine... everything looks fine, but I have a huge problem... every time I upload a file (using ftp or sftp) some wier characters are included inside the file... for example.
I have a problem with my filezilla (or my ftp server). When I want to connect to my ftp server (and also other ftp servers!), after MLSD command, I get a "Connection timed out" error.
Log: Code: Status: Resolving address of khanemashroote.ir Status: Connecting to 46.4.196.109:21... Status: Connection established, waiting for welcome message... Response: 220 ProFTPD 1.3.4rc2 Server (Debian) [::ffff:46.4.196.109] Command: USER badihi Response: 331 Password required for badihi ..... Command: MLSD Error: Connection timed out Error: Failed to retrieve directory listing
I have a dynamic IP connection. so i got hostname on [URL]. Now i want to configure a ftp server with that. I am choosing following options in the network configuration wizard:
1. default transer mode: passive 2. use server's external ip instead 3. get external ip address from the following address: XXXXXX.dyndns.org 4. ask opearting system for a port 5. test result:
Desktop, laptop, both static IPs, can ping each other no issue. 10.04 LTS.Am attempting to transfer files using Filezilla (which worked when I tried it about two years ago!) and the overall response is 'No route to host'. I have scoured the interwebs and have found no solution. Pretty sure I'm putting in the correct details.
I have a Philips SNU5600 wifi dongle. My wifi network isn't particularly strong, but it has always been reliable, even when downloading and uploading large files such as ISOs to the internet.
I recently bought a NAS, and when I run rsync or try and copy a large file (eg 6 gigs) to the NAS, the connection will eventually fail.
The error message in /var/log/debug is:
Code:
I have read elsewhere that this is a problem with the Minstrel algorithm, so I have tried disabling that by creating /etc/modprobe.d/80211.conf with the following line in it:
Code:
And when I reboot, I am able to run "cat /sys/module/mac80211/parameters/ieee80211_default_rc_algo" and the output is "pid".
However, in my /var/log/debug, there is still the line:
Code:
Here is the output from a few other commands for extra information:
Code:
Code:
Code:
Code:
- The problem has only been around since I got the NAS, and only happens when copying files TO the NAS. Downloading/uploading to internet doesn't cause the same problem. - This problem happens on two machines both with the same wifi dongle. - This problem doesn't happen on a laptop running the Intel iwl3945 driver, nor with machines connected via cables, so it is not the NAS which is at fault.
Trying to sftp (get) a file, and am getting the following message:
spawn sftp -oPort=10022 jn000JN@sftp.section111.cms.hhs.gov Connecting to sftp.section111.cms.hhs.gov... The authenticity of host 'sftp.section111.cms.hhs.gov (204.76.173.42)' can't be established. DSA key fingerprint is 66:64:07:cc:39:89:56:2b:3b:4c:fd:cc:3d:2a:7a:9c. Are you sure you want to continue connecting (yes/no)?
Is this an issue with keys? Where are the keys on a sftp client stored? I am running this sftp script from a different directoy than normal if that matters.
Running: Red Hat Enterprise Linux Server 5.2 (Tikanga) I need to be able to automate transferring a few files over from one server to another using scp or the sftp protocol. I have received a text file which looks like a key file along with username and passphrase information for the target server in question.
Instructions were given to me to import the provided text file in puttyGen then save the imported key as a private key to be used by scp or sftp. My assumption is this is for windows utilities, which I am not using. My frustration comes in trying to automate logging into this server via sftp or scp to automate some file transfers. I am asked for a password every time because the public and private key methods failed to find my keys. How can I call scp or the sftp utilities and use the provided key file (the one I generated using puttyGen or the original one provided to me) to login to this server? I've tried taking the generated ppk file from puttygen and adding it with the ssh-add command but that still did not work.
Weird problem; I have set up SSH on my 10.04 server. I can putty to it over my LAN from my Win 7 box but when I try to SFTP I get "connection timed out".
I use the following code to send make-up file dialy via sftp with expect. When I run it from command line, it has no problem sending to the remoteSERVER side, but when it is running via crontab task, it did not do the put, so did not execute the batch file defined in -b option.
upload a file to a business partner of ours in another country. Currently they have an SFTP server set up for us that I am using to download a daily generated file from a previous requirement. I use a bash script to download it since its fairly simple.
sftps manual page only gives a hint about using a batchfile, however i still cannot get it to work. Does anyone know another way? Or if you can even suggest another method or application? It seems like a bit of a cop out you can EASILY download using the sftp command but can't upload.
EDIT - forgot to mention I have already got keyless entry set up using ssh keys.
PuTTY SFTP syntax required to copy a file from computer 10.0.2.2, on user t0p's Desktop (eg /t0p/Desktop/file.txt , to the XP computer? The PuTTY instructions seem to make no sense to me.
Where is the config file for the sftp bit? At the mo it shows all the hiddenfiles (dot) and I don't want it too. Don't laugh, I have just configured my proftp for this, and realised, hang on this isn't the program that dishs out sftp!
I'm trying to establish a connection between two laptops using sftp but am getting the following error message:
Connecting to <IP>... ssh: connect to host <IP> port 22: Connection refused Couldn't read packet: Connection reset by peer
ftp isn't working either. Both machines are running Ubuntu and connect to the internet through the same wifi router in case that's relevant. What could be the problem?
I'm using Filezilla to connect to a remote server with site 2 site VPN. Even when i'm sending a small file with SFTP, the connection time outs and reconnects. Its happening again and again. Even SCP connection is also the same. BUT SSH CONNECTION IS WORKING FINE.
Filezilla log
ravindika@ravindika:~$ tail -f filezilla.log 2010-07-07 10:35:12 2690 2 Response: fzSftp started 2010-07-07 10:35:12 2690 2 Command: open "root@XXX.XXX.XXX.XXX" 22 2010-07-07 10:35:19 2690 2 Command: Trust new Hostkey: Once 2010-07-07 10:35:21 2690 2 Command: Pass: ******* 2010-07-07 10:35:25 2690 2 Status: Connected to XXX.XXX.XXX.XXX
I really hope someone could help me with this problem. I've been stuck on this for a month.I am using the sftp command to upload files using a bash script.The problem is that it is extremely slow to do it this way. as many of you would know if you have shared server somewhere. I would use scp if remote server supported it, but it doesn't.
Anyway, If any of you have ever used FileZilla, in the Settings, if you go to "Transfers" there is a place where you can set the number of "maximum simultaneous transfers". This feature works wonders with SFTP (and FTP too). It really speeds things up. How do I accomplish this same thing with the sftp command... because I don't want to use a GUI. I don't even mind using FileZilla through the command line if possible... but it does not seem to be possible. I've been stuck on this for a month!!! I've searched everywhere and tried a lot of things with no avail...
I want to use a cron job to backup my files to my server. Now when I run the script manually, I get an error when backing up (something and sftp file being used or so). I only get this when I'm simultaneously connected to my server with sftp. So to be sure that this doesnt happen when I wont be there anymore to look at the log, I would like to know if there is a command to kill all sftp connections. I would put this command in the backup scrip cron uses.