Networking :: Using Kermit To Transfer Files With 3 Server System
Mar 23, 2011
Here's the system:
1 server running regular Ubuntu, 40km above the surface of the earth on a scientific balloon behind an iridium modem (RUDICS) connected to its serial port
1 server on the ground running Ubuntu server
1 intermediate server used for contacting the iridium system from the ground server
I'm not sure if all the above details are completely necessary but I included them for completeness. I would like to be able to log into the balloon server and transfer files in both directions. The procedure for connecting is to telnet to the intermediate server and then to issue some modem commands to call the balloon. The balloon server is set with getty running on the serial port connected to the modem. The way I have figured out to transfer files is to run kermit on the ground server and connect to the balloon server through the intermediate server, then run kermit on the balloon server, and set it as a file server with the server command.
However, there is some sort of timeout or something, and only a few kB of any file gets transferred before the connection is broken. After that it seems like the ground server is trying to get the file from the intermediate server (which has no useful files on it at all). The file transfer screen stays open and it keeps trying and trying to transfer, until I type ^C. I don't know if there is a way of detecting through a kermit command whether the connection is still open or if there is some sort of switch to make the transfer automatically stop once it has stalled.
I have been reading about No Kermit Server (NKS) protocol, which seems to be designed for a system like this where the connection is across a third server. Is this likely to do a better job of keeping the connection open and the file transfer going? How can it be implemented? Is there any kermit command to determine from the ground server whether the connection is actually still open? Is there any way of telling whether the connection goes all the way to the balloon server or whether it ends at the intermediate server? I actually just learned about kermit today.
On a related note, is it possible to have the balloon server running getty on the serial port but still have the port accessible for reading and writing by, say, a python script (which could use the modem to dial down to the ground when it isn't in use)? It doesn't seem to work but I'm wondering if there is a way. Is there a way to temporarily stop getty, then restart it, or is this potentially hazardous? Keep in mind there will be no way to contact it if something goes wrong since it will be 40km above the earth.
I'm carrying out a project for my university (CIT in Cork, Ireland) and I'm using CentOS running over WMware. I have a server and a client. The server has no GUI (command line UI) while the client has a UI. I need to install a Simple Forum Machine application and I'm told to FTP the files into the server. I figured out that the best option is to load the files in the client via the GUI and then ftp them in the server. How do I transfer the files from a the client t o the server using FTP? I'm totally new to Linux so the more details the better. Also I'm trying to mount a USB key on the server but have had no luck.
I have a Sun server that could be only configured via serial interface. It has one serial port with a RJ 45 connector. Although my laptop has not serial interface. Is there a way to use C-Kermit or any software to establish a serial connection via the ethernet port. It should be some kind of virtual serial port that should transmit the information over the ethernet port, without encapsulating it in ethernet frames.
I have 2 computers on the same network that i need to link together to transfer files 1 is a web server the other is a minecraft server. the problem is that the file transfer will be constant as the minecraft server will constantly updates files on the web server and I dont want it to go to the router then to come back to the web server. I want to add a second network card to each computer and link them together and use this second connection to transfer the files is it possible?
I run a webserver with centos 5 and like to change hardware. I run a 1u supermicro 6014t server with 4 *500gb raid 6 and like to downgrade to a smaler but more efficient server. The problem is this. I'dd like to transfer all the content and the whole OS to the new system, but how do I do that
I have a RHEL 5 system running vsftpd. If I do a put to the box, files larger than 1 GB fail around the 1 GB point. Smaller files don't have a problem. If I do a get from the box there is no problem. Both the FTP server and client are plugged into a Netgear switch. I'm trying to create an FTP server and web server so my friends and I can upload our vacation pictures. Ultimately I want my friends to be able to access my server. Right now I'm just trying to make it work between my two machines.
When trying to transfer files from my server to my desktop, the files never get fully transfered. I have windows cmd running a ping to my server, and all is fine until i try to transfer files. Some will transfer fine, and then out of nowhere the server will stop replying. I then have to restart the server to gain access to it. A 2 GB file will transfer about halfway, and then the server will not be able to be pinged. I have static IP on the server and my desktop. Running XP x64 and Ubuntu server 32-bit 9.04. Tried using Samba and NFS Exports. I have both my desktop and server connected to the same gigabit switch, and then my switch connected to my router.
I have a server with Private IP and without any public IP. I want to transfer files to the private IP. I can log in to the Private IP through SSH. So basically I installed vsftpd in the server with Public Ip and tried to ftp the public ip from the private ip but it is not working.
I need to transfer some a large amount of file from my Linux lap-top to my desktop Windows machine. Can I connect the two computers through a simple crossover cable and simply navigate into the Windows machine and move to files manually or if not, what's the best way to do this? I don't want to burn a bunch of disks.
I need by searching this site so I haven't had a need to sign up since I can't really help anyone as of yet. With that said here is my problem: I'm running a VPS with CentOS RHEL 5 host-in-a-box, I just did a rebuild of the server and after a day or two pure-ftpd and sshd unexpectedly close out any incoming connections. I am the only one that uses ssh and ftp so I'm not sure what the problem could be. I checked the logs and there is nothing to do with not being able to bind on the address.
I tried connecting through ssh in verbose mode and it connects to the server just fine, but drops the connection before it asks me for my key pass phrase. If I enable password access it will drop before it asks me for it's password. I've tried restarting sshd and ftpd. I've tried rebooting the machine. I've tried google, but this problem seems to need a little more specific trouble shooting. I can get in through console access, but that doesn't help me much when I need to transfer files.
I use Ubuntu Lucid and use the terminal to access my virtual server (GoDaddy - Red Hat Fedora Core 6). Using the terminal and entering SSH [account name]@IP gets me there. I can manipulate the server then.
But how do I transfer files to/from the Ubuntu terminal to the Fedora server? I want to (using Evolution) email a file on the server to someone.
I currently have two computers (one windows one linux) connected to each other via a crossover ethernet cable. Now, each computer can see each other and I can ping both ways. Also, I can ssh into the linux box from windows (putty, cygwin) as well as ssh into my windows machine from the linux box. Here's the problem: I can send files from my linux machine to my windows machine with no problems doing this:
Now, it seems like everything went fine. However when I look in /home/jqweezy that_file.txt is not there.
p.s. Don't know if this helps, but here is some extra info. Linux machine has only one NIC. Windows has two NICs, one NIC is setup for automatic network detection the other is setup for communication with linux machine via crossover cable (see above).
When I try to copy a file from a shared folder of other laptop, the whole of data passes through the router.This affects the internet bandwidth within the network. Is there a way to access the shared files without necessarily going through the router and also without affecting the internet connectivity.
I am using remote boot via PXE boot to an a remote machine. So okay, but when I load the files via NFS, I can not transfer the / lib to my station. I recompiled the kernel, which already influence all NFS packages built in this set as Obrigadao.
I have two CentOS 5 servers that I'm trying to transfer files between. They're on the same LAN switch, same subnet and everything. So far, everything I've attempted has failed, but scp still exits with a return code of 0. It only displays a line of *** and exits immediately. It's almost as if the file transfers instantly, but no file actually gets copied. Here is the verbose output from scp:
My main pc is this Fedora 10 pc. I have two other pcs that run different Linux distros from time to time. What is the basic setup to share and transfer files between the 2 or 3 pcs? They are connected through a 2wire modem/router.
Do I need Samba installed? or is that only if to need to network with a Windows pc?
I have Fedora 12 (with all the latest patches, including the 2.6.31.6-162 kernel) installed on a new Supermicro SYS-5015A-H 1U Server [Intel Atom 330 (1.6GHz) CPU, Intel 945GC NB, Intel ICH7R SB, 2x Realtek RTL8111C-GR Gigabit Ethernet, Onboard GMA950 video]. This all works great until I try to transfer a large file over the network, then the computer hard locks, forcing a power-off reset.
Some info about my setup:
[root@Epsilon ~]# uname -a Linux Epsilon 2.6.31.6-162.fc12.i686.PAE #1 SMP Fri Dec 4 00:43:59 EST 2009 i686 i686 i386 GNU/Linux [root@Epsilon ~]# dmesg | grep r8169 r8169 Gigabit Ethernet driver 2.3LK-NAPI loaded
[code]....
I'm pretty sure this is an issue with the r8169 driver (what I'm seeing is somewhat reminiscent of the bug reported here). The computer will operate fine for days as a (low volume) web server, and is reasonably stable transferring small files, but as when as I try to transfer a large file (say during a backup to a NAS or a NFS share), the computer will hard lock (no keyboard, mouse, etc.) at some point into the transfer of the file. It doesn't seem to matter how the file is transferred (sftp, rsync to NFS share, etc.).
I've been using Ubuntu for about 2 years now, but still have trouble with some of the finer workings of linux. I have a laptop that I use for general computing, and a desktop hooked up to a TV as sort of a remote backup/htpc. A problem I run into is when I transfer files, they get transfered with the owner set as the original computer's account, and I can't do anything until I open a remote viewer and gksudo nautilus to change the permissions of the file. I looked at articles about permissions and uid's, gid's, and umask but can't figure out how to apply it to my situation.
I thought about doing something with groups but am not sure exactly what, and anyway, default group settings only give read access and what I'm really looking for is the ability to manipulate files and folders across the entire /home dir on my desktop from my laptop. Desktop is running 8.04 and laptop is running 9.10. BTW I am currently sharing through smbfs. I read that this has been replaced by cifs, but at the moment I would prefer not the mess with things if I don't need to.
I'm trying to automate the transfer and processing of files between two systems to help test and compare a new server installation. The workflow is a bit complex but I'm basically modifying a script on server 'A' to push a file to server 'B' as standard input to another script.
[Code]...
But no luck. I've tried it without the port in the server_args parameter, without the '-l' option; I've tried having the server parameter set to 'tcpd' and the call to '/bin/nc' in the server_args too. But no success. Can anyone point out what I'm doing wrong with the config? PS. I've restarted xinetd and server B is listening on port 1112 and accepting connections - but nothing gets piped into the script on server B.
I'm looking for a most possible, secure solution to transfer data using rsync over Internet between 2 linux server. I have 3 option: SSH, IPSEC and Kerberos. Which one in your opinion should be most secure solution?
I want to connect a laptop running ubuntu 10.04 to a laptop running windows 7 via direct connection in order to transfer files like music, documents, pictures, etc. I have an ethernet cable that I thought I would need in order to do it. Is that even possible?? If so, how would I go about doing that?
Now, I have tried to share the files wirelessly but for some reason when I pick up the workgroup on the ubuntu laptop and enter the password in order to connect to the windows laptop it says my password is wrong, when I know for a fact that it is not. I know I can transfer files with a flash drive and what not but I want to try to get this working.
I have set up a master DNS server at 192.168.50.9 and a slave DNS at 192.168.50.6. Both servers are BIND9.Machines are for testing/experimenting, hence the IP addresses. Initially, the zone transfer was blocked by the firewall on the master, as the slave uses randomly selected non-privileged ports for zone-transfer query. So, as far as I understand, there are two possible approaches:
1. Allow connections based on source, which should be Code: -A RH-Firewall-1-INPUT -p tcp -m state --state NEW,ESTABLISHED -s 192.168.50.6 --sport 1024:65535 --dport 53 -j ACCEPT (and it works for me fine)
2. Allow ESTABLISHED and RELATED connections, which would be something like Code: -A RH-Firewall-1-INPUT -m state --state ESTABLISHED,RELATED -j ACCEPT which was my initial idea but didn't work, but has inspired me to dig deeper into firewall configuration topics :).
Question: Does zone change notification message count for opening a dialog, or notification from master and slave zone update request are two absolutely separate actions? If the latter is true, that, of course, explains why option #2 didn't work.
In order to debug a problem, I want to list down as many as possible reasons of a server (SOAP server running on Fedora 10) sneding RST in middle of packet transfer. Please note that in my case, SYN and ACK of initial handshake went through. Server received request from client and started data transfer but then suddenly, server sent RST in between data transfer.
This is a recent problem, and I can't pinpoint any change/upgrade that would cause this. Rsync transfer from Client to Server: sent 11756196 bytes received 1032741 bytes 138258.78 bytes/sec total size is 144333466390 speedup is 11285.81 Pinging back and forth from each machine is fine. No Ifconfig errors Client, but Server has RX packet errors.
I am working on a cluster for a molecular dynamics class and I have to edit my FORTRAN code (only the newest and best for me!). In order to get through to the cluster I have to ssh in. The network on which the cluster resides is behind a firewall, so I have to ssh through the firewall into the network first.
this is fine, I can login and move files and folders as needed, including sftp-ing into host 1, then into the cluster so I can transfer files from cluster to host and then host to me. This gets rather tiresome, so it would be nice to edit the files in place.
The problem is that when I access my code with emacs it launches the emacs client on Host 1, with no mouse support. I know the purists will howl about how I should be using keyboard shortcuts, but I am a chemist and not a programmer, so the mouse is very nice for me. Is there any way I can perhaps mount the cluster using sshfs so that when I open my code it launches a local instance of emacs? Sorry if this is the wrong forum, but I thought it was network related.
I had run one script in unix machine and want to copy the results to a windows machineBoth the machines are on different networksIn linux machine trying to do the ftp to the windows machine its giving connection refused. How to chech whether ftp is running on that linux machine or not?Also tried scp and ssh , both are failing