Networking :: Automate The Transfer And Processing Of Files Between Two Systems?
Oct 30, 2010
I'm trying to automate the transfer and processing of files between two systems to help test and compare a new server installation. The workflow is a bit complex but I'm basically modifying a script on server 'A' to push a file to server 'B' as standard input to another script.
[Code]...
But no luck. I've tried it without the port in the server_args parameter, without the '-l' option; I've tried having the server parameter set to 'tcpd' and the call to '/bin/nc' in the server_args too. But no success. Can anyone point out what I'm doing wrong with the config? PS. I've restarted xinetd and server B is listening on port 1112 and accepting connections - but nothing gets piped into the script on server B.
View 2 Replies
ADVERTISEMENT
Jul 30, 2010
I've inherited the following Virtual Machine scenario and am new to Linux Administration and Patch Management. The Host Operating System is Windows 2003 Enterprise, which has VMware Server 2.0.2 installed. Under the VMware Server 2.0.2 I have a Ubuntu 32-bit OS web server running Apache2 Web Services. When I log onto the Ubuntu server (9.10 32-bit) I see the following two lines just above the new mail/last logon lines.
85 packages can be updated
55 updates are security updates
I would like to see at least a summary of each update and its urgency so I can notify the various developers/server owners to get their input regarding whether we should or should not apply that particular update to the server. We apply the patches in our test/dev environment first then once vetted there we roll them out to our production servers. What I am looking for is a way to automate the gathering of the information and once approval has been received automating the actual patching process so that I do not have to manually perform the apt-get process for each separate package needed/approved.
Ideally I would like a recommendation for a GUI based package to manage this process and that is capable of generating the appropriate reports for the 'powers that be' regarding the current security/patch management environment. For proof of concept I would like a free version that is not hamstrung in functionality but is not too costly to procure the production version with no limitations.
View 4 Replies
View Related
Feb 13, 2011
I need to transfer some a large amount of file from my Linux lap-top to my desktop Windows machine. Can I connect the two computers through a simple crossover cable and simply navigate into the Windows machine and move to files manually or if not, what's the best way to do this? I don't want to burn a bunch of disks.
View 2 Replies
View Related
Jul 21, 2010
I want to transfer files through ftp on my fedora 13 system. when I type ftp in terminal, it shows an error message "command not found" why?
View 6 Replies
View Related
Feb 13, 2009
I currently have two computers (one windows one linux) connected to each other via a crossover ethernet cable. Now, each computer can see each other and I can ping both ways. Also, I can ssh into the linux box from windows (putty, cygwin) as well as ssh into my windows machine from the linux box. Here's the problem: I can send files from my linux machine to my windows machine with no problems doing this:
scp this_file.txt windows_user@10.10.10.11:/cygdrive/c
however, I can't seem to send from windows (cygwin) to the linux machine. Here is what I see at cygwin x-term prompt:
$ scp that_file.txt jqweezy@10.10.10.10:/home/jqweezy
jqweezy@10.10.10.10's password:
Executing /etc/profile ...
Now, it seems like everything went fine. However when I look in /home/jqweezy that_file.txt is not there.
p.s. Don't know if this helps, but here is some extra info. Linux machine has only one NIC. Windows has two NICs, one NIC is setup for automatic network detection the other is setup for communication with linux machine via crossover cable (see above).
View 2 Replies
View Related
Jul 8, 2010
When I try to copy a file from a shared folder of other laptop, the whole of data passes through the router.This affects the internet bandwidth within the network. Is there a way to access the shared files without necessarily going through the router and also without affecting the internet connectivity.
View 1 Replies
View Related
Jun 17, 2011
I am using remote boot via PXE boot to an a remote machine. So okay, but when I load the files via NFS, I can not transfer the / lib to my station. I recompiled the kernel, which already influence all NFS packages built in this set as Obrigadao.
View 1 Replies
View Related
Dec 17, 2010
I would like to transfer files from my PC to my account at a sftp server, and I don't know how to do it.
My PC is running with:
User: User1
Address: 10.0.2.3
My sftp account is:
User: SFTPUser1
Address: sftp-server
I can access the sftp server with the command:
Code:
sftp SFTPUser1@sftp-server
The sftp server doesn't answer to ssh requests.
How can I transfer files to the sftp server?
View 7 Replies
View Related
Apr 11, 2009
I have two CentOS 5 servers that I'm trying to transfer files between. They're on the same LAN switch, same subnet and everything. So far, everything I've attempted has failed, but scp still exits with a return code of 0. It only displays a line of *** and exits immediately. It's almost as if the file transfers instantly, but no file actually gets copied. Here is the verbose output from scp:
scp -v kickstart.tar ****@192.168.xxx.xxx:/home/****
Executing: program /usr/bin/ssh host 192.168.xxx.xxx, user ****, command scp -v -t /home/****
OpenSSH_4.3p2, OpenSSL 0.9.8e-fips-rhel5 01 Jul 2008
[code]....
View 4 Replies
View Related
Mar 15, 2009
My main pc is this Fedora 10 pc. I have two other pcs that run different Linux distros from time to time. What is the basic setup to share and transfer files between the 2 or 3 pcs? They are connected through a 2wire modem/router.
Do I need Samba installed? or is that only if to need to network with a Windows pc?
View 14 Replies
View Related
Dec 9, 2009
I have Fedora 12 (with all the latest patches, including the 2.6.31.6-162 kernel) installed on a new Supermicro SYS-5015A-H 1U Server [Intel Atom 330 (1.6GHz) CPU, Intel 945GC NB, Intel ICH7R SB, 2x Realtek RTL8111C-GR Gigabit Ethernet, Onboard GMA950 video]. This all works great until I try to transfer a large file over the network, then the computer hard locks, forcing a power-off reset.
Some info about my setup:
[root@Epsilon ~]# uname -a
Linux Epsilon 2.6.31.6-162.fc12.i686.PAE #1 SMP Fri Dec 4 00:43:59 EST 2009 i686 i686 i386 GNU/Linux
[root@Epsilon ~]# dmesg | grep r8169
r8169 Gigabit Ethernet driver 2.3LK-NAPI loaded
[code]....
I'm pretty sure this is an issue with the r8169 driver (what I'm seeing is somewhat reminiscent of the bug reported here). The computer will operate fine for days as a (low volume) web server, and is reasonably stable transferring small files, but as when as I try to transfer a large file (say during a backup to a NAS or a NFS share), the computer will hard lock (no keyboard, mouse, etc.) at some point into the transfer of the file. It doesn't seem to matter how the file is transferred (sftp, rsync to NFS share, etc.).
View 10 Replies
View Related
Feb 14, 2010
I've been using Ubuntu for about 2 years now, but still have trouble with some of the finer workings of linux. I have a laptop that I use for general computing, and a desktop hooked up to a TV as sort of a remote backup/htpc. A problem I run into is when I transfer files, they get transfered with the owner set as the original computer's account, and I can't do anything until I open a remote viewer and gksudo nautilus to change the permissions of the file. I looked at articles about permissions and uid's, gid's, and umask but can't figure out how to apply it to my situation.
I thought about doing something with groups but am not sure exactly what, and anyway, default group settings only give read access and what I'm really looking for is the ability to manipulate files and folders across the entire /home dir on my desktop from my laptop. Desktop is running 8.04 and laptop is running 9.10. BTW I am currently sharing through smbfs. I read that this has been replaced by cifs, but at the moment I would prefer not the mess with things if I don't need to.
View 3 Replies
View Related
Mar 23, 2011
Here's the system:
1 server running regular Ubuntu, 40km above the surface of the earth on a scientific balloon behind an iridium modem (RUDICS) connected to its serial port
1 server on the ground running Ubuntu server
1 intermediate server used for contacting the iridium system from the ground server
I'm not sure if all the above details are completely necessary but I included them for completeness. I would like to be able to log into the balloon server and transfer files in both directions. The procedure for connecting is to telnet to the intermediate server and then to issue some modem commands to call the balloon. The balloon server is set with getty running on the serial port connected to the modem. The way I have figured out to transfer files is to run kermit on the ground server and connect to the balloon server through the intermediate server, then run kermit on the balloon server, and set it as a file server with the server command.
However, there is some sort of timeout or something, and only a few kB of any file gets transferred before the connection is broken. After that it seems like the ground server is trying to get the file from the intermediate server (which has no useful files on it at all). The file transfer screen stays open and it keeps trying and trying to transfer, until I type ^C. I don't know if there is a way of detecting through a kermit command whether the connection is still open or if there is some sort of switch to make the transfer automatically stop once it has stalled.
I have been reading about No Kermit Server (NKS) protocol, which seems to be designed for a system like this where the connection is across a third server. Is this likely to do a better job of keeping the connection open and the file transfer going? How can it be implemented? Is there any kermit command to determine from the ground server whether the connection is actually still open? Is there any way of telling whether the connection goes all the way to the balloon server or whether it ends at the intermediate server? I actually just learned about kermit today.
On a related note, is it possible to have the balloon server running getty on the serial port but still have the port accessible for reading and writing by, say, a python script (which could use the modem to dial down to the ground when it isn't in use)? It doesn't seem to work but I'm wondering if there is a way. Is there a way to temporarily stop getty, then restart it, or is this potentially hazardous? Keep in mind there will be no way to contact it if something goes wrong since it will be 40km above the earth.
View 6 Replies
View Related
Feb 27, 2009
I'm carrying out a project for my university (CIT in Cork, Ireland) and I'm using CentOS running over WMware. I have a server and a client. The server has no GUI (command line UI) while the client has a UI. I need to install a Simple Forum Machine application and I'm told to FTP the files into the server. I figured out that the best option is to load the files in the client via the GUI and then ftp them in the server. How do I transfer the files from a the client t o the server using FTP? I'm totally new to Linux so the more details the better. Also I'm trying to mount a USB key on the server but have had no luck.
View 2 Replies
View Related
Mar 29, 2011
I need to tar this logs, but i dont how to make it simplier to me. Everyday there are created this five logs. I need to make five tar files from every day from this files at the end of the month
For example
Till now i have tar it manualy (copied every file)
View 2 Replies
View Related
Jul 28, 2010
Here is what I currently do and want to automate:
1) I manually enter a particular web site address in the browser.
2) When the page displays on my machine, it shows a number of links I need to visit, one after the other.
3) Each of these links display another page (file) from which I "cut and paste" information. I do this by highlighting manually the wanted info, click "copy" then select an open file on my computer, select "undo" if necessary to remove any previous content, click "paste" and then "save".
4) I then call a Yabasic program that reads the saved file and trims unwanted info.
5) At the completion of the Yabasic program, I click the web page tag, click the "back" button to return to the first page (since I am in the second) and click the next link in this first page till all links have been visited.
6) visit the next known web site and repeat 1 to 5
In an automated program, what I need to do is:
1) Visit the known page of the web site showing the links
2) save the page showing the links (the first page)
3) make a list of those links
4) visit each link one after the other and save its page from which I will programmatically (Yabasic) select info.
5) repeat 1 to 4
I can do this in "Yabasic" (which can issue Linux commands) or PHP although I do not know PHP much.the purpose of this is to associate towns and cities of the world with their respective political/geographical divisions and their respective time zone. This has to be done often because that data changes regularly.
View 3 Replies
View Related
Jun 20, 2010
I want to connect a laptop running ubuntu 10.04 to a laptop running windows 7 via direct connection in order to transfer files like music, documents, pictures, etc. I have an ethernet cable that I thought I would need in order to do it. Is that even possible?? If so, how would I go about doing that?
Now, I have tried to share the files wirelessly but for some reason when I pick up the workgroup on the ubuntu laptop and enter the password in order to connect to the windows laptop it says my password is wrong, when I know for a fact that it is not. I know I can transfer files with a flash drive and what not but I want to try to get this working.
View 1 Replies
View Related
Sep 30, 2009
Description: I am a newly appointed system engineer taking care of linux servers. We have a new set of data coming in which need below configuration: How to do a script with function?:
for files with ".txt" in sm
copy each of the files to folder : sm1 and sm2 (log every copy)
if succesful:
remove original
log into the log file
if not successful: (not successful copying 1 particular file to all the folders)
retain and retry
log into the log file
mail out the admin with that particular file name
I have already do try a bit:
cd /export/home/
for dir in sm1 sm2; do
cp -p sm/*.txt $dir/
done
Is my starting right? How to do the rest parts?
View 6 Replies
View Related
Aug 14, 2011
I would like to backup important files (totaling about 400GB) on my ext 4 RAID 5 array to an ext4 external hard drive over USB (external drive is mounted to /mnt. In the future I'd like to automate the process using rsync and cron so for now I'm using rsync to transfer the files. My problem is that using the rsync command like this: # rsync -Pr "/dir1" "/dir2" "/dir3" "/dir4" /mnt
rsync shows me the checks and transfers for awhile and then throws up an i/o error (wish I had a screenshot to show but I don't). When I ls /mnt I get a similar i/o error. I then check /dev for the drive and find that it no longer shows up. Originally the partition was /dev/sdc1. I tried unplugging the USB at this point, plugging it back in and mounting the drive back to /mnt, however it has now assigned it to (you guessed it) /dev/sdd1. I get the drive mounted and try the original rsync command again, hoping the first error was a fluke or some kind of one-time drive fart. This time it makes it quite a bit further and then throws up the exact same problem. Am I doing something terribly wrong here? As I said, I'm very new to bash so I'm not making some absolutely moronic, newbie mistake.
View 9 Replies
View Related
Apr 30, 2011
I have a bunch of .7z files in a directory, and I need to put each one of them into a separate directory, named after the file (without extention). The command line I use:
Code:
find . -type f | mkdir `sed -e "s:..(.*)...:1:"` ; ls | grep .7z | cp * `sed -e "s:(.*)...:./1/:"`
Copying fails though:
[Code]....
PS. I don't want to use scripts, I want to do it using simple commands and piping.
View 5 Replies
View Related
May 12, 2010
Doing "ls -a" provides a listing of the hidden files.To process all the .bak files you can do *.bak for:
that.bak
this.bak
But how do you process all the hidden files like:
.that.bak
.this.bak
What is the equivalent of *.bak for only the hidden files?
View 6 Replies
View Related
Aug 23, 2010
I am looking for some suggestions if possible, regarding processing the files using perl script. Scenario is I have a location where new files will be added always. I need to process these files for some validation. I wrote a perl script to do this and I thought I can rename the files once they are processed in that way I dont process the same files again. But now I can't rename the files due to some restrictions. Second thought, to process them based on date stamp but as my perlscript is being automated and runs every one hour to process the files I can't go by date stamp.
View 5 Replies
View Related
May 6, 2011
I'm running apache2 and I installed php5 with yast. httpd2 -M states that php5 is loaded. /etc/apache2/conf.d/php5.conf is being included in httpd.conf and it contains:
Code:
<IfModule mod_php5.c>
AddHandler application/x-php .php
AddHandler application/x-httpd-php .php4
AddHandler application/x-httpd-php .php5
[Code]....
The first line was added by me since that's what mime.types actually contains.
php.ini is in /etc/php5/apache2 and since I was not sure apache/php was finding it I added a PHPIniDir "directive" into httpd.conf. I have not changed it.
Test file is the typical /srv/www/htdocs/info.php with <?php infophp(); ?>
Normal index.htm is working fine. php -a is working fine.
I spent a long afternoon around this configuration and I looked in plenty of pages for solutions. I only do occasional system administration so I might have easily overlooked something trivial, but I run out of ideas.
There is a thread in this web site with a similar problem but no solution:
Php5 not recognized in SUSE 10.2 Apache2
What could be missing/wrong if php5 is loaded and the addhandler is defined? How can I further test?
View 9 Replies
View Related
Mar 1, 2010
New to Fedora (from Windows), I am up and running ok with packages from the repository but only half ok with Processing, the Java graphics programming front end from processing.org.Their download gave me a .tgz file which Package Manager extracted for me into a location of my choice and where there is now a "processing" shell script.This works ok and I have managed to create a launcher on the desktop. That starts ok but always with processing's default action of giving you a new and automatically named work file.In Windows an existing Processing file (.pde file) could be "opened-with" Processing. Trying to do similar in Fedora I find that I am expected to nominate an Application to open with but Processing has not installed as an application.I guess the question is how do I promote Processing to be an Application?Or is there a different approach?
View 4 Replies
View Related
Jan 20, 2010
I'm unable to install any package on my Ubuntu, apt-get terminates with following error message. I'm using Ubuntu Intrepid 8.10
Code:
baipaneni@Baipaneni:~$ sudo apt-get install python-tagpy
[sudo] password for baipaneni:
Reading package lists... Done
Building dependency tree
[Code]....
View 3 Replies
View Related
Apr 12, 2011
Ubuntu 10.04 with an HP d530 SFFFrom lspci:00:1f.5 Multimedia audio controller: Intel Corporation 82801EB/ER (ICH5/ICH5R) AC'97 Audio Controller (rev 02)When trying to do sudo apt-get install -f, I get this:
Errors were encountered while processing:
libdirac-encoder0
libfaac0
[code].....
View 1 Replies
View Related
Nov 8, 2010
I manage to get Ethernet over Firewire working between my Windows XP desktop and my Ubuntu 10.10 laptop.However, I am getting tired of having to manually issue the ifconfig command every time.How can I automate it so that the command is done at bootup?
View 1 Replies
View Related
Mar 4, 2010
I am working on a cluster for a molecular dynamics class and I have to edit my FORTRAN code (only the newest and best for me!). In order to get through to the cluster I have to ssh in. The network on which the cluster resides is behind a firewall, so I have to ssh through the firewall into the network first.
this is fine, I can login and move files and folders as needed, including sftp-ing into host 1, then into the cluster so I can transfer files from cluster to host and then host to me. This gets rather tiresome, so it would be nice to edit the files in place.
The problem is that when I access my code with emacs it launches the emacs client on Host 1, with no mouse support. I know the purists will howl about how I should be using keyboard shortcuts, but I am a chemist and not a programmer, so the mouse is very nice for me. Is there any way I can perhaps mount the cluster using sshfs so that when I open my code it launches a local instance of emacs? Sorry if this is the wrong forum, but I thought it was network related.
View 3 Replies
View Related
Jul 6, 2010
I had run one script in unix machine and want to copy the results to a windows machineBoth the machines are on different networksIn linux machine trying to do the ftp to the windows machine its giving connection refused. How to chech whether ftp is running on that linux machine or not?Also tried scp and ssh , both are failing
View 6 Replies
View Related
Nov 28, 2010
For test purposes I'm running Ubuntu 10.10 from a USB stick with a USB WLAN stick.The system finds the WLAN router and acquires an IP6 address, but no IP4 address.With "dhclient wlan0" though it does get an IP4 address and connects o.k.How do I setup the system so that this dhclient call is done by the system at startup (or whenever it is necessary)?
View 1 Replies
View Related