Networking :: Script That Downloads And Uploads Files From 1 Ftp To Another?
Apr 9, 2011
I have Debian and want to be able to connect to an ftp server, download some (or all) files, disconnect from this server, connect to another ftp server and upload everything on it. (And delete the temporary files on my PC). This should be done form the command line. I am no expert in linux (although I am acustomed with it). How to do this (or part of the solution ). In the end I would like to write a script, that mirrors my site from 1 place to another.
View 3 Replies
ADVERTISEMENT
Dec 11, 2010
Ive looked around and can't seem to find an answer to this. If I use any of my places menus, home, downloads etc, M-Player starts, not Firefox and uploads all the files to a playlist
View 1 Replies
View Related
Jul 15, 2011
We are a small company running half a dozen servers in data center.Recently we got charged heavily for over-utilizing the data transfer. So,we are looking for a way to find - uploads and downloads per ip and port basis.We have mixed environment (Win2008/Ubuntu) so the tool should be able to work for both.I am not sure if MRTG provides per port(i.e. application) based analysis.
View 4 Replies
View Related
Sep 23, 2010
We're having an issue with HTTP POST file uploads on our two Ubuntu PCs. For some reason, whenever one of our users attempts to submit a file in an HTML form, the request times out, usually with a 500 Internal Server Error message. This problem is not limited to one site, but occurs on all sites that use file uploads. Also, the problem does not appear to be with our network, as a Windows 7 PC on the same network can upload files to the same sites without any difficulties. The problem is not browser-specific; we have tested with Firefox, Epiphany, and Google Chrome and all produce the same results. The issue is relatively new, and was first observed within the last month; before this time, both machines had no problems uploading files.
Does anyone have ANY idea what could be causing this? I've tried a number of things, including rebooting the PCs, rebooting the network, disabling IPv6, etc. I'm not very experienced in Linux system administration, but I can use the terminal and am familiar with some terminal-based diagnostic tools, so if you need any additional info or want me to try something, please let me know! I've exhausted my own computer knowledge with regards to finding a solution to this problem.
View 3 Replies
View Related
Sep 20, 2009
I keep on downloading tar.gz files into my downloads folder and i cant do anything with them. what i need to do to install the file so i can use it? An example, i am trying to install Frets on Fire, and am failing bad.
View 9 Replies
View Related
Dec 10, 2009
I am using ubuntu 8.04.When I install software from internet then files are downloaded to /var/cache/apt/archives so i keep those debian files in safe place so that I can install that software on another stand-alone computer.
MY question: When we run 'apt-get update' for first time after fresh install it downloads some files. Can i store those files & point to them for a networkless computer which has no internet ? If this is possible it will allow me to 'apt-get check' on stand-alone computer to see if any package is not proper.
View 13 Replies
View Related
Sep 1, 2011
There is some issue with the latest version of LinuxDC++ (version: 1.2.0~pre1~bzr Core version: 0.75). I have upgraded to the latest version a few days back from launchpad repos due to the frequent crashes and system hang-ups. My complete and incomplete downloads folder are same. While my download was going on, LinuxDC++ deleted all the files from the downloads folder.
After about half an hour of tweaking, I updated my database using updatedb, and found my files in ~/.dc++/Filelists/anantwqqwe.BMIU2NFCFXB7ERTSG62PRSQPRJIN63A56EEGO6Q . What does that supposed to mean and why its doing this way? This is the second time this has happened to me, the last time I was unable to locate the deleted files on my system and I suppose they were not there. I use 64-bit Ubuntu 10.10(maverick).
View 2 Replies
View Related
Aug 29, 2010
curl -L http://URL/file[1-2].txt -o $(date +%m-%d-%y) newfilename{1,2}.txt
Basically, this command goes to URL, downloads file1.txt and file2.txt, however it saves BOTH files as newfilename1.txt. I would like the script to name the second download (file2.txt) newfilename2.txt. So, before you say to use the -O switch in Curl, please understand that I wish to rename the files so that they are not what they were on the server (names are too long). So file1.txt becomes newfilename1.txt, file2.txt becomes newfilename2.txt. Is this possible? The command I listed works only until the newfilename{1,2}.txt, it always saves as newfilename1.txt
View 3 Replies
View Related
Dec 26, 2010
In order to download files from a particular website, I have to include a header containing the text of a cookie, to indicate who I am and that I am properly logged in. So the wget command ends up looking something like:Code:wget --header "Cookie: user=stringofgibbrish" http://url.domain.com/content/porn.zipNow, this does work in the sense that the command does download a file of the right size that has the expected name. But the file does not contain what it should--the .zip files cannot be unzipped, the movies can not be played, etc Do I need some additional option, like the "binary" mode in the old FTP protocols?I tried installing gwget; it is easier to use, but has no way to include the --header stuff, so the downloads never happen in the first place
View 3 Replies
View Related
Nov 27, 2010
I'm having a very strange problem with my ubuntu apache2 server running wordpress. i want do download media files (from within a flash-mp3-player onsite or by link [url]) but the file transfer just stops after a while. (at least sometimes) at random positions. after that i have to clear the browsers cache and try again.
It is really annoying, though it is my band's website and we want to share our songs with our friends. i checked from several clients, seems to happen everywhere (linux, mac or windows clients)
The system:
Code:
With wget it seems to work...
View 4 Replies
View Related
Feb 15, 2009
I just switched from XUBUNTU to fedora 10 and have to say I will most likely change back. Unless someone can tell me how to speed up the internet connection in fedora. I have tried wireless and wired and it doesn't matter which I use the download rate is painfully slow. With wired I get a meg every minute. It really hurts to watch the monitor.Downloads for three seconds then takes a break for eight seconds then downloads for three more seconds and this goes on and on. I have a very fast connection and am used to getting amazing unbroken streams of data at no less than 1 meg every 2 seconds. I hope this is a rectifiable situation.
View 4 Replies
View Related
Nov 25, 2010
I have a fresh install of f14. Using an Intel dp55wb mobo w/ integrated nic. I have access to network. Have access to internet. When downloading large files (f14.iso or ..... vid) after ~35 secs, data stream drops to 0. Other computers on network do not have problem. F14 computer is wired into router. Have tried a Biostar mobo w/ integrated nic w/ same results. Have changed patch cable. Have changed MTU's both higher and lower from default of 1500, no change.
View 2 Replies
View Related
Mar 20, 2011
We (3) have download limit problems with a 50G/month limit (Inc uploads). The router I cannot change, and it offers no useful options. So I am considering using IP forwarding from my own box. I think I would need a second nic, & router. Eth0 would run a dchp server, eth1 would run a client. What do I run on the box to monitor downloads & uploads, and is there a way of adding pc & laptop downloads to limit luser downloads? Does this stuff strangle speed? I'm running slackware-13.1
View 3 Replies
View Related
Jun 14, 2011
Have seen this post and was wondering why would that be so.
Quote:Originally Posted by DaveG
you do not want to download software through the privoxy filters - they could corrupt the data.
Privoxy is supposed to either block or allow specific adverts, why would that corrupt a single file, like an .exe or .tar.gz?
View 1 Replies
View Related
Oct 19, 2010
Transmission says my port is closed. If I google the problem, it just gives me advice on how to open a port in Windows OS. There's no firewall in Ubuntu 10.10 by default, right? There isn't any router used neither. I'm living in a dorm. I just plug the LAN cable in the box fixed to the wall.
View 3 Replies
View Related
May 6, 2009
I used www.speedtest.net to test my download through firefox first on vista and then on Ubuntu (9.04) - it is a dual boot machine.
Whilst I get between 18 and 19 mbs on Vista I only get 4.5 on Ubuntu - how do I fix this - is it an issue with my ethernet drivers or my setup?
I am on a 20mbs cable modem - I connect via an ethernet cable to a router - which obviously connects to the cable modem.
View 4 Replies
View Related
Apr 6, 2011
I just Dled Cleanus 1 system sounds. I've got a bunch of "ogg" files in my Downloads directory. Where should these files live to be system-wide?
View 2 Replies
View Related
Mar 19, 2011
I just did a fresh install of 10.04 on my system and everything runs perfect and exremely smooth, however the update manager and firefox are taking forever to load (the progress meter reads something like 2d 17hr 23m), however the internet connection is very fast.the logs are all insanely clean with the fewest errors I've ever seen, so this shouldn't be too terribly difficult to solve. Also I seem to be having a problem with the font on Firefox, every fourth letter or so has a bright streak in it. Anyway, that might be related or not but my main concern right now is getting my download speed up to par?
View 1 Replies
View Related
May 21, 2011
I have a server and I have a few computers connected to it via a Airport Extreme. Using network cable. So when Im uploading,(ftp) IE using a lot of the network "space" the other computers on the network gets kicked out. So what is going on? My Airport Extreme is doing fine, but my other clients just get kicked out. If I pause the upload, everything is okay again. The whole network is 1 gigabit, clients, everything.
[Code].....
View 4 Replies
View Related
Nov 22, 2008
I have a site that users upload files on. Its on a dedicated server with 2 HDDs and the first HDD is 97% full, is it possible to use the other HDD for the files users upload? if so how?
View 1 Replies
View Related
Sep 16, 2010
Can we create alert whenever a upload to ftp server happens. We have redhat 5
View 1 Replies
View Related
May 18, 2011
Sometimes I notice that there is high upload speeds for 10 minutes or so. At the time of the screenshot I was sitting in a public wireless place, only chromium was open and I don't see any reason why there should be sustained upload speeds.Is there a GUI or CLI so I can find out which process uses the internet?
View 1 Replies
View Related
Jul 13, 2011
All my music is already synced, but every time a song finishes playing in Banshee, a notify OSD message appears letting me know that song is being uploaded to my Ubuntu One account. I'm running Ubuntu 11.04 32 bit.
View 2 Replies
View Related
Feb 10, 2011
What is the least painful way to temporarily prevent uploads to an FTP server by certain accounts? they all only upload directly to their home directory setup in /etc/password
View 1 Replies
View Related
Feb 3, 2009
I upload picture files to an ftp server. I can't do much about my upload speed but I think that a multi-thread upload may yield the same kind of improvements that a multi-thread download yields.
View 4 Replies
View Related
Aug 18, 2010
I am working on a PHP enabled webpage that will allow a user to select multiple files and directories to upload from a local machine to an ftp server. I am comfortable with uploading the files from the machine to the server. The problem is making it easy to select all the desired files. What I would like to do is create an expandable file tree that lists all the directories and files on the local filesystem. From there, the user should be able to select directories and files using checkboxes. Upon clicking submit, all of the selected files should be fed into an array of files that can be sequentially uploaded to the ftp server.
View 1 Replies
View Related
Feb 2, 2010
Hope you can help me out. I'm trying to setup a "drop-box" on ubuntu 9.10 server with vsftpd. I'm able to login and land in the /home/user directory, however I cannot write anything.
View 5 Replies
View Related
Jul 14, 2011
I've made a simple headless home server based on:
1. Motherboard Asus AT4NM10-I (Intel NM10, PCI)
2. CPU integrated Intel Atom D410
3. 2 Gb of RAM
4. Network Card D-Link DGE-528T 10/100/1000 Mbit/s
5. OS Ubuntu Server 10.04.2 (All installed packages are up to date)
Storage build under LVM based on:
1. Samsung HD103SI - 1 Tb, 5400/32 Mb.
2. Hitachi SATA 2000Gb Deskstar 7K3000 - 2 Tb, 7200/64 Mb.
So found one issue: When torrent upload speed reaches peak speed (160-200 Kbytes/s) huge read slowdown happens. Server becomes almost unreachable... It allows to connect via putty but it takes a lot of time.
Tested top stats during those lags (Deluge, Transmission) - 10-15% CPU usage.
So I think the problem is in LVM and not in CPU.
How is it possible to find weak place in system to avoid those lags... Cause if torrent is seeding it's impossible to watch movies through network form that server.
View 9 Replies
View Related
Aug 20, 2010
I am uploading the incremental backups using duply/duplicity using the sftp-module. As the initial upload is pretty big and runs several days (more than 50GB over a 1Mbps-line) I am confronted with the problem that other users in the network experience slowdowns when I upload.
I would like to run a script every n minutes which pings a host in the internet (second hop of the traceroute for example). If the response time is less than a value (150ms), the script throttles the upload for one specific host and protocol. Traffic to the local net (Samba mainly) should be unaffected. I cannot use the QoS of the firewall/router. Also I would like the penalty to be removed if the ping is quicker (loess than 70ms for example) I looked at trickle, and some other out-of-the-box shaping tools but they do not give me the possibility to change the rate while the upload is running.
I would now write a script in perl which uses [URL]some wrapper for iptables combined with some ping module [URL] Also I was trying to get the proof of concept before I start coding: (I haven't verified if this works yet)
sudo tc qdisc add dev eth0 root handle 11: cbq bandwidth 100Mbit avpkt 1000 mpu 64
sudo tc class add dev eth0 parent 11:0 classid 11:1 cbq rate 100kbit allot 1514 prio 1 avpkt 1000 bounded
sudo tc filter add dev eth0 parent 11:0 protocol ip prio 16 u32 match ip dst MyserverIP flowid 1:1
View 2 Replies
View Related
Jun 24, 2011
I have a home server based on Ubuntu Linux 10.04.2.
Hardware:
Motherboard - Asus AT4NM10-I (Intel NM10, PCI)
CPU - Integrated Intel Atom D410
RAM - 2 Gb
Lan - D-Link DGE-528T Gigabit Adapter
Provider gives 8/2 Mbit ADSL connection.
So tried Deluge and Transmission, and integrated or external network card and no luck.
When torrent file is being seeded on top speed network starts freezing, server almost unreachable, video freezing when watching it by LAN from server... etc...
When I pause upload - everything starts working ok!
Network based on gigabit switch and cooper UTP cables...
View 7 Replies
View Related