I am unable to transfer files in pidgin. I am working under the assumption that I need to create an exception in iptables in order to rectify this problem. I have done a bunch of google searches trying to figure out how to do this specifically to allow file transfers using pidgin.
I am unfamiliar with iptables (I know I know, read the man page) and I'm not sure which port pidgin uses to transfer files. Can anyone help me figure out how to add an exception to so I can transfer files?
i am trying to transfer a file from my live linux machine to remote linux machine it is a mail server and single .tar.gz file include all data. but during transfer it stop working. how can i work and trouble shooot the matter. is there any better way then this to transfer huge 14 gb file over network,vpn,wan transfer. the speed is 1mbps,rest of the file it copy it.
[root@sa1 logs_os_backup]# less remote.log Wed Mar 10 09:12:01 AST 2010 building file list ... done bkup_1.tar.gz deflate on token returned 0 (87164 bytes left) rsync error: error in rsync protocol data stream (code 12) at token.c(274) building file list ... done code....
Having a bit of a issue with Debian Squeeze and transferring files to the Sony PSP..Hook up PSP to USB port and Debian mounts it..I go to drag a 125 meg mp4 to video folder..Copy windows takes about 10 seconds to transfer it..Exit USB mode and there is no video there. Go back into USB mode and look at video folder on the PSP memory stick and there is no video..It vanished. From another after copy progress closed I right clicked PSP and unmounted it..
It error-ed saying device was busy and could not unmount..Looking at light on PSP i see memory stick is still being written to..i wait for light to stop flashing..About a minute or so..Then am able to unmount it..Go to PSP video and theres the video ready to be watched. Debian isnt accurately showing the copy progress...Its showing complete when it isnt..I have to watch the light on PSP to know when it is truly finished.
I have a lan network. I want to transfer a file from one Pc (connected to Lan) to another one. I know that the best way to do this it's ssh because it's very safe . But i know Samba too. Are there other ways to tranfer a file in a Lan?
I have connected two computers with each other both having fc installed. Now, when I tried to transfer a file from one computer to another using scp command, sometimes the file transfers very slowly and sometimes very fast. I wana know that why it sometimes transfers slowly. By slow I mean much slower than a file downloaded from a dsl.
I managed to ssh lan for file transfer but now i want to connect to ssh again for file transfer but from another city,not lan, What ip do i type in Host Ip? I mean,i can tell them to start the ssh server but then i need to type the external ip.
I have ubuntu 10.04 on my netbook and all my files on my XP desktop. How can I connect the two so that I can transfer files from the XP machine to the Ubuntu netbook? Both have WiFi and ethernet. I am not fluent in windows networking but could manage to join two xp machines using an ethernet cable and naming the two with the same workgroup then setting a folder as shared!
I am trying to set up a computer so that I can remotely torrent files and then transfer the files to my PC while I am at college. I set up my computer so that it has a dynamic dns as well as hosting via DynDns. I have recently hit a wall when trying to find a way to transfer my files from the computer at home to my Windows 7 computer at school.
We have a server and we have instales an Open suse 10.3 on it. We created a Samba server also. Made to share folder, that we acces from network from other computers that have xp.
The problem is if we try to copy from server it is very slow only 100-300kb/s. The strange thing is that if i copy 1 file then its slow but if i start to copy another one the speed gos up to 10-15mb/s. Evry time i want to copy somethin or install from that server i need to start another copy. If i copy from a comp to that server the speed is normal only if i copy from server its slow.
Ok, the only thing I remember doing is changing out my hard drive because it went bad. I swapped in a new one and reformatted with Windows 7. (I can hear the boos and hisses as I type) Anyway, now file transfers are exceedingly slow. They are ~10KB/s when they should be ~40MB/s. The odd thing is that it mostly occurs on large files, but not all large files and sometimes smaller files are affected. Copying files from one hard disc to another on the server, as well as copying from the server to Win7. Local files are not affected. The problem does not occur on my laptop which has Slackware Linux on it.
Another thing about the problem is that when I mount a disc image on a virtual drive, access time is very slow and windows explorer is very unresponsive and often locks up completely.
I have a samba server with Slackware Linux on it and a Windows 7 client. DHCP is configured through dnsmasq. (dhpc-host=<mac address>,<computer name>) Every now and again, dnsmasq seems to conk out when I try to access the network with my Win7 machine. I then have to reboot the server, as my only access to it is via SSH and I don't have a video card in it. After the server reboots everything is fine until accessing the network with my Win7 machine conks it out again.
I've tried different MTU settings, different network cards on the Win7 machine(Dlink and realtek), various regedit hacks, but none of them produce any results.
this is all on LAN.Whenever I transfer files through the router A to a pc connected wirelessly, the internet hit the speed of molasses. What can I do to rememdy this? I was thinking about having my fedora box connect wirelessly to the internet, through router A, and then bridge via ethernet to router B and then have my pc always connected to router B, so when i transfer files, router A remains unaffected. Is this correct logic?
I am having a problem trying to transfer large file (~700MB) from one station in my home to another. I have 3 PCs hooked up through a router. 1 is wired to the router and the other two are wireless. One wireless is a laptop that has a built-in Atheros wireless card that was supported during the FC13 install. The other wireless has a plug-in wireless card made by Belkin (F7D1101). I had to use ndiswrapper to get it to work on FC13.(BTW all PCs are running FC13)
The one with the Belkin card is, I think, the problem. The one with the Atheros card will transfer files at a rate of about 8MB/sec to the one with the wired ethernet connection. The one with the Belkin card will not transfer at rates over 300KB/sec to either of the other two PCs. I have tried file transfers both encrypted and not and it makes no difference.
I am working on Ubuntu-10.4 and I want to transfer 'hello.txt' file form my computer to mobile phone (Nokia 7210).When I run command, My mobile ask for accept the connection from ubuntu (device name connected to ubuntu). I accept the connection, Then I got the error log given above.
I am running Ubuntu Linux 9.10 on my laptop and desktop. I would like to transfer files from one to the other using an Ethernet Cross Over Cable. Is there software already installed in the bundle, that would allow me to setup network drives for each machine when the cable is connected. Also how would I setup the configuration.
I do IT work for a decently-sized bookstore. Over the past year or so we have had a recurring problem crop up where file transfers with networks outside of our building will be interrupted, or those receiving our uploads will get truncated files. It is an intermittent problem and I have not been able to narrow it down to any particular day of the month or time of day. Files that are dealt with in our web browsers have no issues - I can happily download and upload large files all day without interruption. Yet when our little postage meter machine tries to connect to its update server, it sometimes has its connection interrupted downloading its update files. Our credit card machines sometimes have trouble holding a connection long enough to batch out, and our storewide POS system often has to resend its credit batch multiple times before it can maintain connection long enough to push the whole file through. One server daily ftp's a ~11mb file to an associate, and a few days a month they will receive a truncated file that appears to have been cut off mid-transfer. Our bandwidth is handled through three T1 lines that share phone line and internet service. Our voice calls never get dropped and have good line quality, but mobile credit card terminals will have trouble maintaining a connection when they dial out.
I'm thinking this is either an ISP issue or an internal network problem. I don't know what protocol the mobile machines use, but it's possible it could be restricted to ftp or, I dunno, telnet? Maybe a port forwarding conflict? Files sent internally have no problems, so I'm thinking it's something at the firewall/proxy/router stage.
Our servers run a number of *nixes: our central databasing/pos is handled by an IBM RISC box running AIX 4.3; our DHCP server runs Debian Sarge 3.1; we have a Ubuntu 8.04 LTS box running a squidproxy setup using squidguard for url filtering; firewalling is handled by a Debian 5 box running Untangle. Our ISP provides us a 4.5 Mb connection via three T1 lines that provide shared bandwidth for our digital telephone lines and internet.
I've fixed almost everything except this problem. My network card worked fine on Windows but since I switched from Windows, file transfer over the network using my network card is horribly slow, 30kbps max on 100mbps wired connection. I know my network card is working fine, and I'm trying to copy files between my Ubuntu PC and windows PC, I've set up Samba as recommended by these forums, I can transfer files when using wireless connection, but I still can't transfer files at a speed faster than 30kbps on a wired connection.And the funny part is, it's fast when copying files from Ubuntu to Windows but it's slow the other way. And I've tried this with more than 4 different Windows computers.lspci returns this for the network controllerCode:Silicon Integrated Systems [SiS] 191 Gigabit Ethernet AdapterIs there any way of updating drivers or something that I can do to fix this problem
I use basic ssh and scp on a regular basis and sometimes for file transfer... certainly over public networks. However, often I want to transfer large (several GB) files from one place on my LAN to another.
I've read that FTP is the fastest, and when I'm transfering many GB, fast matters. Of course I know it's not secure, but I don't need security on my home wired LAN. What I do need many times is preservation of date and time of each file when uploading. Filezilla does that and I use it, but sometimes I need a CLI solution. I have use ncFTP and been happy with it until I realized it doesn't preserve date and time on uploads (probably not on downloads either, haven't looked).
Is there a CLI FTP solution out there that will preserve file date and time information on file transfers (including uploads)?
I have a problem that I can't seem to fix.When I try to transfer a large file lets say 700Mb or so my wireless shuts down and i have to restart my ubuntu computer the other computer is vista.ubuntu is on a wusb54gver4 and vista is going through a wrt54g tm running dd-wrt mega.I have tried every thing i know with no luck.
I set up a virtual bridge for a virtual machine (qemu-kvm) using a qemutap interface:
Code: # brctl show bridge2 8000.0022648a3dcc no eth5 qemutap2 I can ping and ssh to the server ... Then I try to copy (scp) a file (2.8 M) to the server and, near the end, the copy stops and the server stops responding to ping and is not able to ping nothing except itself.
"ifconfig down ; ifconfig up" and the server is on line again.
I am using Ubuntu 9.04. After doing sudo apt-get update When I try to insatll pidgin via terminal it shows: XXX@XXX-desktop:~$ sudo apt-get install pidgin Reading package lists... Done Building dependency tree Reading state information... Done Some packages could not be installed. This may mean that you have requested an impossible situation or if you are using the unstable distribution that some required packages have not yet been created or been moved out of Incoming.
The following information may help to resolve the situation: The following packages have unmet dependencies: pidgin: Depends: libpurple0 (>= 1:2.6.0) but it is not going to be installed Depends: perl (>= 5.10.0-19ubuntu1.1) but 5.10.0-19ubuntu1 is to be installed E: Broken packages
Previously after doing aptitude upgrade, my pidgin seems broken with no gui shown. So I decided to download the latest source from pidgin website and build it from source, but I think I made situation worse because it then complained that ssl lib was needed. Then I removed the libpurple (e.g. aptitude purge libpurple0 libpurple-bin libpurple-dev) and reinstalled pidgin (aptitude install pidgin). Now it shows the error
pidgin: symbol lookup error: pidgin: undefined symbol: purple_media_element_info_get_type
I searched on the internet and can not find a solution. The clues on the internet says that's because the piding I use is the older version of libpurple. But I think I've removed all with purge and reinstalled it. Maybe some legacy binary is referenced. What or where it might be? Or where there may contain related information.
I have setup ssh between a F15 box and a remote centOS box. I am using ssh -X, then nautilus/gnome-session to open a gui file browser/desktop environment of remote machine. But anything I want to copy from remote machine to local machine by gui, is showing 'path not found' error. CLI work just fine but is it possible to transfer files between remote and local machine by gui over ssh?