Software :: Download Emails To Access From Different Computers Offline?
Feb 15, 2010
I need to download emails via IMAP in thunderbird. However, I want the downloaded emails to be accessible from other computers offline. How can I download so that the downloaded copy is readable by thunderbird copies on any computer?
Use Firefox 3.0.7 Use GNU Wget 1.11.4 I have a question about downloading web pages. If I download with web page complete will I be able to open the pages without being on line, or will there be some pages that I will still need to log in for. If a web browser is not sufficient, is there some command I can use with Wget to accomplish this?
I have a little problem with Evolution and Exchange 2007. I have managed to connect to the Exchange server without problem using MAPI and I can access my emails fine.
The only problem I have is that although I have checked all the options to download messages and automatically synchronise account locally it won't store the email contents locally until I first look open it (or at least preview it).
Unfortunately I have over 5000 emails so this presents a little problem.
I don't know if this is a bug or just how evolution works.
I set up Evolution with a gmail account using TLS both for sending and receiving messages. Then I markd Inbox and All mail folders to be synced for offline usage from their properties (right-click folder and select Properties item).I tried File>Download Messages for Offline Usage and Send/Receive button and Work Offline / Work Online and pressing sync for offline usage when it asks before going offline. But in all situations, Evolution ONLY shows messages (at its status-bar area) about syncing messages, checking for new mail, etc. but DOES NOTHING without any real network activity. Time to time it sends receive some kilobytes, but I could not make it to start some gigabyte gmail syncing transfer.
I am at a university where my bandwidth is severely capped. I can start several other computers near me and download at the limited speed simultaneously. Is there any way for me to share the download between the computers to get the cumulative speed?
I am about to loose my internet soon, I am not sure for how long, but I am curious, Can I go to another computer that has internet and download updates for my computer, take them back to my computer and install the updates so I can stay up to date?
I know I can build a local repository but I'd like to try just moving the appropriate .deb files. My problem is not knowing which files I need and it what order. Example... I want to install nfs-common
Doing apt-get install nfs-common --- does it all for me when I'm online. So I looked in the /var/cache/apt/archives to see what was installed. I found two nfs files... nfs-common_1.2.0-4ubuntu4.1_amd64.deb nfs-kernel-server_1.2.0-4ubuntu4.1_amd64.deb
But when I tried to install those on another machine I found I was missing additional files. libgssglue1_0.1-4_amd64.deb libnfsidmap2_0.23-2_amd64.deb librpcsecgss3_0.19-2_amd64.deb portmap_6.0.0-1ubuntu2.1_amd64.deb
For future installations. How do I find all the dependencies and the ORDER they need to be installed so I can write my own script and install them to a machine that is offline?
I am away for two weeks in a Internet free zone - unless I can get it back on - I would like to download the wiki, if possible to browse and try new things. I can update my comp, but will need to take it to a friends to connect.
I am helping a friend start with Ubuntu and he doesn't have as fast an Internet connection as I do. I was wondering how I could easily download all the deb packages for the software I want to install for him. It seems doing:
sudo apt-get install -d --reinstall <package>
Will download the packages for me, but it doesn't get the dependencies because I have already downloaded them... is there a way to get apt-get to get the dependencies as well?
For some reason it seems to be downloading too much and taking forever for a small website. It seems that it was following alot of the external links that page linked to.
It downloaded too little. How much depth I should use with -r? I just want to download a bunch of recipes for offline viewing while staying at a Greek mountain village. Also I don't want to be a prick and keep experimenting on people's webpages.
is there a way to get urls of the packages that have been updated and then download them in another computer? like this feature of ubuntu HOWTO: Download package dependencies for offline installation - Ubuntu Forums
its a simple feature and its present in smart and synaptic,yet its not in yast (or i havent found it yet.
i would use smart package manager but in my home connection for checking for updates ,yast is better ( smart downloads filelist.xml.gz that is very way biger than what yast downloads (though it enables smart to show filelist of package BEFORE installing) .so at home i can check for package update with yast ,buy downloading them is very hard. (my connectioon is very bad (i live in iran) and yast mirrors are NOT the best of servers ,so yast gets interrupted in middle downloading a rpm and the whole process is waiting for me to press retry ,so i cant do updates and installs overnight.btw is there some way to tell it to retry always or a number of times automatically? )
i need the url links of rpms so i download them separately and install them.
I am new to Unbuntu I just installed 11 and sofar love it. I am setting up my email on it. I use a Gmail account. I like how on Android and iOS the device does not download any emails but it keeps them on my sever and only views hen and then updates the changes so all of my email are on the Gmail server. What option will I need to use to obtain that.
when I first started with tbird version 3.0.5, under that version everything worked fine and attachments etc. downloaded, but tbird was automatically upgraded to 3.1 since then attachments don't download and pictures don't appear in emails.What do I need to do to correct this?
I changed to ubuntu 10.10 I have setup my gmail account on the Evolution client, and has downloaded like a third of my inbox emails in the first syncro. Ever since it hasn't downloaded any newer email after that even when several have passed.
Is there any configuration I should use to tell the inbox to go over a fixed capacity or something like this?
you see i have two computers, both run ubuntu 10.10. now i want to upgrade them to 11.04 or whatever the latest version is....also both of them need normal upgrading also (as in file updating in the update manager)
so is there any way to not induvidually download for both computers??can i download for one computer and use it on the other as well??
I used to have this setup on an old server and i'm trying to move it to a new server.I have a new box installed with Ubuntu 10.10, sendmail, courier-imap and courier-pop.I've configured virtusertable/local-host-names/virtual-domains/sendmail.cf and such.Everything is set up to take any mail arriving to @mydomain.com and move it to a user called "mobileinbox".When i log in as mobileinbox (su mobileinbox) and check my inbox using mutt, i see all the e-mails , but when i login from the outside using pop3 or imap , it says i have no new emails.
I have two computers, running Debian Lenny 5.0.3, suddenly they cannot access the Debian depositories and even when typed debian.org, the site does not come up! I used another hard disk with a different distro and one of the computers, and it did access the debian site. Any reason for this?
I am interested in sharing an external drive between two computers. I do not want to disconnect the drive from one and then connect it on the other one - I want to share it.
Would this work with an external USB drive and a normal USB hub? Or is it something more complicated/impossible?
Also connecting the machines via network is not possible - it has to be USB, or I can connect it to one machine also via Ethernet but the second connection has to be USB.
my main computer hostname is home, and the others are ubuntu and eduardo.In home, I try to configure samba, downloading it with "sudo apt-get install samba" and then downloading at the synaptic manager the samba-common-bin thing.I shared my folders as ROOT in home and I cannot access from ubuntu and eduardo.Then I googled and I found this site: Quote:URL]Well... I follow all steps and I can't access these files.What I need to do to share files between these computers???
but Im thinkng of completely switiching to Ubuntu,But all of my friends are on Windows...We have a LAN of abt 100-150 .. Is there is any GUI software thru which I can see all the files which are being shared on the network by Windows PCs,I know abt Samba ...but that is only computer specific & also that is reverse way...& doing from command will be tiresome task for all PCs.
The problem started happening a few days ago. Only my linux computers are affected. Yup, that's right. My roommates running windows have zero problems.
What's the problem? Suddenly, I cannot access 2 websites: namely facebook and netflix. I just get a "waiting for facebook.com" status from my browser, and it waits there patiently until the browser finally gives up. I haven't found any other sites that give me this issue. Gmail, ....., flickr, etc all work fine.
This happens using both firefox and chrome browsers. I've tried using Ubuntu 10.10 (on my desktop) and Peppermint (distro based on ubuntu, runs on my laptop). Both machines access the internet via wifi. Both have the same problem! o_O
Both machines are up-to-date. I've rebooted many times. I've tried booting an old kernel. I haven't installed any new software lately. I've tried disabling all plugins for the browsers. I've tried power-cycling our internet modem. I've tried changing my DNS settings to use Google's Public DNS service. Nothing helps.
Actually, one small piece of information: If I put my browser in incognito mode, I can get to the "sign-in" page for both facebook and netflix. But upon putting in my credentials, I still cannot reach my custom user home page for either site.
I want to configure a remote internet facing server as git server. I would like to restrict access to the server to a few systems (access is restricted to select computers, not users). I first thought of using ssh key, but the key can be copied to another system hence that alone is not sufficient. I am having a dynamic IP, so simple IP based firewall blocking is also not possible. I was thinking about the possibility of using both SSH Key and IP based access. Is it possible to update the firewall rule whenever my ip gets changed?
At school, the shop I work in has machines that run windows xp and cannot be updated to the latest SP (consider these machines "B"). This means that they are quarantined whenever connected to the network. There are also workstations that we would like to be able to connect to "B" for the sole purpose of dropping a file into a directory. These machines we will call "A" and are considered trusted.
I have No control of the school's network. I have a spare PC with two NICs as well as a 5 port switch. My thought was to use the spare PC as a gateway/router/VPN and setup an isolated "network b" consisting of all the untrusted systems. Disallow all traffic other than the VPN connection. Connect via vpn from the 4ish trusted workstations "A" to Network B. I could use mac filtering (I think) to accomplish this and disallow any computer not specifically authorized, thereby isolating the untrusted computers completely.
I am trying to make my home server accessible to the whole web. I have installed Nginx on my Fedora 15 64-bit Linux machine, and it works with localhost but it doesn't work online or allow other computers on the network to access it via the IP address. It keeps coming back with: Could not connect
I have port forwarding. I have even tried different ports but they all seem to be blocked. What could be wrong? I have a netgear router.
I have installed debian to run Squid cache as a caching proxy. Ive been bashing away now for 2 days and i have managed to install squid (i first tried manually, but that did not work so i used synaptic software packager to install it (from Administration menu) That went well, thereafter i installed webamin to work with squid in a GUI
I have managed to start squid and added my range of IP addresses to the ACL list I have added the proxy restriction too.
Now, i tried to test it. I opened Iceweasel Web browser (on the same machine) and setit to use the Proxy server: localhost and port:3128 That works fine.
But when i try to change the proxy setting to my machines ip (where squid is installed) : Proxy server: 10.0.0.35 and port:3128 That does not work. Am i missing something, please help I then tried to set another windows PC on the network to: Proxy server: 10.0.0.35 and port:3128 That also does not work.
I also edited the conf file to http_access allow all, but i do not know if i have doen it correctly, but maybe there is another problem?
I am thinking of installing a Linux mail server and use it for my company. We are about 3 people that using emails and i do not want do spend a significant amount to get an exchange server. However, i need a mail server that i will be able to get the functionality that exchange server provides such as emails (to my mobile, web access, and using outlook client), calendar, tasks and contacts. Can you reccomend me which Linux mail server should i use for all the above functionality?
I have three computers in my network, but two will be mentioned. Computer A is a Linux Mint 9/Windows 7 dual-boot, and I have just installed Mandriva Free 2010.2, which I will call Computer B.
Now my main problem is that Computer B, while it can see and access Computer A's shares as well as the third computer, the aforementioned computers cannot access Computer B. The message was: "Unable to mount location/Failed to mount Windows share." Now, the SMB protocol was used because of the third computer and Computer A have Windows OSs installed in them.
What I originally wanted was that I can share Computer B's NTFS partition, namely Documents and Downloads, to the other computers. And I can't do that, because of the error message.
What I can do, however, is use Computer B to view shares from the other two computers (Computer A, as an example). By my experiences in Linux Mint, I understand that I'd have to mount my Windows partitions in order to share them. I don't even know if my NTFS drive in Computer B is mounted, though that is what was described.