Ubuntu :: Is 'Sweeper' Good Tool For Cookies Web History?
Jun 8, 2010
Is "Sweeper" a good tool for cookies web history & such?Also, I want to set up personal file sharing for my house & it says I do not have the correct packages installed, but it also doesn't say which packages I need to install.
IMDB gives me a "recently viewed" list of pages I viewed recently (and a few of them were months ago, since I don't go there too often) at the bottom of every page I view. My cookies are enabled for session-only, have adblock plus, my ip changes every day, how are they doing that? How can I prevent it? What other websites are using the same trick? At this page [URL] there's a link there that says "Clear entire history", but I want to disable them from being able to track me like that, if they can track me, so can others.
I used to use CCleaner so I could keep specific cookies from being deleted while I deleted all others. Is there any way I can do this with Ubuntu? Firefox doesn't seem to allow it other than manual deletion which is not as fast an automated as CCleaner made the task.
I was wondering does anyone here know a good SQL management tool for ubuntu? I just want something with a nice GUI that doesn't require much hassle to add remove or even migrate databases.
I need a good bi-directional sync. tool like unison, that would work properly over ssh and allow manual decisions and merging if both sides (local&remote) contain changes. unison meets most of my needs, the only drawback is the compatibility - it needs to be installed on both sides, and with the same version. Nowadays it's a bit of a problem, if you have say OS11.3 on one node, SL3 on the other and a MacOS on the third. In my view one only needs the ssh connection, all the rest of the file analysis should be done by the local software. Anyone knows such a tool?
Any easy to install/configure network/server monitoring tool? PLease note I'm looking for something of little lightweight here (Not something like zenoss) But I'd still like to get performance graphs and event notifying alerts. Also note this is to monitor less than 50 servers and perhaps a firewall or 2.
I am using squid proxy server for sharing Internet in my internal network. I would like to know that how can I check the browsing history by individual users web surfing history by their IP addresses?
I'm trying to find a good desktop search tool. Beagle is dead, Recoll and Strigi are KDE, and Tracker is not many features (can't even search Thunderbird 3). Do I miss something? Is desktop search on Linux dead? Should I use Google Desktop Search instead
when I quit Ubuntu Forums, I getvBulletin MessageAll cookies cleared! When I restart, I have to LogIn againWhere do I fix it, so that Cookies are not cleared on LogOut?(I'm using Win2K and FireFox 3.6.3)nFireFox Privacy Clear History when FireFox closes.ccept cookies from Third Party is checked
I need to implement a company-wide policy on "adobe flash player settings".Each individual can visitmanager07.html and change their settings, but that's far from ideal.I suspect I could get somewhere by locking down permissions on temp directories, but I wonder if there is a more elegant way?
Browsers under Debian distros used to allow cookies per site but now it seems it is all or nothing. Perhaps I should install another web browser. I have Epiphany and Iceweasel.
My main problem is that I cannot set non-volatile the number of results per page in Google to 100.
I understand that a website can set a cookie which means a little text file with a no. in it will be created on the client side, and then the no. will get sent to the server with every visit. But how is this used to customize a page? Does this mean that a MySQL database *has* to be created, which can be used to match up that no. with a particular login name, for instance? How exactly is this pulled off?
so i want to remove all cookies from my firefox, seems like lots of ways to do this, none of which work?
tried, edit pref privacy show cookies remove all cookies all cookies appear gone but on restart are all back again tried tools view all cookies remove all cookies
same result. anyone any idea how i can do this tried to manually delate cookie file from ~/.mozilla/firefox/somecode.default but that didn't work either, failing that how do i remove firefox and all config files with yum, like --purge option for .deb systems?
Firefox used to work fine, then I played around (only) with privacy settings, and now it deletes all cookies (including ones set to expire in a month) at the end of the session.It is set to "remember history". Moving to a new profile is not an option.
I was curious if I would be able to view cookies from a command prompt when ssh'd into a machine. On a test machine running fedora 13, I found that the cookies were stored in a cookies.sqlite. I made sure that all instances of Firefox was closed and attempted to view the file running the following command
sqlite3 cookies.sqlite
It loaded but I was unable to run view any of the information because the database was locked? There were no instances of Firefox running and I check to make sure there were no services of Firefox running as well. Am I doing something wrong? Is this not the correct way to view cookies from the command line ? I have tried google searches and has since been unable to come up with anything.
I had an extension i started making and had loaded in Chromium. Well i forgot to unload it and deleted the files for it. Well now Chromium crashes on startup saying it can't find the manifest file for it. How can i remove the entry for that in the extension cookies file?
I use different browsers for sites like facebook and general browsing (specifically rekonq for facebook, firefox for others). However, I find that Flash cookies are shared between browsers, and are not cleared when clear my browser cookies.
Flash cookies are kept in ~/.macromedia, and it's OK for me to clear this periodically with a little cron job. However, I would really like separate places for flash cookies from rekonq and firefox.
Does anyone know how to do this? Maybe there is an environment variable which allows this?
My second implementation option would be to make a chrooted environment for each browser or something like that.
I am trying to set up Apache and Webmin so that I can access Webmin by going to [URL]. I am using the direction at [URL] under the "Webmin In A Sub-Directory Via A Proxy" section. I had this setup working before, but I think an update of either Webmin or Apache broke it. Now, I can go the the webpage and I see the login screen. However, when I try to log in, I get an error.
Quote:
Error - No cookies
Your browser does not support cookies, which are required for this web server to work in session authentication mode I have tried adding the ProxyPassReverseCookieDomain and ProxyPassReverseCookiePath directives to my virtual host config file, but it still doesn't work.
Is there a way I can send cookies/site login information to computers within my internal network? i.e push them through in packet headers? Not FTP. For example if I want to send cookies with my shopping cart or login information from one computer to another within my network how can I accomplish this?
For some reason I was trialing a SUSE 11.1 SP1 version for a while and somehow I have NO clue it attempted to change itself to a OpenSUSE 11.1 System. I have no clue what I did, please don't ask Now though I have been successful in turning into an OpenSUSE 11.2 system by changing the repositories to OpenSUSE 11.2, doing a "zypper refresh", "Zypper in Zypper", "Zypper dup -d", "Zypper dup". I also did a repair and refreshed the base packages with an OpenSUSE 11.2 DVD and that seemed to help also. The only thing it seems I can get right is that yast/yast2 give me this error: Download failed: Failed to download ./repo/repoindex.xml from https://nu.novell.com/?cookies=0&cre...NCCcredentials