Debian :: Downloading Pages In A Clean Way Using Wget?
Oct 30, 2010
I want to download pages, in the way they are seen when we visit them in a normal way. For example, I used this on Yahoo, and here is a part of the file I got:
[Code].....
But I just want the normal text, and nothing else...
View 1 Replies
ADVERTISEMENT
Apr 4, 2011
I get a page of rssfeed.php in my root dir, whenever i run it once more, there goes a new rssfeed.php.0, rssfeed. php.1...etc, the number increases. but all of them stays empty, is there a tag that avoids this?I am not sure which one to use on the man page, there is -o for output, but none for no output?
View 1 Replies
View Related
Jul 19, 2011
Example: [url]
This was what I tried...wget -A rar [-r [-l 1]] <mediafireurl>
That is to say, I tried with and without the recursive option. It ends up downloading an HTML page of a few KB in size, while what I want is in the range 90-100 MB and RAR.
What happens with MediaFire for those who may not be aware, is that it first says
Processing Download Request...
This text after a second or so turns into the download link and reads
Click here to start download..
How to write a proper script for this situation.
View 1 Replies
View Related
Jan 29, 2011
I'm trying to download phpmyadmin from sourceforge => http://sourceforge.net/projects/phpm...r.bz2/download .I'm using the wget command followed by direct link from the page. All I get is some irrelevant file that has nothing common with phpMyAdmin-3.3.9-all-languages.tar.bz2.The direct link is for clients with web browsers that triger automatic download to user desktop, but I need to download the package to a server. What is the wget option to get the file from this kind of links?
View 1 Replies
View Related
Dec 29, 2010
I'm trying to have wget retrieve the pics from a list of saved URLs. I have a list of facebook profiles from which I need the main profile picture saved.When I pull such up in my browser with the included wget command I see everything just fine; however, when I do it reading in a file (or even manually specifying a page to download), what I receive is the html file with everything intact minus the main photo of the page (that pages' user picture).I believe I need the -A switch, but I think that is what is causing the issues (because the page is not a .jpg, it's getting deleted).
View 1 Replies
View Related
Feb 2, 2010
I am trying to download the contents from [URL] to my local system for off-line browsing but am having little to no success. I have tried using wget and httrack, although I can download the directory structure there does not seem to me any sfw files.
View 7 Replies
View Related
Feb 5, 2010
I am vijaya, glad to meet you all via this forum and my question is I set a crontab for automatic downloading of files from internet by using wget but when I kept it for execution several process are running for the same at the back ground. My concern is to get only one copy, not many copies of the same file and not abled to find out where it's actually downloading.
View 1 Replies
View Related
Aug 5, 2010
I'm trying to update Ubuntu 10.04 after a clean installation, it downloaded 245 files,and there is an error in the indexes of the 2 last packages:Failed to fetch [URL] difiere (bad size) Failed to fetch [URL] difiere (bad size) This happens even changing the repository to main server or another through "Software origins".Update-manager doesn't conclude the update because it can't download all the packages.
View 1 Replies
View Related
Jun 19, 2011
If a wget download is interrupted (like if I have to shutdown prematurely), I get a wget.log with the partial download. How can I later resume the download using the data in wget.log? I have searched high and low (including wget manual) and cannot find how to do this. Is it so obvious that I did not see it? The wget -c option with the wget.log as the argument to the -c option does not work. What I do do is open the wget.log and copy the URL and then paste it into the command line and do another wget. This works but the download is started from the beginning, which means nothing in the wget.log is used.
View 2 Replies
View Related
Feb 19, 2010
I have a Cent OS 5.4 32 bit final installed in my dedicated server.
I used to run lighttpd with php in my server until now and all was fine.But yesterday I changed my website which needs apache to run. So installed apache using yum install httpd command.
Then I added the virtual host name of my domain in webmin panel but when i try to run my php script in browser then its not opening php pages.
Instead it downloads php files like index.php when i open in browser.So I guess apache is not able to compile and run php pages. Only html pages are opening right now..
View 2 Replies
View Related
Sep 9, 2009
when i input man <cmd>.....it's saying formatting pages....n then it is displaying blank page.
View 6 Replies
View Related
Dec 18, 2010
I have problem with my printer HP Deskjet D1460. My printer is configured and works. When I send a file on the print, the printer clings a sheet of paper and starts to print, but a paper as was clean so clean and remains, after printing.
View 2 Replies
View Related
Oct 11, 2010
I configured cron to clean my /tmp directory, should I also add other locations to clean and especially /var/tmp.
View 4 Replies
View Related
Nov 25, 2015
This is the command line switch I am using:
Code: Select allwget -p -k -e robots=off -U 'Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.8.1.6) Gecko/20070802 SeaMonkey/1.1.4' -r www.website.com
For some reason it seems to be downloading too much and taking forever for a small website. It seems that it was following alot of the external links that page linked to.
But when I tried:
Code: Select allwget -E -H -k -K -p www.website.com
It downloaded too little. How much depth I should use with -r? I just want to download a bunch of recipes for offline viewing while staying at a Greek mountain village. Also I don't want to be a prick and keep experimenting on people's webpages.
View 3 Replies
View Related
Feb 4, 2010
Is there any difference between apt-get clean and aptitude clean? Do they both remove the same caches? Should I know any other commands for cleaning up wasted space on my ubuntu laptop?
View 1 Replies
View Related
Feb 13, 2016
I have a fairly slow connection over a cell phone tether, and when it's not so slow it's metered and I only get 5 gigs a month of faster data (Straight Talk). It's a wonderful thing to be able to do apt-get update and apt-get upgrade every day but I can't afford the bandwidth. Even trying to upgrade once a month takes several days of downloading.
I never wanted bleeding edge code, I'd rather have something stable. I also use OpenBSD and I update that about every 2 years. I took everything mentioning testing out of my sources.list but it's not clear what should be in there.
View 9 Replies
View Related
Jan 10, 2011
APT - it really has me mystified at times so I'm looking for an idiots guide on how to use it. I've googled and read the APT How to on the Debian site, as well as a lot of other APT pages, so I understand what it does and the command structure, but I can't seem to download one off packages from the Debian site.
I've managed to get the updates to work (ran an update the other day) so I know my source file is working (my source.list points to deb http://ftp.uk.debian.org/debian/ lenny main) however I don't understand/can't figure out how to get single packages from Debian.
As an example I want to get the rsynch package which has a download page in Debian and my mirror in the source.list file can be used. However when I do apt-get rsynch I get an error message that says it can't be found.
Looking at the Debian package website it does say that the rsynch package can be requested from the subdirectory of pool/main/r/rsync/ at any one of the listed download sites (of which the site in my source.list file is one of those listed). Do I have to add the pool/main/r/rsync/ information to my sources.list file, or add it to the apt-get command?
View 7 Replies
View Related
Jan 10, 2010
I have debian on a VPS. I think the installation is fubar.
Can I completely reinstall debian on this? I don't suppose I need to format the disk. Could I put some sort of network install is some special directory, and run the install from that?
View 2 Replies
View Related
Jun 4, 2010
It's debian testing. I can't scroll down any man pages with typing 'j'. I must hit <ENTER> to scroll down. Why is it ?
And i can't scroll up manpages.
View 7 Replies
View Related
Jul 12, 2011
Every thing was running ok in debain squeeze, but last time when i boot the system and connected to the Internet with the dialup connection, the graphical browsers like firefox and epiphany stop working and cannot open the web pages, the error is " Problem loading page", other clients like nslookup, ping, ftp clients and text browser lynx working ok.
View 4 Replies
View Related
Jan 8, 2016
I have installed network controller driver(iwlwifi) in debian jessie and connected to wifi, but I could not connect to any web sites. I installed network-tools and input "iwconfig" it did show a wlan0 there.
Then I opened resolv.conf ,but there were 2 resolv.conf in my /etc,one contains "search lan nameserver 192.168.199.1",one contains "nameserver 8.8.8.8 nameserver 8.8.4.4".
I changed the content of the first one to 8.8.8.8 and restart networking,but I still could not open pages with wireless network.
And I open the first one again,and it changed back to 192.168.199.1,and an extra line "domain lan"above the line"search lan".
View 6 Replies
View Related
Aug 4, 2015
I have some scripts that need to use a newer version of PHP, Im running Debian 6 which has PHP 5.3.3 support, I found I could install php 5.4 using [URL]. This worked, it updated my PHP to a newer version, the only issue is that when the install completed apache now downloads the PHP file instead of rendering it.
Im guessing this has something to do with the Apache configs, but I don't know what to do.
Code:
Select alltom@vps:~$ dpkg --list |grep -E '(apache)|(php5-)'
ii apache2 2.2.16-6+squeeze12 Apache
HTTP Server metapackage
ii apache2-doc 2.2.16-6+squeeze12 Apache
[Code] ...
View 1 Replies
View Related
Sep 6, 2015
I was thinking of migrating my apt-mirror repository to the recommended ftp scrips: [URL] .....
I pre populated my pool with already downloaded files, and setup the scripts.
However, if I run the bin/ftpsync, and monitor rsync with lsof -p, I can see that it is still downloading files from oldstable (wheezy) despite exclude options.
I'm guessing it's a configuration error, but I can't seem to figure it out. Any thoughts? My etc/ftpsync.conf is as follows:
Code: Select allMIRRORNAME=`hostname -f`
TO="/server_storage/srv/mirrors/debian"
RSYNC_PATH="debian"
RSYNC_HOST=ftp.us.debian.org
LOGDIR="${BASEDIR}/log"
[Code] ....
Actually, I don't think it works like I thought it did. A few guides I found listed the exclude options, but the sample config file has this:
Code: Select all## If you do want to exclude files from the mirror run, put --exclude statements here.
## See rsync(1) for the exact syntax, these are passed to rsync as written here.
## DO NOT TRY TO EXCLUDE ARCHITECTURES OR SUITES WITH THIS, IT WILL NOT WORK!
#EXCLUDE=""
So it looks like it doesn't exclude the suites at all.
View 5 Replies
View Related
Apr 24, 2011
In windows, using firefox you can use backspace key to navigate back pages in history. Now in Debian - I try and nothing happens. Does anyone know how to change that?
View 10 Replies
View Related
May 21, 2011
I have a new Debian Linux 6.0 server installed, running Linux 2.6.38.3-linode32 on i686. It has apache2 on front.
The master plan was to run Railo on this one, so I can continue my mad Coldfusion schemes to rule the world, so got the tomcat set up with it, as instructed by some guy I met on a chat. He seemed very reliable.
I got the whole thing working, can view my regular html files and whatnot, but as soon as I try to run a cfm file,
I get an error message on screen:
I have no idea whatsoever what to do next. This is my site with the html file:[url] and here is my test index.cfm file:[url]
View 1 Replies
View Related
Nov 20, 2010
Did a clean minimal install of Testing in a Virtual Machine (VirtualBox).Login as root.Type "shutdown now".It starts shutting down, then says INIT: Sending processes the TERM signal Give root password for maintenance (or type Control-D to continue):If i press Control-D it goes back to a login prompt.Okay, i maybe missing a point since "shutdown -h now" gives the expected behaviour.Call me old fashioned but I think that a "shutdown now" should shutdown a system, and not effectively reboot the system. There is a reboot command for that.
View 3 Replies
View Related
Dec 10, 2010
I want try clean compiled linux kernel on system, I want see what can I do with clean linux kernel. I want just on clean partitioned hdd, put grub and linux kernel and then boot it up, so what then I get? Can I input commands like ls?
View 5 Replies
View Related
Dec 15, 2010
With ubuntu it was possible to clean packages with the commands
sudo apt-get autoremove
sudo apt-get autoclean
All not necesseary packages were removed. Packages who didn't no longer has a dependency were removed.
Can this also be done under Debian?
View 12 Replies
View Related
Jul 27, 2011
I have just upgraded my lenny box to squeeze. I did it by clean-installing squeeze. The installation was successful, but I just noticed that I had forgotten to backup some important files I had on this machine before the installation...
Now, is there any way to recover those files?
View 2 Replies
View Related
Mar 26, 2010
The purpose of that topic is to identify if there is any way to totally clean a debian system and make it like a fresh installed system (of course i amn't refering to packages because aptitude is just perfect?
View 3 Replies
View Related