Server :: Wget Indirect Link Downloading?
Jan 29, 2011
I'm trying to download phpmyadmin from sourceforge => http://sourceforge.net/projects/phpm...r.bz2/download .I'm using the wget command followed by direct link from the page. All I get is some irrelevant file that has nothing common with phpMyAdmin-3.3.9-all-languages.tar.bz2.The direct link is for clients with web browsers that triger automatic download to user desktop, but I need to download the package to a server. What is the wget option to get the file from this kind of links?
View 1 Replies
ADVERTISEMENT
Oct 30, 2010
I want to download pages, in the way they are seen when we visit them in a normal way. For example, I used this on Yahoo, and here is a part of the file I got:
[Code].....
But I just want the normal text, and nothing else...
View 1 Replies
View Related
Jul 19, 2011
Example: [url]
This was what I tried...wget -A rar [-r [-l 1]] <mediafireurl>
That is to say, I tried with and without the recursive option. It ends up downloading an HTML page of a few KB in size, while what I want is in the range 90-100 MB and RAR.
What happens with MediaFire for those who may not be aware, is that it first says
Processing Download Request...
This text after a second or so turns into the download link and reads
Click here to start download..
How to write a proper script for this situation.
View 1 Replies
View Related
Dec 29, 2010
I'm trying to have wget retrieve the pics from a list of saved URLs. I have a list of facebook profiles from which I need the main profile picture saved.When I pull such up in my browser with the included wget command I see everything just fine; however, when I do it reading in a file (or even manually specifying a page to download), what I receive is the html file with everything intact minus the main photo of the page (that pages' user picture).I believe I need the -A switch, but I think that is what is causing the issues (because the page is not a .jpg, it's getting deleted).
View 1 Replies
View Related
Feb 5, 2010
I am vijaya, glad to meet you all via this forum and my question is I set a crontab for automatic downloading of files from internet by using wget but when I kept it for execution several process are running for the same at the back ground. My concern is to get only one copy, not many copies of the same file and not abled to find out where it's actually downloading.
View 1 Replies
View Related
Jul 28, 2009
What I'm trying to do is wget images, however, I'm not sure how to do it 100% right. What I've got is a index.html page that has images (thumbs) that link to the full size images. How do I grab the full size images?
Example of links on the page:
<a href="images/*random numbers*.jpg" target="_blank"><img border=0 width=112 height=150 src="images/tn_*random numbers*.jpg" style="position:relative;left:3px;top:3px" /></a>
I tried:
wget -A.jpg -r -l1 -np URLHERE
but only got the thumbs.
View 1 Replies
View Related
Jun 8, 2010
I've a damn TP-LINK WG422G UBS Adapter that i love i installed latest compat-wireless driver package and used latest ar9271 firmware after that it was worked properly. There is no problem when surfing the net, the problem is occurring usually download something via torrent sometimes form websites. I'm starting to download and then usb adapter suddenly freeze. If i remove and plug in the usb adapter it is working again.
View 1 Replies
View Related
Sep 8, 2010
how can we create soft link and hard link in RHEL5 when am using in command it is giving format error
View 6 Replies
View Related
Sep 24, 2010
I just checked that APTOnCD is a .deb package. The internet connection I have is on OpenSuse 11.2. So I can't install APTOnCD on OpenSuse, Can I ?
I have to create a Repository for LinuxMint (which is Ubuntu based) on a CD/DVD.
What is the way out ?
View 13 Replies
View Related
Feb 3, 2011
We have 2 servers, 1 is the webserver and the other is the Mysql server.
When transfering a 2GB file from the webserver to the Mysql server.
The webserver's connection to the mysql DB server dies completely.
Need to restart the MYSQL process in order for it to come back online.
During this connection downtime, when using phpmyadmin on the mysql server shows no problem running queries etc.
View 2 Replies
View Related
Jun 19, 2011
If a wget download is interrupted (like if I have to shutdown prematurely), I get a wget.log with the partial download. How can I later resume the download using the data in wget.log? I have searched high and low (including wget manual) and cannot find how to do this. Is it so obvious that I did not see it? The wget -c option with the wget.log as the argument to the -c option does not work. What I do do is open the wget.log and copy the URL and then paste it into the command line and do another wget. This works but the download is started from the beginning, which means nothing in the wget.log is used.
View 2 Replies
View Related
Feb 20, 2011
What is indirect rendering and what sort of implications does it have on graphics performance?Also, is it a Linux-specific term or can it be used in the context of other operating systems?
View 2 Replies
View Related
Dec 19, 2010
I am trying to copy a directory and everything in it from one server to another.No matter what I type into SCP, it just gives me back:
usage: scp [-1246BCpqrv] [-c cipher] [-F ssh_config] [-i identity_file]
[-l limit] [-o ssh_option] [-P port] [-S program]
I tried:scp -r -P 1133 root@XX.XX.XX.XX:/home/imagesShouldn't that recursively copy /home/images from the server XX.XX.XX.XX through SSH on port 1133?Btw - I know you can do it with a tar or just a regular FTP program. The folderI am trying to copy is 40 gig, there isn't enough free space to make a tar (if the server would even do it)
View 6 Replies
View Related
Apr 13, 2016
I have Debian 8.4.0 with gnome gdm3 and NVIDIA GeForce 8400 GS. When I run some applications the following message appears:
Code: Select allname of display: :0
X Error of failed request: BadValue (integer parameter out of range for operation)
Major opcode of failed request: 155 (GLX)
Minor opcode of failed request: 24 (X_GLXCreateNewContext)
Value in failed request: 0x0
Serial number of failed request: 33
Current serial number in output stream: 34
Previously, I had lightgdm instead of gdm3 and I fixed this problem allowing indirect GLX protocol just setting xserver-command=X -core +iglx in /usr/ share/lightdm/lightdm.conf.d/50-xserver-command.conf...Now I can't allog GLX protocol. I have tried the following:
Set Option "AllowIndirectGLXProtocol" "On" in xorg.conf Set serverargs="+iglx" in /usr/bin/startx Set exec /usr/bin/X in /etc/X11/xinit/xserverrc
Despite all this, the error message persists and xorg.0.log indicates that indirect GLX is disabled (line 19.477):
Code: Select all[ 17.604]
X.Org X Server 1.16.4
Release Date: 2014-12-20
[ 17.604] X Protocol Version 11, Revision 0
[ 17.604] Build Operating System: Linux 3.16.0-4-amd64 x86_64 Debian
[ 17.604] Current Operating System: Linux federueda-pc 3.16.0-4-amd64 #1 SMP Debian 3.16.7-ckt25-1 (2016-03-06) x86_64
[ 17.604] Kernel command line: BOOT_IMAGE=/boot/vmlinuz-3.16.0-4-amd64 root=UUID=4156658f-15d7-43ae-b0ff-2dfc7484c3ab ro quiet
[code]...
In short, I don't find the way to enable indirect GLX in gdm3.
View 1 Replies
View Related
Aug 4, 2010
I'm trying to use wget on an ubuntu 10.04 server 64bit, with 16GB RAM and 1.1 TB free disk space. It exits with the message "wget: memory exhausted". I'm trying to download 1MB of some sites. After different tries this is the command I'm using:
Code:
wget -r -x -Q1m -R "jpg,gif,jpeg,png" -U Mozilla http://www.onesite.com
(I only need the html documents, but when if I run the -A options only the first page is donwloaded, so I change to -R).
This happens with wget 1.12 version. I've tried the same command in other computers with less RAM and disk space (ubuntu 8.04 - wget 1.10.2) and it works just fine.
View 1 Replies
View Related
Oct 13, 2010
I cant update php in my RHEL Server 5.1 using yum. I am using RHEL Server 5.1 and I cant use yum to upgrade my php. Other site told me that use wget to solve this problem. How to use wget to upgrade php? This is my first time to handle linux server..
View 9 Replies
View Related
Oct 2, 2010
I have multiple distros that I chainload and I have installed a grub2 shell to the partition and can boot manual. I can not seem to get a grub.cfg file to work. Is there a directory that needs to be built for this file?
View 9 Replies
View Related
Feb 23, 2010
I've got a situation. I'm having GNU bash version 3.00.16(1) on Solaris 10. I need to declare an array say arr1 which will be populated by an output of a command.
declare -a arr1
arr1=( $(/some/command) )
Supposing it will eventually (after executing the command) have element values as -
arr1[0]=1234
arr1[1]=5678
arr1[2]=7890
Now, I need to declare another set of arrays, one for each of the element values above - e.g.
declare -a arr1_1234
declare -a arr1_5678
declare -a arr1_7890
And I also need to populate elements of each of above 3 arrays with output of another command in a loop. So, these arrays will hold values something like -
arr1_1234[0]="abc"
arr1_1234[1]="def"
arr1_1234[2]="ghi"
arr1_5678[0]="jkl"
arr1_5678[1]="mno"
arr1_5678[2]="pqr"
arr1_7890[0]="tuv"
arr1_7890[1]="xyz"
arr1_7890[2]="aab"
I'm able to declare and populate arr1[*]. My question is how do I declare, populate and print the subsequent arrays and their elements?I am feeling rather thick to get this working.
View 7 Replies
View Related
Jan 12, 2011
I need to write a script that executes a wget. The difficulty being, if wget just sits there with no reply for a very long time, I don't want the script to run that long.
How can I time out the wget command in the script if it doesn't give me a reply in a minute or so?
View 3 Replies
View Related
Jul 12, 2010
There is a partnering website that provides an RSS feed to display on the website I am working on. The website displays information on the feed every time a user accesses the website. The feed changes almost every day. For bandwidth considerations and speed, I would like to download the feed once by the server using a crontab job (my website is in a linux shared hosting environment). The problem exists with the URL structure, which I have no control over.
Here is the URL:
Code:
[code]....
I am aware that there are characters that need escaping and this is where I am getting my errors. I have never written a shell-script but I am also assuming some of the characters are keywords in the Shell Scripting language or Linux I am also aware that I can avoid having to escape by enclosing the URL with single or double quotes. You will notice that the URL has BOTH single and double quotes, so its not as simple.
View 1 Replies
View Related
Feb 18, 2011
I wish to download a webpage, which is secured by username and password, using WGET. The thing is there are many forms on that page and I dont know how to tell WGET which one should it send (by POST method) the parameters. I have solved it till this so far:
wget --post-data="predmet=xxx" --http-user="yyy" --http-password="zzz" [URL]
It gets through the authentication but it will not submit the form.
View 3 Replies
View Related
Sep 23, 2009
I'm debugging some kind of a SOAP problem. I can use wget for any domain I want, besides domains that are hosted on the server itself.
What could it be? Centos firewall & selinux are turned off.
(domains / ips removed from code)
[root@http1 ~]# wget http://www.google.com
--12:09:53-- http://www.google.com/
Resolving www.google.com... 74.125.93.103, 74.125.93.104, 74.125.93.105, ...
Connecting to www.google.com|74.125.93.103|:80... connected.
[Code].....
View 4 Replies
View Related
Jun 24, 2011
i am using fc9 server i installed Apache web-server i kept some datafile in my html folder when tried to download remotely through web i can download the file tried to get the file in remotely through wget command i am unable to get the fileor is failed: Connection timed out Retrying below the steps i tried it
my target file is http://X.X.X.X/test.zip
wget -T 0 http://X.X.X.X/test.zip
wget http://X.X.X.X/test.zip
[code]...
View 1 Replies
View Related
Jun 29, 2010
I use the
Code:
wget -r -A <extension> <site>
command to download all files from a certain site. this time i already have some of the files already downloaded and listed in a text file via
Code:
ls > <text file name>
How can i make wget to download from the site i want but ignore the filenames listed in the text file?
View 2 Replies
View Related
Jul 9, 2010
hi! everyone does anybody experience this? i have ubuntu server, php5, apache2. every time i visit my website the script is not executed by my apache instead it will downloading my script with application/x-httpd-php file type. how can i fix this?
View 5 Replies
View Related
Jun 26, 2009
I want to install ubuntu to client machines. I tried to install using apache server.. I installed that well. and it is working well. i tested that.I did every configuration like this link [url]
But when i give the image server ip address to the image server. it promote a message says that release file cannot be download...
I dont know y i'm geting this error..
In that link there is image call netboot installer. i boot from that .iso am i correct or i didn't understand that thing.
View 2 Replies
View Related
Mar 24, 2011
Since yesterday Kmail has started downloading ALL ,messages, not just the new ones, from the server. It is a real pain as the mail check takes about twenty minutes and Kmail usually freezes afterwards. Then I have to close it down and after restart it downloads several thousand messages all over again. This has only started happening since yesterdays Fedora update, which I think was anything to do with Kmail?
I am using Fedora 14.
EDIT: I have just notice it is not downloading ALL messages, just the read ones. The new messages are being left on the server but the old ones are downloaded and flagged as new. So it seems as if something has got flipped around somewhere?
View 7 Replies
View Related
Jan 3, 2010
When I download fast and much my server freeze however if I don't download anything the server can run for weeks without problems but as soon I start downloading fast it doesn't take long until the server freeze again, I mean totally frozen so I have to manually power it down and then turn it on again. This started happening recently I think it started happening after I updated to 9.10 it have always worked perfectly before. Is there anyway to fix this? If not is there a way to downgrade back to 9.04 or 8.10 where everything were working fine?
View 2 Replies
View Related
Mar 17, 2010
I have an Ubuntu 9.10 server set up at my house. I have Apache2 and PHP5 installed on it. Every time I go to the server on a web page and try to load the PHP index page it downloads instead of displaying.
I have virtual servers set up and have the files stored at /home/cusinndzl. If anyone needs to take a look I can let them into the webmin panel.
View 4 Replies
View Related
May 22, 2010
I need to ask you how canIi make some fetchmail jobs to download mails from my gmail account into my mail account in my own server? My server is :
Linux Centos 5.4
Postfix Mail Server
I make a file named .fetchmailrc in my home directory and set its permissions to 755and the content of this file is:
set postmaster "postmaster"
set bouncemail
set no spambounce
set properties ""
poll imap.googlemail.com:993 protocol imap username "username@gmail.com" password "password"
I think something is not completed, I need to make that fetch to a specific user only, not all users.
View 7 Replies
View Related