Server :: How To Upgrade PHP Using Wget Method
Oct 13, 2010
I cant update php in my RHEL Server 5.1 using yum. I am using RHEL Server 5.1 and I cant use yum to upgrade my php. Other site told me that use wget to solve this problem. How to use wget to upgrade php? This is my first time to handle linux server..
View 9 Replies
ADVERTISEMENT
Jul 26, 2010
I have executed the command
Code:
sudo wget -r -Nc -mk [URL]
(referring the site : [URL]) for downloading entire website using wget for offline viewing on Linux,
At the middle of download I have shutdown my laptop (downloading was not over) and when I started my laptop again. I have executed the same command to continue downloading but I have got the error :
Code:
test@test-laptop:/data/Applications/sites/googlejam$ sudo wget -r -Nc -mk [URL]
--2010-07-25 19:41:32-- [URL]
Resolving code.google.com... 209.85.227.100, 209.85.227.101, 209.85.227.102, ...
Connecting to code.google.com|209.85.227.100|:80... connected.
HTTP request sent, awaiting response... 405 Method Not Allowed
2010-07-25 19:41:33 ERROR 405: Method Not Allowed.
Converted 0 files in 0 seconds.
test@test-laptop:/data/Applications/sites/googlejam$
View 8 Replies
View Related
Feb 19, 2011
I would like to download a webpage using WGET which needs a form submission (POST method) in order to appear. I can do that with this command.
wget --post-data="computer=hosts&style=list" http://www.example.com
The problem is there is more than one form on the requested page and I dont know how to tell WGET which one should it POST the data to.
View 3 Replies
View Related
Feb 13, 2010
Make sure you have the Sun VirtualBox repository installed and enabled.
If not, you can download this file [URL] and copy it in the directory /etc/zypp/repos.d on your system.
refresh this repository (only needed if autorefresh is turned off for this repo)
(sudo) zypper ref virtualbox
upgrade to the latest Sun VirtualBox
(sudo) zypper up -r virtualbox
recompile the VirtualBox kernel module
(sudo) service vboxdrv setup
Done! No need to reboot.
Code:
# rpm -qa | grep VirtualBox
before -> VirtualBox-3.1-3.1.2_56127_openSUSE111-1.x86_64
after -> VirtualBox-3.1-3.1.4_57640_openSUSE111-1.x86_64
View 5 Replies
View Related
Sep 18, 2010
I will be version-upgrading a friends (Ubuntu only) laptop very soon. It is 9.04 now and the new version will be (ideally) 10.04.1 The machine has a large unused area on the hard drive and who has known this situation was to use the uncommitted area to do a complete new install of 10.04.1, leaving the 9.04 unchanged (useful insurance). Then, copy, paste (?) the /home directory from the 9.04 into the newly completed installed 10.04.1 overwriting the installed directory.
Opinions seem to support the notion that such a paste into 10.04.1 is likely to be successful and trouble free as long as the 10.04.1 installing username is the same as the 9.04 username with same privilege level. I would be grateful for comments here, particularly with any details, gotchas, you can see.
View 1 Replies
View Related
Feb 3, 2011
We have 2 servers, 1 is the webserver and the other is the Mysql server.
When transfering a 2GB file from the webserver to the Mysql server.
The webserver's connection to the mysql DB server dies completely.
Need to restart the MYSQL process in order for it to come back online.
During this connection downtime, when using phpmyadmin on the mysql server shows no problem running queries etc.
View 2 Replies
View Related
Jun 19, 2011
If a wget download is interrupted (like if I have to shutdown prematurely), I get a wget.log with the partial download. How can I later resume the download using the data in wget.log? I have searched high and low (including wget manual) and cannot find how to do this. Is it so obvious that I did not see it? The wget -c option with the wget.log as the argument to the -c option does not work. What I do do is open the wget.log and copy the URL and then paste it into the command line and do another wget. This works but the download is started from the beginning, which means nothing in the wget.log is used.
View 2 Replies
View Related
Dec 19, 2010
I am trying to copy a directory and everything in it from one server to another.No matter what I type into SCP, it just gives me back:
usage: scp [-1246BCpqrv] [-c cipher] [-F ssh_config] [-i identity_file]
[-l limit] [-o ssh_option] [-P port] [-S program]
I tried:scp -r -P 1133 root@XX.XX.XX.XX:/home/imagesShouldn't that recursively copy /home/images from the server XX.XX.XX.XX through SSH on port 1133?Btw - I know you can do it with a tar or just a regular FTP program. The folderI am trying to copy is 40 gig, there isn't enough free space to make a tar (if the server would even do it)
View 6 Replies
View Related
Sep 17, 2010
What is the best method for backing up a VPS server? (A guest instance). I'm assuming you can't copy the image file while the VM is active. And if you stop the VM you have downtime.
View 3 Replies
View Related
Aug 4, 2010
I'm trying to use wget on an ubuntu 10.04 server 64bit, with 16GB RAM and 1.1 TB free disk space. It exits with the message "wget: memory exhausted". I'm trying to download 1MB of some sites. After different tries this is the command I'm using:
Code:
wget -r -x -Q1m -R "jpg,gif,jpeg,png" -U Mozilla http://www.onesite.com
(I only need the html documents, but when if I run the -A options only the first page is donwloaded, so I change to -R).
This happens with wget 1.12 version. I've tried the same command in other computers with less RAM and disk space (ubuntu 8.04 - wget 1.10.2) and it works just fine.
View 1 Replies
View Related
Jan 12, 2011
I need to write a script that executes a wget. The difficulty being, if wget just sits there with no reply for a very long time, I don't want the script to run that long.
How can I time out the wget command in the script if it doesn't give me a reply in a minute or so?
View 3 Replies
View Related
Jul 12, 2010
There is a partnering website that provides an RSS feed to display on the website I am working on. The website displays information on the feed every time a user accesses the website. The feed changes almost every day. For bandwidth considerations and speed, I would like to download the feed once by the server using a crontab job (my website is in a linux shared hosting environment). The problem exists with the URL structure, which I have no control over.
Here is the URL:
Code:
[code]....
I am aware that there are characters that need escaping and this is where I am getting my errors. I have never written a shell-script but I am also assuming some of the characters are keywords in the Shell Scripting language or Linux I am also aware that I can avoid having to escape by enclosing the URL with single or double quotes. You will notice that the URL has BOTH single and double quotes, so its not as simple.
View 1 Replies
View Related
Jan 29, 2011
I'm trying to download phpmyadmin from sourceforge => http://sourceforge.net/projects/phpm...r.bz2/download .I'm using the wget command followed by direct link from the page. All I get is some irrelevant file that has nothing common with phpMyAdmin-3.3.9-all-languages.tar.bz2.The direct link is for clients with web browsers that triger automatic download to user desktop, but I need to download the package to a server. What is the wget option to get the file from this kind of links?
View 1 Replies
View Related
Feb 18, 2011
I wish to download a webpage, which is secured by username and password, using WGET. The thing is there are many forms on that page and I dont know how to tell WGET which one should it send (by POST method) the parameters. I have solved it till this so far:
wget --post-data="predmet=xxx" --http-user="yyy" --http-password="zzz" [URL]
It gets through the authentication but it will not submit the form.
View 3 Replies
View Related
Sep 23, 2009
I'm debugging some kind of a SOAP problem. I can use wget for any domain I want, besides domains that are hosted on the server itself.
What could it be? Centos firewall & selinux are turned off.
(domains / ips removed from code)
[root@http1 ~]# wget http://www.google.com
--12:09:53-- http://www.google.com/
Resolving www.google.com... 74.125.93.103, 74.125.93.104, 74.125.93.105, ...
Connecting to www.google.com|74.125.93.103|:80... connected.
[Code].....
View 4 Replies
View Related
Mar 29, 2010
1>what method is used?
2>does it required separate machine for that?
3>will the backup server never crashed?
View 8 Replies
View Related
Feb 22, 2010
Is Setting up NFS the best method for mounting share drives between Mac and Linux?
View 6 Replies
View Related
Jun 24, 2011
i am using fc9 server i installed Apache web-server i kept some datafile in my html folder when tried to download remotely through web i can download the file tried to get the file in remotely through wget command i am unable to get the fileor is failed: Connection timed out Retrying below the steps i tried it
my target file is http://X.X.X.X/test.zip
wget -T 0 http://X.X.X.X/test.zip
wget http://X.X.X.X/test.zip
[code]...
View 1 Replies
View Related
Jun 29, 2010
I use the
Code:
wget -r -A <extension> <site>
command to download all files from a certain site. this time i already have some of the files already downloaded and listed in a text file via
Code:
ls > <text file name>
How can i make wget to download from the site i want but ignore the filenames listed in the text file?
View 2 Replies
View Related
Sep 16, 2010
We have a production web site running apache 2.2.3 across several web servers. we also have a major problem with SPAM comments right now. our method of identifying valid IPs (whether by external clients/customers, or internal personnel) vs SPAM'ers is not ideal - its prone to erroneously labeling legit IP's as targets to be blacklisted.
What we need is.. a way to see how much distinct request traffic is coming from any given IP address to the site in real time (or very near realtime). Essentially we want to see in some graphic/chart way requests per sec to apache / per ip sorted by requests per sec.Would nTop do this? I've only used this in a limited form at a branch office, not on a production web server.
View 3 Replies
View Related
Dec 29, 2010
What is the best method for debugging the booting process? Is there a cheatcode (tried debug and failsafe). Slackware 13.1 with standard linux kernel 2.6.35.8. I having a problem with a dell server 2860 booting my build. It is hanging on
[Code]....
Booting my previous kernel 2.6.27.27(different disk with different initrd.gz) it loads the same as above but continues with
Code: loading iSCSI transport class v2.0-870 iscsi registered transport tcp I'm stumped it the problem is with the ata_piix module or the iSCSI transport module or some firmware I might be missing. I booted other pcs without issue.
View 3 Replies
View Related
Apr 3, 2011
I have a very strange problem for me. In this example dns and IP is hiden for security reason. When I try from putty manager to connect to my Debian and write next command:
[Code]...
View 1 Replies
View Related
Jul 7, 2010
I have a dedicated server with Fasthosts. It's currently running Ubuntu Server 8.10 and I want to update it to 10.04.
The only issue I have so far is it runs Matrix Server Appliance 2.0-38
Does anyone know if this will be affected by the upgrade in a bad way?
I have had no luck finding a site for Matrix or any other information.
For the upgrade itself, I was hoping to run do-release-upgrade.
View 2 Replies
View Related
Apr 30, 2011
I am running 2 Ubuntu servers, one as UEC frontend and the other as UEC node. I started, last night, the server upgrade on the frontend. As it promised to take several hours I left it alone to proceed through the night. This morning it was waiting on a prompt which, by virtue of hitting esc to relight the monitor, it continued upgrading packages. At some point subsequent to that it paused on a discrepancy between the eucalyptus.conf file the install wanted to save and the eucalyptus.conf file that existed. I reviewed the deltas and determined that the current, as it existed, would suffice. I received 3 more messages on the console:
Installing new version of config file /etc/init/eucalyptus-network.conf. Installing new version of config file /etc/init/eucalyptus-conf...
eucalyptus start/running process 15290
Then no further console messages nor apparent disk or cpu activity
I can shell into the machine and see the following:
Code:
root 7077 6784 0 Apr29 tty1 00:00:33 /usr/bin/python /tmp/update-manager-0r8nH5/natty --mode=server --frontend=DistUpgradeViewText
[Code]...
View 9 Replies
View Related
Jun 7, 2011
I've just upgraded by dedicated server with OVH from 9.04 to 10.04 and it now refuses to boot with the error "kernel panic unable to mount rootfs on unknown block 2,0" With OVH you must use their custom kernel, but this has been installed and seems to boot thus far, but fails at above. OVH also recommend adding a dev line to fstab, which I've done: [URL] I'm now at a loss as what to do, as it seems the kernel can't see the hard drives, yet when I boot up in rescue mode, I can mount all the drives fine.
Files of interest below.
lilo.conf
Code:
boot=/dev/sda
prompt
timeout=50
default=linux_updated
[code]....
View 2 Replies
View Related
Jun 27, 2011
Currently our Production Server version is Fedora8. I know its very old version, i was newly joined as server admin for this company.. my first task need to Upgrade Server with all updated packages and patches..Without production time down..because we have nearly 400 clients accessing our server.
1. Is it possible to do Without Production loss??
2. before upgrade what are the things i need to do??
3. is there any possibles the working function not working in new upgrade packages??
View 6 Replies
View Related
May 18, 2010
I had a running server (mandrake10.1) that I wanted to transfer to a better version of linux, so, I decided to install in a new hard drive the new version and adding as slave the old hard drive that it contended data files. When I finish all the installation I start to try to find the old data files but I din't find, (/dev/hdb), the hd is mounted already, but when I look inside all files are hide.
View 2 Replies
View Related
Sep 10, 2010
I'm typing this on my linux laptop, at work. My Firefox works fine, but I cannot apt-get, or wget anything. To get my Firefox to work, I just went into the Firefox preferences, checked "Automatic proxy configuration URL" and entered the url that I have. Now Firefox works fine, but the rest of my system does not.o be a similar setting in System>Preferences>Network Proxy. There is check box for "Automatic proxy configuration" and a field for a "Autoconfiguration URL". I put the same URL that put into Firefox here and told it to apply it system-wide, but my apt still does not work. This is a big deal because I need to install software and I really don't want to start manually downloading packages, plus I need ssh.
I have googled extensively on how to get apt to work from behind a proxy, but nothing seems to be working. I don't have a specific proxy server and port; rather I have some kind of autoconfiguration URL. Plus, my system has no /etc/apt.conf file at all. Any ideas on how I can get my system to be able to access the internet? It's very strange to me that Firefox can, but apt, ping, wget, etc cannot.
View 10 Replies
View Related
Sep 6, 2011
I need to mirror a website. However, each of the links on the site's webpage is actually a 'submit' to a cgi script that shows up the resulting page. AFAIK wget should fail on this since it needs static links.
View 1 Replies
View Related
Dec 17, 2010
I am trying to have this cool effect where gnome scheduler downloads with wget this image every three hours. However, even when I do it manually in the terminal it doesn't seem to download it correctly. When I go to open the .jpg it says in a big red bar on the top "Could not load image '1600.jpg'. Error interpreting JPEG image file (Not a JPEG file: starts with 0x47 0x49)"
However, when I go to the picture in the link above and right click "Save Image As" it downloads it fine.
View 4 Replies
View Related