Software :: Find Download Site For IPlanet WebServer Enterprise Edition 6.1 SP13
Jan 6, 2011
Need Help to find Download site for iPlanet WebServer Enterprise Edition 6.1 SP13..I cant find this anywhere on the sun's website..Can some please post the download link here for me?por favor, muchos gracias
I use openSuse 11, 64 bit. I would like to install Java, but do not know how. I can find the Java-site, I can download to my hard disk, but I can't install.
My config: PI945GZD motherboard 2 GB RAM Windows 7 UltimateMy processor supports 64 bit OS but I have not tried any. So should I download and use the 32 bit edition or the 64 bit edition.I have sound blaster 5.1(not sound blaster 5.1 live). Would it work in ubuntu?
i installed this OS [ Red hat Enterprise edition 5.1 ]on my desktop and found there is no sound playing. step 1: while installation i found sound Test got failed and given errors as follows["You can create /root/scsconfig.log, /root/scsrun.log on the System tab and file a new bug at step2 : Then i went to /root/scsrun.log and found below error information
[ALSA lib pcm_hw.c:1357_snd_pcm_hw_open) Invalid value for card aplay: main:550: audio open error: No such device amixer: Mixer attach default error: No such device
I am trying to install Redhat linux 4.0 Enterprise edition in my system. The system starts booting from CD-ROM and starts installation. It prompts for language selection(By default English) and then key board type ( By default US type). After this it is prompting to select the drivers with the following options.
" SELECT DRIVERS", "CHOOSE DRIVERS FROM LOCATION", "BACK". Why it is prompting for drivers?. what type of drivers it is looking for? what is the solution? My system mother board type is Intel 80865.
the title pretty much says it all, is there at this point in time (december 2010) still a place where I could download ubuntu netbook edition 10.04? I want to do that because I think it looks better and more importantly, it uses compiz.
I have setup a an Apache webserver inside my local network to host my website...The domain name is purchased thru godaddy, and they are providing our DNS control. Now I have setup Postfix on this same box. I can send email to the WWW from this box, But i can only recieve mail on it, if I send the mail from this box...to this box.
When i send mail from the www it kicks it back to me, says connection to the Mail Exchager has failed. My question is how do I find out what my MX record should be, or How do I go about setting up my MX record. It needs to be inputed on godaddys total dns control page for my domain.
I've just unwrapped my new netbook. Its got the new Atom N450 processor and the first thing I want to do is get XP home off of it. I'm just wondering if I should install 9.10 full or download the "netbook edition". Whats the difference? Will I see better performance with the netbook or full edition? (just asking because I'd like to avoid the ~650mb download if I can)
I have just done a fresh install of Ubuntu 10.0.4 Netbook Edition on my Asus Eee PC 1000H, which was originally running Windows XP. Right after the installation I used the Update Manager to download/install updates. It works fine so far and I think I am beginning to like it.
I just could not get my head around one issue with the Firefox 3.6.6. For some strange reasons I just could not download and install Firefox add-ons or themes. I would click on the button "Add to Firefox". The download window with the progress bar would appear. The progress bar would get stuck at 0% with the message "waiting". After 30 secs or so, this error message would appear:
I want to enable the communication b/n ibm tsm server and ESX server . for that reason i want instal /etc/init.d/vsftpd in my sys. But iam unable to find the link which can provide me this software [/COLOR].
I am new user in RedHat Enterprise 5 Linux. I want to install openoffice. But i can't download & install it. Have any solution to done this successfully?
i use this code to download :wget -m -k -H URL... but if some file cant be download , it will retry Again and again ,so how to skip this file and download other files ,
For some reason it seems to be downloading too much and taking forever for a small website. It seems that it was following alot of the external links that page linked to.
It downloaded too little. How much depth I should use with -r? I just want to download a bunch of recipes for offline viewing while staying at a Greek mountain village. Also I don't want to be a prick and keep experimenting on people's webpages.
Ok let's say I have Apache Webservers on 2 different machines within my network, I have http://outterABC.com setup at dyndns.org to point to my modem at home, and my router forwards Port 80 to the ServerA Machine (i.e. 192.168.0.3). I can access my webpage I setup for the Server A Machine.
But what I want to try and do is somehow access my ServerB machine's website that is on my same network. I tried something like this http://ServerB.outterABC.com and the apache page came up with something like the page wasn't available. I want to access the content of the ServerB website, but because I have only one router, i can only forward Port 80 site traffic to my ServerA machine's website. I'm sure it's a different syntax I should use but i'm just not sure what I should enter to bring up the apache root web page for the Server B website via http://outterABC.com
I tried setting up DNS A records on ServerA, but I don't think it will work with what i'm trying to do above.
How do you instruct wget to recursively crawl a website and only download certain types of images? I tried using this to crawl a site and only download Jpeg images:
However, even though page1.html contains hundreds of links to subpages, which themselves have direct links to images, wget reports things like "Removing subpage13.html since it should be rejected", and never downloads any images, since none are directly linked to from the starting page.I'm assuming this is because my --accept is being used to both direct the crawl and filter content to download, whereas I want it used only to direct the download of content. How can I make wget crawl all links, but only download files with certain extensions like *.jpeg?
EDIT: Also, some pages are dynamic, and are generated via a CGI script (e.g. img.cgi?fo9s0f989wefw90e). Even if I add cgi to my accept list (e.g. --accept=jpg,jpeg,html,cgi) these still always get rejected. Is there a way around this?
I recently upgraded to Ubuntu 10.10 and did not find any change how can I know that the new Ubuntu 10.10 is installed. Also I want to know the good site from where I cam download Themes and install them.
i want to download android developer guide from google site but code.google is forbidden from my country i want to use wget to download entire android dev guides with freedom( proxy ) that i set in firefox these for open forbidden sites ( 127.0.0.1 port:8080 ) i use this command to download entire site
When I use the BBC site, I see messages "Cannot play media.You do not have the correct version of the flash player. Download the correct version"..However, when I go to Applications > Ubuntu Software Centre, I get the message, "Adobe Flash Player is installed on this computer. It is used by 1 piece of installed software"If I try to use the Firefox Adobe add-ins feature, I'm directed to the Adobe page. Is it safe to use this feature rather than, for example, Synaptic Package Manager?
I have downloaded fedora 9 iso to my xp os so I can dual boot my machine. I can't seem to find a place to plug up my RJ-45 to download the extras package in an RPM or a tar file so that I can transfer it onto my linux os so I need a wireless site to download from.
This is our first time choosing and installing linux. Our other servers are all windows 2008 x64. We were told to install fedora 13. I can only find a download for the desktop version and we're looking for the SERVER x64 download. Could I please get a link?
However, the page I'm downloading has remote content from a domain other than somedomain.com. It was asked of me to download that content too. is this possible with wget?
Anyone know of a good download manager for ubuntu that will support multi thread as well as site logins. I do a lot of downloading from rapid share and all the download managers i've tried don't support site logins.