Ubuntu :: Wget Failed Urls Output / Log The Urls That Have Failed?

Jan 22, 2010

I'm using wget to retrieve a long list of URLs, a small proportion of which fail, hence:

Code:
wget --input-file=urls.txt
Is there a way to log the urls that have failed? Unfortunatley wget does not output the current URL being processed (and then the status), so hard to see grepping the output helping.

Or should I use some alternative like curl, wmget?

View 1 Replies


ADVERTISEMENT

Ubuntu :: Using SED To Grab URLs?

May 30, 2010

Is there a way with SED to read this link:

[URL]

and then pull the individual links and list them in order each on a separate line? like...

[URL]

If SED can't do this is there another command or program that will?

View 1 Replies View Related

Ubuntu :: Linking To External URLs In Tomboy?

Jul 18, 2010

Is there any way to embed a URL to an external web page in the text of a tomboy note, in the same way that other notes are linked to? I know I can just pates the URL into the note and have it link out, but when the link is over a hundred characters long (not kidding) then that stops being an option.

View 1 Replies View Related

Ubuntu :: Really Mount Windows Share (SMB URLs)

May 6, 2011

I am currently in a more mixed environment as I desire and I need to mount samba shares because I need to work with the data. I noticed that nautilus does not really mount the shares and some applications cannot deal with smb urls. I searched and found this old thread: [URL]. Is it possible that after all these years, this is still unchanged? To permanently mount on boot time is not an option for me as the drive will not always be available - already changes when I move within the office from fixed to WLAN (e.g. when going to a meeting and vice versa).

View 1 Replies View Related

Fedora Networking :: Can Resolve Some URLs But Not Others?

Dec 29, 2009

Over the weekend I upgraded my home PC from Fedora 9 to Fedora 12, and now I'm having problems connecting to the Internet. Basically, I am able to connect to some URLs but not others, and it happens in both Firefox and Konquerer. I am able to connect to url, url, url and url with no problems. However, when I try to connect to slashdot.org, url, fedoraforum.org and rpmfusion.org I can not. All the other Windows PCs in my home using the same 2Wire home portal are able to get to the sites using IE with no problem.

I first suspected a DNS issue, but the "host" command returns a valid IP address for all the URLs that I can not reach. Another symptom is that the following command

Code:
su -c 'rpm -Uvh url. url
(from rpmfusion.org/Configuration/) also doesn't work when entered at the command prompt. However, when I did "host download1.rpmfusion.org" and edited the command to use the IP address returned instead of "download1.rpmfusion.org" it worked. But then, the next time I ran "yum" it failed because it couldn't find the rpmfusion.org URL in the installed repository entries.

After reading some other threads, I tried disabling avahi-daemon, but that had no effect. I also tried examining /var/lib/dhcpd and /var/lib/dhclient, but neither file existed on my system.

View 3 Replies View Related

Programming :: Extracting URLs From Strings?

Jun 28, 2009

I have a PHP script written that is checking a string to see if it contains a link in it (i.e. a URL). I have the following if statement, that uses 3 possible regular expressions to determine if there is a link or not.

Code:
// check if we found a link
// links are denoted by strings that:
// - contain http://
// - contain www.*.*

[Code]....

I'm not convinced yet that writing a shell script to do this is the best course of action. If someone is capable of doing this with a Perl or a Python script that's fine too. If you want to make it super high performance and write it in assembly

View 1 Replies View Related

Networking :: Block The Urls With Iptables?

Feb 22, 2010

I have a server with slackware 12 and i try to block 2 web sites but without success. I write in iptables rules /etc/iptables.conf

iptables -A INPUT -s web.org -j DROP
iptables -A OUTPUT -d web.org -j DROP

but no effect. What rule i must write to block url`s?

View 4 Replies View Related

Ubuntu :: Post All The Repo Urls That Come With 10.10 64-bit Desktop Edition?

Nov 7, 2010

im missing some of the repos in my package manager. can someone post all the repo urls that come with ubuntu 10.10 64-bit desktop edition

View 1 Replies View Related

Ubuntu :: Mutt Not Handling Mailto: URLs / Getting Error?

Dec 10, 2010

Ubuntu Lucid amd64, mutt-1.5.20-7ubuntu1

I'm pretty sure the following used to work:

Code:
$ mutt mailto:url?subject="Whoop dee doo"
Now mutt exits silently with status=1. Tried setting EDITOR/VISUAL, no difference.

Edit: I had tempdir pointed at a non-existent directory in my .muttrc. Still annoying that there was no error message, but at least things are working again.

View 1 Replies View Related

Server :: Cannot Access Apache Web URLs Remotely?

Jun 7, 2010

I have a fresh fedora 13 install, I managed to browse and setup my phpadmin.....and browse everthing locally. I can not browse the web site from any other machine in my network. All my machines get their IPs from my dhcp (192.168.1.0).I googled and read a thread in this forum, I understood it might be due to SELINUX. I disabled it, rebooted, still have the same behavior, browse my apache locally but not from other machines. I did a telnet from one of my machines using the IP as followstelnet 192.168.1.11 80got the following onnecting To 192.168.1.11...Could not open connection to the host, on port 80: Connect failed.I checked error-log and access_log file, found no hint. I think it should be something related to some fedora systemor firewall or selinux config that is not allowing access to it.

View 4 Replies View Related

Server :: Disable Caching For Certain URLs In Varnish?

Mar 13, 2011

There is this server that is running a lot of websites and runs varnish for caching for performance boosting. But I want to somehow remove certain URLs from caching which change frequently. But I do not want to remove complete domains from caching but certain URLs from the websites. Is there any way to remove those pages from caching?

View 2 Replies View Related

Programming :: Firefox Extension To Handle URLs?

Jan 11, 2011

I want to write simple firefox extension / script or anything to change URLs from HTTP to HTTPS for selected websites (e.g. facebook). That thing is actually bypassing some security checks in my network.

Can anyone tell me how to proceed? I can deal with language as far its C++ or Python (something else would just take more time that's all )

View 1 Replies View Related

Ubuntu Servers :: Using Iptables To Get Web Usage Statistics And Filter Urls?

Dec 16, 2010

I'm deploying new ubuntu server which should act as a router. I've already set up the NAT for local network, and also did some shaping for different groups of users, but now I'm facing new problem.I need to make a scheduled URL filter. I know it's not a problem with cron and simple script, but maybe there is existing way to do that? And also, I need to make statistics on web-traffic. I need to have list of URLs visited by users (source ip, destination url). Is it possible with iptables? or with any other software but without using proxy servers.

View 9 Replies View Related

Debian :: Mapping URLs To Filesystem Locations With Apache 2.2?

Jul 17, 2011

I am trying to map a URL to a file system location.According to official documentation at [URL] under 'Files Outside the DocumentRoot', it says:On Unix systems, symbolic links can bring other parts of the filesystem under the DocumentRoot. For security reasons, Apache will follow symbolic links only if the Options setting for the relevant directory includes FollowSymLinks or SymLinksIfOwnerMatch.

View 1 Replies View Related

Fedora :: Clicking URLs In Thunderbird Does Not Start Firefox In F15?

May 27, 2011

Does anyone have the problem where clicking a URL in an email in Thunderbird doesn't start up Firefox? It used to work in previous Fedora releases

View 6 Replies View Related

OpenSUSE :: Dragging Urls On Plasma Doesn't Allow To Link As

Feb 17, 2010

just update and dragging urls on plasma doesn't allow me to link as... preview, link..

View 2 Replies View Related

OpenSUSE Multimedia :: Mplayer Configuration - Save URLs

Jan 3, 2010

I'm not able to reproduce a configuration which I had before.

The plot: openSUSE 11.1 i586, seamonkey 1.1.18, MPlayer dev-SVN-r30099-4.3-openSUSE Linux 11.1 (i686)-Packman = MPlayer-1.0rc2_r30099-1.pm.1.1, mplayerplug-in-3.55+cvs20090923-0.pm.2.2

My old system which died in a catastrophic hardware failure did write the URLs of all movies played with mplayerplug-in to a file located at $HOME/Movies/playlist.

How can I configure the new system to do it again?

View 5 Replies View Related

General :: Script(urls.sh) Scheduled To Run With Crontab At Every Hour?

Jan 26, 2011

I have a script(urls.sh) scheduled to run with Crontab at every hour. The script is all good and executes manually with [root@server cron.hourly]# ./urls.sh But the scrip is not executing according to schedule. This is what I see in /var/log/cron every hour:crond[4729]: (root) CMD (run-parts /etc/cron.hourly/urls.sh)My crontab look likes this:

SHELL=/bin/bash
PATH=/sbin:/bin:/usr/sbin:/usr/bin
MAILTO=root

[code]....

View 2 Replies View Related

Security :: Force Squid To Log Urls Allowed As An Exception?

Aug 10, 2010

Squid acl rules can be configured to allow specific ip's to get full access, or rather skips the blocked site list.

acl <tag> src x.x.x.x
http_access allow <tag>
http_access deny blocksites

From all the ways i tried, squid does not log these urls. Is there a way to have squid log the urls requested from allowed ip's?

Specs:
squid ver : (squid/2.6.STABLE21)
OS : CentOS 5.5

View 1 Replies View Related

Ubuntu Servers :: Setup Reverse Proxyapache2 Drupal Clean Urls?

Apr 16, 2010

I have a scenario.A domain [URL].. then there are 4 private computers on which applications are hosted at port 80. So when some one from outside access the site it look [URL]..I added

[Code]...

View 1 Replies View Related

General :: Script For Automated Torrent Downloads (from Announce Urls In IRC)

Oct 12, 2010

Some time ago I read a discussion in /r/linux and someone mentioned a script that checks new messages in IRC channel (in irssi client) and if they contain announce urls for torrents, they are passed to a torrent client (rtorrent) for downloading. Sadly, I didn't bookmark it, nor I could find it now. Maybe someone has a similar script?

View 1 Replies View Related

Server :: Non Latin Urls / Giving A 404 Error (file Not Found)?

Mar 17, 2011

my web host runs a linux server, and when i try to load a file in my browser (which i have uploaded in my web space) with non latin words it gives a 404 error (file not found). for example i have uploaded mydomain.com/νεο.html the word "νεο" is non latin. so when i try to reach this document from my browser i get the error.

View 2 Replies View Related

Programming :: Apache Redirect - Core SEF URLs On - Using Htaccess.txt File That Came With Joomla

Apr 7, 2009

I have recently merged two Joomla 1.0 sites I ran into one. I imported the articles I wanted to keep to the new site, and I have the old site's domain pointing as an alias at the new site. The new site is www.theouthousers.com . The old site was www.bludblood.com .

I also have the core SEF URLs on, using the htaccess.txt file that came with Joomla.

I have one writer for the old site who linked to his articles in various places, so I am trying to set up redirects for him so that he doesn't have to change all of his links.

For instance, I need something like:

http://www.bludblood.com/joomla/inde...d=25&Itemid=51

To redirect to the equivalent location on the new site:

[url]

And I also need specific links like:

[url]

To redirect to their new counterparts:

[url]

Keeping in mind that www.bludblood.com is now an alias of www.theouthousers.com, is there any way to do this? I have been trying with rewrite rules and redirects, and cannot seem to achieve the desired effect.

Tried various versions of:

Code:
Redirect [url] [url]

With the http, without, as regexps, as 301s, as permanents, etc, and it just will not work. Also tried as RewriteRule.

View 2 Replies View Related

Programming :: Bash - Read User Input: URLs Without The Enter Key Stroke?

Sep 23, 2010

Here's a challenge I've been struggling for months with:

I have a bash script that reads URL addresses of our internal server and then executes some test commands on them. Something like this:

Code:
read -p "Enter URL: " url
sh execute-what-ever-to $url

After copy-pasting the URL the user taps the enter key and the script proceeds, but here comes the tricky part: I want this to work without the need to press the enter key after copy-pasting the URL.

"read -n" does not work in this case, as the URLs vary greatly in length. However, the URLs always end to the same string. They could be like "http://url1/END", "http://url2/END" and so on. So this ending string "END" could be theoretically used to recognize that the whole URL has been pasted.

View 2 Replies View Related

Fedora Installation :: Update 15 - Transaction Error - Check That The Correct Key Urls Are Configured For This Repository

Jul 28, 2011

I've been trying to update Fedora 15 for weeks. I always end up with a transaction error and the update stalls. The error reads: GPG keys listed for RPM Fusion for Fedora Rawhide-free repository are installed but they are not correct for this package. Check that the correct key urls are configured for this repository.
This is far too involved for a linux newbie, I think my only option is to reformat and reinstall. This is so frustrating, there are 250 MB of updates available that I can't access.

View 6 Replies View Related

Ubuntu Networking :: Wget Failed To Download Image From URL?

Jan 21, 2011

I tried to download some images from google using wget:where wget cbk0.google.com/cbk?output=tile&panoid=2dAJGQJisD1hxp_U0xlokA&zoom =5&x=0&y=0However, I get the following erros:

--2011-01-21 04:39:05-- http://cbk0.google.com/cbk?output=tile
Resolving cbk0.google.com... 209.85.143.100, 209.85.143.101
Connecting to cbk0.google.com|209.85.143.100|:80... connected.

[code]....

View 3 Replies View Related

General :: Wget Authenticated HTTP Connection Failed

Jul 7, 2010

we have a Red Hat server and I'm using wget in crontab to run some PHP scripts. We've been doing this for some time now and it's been working fine.I tried to add another script using wget to run a PHP script behind HTTP authentication. However, despite the fact that the URL works fine and the username and password are correct, we are getting Connection Timed Out errors each time. What might cause wget to work for unauthenticated URLs, but not authenticated ones?

I've tried --user=/--password=, --http-user=/--http-password and Username:Password@ in the URL and all three fail the same way. Here's the command in question:

[blahblah user]# wget -t 5 -O /dev/null 'http://Username:Password1!@test.example.com/sub/dir/file-name.php'
--2010-07-07 10:11:55-- http://Username:*password*@test.example.com/sub/dir/file-name.php
Resolving test.example.com... 000.000.000.000

[code]....

Again, wget works, the file with authentication works, but wget calling the file with authentication does not work.

UPDATE: Actually, I get the same timeout if I access the authenticated URL without authentication. Could that mean that Apache is rejecting wget requests for authentication outright?

View 2 Replies View Related

Ubuntu :: Failed To Open Output Device /dev/dsp

Jan 21, 2011

I'm trying for the first time to work with sounds and edit them. I'm trying to use Gnome Wave Cleaner and don't know what to do about the error.

View 1 Replies View Related

OpenSUSE :: Subprocess Failed - Error: RPM Failed: Error: Failed Dependencies

Mar 31, 2010

When the RPM runs it come up with this error. How do I install the required dependencies? I have added more repositories, but still there are a few dependencies missing. Is there a zypper/sudo -get or something available? Opensuse 11.1 Gnome

[code]...

View 4 Replies View Related

Ubuntu Servers :: ERROR Installation Failed, Please Check The Terminal Output

Apr 24, 2010

I'm installing and configuring my first server using RackSpace CloudServers running Ubuntu Karmic Koala (9.10) and I'm now installing iRedMail. The installation runs successfully until I recieve this error:

Code:
The following packages have unmet dependencies: mysql-server-5.0: Depends: mysql-server-core-5.0 (>= 5.1.30really5.0.83-0ubuntu3) but it is not going to be installed

E: Broken packages < ERROR > Installation failed, please check the terminal output. I understand this is telling me there is some software that iRedMail (or something iRedMail is dependant upon) that needs installed. Is this correct? And if so, what is i needing installed and how do I do that (aptitude install example-package?)?

View 4 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved