[Techtalk] Robust link checker?
ktb
x.y.f at home.com
Mon Oct 15 11:37:49 EST 2001
On Mon, Oct 15, 2001 at 12:26:10PM -0400, Raven, corporate courtesan wrote:
> Heya --
>
> Posting a question for a friend of mine -- I couldn't help her,
> but I'm hoping someone else can.
>
> She has recently taken on Webmistress duties for a large and
> robust site (http://www.pbs.org). She's looking for a link checker
> program that can run through the site and give her a list of broken
> links so that she can fix them. The problem is, there are hundreds of
> thousands of links on the site, and all the programs she's tried so far
> have crashed, unable to handle a site of that size.
>
> Anyone have any favorites or recommendations for her? It can
> run on either a Linux or a Windows platform.
>
> Thanks in advance for your help.
>
If she is interested in rolling her own she might be able to put
together something with wget. From the man-page -
---------------------------------
--spider
When invoked with this option, Wget will behave as a Web
spider, which means that it will not download the pages, just check
that they are there. You can use it to check your bookmarks, e.g. with:
wget --spider --force-html -i bookmarks.html
This feature needs much more work for Wget to get close to the
functionality of real WWW spiders.
----------------------------------
Probably not close to what she wants but...
hth,
kent
--
"The significant problems we face cannot be solved at the
same level of thinking we were at when we created them."
--Albert Einstein
More information about the Techtalk
mailing list