[GLLUG] Web Whacker type program

Melson, Paul PMelson@sequoianet.com
Wed, 5 Jun 2002 14:52:30 -0400


Depending on your specific need, I've had great luck in classroom/lab
settings using Squid and Niels Provos' crawl utility
(http://www.monkey.org/~provos/crawl/).  If I knew the sites I wanted to
demo, I'd set crawl loose on them (through the Squid proxy-cache) a few
hours before the class.  It was taking advantage of the existing Squid
infrastructure, though, and might not be worth the extra effort if you
don't need multiple machines (all behind the same Squid proxy) to view
these same sites offline.

There's another tool called HTTrack that is a little more like Web
Whacker.  I've never used it, but it might be a better fit.
http://www.httrack.com

Hope that helps!

PaulM

-----Original Message-----
From: Jeff Vanderlaan [mailto:jvan@kvc.net]
Sent: Wednesday, June 05, 2002 2:39 PM
To: linux-user@egr.msu.edu
Subject: [GLLUG] Web Whacker type program


What is the best way to copy a website for off-line viewing?  We've used
a Windows program in the past called Web Whackers.  Is there an
open-source alternative?

Jeff

-- 
Jeff Vanderlaan
Sr. Media & Electronics Specialist
Lansing School District
jvan@lsd.k12.mi.us
517-325-6620
_______________________________________________
linux-user mailing list
linux-user@egr.msu.edu
http://www.egr.msu.edu/mailman/listinfo/linux-user