[rescue] SS2 memory?

Joshua D Boyd jdboyd at cs.millersville.edu
Thu Mar 7 19:32:24 CST 2002


On Thu, Mar 07, 2002 at 06:27:40PM -0500, Greg A. Woods wrote:

> A caching HTTP proxy?  That's one thing I don't think an SS2 would be
> worthwhile for, unless of course it's your only choice _AND_ you have a
> very slow Internet connection.  Squid in particular has high VM and CPU
> demands, and HTTP caching is a high I/O job by definition.  I suspect
> (though I've never tried it) that an SS2 would only increase your
> browsing latency and unless you have reason to believe that a
> significant amount of the web content you view will be viewed multiple
> times before it expires then there's not going to be any gain.
> 
> If you have a high-speed connection and only a few users then a local
> caching HTTP proxy won't do you much good even if it's running on a much
> more capable machine.
> 
> I found a local squid running on a Pentium-133 to be useful when my
> connection was a 28.8kbps PPP link, but now with ADSL and/or cable modem
> it became a very noticable hinderance.  I do find using the cable
> provider's cache to be beneficial though, except maybe when it is
> overloaded at peak times.
> 
> The squid servers I run for various ISPs really do make a dent in the
> backbone bandwidth required (10-30% reduction) once they've got a few
> hundred or more clients.  However you've really got to have serious
> high-end hardware to run a cache for several thousand users or more! ;-)

Hmm.  We are talking maybe 5 users on a 640k downlink DSL line.  Perhaps it
will be totally awefull.  It seems to me that just caching images on common
sites should be a pretty big help though.  But then, for that, a simple
proxy written in python should do the job.

-- 
Joshua D. Boyd



More information about the rescue mailing list