[linux] web-server help ( squid ?? )

Michal michalg at gmail.com
Mon Oct 4 21:36:51 PDT 2004


On Mon, 4 Oct 2004 17:35:10 -0700 (Pacific Daylight Time), Ed Mulligan
<mulligan at u.washington.edu> wrote:

> Greetings UW Linux-land

>

> After taking over 9 million hits on our web server in the last week (&

> thats not counting the 14 million that went to a matching C&C server in

> the same time period), we have decided that we need to do something before

> the Mt St Helens deluge does us in. (Yes we were slashdotted too).

>

> Any opinions on the use of "squid" as a reverse-proxy-caching server ??

>

> Any direct experience ?

>

> I seem to recall that if a site is mostly static pages (like ours), a big

> memory machine dedicated to running squid can really spew it out.

> (Faster than Apache alone ???)


If you have _only_ static pages (no SSI, CGI, PHP) then consider
publicfile. It's simple, lightweight, secure, and fast. Its author
claims:

"On a Pentium II-350 under OpenBSD, according to various HTTP
benchmarking tools, publicfile handles 200-300 small single-fetch
connections per second. That's 20 million connections per day."

More at:

http://cr.yp.to/publicfile.html

It requires some work to get running, incl. installing other packages
by the author, but I've heard very good things about it.

Though, maybe it's a bit too bare-metal for your needs.

-Michal


More information about the Linux mailing list