The first point I would make would be that 100 hits / sec of a 1k page (not a big HTML file, no images) would be 100K/second, and with overhead that in itself would pretty much use up a T1 line. Thus, you need to plan to have say double that for Slashdot (or more). In reality many pages are bigger than that, so you are really talking about 5 to 10 mbps of network bandwidth. That's a lot.
Some testing has been done, and the underlying software is pretty reliable. Most sites that have heavy load seem to end up caching parts of pages, and "memoizing" expensive SQL queries (basically caching the results of an expensive SQL query by only running the query at most every x seconds).
I have done some stress testing of PG and have found it to be very reliable. I haven't had problems with database corruption.
Maintaining the site, in my opinion, has more to do with changes that you need to make to stay current (add content, change HTML layout, etc.) rather than worrying about the server staying up.