Forum OpenACS Q&A: Response to Encoded Email for Webpages

Collapse
Posted by Ken Mayer on
I have an emotional response everytime I see this topic come up. First, because I think spammers are evil; Second, in order to thwart harvesting, I have to reduce the utility of a web page; Third, no matter what mapping we make, it is only a matter of code that will translate the mapping back to the original address, and once that code is written it can be used everywhere. Meanwhile, we've made our own work more tedious. My personal experiences were pretty painful, too.

I used to get ~1000 spams a month, and since I was checking my e-mail, perhaps only once a month, in i-cafes, in Mexico (with speeds varying from 9600-54000 bps), pop3 transfer times were significant and costly.

I have since installed spamassassin. It seems to work well, scoring spams based on regexp patterns. I have the option of adjusting the threshhold level. So far, I toss anything that scores over 14 to /dev/null, and anything greater than 5 goes into my grey bucket. I still get spam, but less than 1 per day.

I guess my point of all this is that I found that with spamassassin. rewriting web pages to protect spam harvesters proved unneccessary.

YMMV