Forum OpenACS Q&A: Re: Approach to Google-optimizing

Collapse
Posted by Tom Jackson on

You will screw yourself bigtime if you try to trick googlebot. Here are some links, read these first. I'm lauching a campaign to redo a few sites myself.

First, is the Greenpeace page you are interested in ranking, actually in Google? Page rank, after that point depends on the key words you choose. You shouldn't judge yourself or allow Greenpeace to judge you based on how a page ranks, but you can ask yourself, what are the key words I want Google to index? Are those words used on the page as a main theme? How relevent are other pages on the net based on searching those key words? Also, with key words, what are users typing into Google, you really want Google to return Greenpeace on subjects where they believe they have some authority, but you cannot choose what users will type.

Bottom line is there are no tricks, only sound writing skills and webmastering. Maybe one exception: hopefully a search for "Greenpeace" will return their home page...

Collapse
Posted by Jeff Davis on
We should also try to set noindex,nofollow robot meta tags to reduce the number of duplicates returned from some of the applications. In particular, something like bug tracker or photo-album returns the same content many times and since google will limit the number of pages it indexes when they have query variables the duplicates reduce the depth of the indexing on the site.

My idea for bug tracker would be to only index pages when no state variables are set and for photo-album to make the medium noindex nofollow so that the large image is not indexed.