Googlebot Page Selection

Do you have a question? Post it now! No Registration Necessary.  Now with pictures!

Threaded View
Does anyone know how Googlebot decides which pages it has crawled to
include in it's index? On my site, it seems to have used the least
relevant page among many that have a much higher keyword density.

Also, is using the same META information (description,etc) in all pages
of a website considered a form of spam/and or affects page ranking?

thanks in advance,


Re: Googlebot Page Selection

On Sat, 06 May 2006 16:45:34 +0200,  

Quoted text here. Click to load it

It's considered stupidity ;)

It is not a spam. However, using same meta on all pages you are loosing  
chance to make your site diversified. Analysis of title text seems to be  
the most important part of the G algo (on-page part of the algo, external  
links can be/are more important). Description is probably not used in  
analysis (or has much less weight) but can be used by G as snippet  
describing your page in the search results - good snippet can get you a  
visitor even if you are down the SERPs page, weak one can lost a visitor  
even if you are #1.


Re: Googlebot Page Selection

On 6 May 2006 07:45:34 -0700, ""

Quoted text here. Click to load it


Not spam so much as a wasted potential asset. Description tags can be
used as snippets so if there's something enticing in there then it can
encourage clickthroughs.


Quoted text here. Click to load it


Site Timeline