Google Sitemaps - sure way to lazy Googlebot?

Do you have a question? Post it now! No Registration Necessary.  Now with pictures!

Hello good people in  a.i.s-e!

I want to run this observation by you and get some responses based on your
personal experience with G sitemaps. I have to admit, this may be a case
of ever increasing paranoia, but I can’t help thinking that my using G
sitemaps changed Googlebot behavior on my site(s) in an adverse way.

It has been two months since I started using sitemaps. Right after I put
them up the G saturation ( jumped from 100K+ pages to
1.2M+, which got me very excited indeed. However, the traffic has not
increased a bit, which you’d think is strange because just by sheer luck
having 12 times more pages would get you at least couple times more
visits, but this was not the case. Additionally, having been watching
Googlebot’s activity, especially that of  “Deepbot“  (I am a true believer
in Freshbot/Deepbot concept), I noticed a strange trend: the number of
Deepbot visits went down sharply – from 2K-3K per day to 50-300 per day.
In line with this trend was the actual measurable traffic from Google –
pretty steady. Down one day, up another, but stable over long run (Jagger
time included).  
There are deviations from this behavior – such as enormous spike in
Deepbot’s activity two days ago (75K pages/day), but the end result of
using the Sitemap seems to be grand ZERO if not negative. Traffic from G
was rising steadily before I put the maps up, and then stopped at that
level, which makes me believe number of Deepbot does matter.

Does anyone confirm that trend? I’m contemplating yanking the sitemaps off
my sites, but don’t want to do any damage, obviously, so I’m looking for
some additional supporting data.

See Site Sig Below

Article posted with Web Developer's USENET Archive
Web and RSS gateway to your favorite newsgroup - - 18830 messages and counting!

Site Timeline