Do you have a question? Post it now! No Registration Necessary. Now with pictures!
- Posted on
- Sitemap.txt - Google reports 404 errors
- Steve Campbell
December 29, 2006, 3:07 am
rate this thread
I've been using a sitemap.txt file with Google for about three months.
I've updated it faithfully with changes to my web pages and Google
seemed upload my sitemap and crawl my site without difficulty.
But starting last week, I get 4 or 5 HTTP 404 errors (page cannot be
found) every time I uplinked the sitemap to my Google webmaster
account. Funny thing is that it seems to be a different 4 or 5 pages
each time. The site has 39 files listed in the sitemap.
I never get a 404 error when I visit my site with my browser. My host
said it had to be an issue with the Google robot (of course) because
their servers were working fine.
Is anyone else getting unexplainable 404 errors from the Google robot?
Could it be because I'm using the TXT file and not ASP?
Re: Sitemap.txt - Google reports 404 errors
Steve Campbell wrote:
A text sitemap file is OK,
it makes no difference in this case
if you use a text sitemap file or an
ASP dynamically generated sitemap file in XML format.
There have been some postings about 404 errors in the
Google Webmaster Help > Sitemap Protocol group
If possible check your server raw access logs to see if
you can see those 404 HTTP status responses to Googlebot
from those specific URLs, maybe you can get an idea of what happened.
Also check if/when those URLs where cached by Google with
the cache: operator in a Google search.
If the URLs are well indexed (not Supplemental) and recently cached
by Google then it is not that bad, but if not, there might
be some problem, so maybe keep an eye on the
Google Webmaster Tools stats and on the server access logs
for 404 HTTP status responses for good URLs.
- Charles Sweeney
December 29, 2006, 11:06 pm