is this idea valid?

Do you have a question? Post it now! No Registration Necessary.  Now with pictures!

Threaded View
We have asp pages on our site that are rich with keywords.

Each of these pages has the same static text about a particular location
(save for the name of the location.. that is dynamically changed for
each page), but also includes some dynamic text.

The dynamic text is placed around 3 paragraphs down from the static text
on the page and changes roughly every 3 days to keep the site fresh for
google bots.

Now I think this is crazy, but my boss reckons that if the dynamic text
were placed higher in the source code of the site, that the google bots
are more likely to discover the newly updated text (and hence re-cache
the site) than if the dynamic text was placed after 3 paragraphs of
static (i.e. non-changing) text.

How does google decide whether a page has been updated? Does it compare
only the first few paragraphs or the file size or the text on the whole

What percentage of the page has to be original before it's considered
duplicate content?

Any advice would be great.

"I hear ma train a comin'
... hear freedom comin"

Re: is this idea valid?

Quoted text here. Click to load it

Nobody knows. It's generally held that just changing the date or the
equivalent won't excite a stampede of bots though. And the file size
does in all probability need to change significantly. I wouldn't try
to define significantly though.

Quoted text here. Click to load it

I like to avoid duplicate content filters by making sure 0% of my
content is original. Hey - works for me!

Actually that was too easy; - we dunno, really, there's no set limit
ever been defined anywhere. Why worry? Write different pages, then the
problem will never arise.


         home of SEO that's shiny!

Site Timeline