Do you have a question? Post it now! No Registration Necessary. Now with pictures!
October 15, 2005, 10:35 am
rate this thread
Google's Duplicate content filter and it brings up some questions.
The thread can be found here:
It is basically theorizing that lately the dup content filter has gone
haywire and has been removing too much.
My site is made up of many products and each product has 3 pages.
One page is an overview, one is a specification page and the third is a
large image of the product. All will have similar titles since they are
all about the same product.
Example of page titles where the name of the product is ABC.
ABC (this will be the main product page that gives a small image of the
product and an overview of features)
ABC large (this will be a page with a hi res image)
ABC specifications (this page is for the techies that want to dig deep
in to specs)
All 3 pages will be linked for ease of navigation.
According to the theory in the forum I just read, Google's dup filter
has gone wacky and instead of filtering the two pages linking from ABC
as dupes, it is removing all 3 pages.
If this is the case then a solution would be to plop a "noindex" meta
on the supporting pages.
Has anyone here done experimenting with this kind of thing?