I thought the same but a couple of weeks ago when I was reviewing the impact of implementing rel canonical (which was done several months earlier) I was find quite a number of situations whether Google was not taking notice of the rel canonical guidance and was using the wrong site within its search results (having difficulty finding a specific example at the moment though). The conclusion I came to was Google simply views rel canonical as a suggestion rather than a rule.
- Home
- mreeves
Latest posts made by mreeves
-
RE: Complex duplicate content question
-
Complex duplicate content question
We run a network of three local web sites covering three places in close proximity. Each sitehas a lot of unique content (mainly news) but there is a business directory that is shared across all three sites. My plan is that the search engines only index the business in the directory that are actually located in the place the each site is focused on. i.e. Listing pages for business in Alderley Edge are only indexed on alderleyedge.com and businesses in Prestbury only get indexed on prestbury.com - but all business have a listing page on each site.
What would be the most effective way to do this? I have been using rel canonical but Google does not always seem to honour this.
Will using meta noindex tags where appropriate be the way to go? or would be changing the urls structure to have the place name in and using robots.txt be a better option.
As an aside my current url structure is along the lines of:
http://dev.alderleyedge.com/directory/listing/138/the-grill-on-the-edge
Would changing this have any SEO benefit?
Thanks
Martin
Looks like your connection to Moz was lost, please wait while we try to reconnect.