Complex duplicate content question
-
We run a network of three local web sites covering three places in close proximity. Each sitehas a lot of unique content (mainly news) but there is a business directory that is shared across all three sites. My plan is that the search engines only index the business in the directory that are actually located in the place the each site is focused on. i.e. Listing pages for business in Alderley Edge are only indexed on alderleyedge.com and businesses in Prestbury only get indexed on prestbury.com - but all business have a listing page on each site.
What would be the most effective way to do this? I have been using rel canonical but Google does not always seem to honour this.
Will using meta noindex tags where appropriate be the way to go? or would be changing the urls structure to have the place name in and using robots.txt be a better option.
As an aside my current url structure is along the lines of:
http://dev.alderleyedge.com/directory/listing/138/the-grill-on-the-edge
Would changing this have any SEO benefit?
Thanks
Martin
-
I went through both sites and noticed that you don't have canonicals on any of your pages. Also don't see any XML sitemaps.
If you can only show the businesses that are in that particular area, and remove the ones that are not I would probably go for that, unless you are doing Local Search as well.
I would be curious to see the example of canonical issue you mentioned.
Are these Joomla sites? I'm just asking because there are no description tags on anything except the home page.
-
Maybe this article on the Matt Cutts blog can help you : http://www.mattcutts.com/blog/rel-canonical-html-head/
-
I thought the same but a couple of weeks ago when I was reviewing the impact of implementing rel canonical (which was done several months earlier) I was find quite a number of situations whether Google was not taking notice of the rel canonical guidance and was using the wrong site within its search results (having difficulty finding a specific example at the moment though). The conclusion I came to was Google simply views rel canonical as a suggestion rather than a rule.
-
I'd suggest that rel canonical is the perfect way to handle this. What seems to be the problem with this approach that makes you want to chance it?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Geo Targeting Content Question
Hi, all First question here so be gentle, please My question is around geo targeted dynamic content; at the moment we run a .com domain with, for example, an article about running headphones and then at the end - taking up about 40% of the content - is a review of some people can buy, with affiliate links. We have a .co.uk site with the same page about running headphones and then 10 headphones for the UK market. Note: rel alternative is used on the pages to point to each other, therefore (hopefully) removing duplicate content issues. This design works well but it involves having to build links to two pages, in the case of this example. What we are thinking of doing is to just use the .com domain and having the product page of the page served dynamically, ie, people in the UK see UK products and people in US see US products. What are people's thoughts on this technique, please? From my understanding, it wouldn't be any problem with Google for cloaking etc because a googlebot and a human from the same country will see the same content. The site is made in Wordpress and has <....html lang="en-US"> (for the .com) in the header. Would this cause problems for the page ranking in the UK etc? The ultimate goal of doing this would be to reduce link building efforts by halving the number of pages which links would have to be built for. I welcome any feedback. Many thanks
Technical SEO | | TheMuffinMan0 -
404 Error Pages being picked up as duplicate content
Hi, I recently noticed an increase in duplicate content, but all of the pages are 404 error pages. For instance, Moz site crawl says this page: https://www.allconnect.com/sc-internet/internet.html has 43 duplicates and all the duplicates are also 404 pages (https://www.allconnect.com/Coxstatic.html for instance is a duplicate of this page). Looking for insight on how to fix this issue, do I add an rel=canonical tag to these 60 error pages that points to the original error page? Thanks!
Technical SEO | | kfallconnect0 -
Duplicate Content - Different URLs and Content on each
Seeing a lot of duplicate content instances of seemingly unrelated pages. For instance, http://www.rushimprint.com/custom-bluetooth-speakers.html?from=topnav3 is being tracked as a duplicate of http://www.rushimprint.com/custom-planners-diaries.html?resultsperpg=viewall. Does anyone else see this issue? Is there a solution anyone is aware of?
Technical SEO | | ClaytonKendall0 -
Rel=canonical overkill on duplicate content?
Our site has many different health centers - many of which contain duplicate content since there is topic crossover between health centers. I am using rel canonical to deal with this. My question is this: Is there a tipping point for duplicate content where Google might begin to penalize a site even if it has the rel canonical tags in place on cloned content? As an extreme example, a site could have 10 pieces of original content, but could then clone and organize this content in 5 different directories across the site each with a new url. This would ultimately result in the site having more "cloned" content than original content. Is this at all problematic even if the rel canonical is in place on all cloned content? Thanks in advance for any replies. Eric
Technical SEO | | Eric_Lifescript0 -
Hosted Wordpress Blog creating Duplicate Content
In my first report from SEOmoz, I see that there are a bunch of "duplicate content" errors that originate from our blog hosted on Wordpress. For example, it's showing that the following URLs all have duplicate content: http://blog.kultureshock.net/2012/11/20/the-secret-merger/ys/
Technical SEO | | TomHu
http://blog.kultureshock.net/2012/11/16/vendome-prize-website/gallery-7701/
http://blog.kultureshock.net/2012/11/20/the-secret-merger/sm/
http://blog.kultureshock.net/2012/11/26/top-ten-tips-to-mastering-the-twitterverse/unknown/
http://blog.kultureshock.net/2012/11/20/the-secret-merger/bv/ They all lead to the various images that have been used in various blog posts. But, I'm not sure why they are considered duplicate content because they have unique URLs and the title meta tag is unique for each one, too. But even so, I don't want these extraneous URLs cluttering up our search results, so, I'm removing all of the links that were automatically created when placing the images in the posts. But, once I do that, will these URLs eventually disappear, or continue to be there? Because our blog is hosted by Wordpress, I unfortunately can't add any of the SEO plugins I've read about, so, wondering how to fix this without special plugins. Thanks!
Tom0 -
Taking descriptions from Manufacturer sites and Duplicate content
We are doing some inventory improvements eg new photographs from various angles, etc. We are also writing descriptions for each product.. As one of our suppliers has perfect desriptions on their site what is the theory on how duplicate content will affect our ranking for these products if we copy and paste? Also if we change the descriptions, just how different do they need to be? Thanks
Technical SEO | | seanmccauley1 -
Caps in URL creating duplicate content
Im getting a bunch of duplicate content errors where the crawl is saying www.url.com/abc has duplicate at www.url.com/ABC The content is in magento and the url settings are lowercase, and I cant figure out why it thinks there is duplicate consent. These are pages with a decent number of inbound links.
Technical SEO | | JohnBerger0 -
I am Posting an article on my site and another site has asked to use the same article - Is this a duplicate content issue with google if i am the creator of the content and will it penalize our sites - or one more than the other??
I operate an ecommerce site for outdoor gear and was invited to guest post on a popular blog (not my site) for a trip i had been on. I wrote the aritcle for them and i also will post this same article on my website. Is this a dup content problem with google? and or the other site? Any Help. Also if i wanted to post this same article to 1 or 2 other blogs as long as they link back to me as the author of the article
Technical SEO | | isle_surf0