HELP! How does one prevent regional pages as being counted as "duplicate content," "duplicate meta descriptions," et cetera...?
-
The organization I am working with has multiple versions of its website geared towards the different regions.
- US - http://www.orionhealth.com/
- CA - http://www.orionhealth.com/ca/
- DE - http://www.orionhealth.com/de/
- UK - http://www.orionhealth.com/uk/
- AU - http://www.orionhealth.com/au/
- NZ - http://www.orionhealth.com/nz/
Some of these sites have very similar pages which are registering as duplicate content, meta descriptions and titles. Two examples are:
Now even though the content is the same, the navigation is different since each region has different product options / services, so a redirect won't work since the navigation on the main US site is different from the navigation for the UK site. A rel=canonical seems like a viable option, but (correct me if I'm wrong) it tells search engines to only index the main page, in this case, it would be the US version, but I still want the UK site to appear to search engines. So what is the proper way of treating similar pages accross different regional directories?
Any insight would be GREATLY appreciated! Thank you!
-
Are they showing as duplicate content within Moz Analytics? As long as the search engines don't see it as duplicate content you shouldn't have a problem. Here are some things to check for:
- The content should be localised as best as possible (e.g. the differences between US, Canadian and British English).
- I'm not sure it's possible with your sites, but things like contact details and local currency can help send the right signals to Google etc. - so the address and phone number if you have a HQ in each country.
- Each site should be registered separately with Bing and Google Webmaster Tools and geotargeted to the relevant location.
- Use the hreflang attribute if you're not already (triple-check to make sure it is set up properly): https://support.google.com/webmasters/answer/189077?hl=en - you can set Canadian English (en-ca) and Canadian French (fr-ca) - the language always comes before the country.
As you mentioned your terms and conditions, I noticed a lot of companies are using very similar wording to yours, so that could be an issue of duplicate content that features on completely different websites. You could rewrite the t's and c's, though I wouldn't see any harm in noindexing those particular pages.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
B2B site targeting 20,000 companies with 20,000 dedicated "target company pages" on own website.
An energy company I'm working with has decided to target 20,000 odd companies on their own b2b website, by producing a new dedicated page per target company on their website - each page including unique copy and a sales proposition (20,000 odd new pages to optimize! Yikes!). I've never come across such an approach before... what might be the SEO pitfalls (other than that's a helluva number of pages to optimize!). Any thoughts would be very welcome.
Intermediate & Advanced SEO | | McTaggart0 -
How do i prevent Google and Moz from counting pages as duplicates?
I have 130,000 profiles on my site. When not Connected to them they have very few differences. So a bot - not logged in, etc, will see a login form and "Connect to Profilename" MOZ and Google call the links the same, even though theyre unique such as example.com/id/328/name-of-this-group example.com/id/87323/name-of-a-different-group So how do i separate them? Can I use Schema or something to help identify that these are profile pages, or that the content on them should be ignored as its help text, etc? Take facebook - each facebook profile for a name renders simple results: https://www.facebook.com/public/John-Smith https://www.facebook.com/family/Smith/ Would that be duplicate data if facebook had a "Why to join" article on all of those pages?
Intermediate & Advanced SEO | | inmn0 -
Duplicate Page Content Issues Reported in Moz Crawl Report
Hi all, We have a lot of 'Duplicate Page Content' issues being reported on the Moz Crawl Report and I am trying to 'get to the bottom' of why they are deemed as errors... This page; http://www.bolsovercruiseclub.com/about-us/job-opportunities/ has (admittedly) very little content and is duplicated with; http://www.bolsovercruiseclub.com/cruise-deals/cruise-line-deals/explorer-of-the-seas-2015/ This page is basically an image and has just a couple of lines of static content. Also duplicated with; http://www.bolsovercruiseclub.com/cruise-lines/costa-cruises/costa-voyager/ This page relates to a single cruise ship and again has minimal content... Also duplicated with; http://www.bolsovercruiseclub.com/faq/packing/ This is an FAQ page again with only a few lines of content... Also duplicated with; http://www.bolsovercruiseclub.com/cruise-deals/cruise-line-deals/exclusive-canada-&-alaska-cruisetour/ Another page that just features an image and NO content... Also duplicated with; http://www.bolsovercruiseclub.com/cruise-deals/cruise-line-deals/free-upgrades-on-cunard-2014-&-2015/?page_number=6 A cruise deals page that has a little bit of static content and a lot of dynamic content (which I suspect isn't crawled) So my question is, is the duplicate content issued caused by the fact that each page has 'thin' or no content? If that is the case then I assume the simple fix is to increase add \ increase the content? I realise that I may have answered my own question but my brain is 'pickled' at the moment and so I guess I am just seeking assurances! 🙂 Thanks Andy
Intermediate & Advanced SEO | | TomKing0 -
Redirecting thin content city pages to the state page, 404s or 301s?
I have a large number of thin content city-level pages (possibly 20,000+) that I recently removed from a site. Currently, I have it set up to send a 404 header when any of these removed city-level pages are accessed. But I'm not sending the visitor (or search engine) to a site-wide 404 page. Instead, I'm using PHP to redirect the visitor to the corresponding state-level page for that removed city-level page. Something like: if (this city page should be removed) { header("HTTP/1.0 404 Not Found");
Intermediate & Advanced SEO | | rriot
header("Location:http://example.com/state-level-page")
exit();
} Is it problematic to send a 404 header and still redirect to a category-level page like this? By doing this, I'm sending any visitors to removed pages to the next most relevant page. Does it make more sense to 301 all the removed city-level pages to the state-level page? Also, these removed city-level pages collectively have very little to none inbound links from other sites. I suspect that any inbound links to these removed pages are from low quality scraper-type sites anyway. Thanks in advance!2 -
Using "Read More" buttons as a tool to cram in Content
Hi Mozzers! Let's say our website is clean, professional, and minimalistic. Can we use a "read more" button that will expand the text on the page to increase the amount of content while (unless clicked) not impacting the appearance? I want to make sure I am not violating Google Webmaster's guidelines for "Hidden Text" Thanks!
Intermediate & Advanced SEO | | Travis-W0 -
Duplicate description problem in Wordpress.
Webmaster tools is flagging up duplicate descriptions for the page http://www.musicliveuk.com/live-acts. The page is one page in the wordpress page editor and the web designer set it up so that I can add new live acts from a seperate page editor on the left menu and that feeds into the page 'live-acts'. (it says under template 'live-acts-feed'. The problem is as I add more acts it creates new url's eg http://www.musicliveuk.com/live-acts/page/2 and http://www.musicliveuk.com/live-acts/page/3 etc... I use the all in one SEO pack and webmaster tools tells me that page 2/3/4/ etc all have the same description. How can I overcome this? I can't write new descriptions for each page as the all in one SEO pack will only allow me to enter one for the page 'live-acts'.
Intermediate & Advanced SEO | | SamCUK0 -
The system shows duplicate content for the same page (main domain and index.html). Is this an error of SEOMOZ?
Should I be worried that this will affect SEO? Most sites redirect to the index.html page, right? [edited by staff to remove toolbar data]
Intermediate & Advanced SEO | | moskowman0 -
Should "View All Products" be the canonical page?
We currently have "view 12" as the default setting when someone arrives to www.mysite.com/subcategory-page.aspx. We have been advised to change the default to "view all products" and make that the canonical page to ensure all of our products get indexed. My concern is that doing this will increase the page load time and possibly hurt rankings. Does it make sense to change all our our subcategory pages to show all the products when someone visits the page? Most sites seem to have a smaller number of products as the default.
Intermediate & Advanced SEO | | pbhatt0