How to resolve duplicate content issues when using Geo-targeted Subfolders to seperate US and CAN
-
A client of mine is about to launch into the USA market (currently only operating in Canada) and they are trying to find the best way to geo-target. We recommended they go with the geo-targeted subfolder approach (___.com and ___.com/ca).
I'm looking for any ways to assist in not getting these pages flagged for duplicate content.
Your help is greatly appreciated.
Thanks!
-
Presuming your client is planning on launching in the US using largely the same content, perhaps with minor regional amendments, then this would be an ideal situation to use hreflang.
There is plenty out there on correct use and implementation of this. You could start be reading this article. Hope this helps.
-
Simple answer would be unique content or canonical urls.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is there a difference between .us and .org for a website targetting the US market?
Hello, We were searching for some evidence regarding this and couldn't find much so the question. We have a service that is related only to the USA market. If we buy a .us domain name with the service we provide in the domain name will google treat it the same with a .org? We did searches regarding this but didn't see too many .us domains popping up. Unfortunately the .com is not available. Thank you
Intermediate & Advanced SEO | | anitawapa0 -
A lot of news / Duplicate Content - what to do?
Hi All, I have a blog with a lot of content (news and pr messages), I want to move my blog to new domain. What is your recommendation? 1. Keep it as is. old articles -> 301 -> same article different URL
Intermediate & Advanced SEO | | JohnPalmer
2. Remove all the duplicate content and create 301 from the old URL to my homepage.
3. Keep it as is, but add in the meta-tags NoIndex in duplicate articles. Thanks !0 -
Parameter Strings & Duplicate Page Content
I'm managing a site that has thousands of pages due to all of the dynamic parameter strings that are being generated. It's a real estate listing site that allows people to create a listing, and is generating lots of new listings everyday. The Moz crawl report is continually flagging A LOT (25k+) of the site pages for duplicate content due to all of these parameter string URLs. Example: sitename.com/listings & sitename.com/listings/?addr=street name Do I really need to do anything about those pages? I have researched the topic quite a bit, but can't seem to find anything too concrete as to what the best course of action is. My original thinking was to add the rel=canonical tag to each of the main URLs that have parameters attached. I have also read that you can bypass that by telling Google what parameters to ignore in Webmaster tools. We want these listings to show up in search results, though, so I don't know if either of these options is ideal, since each would cause the listing pages (pages with parameter strings) to stop being indexed, right? Which is why I'm wondering if doing nothing at all will hurt the site? I should also mention that I originally recommend the rel=canonical option to the web developer, who has pushed back in saying that "search engines ignore parameter strings." Naturally, he doesn't want the extra work load of setting up the canonical tags, which I can understand, but I want to make sure I'm both giving him the most feasible option for implementation as well as the best option to fix the issues.
Intermediate & Advanced SEO | | garrettkite0 -
URLs: Removing duplicate pages using anchor?
I've been working on removing duplicate content on our website. There are tons of pages created based on size but the content is the same. The solution was to create a page with 90% static content and 10% dynamic, that changed depending on the "size" Users can select the size from a dropdown box. So instead of 10 URLs, I now have one URL. Users can access a specific size by adding an anchor to the end of the URL (?f=suze1, ?f=size2) For e.g: Old URLs. www.example.com/product-alpha-size1 www.example.com/product-alpha-size2 www.example.com/product-alpha-size3 www.example.com/product-alpha-size4 www.example.com/product-alpha-size5 New URLs www.example.com/product-alpha-size1 www.example.com/product-alpha-size1?f=size2 www.example.com/product-alpha-size1?f=size3 www.example.com/product-alpha-size1?f=size4 www.example.com/product-alpha-size1?f=size5 Do search engines read the anchor or drop them? Will the rank juice be transfered to just www.example.com/product-alpha-size1?
Intermediate & Advanced SEO | | Bio-RadAbs0 -
Using IP to deliver different sidebar content on homepage
We have a site with a generic top level domain and we'd like to use a small portion of the homepage to cater content based on the IP of a visiting user. The content is for product dealerships around different regions/states of the US, not internationally. The idea being that someone from Seattle would see dealerships for this product near their location in Seattle. The section on the homepage is relatively small and would churn out 5 links and images according to location. The rest of the homepage would be the same for everyone, which includes links to news and reviews and fuller content. We have landing pages for regional/state content deeper in the site that don't use an IP to deliver content and also have unique URLs for the different regions/states. An example being a "Washington State Dealerships" landing page with links to all the dealerships there. We're wondering what kind of SEO impact there would be to having a section of the homepage delivering different content based on IP, and if there's anything we should do about it (or if we should be doing it all!). Thank you.
Intermediate & Advanced SEO | | seoninjaz0 -
Duplicate Content Question
Hey Everyone, I have a question regarding duplicate content. If your site is penalized for duplicate content, is it just the pages with the content on it that are affected or is the whole site affected? Thanks 🙂
Intermediate & Advanced SEO | | jhinchcliffe0 -
Duplicate Page Content / Titles Help
Hi guys, My SEOmoz crawl diagnostics throw up thousands of Dup Page Content / Title errors which are mostly from the forum attached to my website. In-particular it's the forum user's profiles that are causing the issue, below is a sample of the URLs that are being penalised: http://www.mywebsite.com/subfolder/myforum/pop_profile.asp?mode=display&id=1308 I thought that by adding - http://www.mywebsite.com/subfolder/myforum/pop_profile.asp to my robots.txt file under 'Ignore' would cause the bots to overlook the thousands of profile pages but the latest SEOmoz crawl still picks them up. My question is, how can I get the bots to ignore these profile pages (they don't contain any useful content) and how much will this be affecting my rankings (bearing in mind I have thousands of errors for dup content and dup page titles). Thanks guys Gareth
Intermediate & Advanced SEO | | gaz33420 -
Finding Duplicate Content Spanning more than one Site?
Hi forum, SEOMoz's crawler identifies duplicate content within your own site, which is great. How can I compare my site to another site to see if they share "duplicate content?" Thanks!
Intermediate & Advanced SEO | | Travis-W0