Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Same site serving multiple countries and duplicated content
-
Hello!
Though I browse MoZ resources every day, I've decided to directly ask you a question despite the numerous questions (and answers!) about this topic as there are few specific variants each time:
I've a site serving content (and products) to different countries built using subfolders (1 subfolder per country).
Basically, it looks like this:
site.com/us/
site.com/gb/
site.com/fr/
site.com/it/
etc.The first problem was fairly easy to solve:
Avoid duplicated content issues across the board considering that both the ecommerce part of the site and the blog bit are being replicated for each subfolders in their own language. Correct me if I'm wrong but using our copywriters to translate the content and adding the right hreflang tags should do.But then comes the second problem: how to deal with duplicated content when it's written in the same language? E.g. /us/, /gb/, /au/ and so on.
Given the following requirements/constraints, I can't see any positive resolution to this issue:
1. Need for such structure to be maintained (it's not possible to consolidate same language within one single subfolders for example),
2. Articles from one subfolder to another can't be canonicalized as it would mess up with our internal tracking tools,
3. The amount of content being published prevents us to get bespoke content for each region of the world with the same spoken language.Given those constraints, I can't see a way to solve that out and it seems that I'm cursed to live with those duplicated content red flags right up my nose.
Am I right or can you think about anything to sort that out?Many thanks,
Ghill -
Thanks Kristina, this is in place now!
-
I would recommend setting up each country's subdirectory as separate properties in Google Search Console. Then, go to original Search Console, and click on Search Traffic > International Targeting, click the tab Country, and identify which country you're targeting users in.
That should give GSC enough information to not flag the content as duplicate.
Good luck!
-
A quick additional question to my initial interrogation though: it seems that there is no difference between HTML tags, HTTP header and XML sitemap to include hreflangs.
But is there any difference when it comes to GCS, SEO tools, Hreflang online cherckers and so on?E.g. if [random] SEO tools spot duplicated content between two regions for a similar page whilst there is hreflang tags within the sitemap, shall I just ignore this warning (provided that the job has been done correctly) or does it mean that there is something wrong still?
Pretty much the same for GCS, if I find warnings around duplicated content whilst hreflang are in place, what does it mean?
Thanks!
-
Hi Kristina,
Reading quite a lot of literature on the topic I was confident that hreflang would not help with duplicate content and then I realized they were mainly depreciated and old blog posts.
Out of curiosity, has the hreflang utilization evolved since its introduction or is it just me going crazy?Anyway, thanks loads for your help, seems much "easier" (so to speak as the hrelang introduction is not an easy one for huge international websites) than I thought.
-
It's for different regions as well. Check out the link I shared. Google lists the reasons for hreflang. The second reason is:
"If your content has small regional variations with similar content, in a single language. For example, you might have English-language content targeted to the US, GB, and Ireland."
-
Hi Kristina,
Thanks for your reply.
But from my understanding of hreflang, it mainly helps Google understand that the content is available in different languages/other regions. It doesn't sort out duplicate content issues if the language remains the same for different regions. -
For any duplicate content you have between countries, use hreflang to differentiate regions. Google lays out how to do that here.
Hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will I be flagged for duplicate content by Google?
Hi Moz community, Had a question regarding duplicate content that I can't seem to find the answer to on Google. My agency is working on a large number of franchisee websites (over 40) for one client, a print franchise, that wants a refresh of new copy and SEO. Each print shop has their own 'microsite', though all services and products are the same, the only difference being the location. Each microsite has its own unique domain. To avoid writing the same content over and over in 40+ variations, would all the websites be flagged by Google for duplicate content if we were to use the same base copy, with the only changes being to the store locations (i.e. where we mention Toronto print shop on one site may change to Kelowna print shop on another)? Since the print franchise owns all the domains, I'm wondering if that would be a problem since the sites aren't really competing with one another. Any input would be greatly appreciated. Thanks again!
Intermediate & Advanced SEO | | EdenPrez0 -
Duplicate content in Shopify - subsequent pages in collections
Hello everyone! I hope an expert in this community can help me verify the canonical codes I'll add to our store is correct. Currently, in our Shopify store, the subsequent pages in the collections are not indexed by Google, however the canonical URL on these pages aren't pointing to the main collection page (page 1), e.g. The canonical URL of page 2, page 3 etc are used as canonical URLs instead of the first page of the collections. I have the canonical codes attached below, it would be much appreciated if an expert can urgently verify these codes are good to use and will solve the above issues? Thanks so much for your kind help in advance!! -----------------CODES BELOW--------------- <title><br /> {{ page_title }}{% if current_tags %} – tagged "{{ current_tags | join: ', ' }}"{% endif %}{% if current_page != 1 %} – Page {{ current_page }}{% endif %}{% unless page_title contains shop.name %} – {{ shop.name }}{% endunless %}<br /></title>
Intermediate & Advanced SEO | | ycnetpro101
{% if page_description %} {% endif %} {% if current_page != 1 %} {% else %} {% endif %}
{% if template == 'collection' %}{% if collection %}
{% if current_page == 1 %} {% endif %}
{% if template == 'product' %}{% if product %} {% endif %}
{% if template == 'collection' %}{% if collection %} {% endif %}0 -
Cached version of my site is not showing content?
Hi mozzers, I am a bit worried since I looked a cache version of my site and somehow content is partially showing up and navigation has completely disappeared. Where could this come from? What should I be doing? Thanks!
Intermediate & Advanced SEO | | Taysir0 -
Duplicate content on URL trailing slash
Hello, Some time ago, we accidentally made changes to our site which modified the way urls in links are generated. At once, trailing slashes were added to many urls (only in links). Links that used to send to
Intermediate & Advanced SEO | | yacpro13
example.com/webpage.html Were now linking to
example.com/webpage.html/ Urls in the xml sitemap remained unchanged (no trailing slash). We started noticing duplicate content (because our site renders the same page with or without the trailing shash). We corrected the problematic php url function so that now, all links on the site link to a url without trailing slash. However, Google had time to index these pages. Is implementing 301 redirects required in this case?1 -
Duplicate Content through 'Gclid'
Hello, We've had the known problem of duplicate content through the gclid parameter caused by Google Adwords. As per Google's recommendation - we added the canonical tag to every page on our site so when the bot came to each page they would go 'Ah-ha, this is the original page'. We also added the paramter to the URL parameters in Google Wemaster Tools. However, now it seems as though a canonical is automatically been given to these newly created gclid pages; below https://www.google.com.au/search?espv=2&q=site%3Awww.mypetwarehouse.com.au+inurl%3Agclid&oq=site%3A&gs_l=serp.3.0.35i39l2j0i67l4j0i10j0i67j0j0i131.58677.61871.0.63823.11.8.3.0.0.0.208.930.0j3j2.5.0....0...1c.1.64.serp..8.3.419.nUJod6dYZmI Therefore these new pages are now being indexed, causing duplicate content. Does anyone have any idea about what to do in this situation? Thanks, Stephen.
Intermediate & Advanced SEO | | MyPetWarehouse0 -
Removing duplicate content
Due to URL changes and parameters on our ecommerce sites, we have a massive amount of duplicate pages indexed by google, sometimes up to 5 duplicate pages with different URLs. 1. We've instituted canonical tags site wide. 2. We are using the parameters function in Webmaster Tools. 3. We are using 301 redirects on all of the obsolete URLs 4. I have had many of the pages fetched so that Google can see and index the 301s and canonicals. 5. I created HTML sitemaps with the duplicate URLs, and had Google fetch and index the sitemap so that the dupes would get crawled and deindexed. None of these seems to be terribly effective. Google is indexing pages with parameters in spite of the parameter (clicksource) being called out in GWT. Pages with obsolete URLs are indexed in spite of them having 301 redirects. Google also appears to be ignoring many of our canonical tags as well, despite the pages being identical. Any ideas on how to clean up the mess?
Intermediate & Advanced SEO | | AMHC0 -
Focusing on Multiple Niches for one site: good or bad?
Is it wise to focus on multiple niches for one site, rather than zoning in one or two different niches? On one hand, you can target many more topics and go after tons of keywords, but on the other hand doesn't google get confused of what your site is really about? Won't google just focus on one of the niches that you provide more than all others? Any input would be great!
Intermediate & Advanced SEO | | WebServiceConsulting.com0 -
How get rid of duplicate content, titles, etc on php cartweaver site?
my website http://www.bartramgallery.com was created using php and cartweaver 2.0 about five years ago by a web developer. I was really happy with the results of the design was inspired to get into web development and have been studying ever since. My biggest problem at this time is that I am not knowledgable with php and the cartweaver product but am learning as I read more. The issue is that seomoz tools are reporting tons of duplicate content and duplicate title pages etc. This is likely from the dynamic urls and same pages with secondary results etc. I just made a new sitemap with auditmypc I think it was called in an attempt to get rid of all the duplicate page titles but is that going to solve anything or do I need to find another way to configure the site? There are many pages with the same content competing for page rank and it is a bit frustrating to say the least. If anyone has any advice it would be greatly appreciated even pointing me in the right direction. Thank you, Jesse
Intermediate & Advanced SEO | | WSOT0