I have a site that has 65 different versions of itself.
-
I've just started managing a site that serves over 50 different countries and the entire web enterprise is being flagged for duplicate content because there is so much of it. What's the best approach to stop this duplicate content, yet serve all of the countries we need to?
-
Yes sir, I agree it will be a "bit of an effort". Thank you both for some great guidance and if there's anybody else that has other solutions to these types of issues, I welcome your feedback as well.
-
It may be a bit of an effort but is it possible that you can work your way through the pages and make the content, titles and descrptions unique so that they don't get flagged as duplicate content.
This has the added advantage of having a large number of pages targeted at your various keyphrases whereas other apporaches involving 301 redirects or rel="nofollow" reduce the duplicate content issue but also reduce the number of pages on which to target keyphrases across all of these pages. If they are acorss 50 countries is there a local spin that can be put on the content so that all the relevant terms are targeted for in their regions but so that Google doesn't see 50+ versions of the same site.
-
Thank You Flatiron,
Yes the content is on different servers due to the different countries they serve as well as the languages. The client's US site is what I am working to improve and they currently have over 2,500 Duplicate title tags and Meta-Descriptions out there. Would modifying the robots.txt file to instruct the SE's to simply crawl the one main site and ignore the others be the best solution? My train of thought is going back to a previous case I had with a previous company where their product list pages were seen as duplicate pages due to the fact that each of the "sort" parameters were being recognized as duplicates by the SE's. We had to write an instruction to only crawl the first sorted results.
-
Hi Ken,
Is the content actually exactly the same but running on different domains? That will determine how to approach this issue. If all the the content is the same you can either utilize 301 redirects or rel=canonical tags to help the engines view the multiple sites as a single site and combine any link juice that's associated with each of the 50 sites. If the content isn't actually duplicitous then it or the page titles are extremely similar. In the long run I would recommend localizing your content so as to not only help from an SEO perspective but to also improve the user experience and hopefully the conversion rates as well.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
MultiRegional site indexing problems
Hello there!!! I have a multiregional site and dealing with some indexing problems. The problem is that google have only indexed our USA site We have: -set up hreflang tags -set up specific subdirectories https://www.website.com/ (en-us site and our main site) https://www.website.com/en-gb https://www.website.com/en-ca https://www.website.com/fr-ca https://www.website.com/fr-fr https://www.website.com/es-es ..... -set up automatic GEO IP redirects (301 redirects) -created a sitemap index and a different sitemap for each regional site -created a google webmaster's tool for each country targeted -created translations for each different language and added some canonicals to the US' site when using English content. The problem is that Google is not indexing our regional sites. I think that the problem is that google is using a US bot when spidering the site, so it will be always redirect to the US version by a 301 redirect. I have used fetch as google with some of our regional folders and asked for "Indexing requested for URL and linked pages", but still waiting. Some ideas?? changing 301 to 302? Really don't know what to do. Thank you so much!!
International SEO | | Alejandrodurn0 -
Need help with search results for US site for a compnay that has many international sites
I am tasked with optimizing a US site for a company that has many international sites. Currently, if you search for just the main company name and don't include "USA" in your search, it won't even give you the US site on the SERP. It displays the Italian, French, etc etc sites - even though I'm searching on Google in the US with a preferred language of Engilsh. Unfortunately I don't have any control over the other sites, only the US one. Is there anything I can add to the US site (aside from setting the country code in GSC) so that when someone searches from within the USA, they get the US site and not all of the other ones? thanks!
International SEO | | SEOIntouch0 -
In the U.S., how can I stop the European version of my site from outranking the U.S. version?
I've got a site with two versions – a U.S. version and a European version. Users are directed to the appropriate version through a landing page that asks where they're located; both sites are on the same domain, except one is .com/us and the other is .com/eu. My issue is that for some keywords, the European version is outranking the U.S. version in Google's U.S. SERPs. Not only that, but when Google displays sitelinks in the U.S. SERPs, it's a combination of pages on the European site and the U.S. site. Does anyone know how I can stop the European site from outranking the U.S. site in the U.S.? Or how I can get Google to only display sitelinks for pages on the U.S. site in the U.S. SERPs? Thanks in advance for any light you can shed on this topic!
International SEO | | matt-145670 -
International Sites - Sitemaps, Robots & Geolocating in WMT
Hi Guys, I have a site that has now been launched in the US having originally just been UK. In order to accommodate this, the website has been set-up using directories for each country. Example: domain.com/en-gb domain.com/en-us As the site was originally set-up for UK, the sitemap, robots file & Webmaster Tools account were added to the main domain. Example: domain.com/sitemap.xml domain.com/robots.txt The question is does this now need changing to make it specific for each country. Example: The sitemap and robots.txt for the UK would move to: domain.com/en-gb/sitemap.xml domain.com/en-gb/robots.txt and the US would have its own separate sitemap and robots.txt. Example : domain.com/en-us/sitemap.xml domain.com/en-us/robots.txt Also in order to Geolocate this in WMT would this need to be done for each directory version instead of the main domain? Currently the WMT account for the UK site is verified at www.domain.com, would this need reverifying at domain.com/en-gb? Any help would be appreciated! Thanks!
International SEO | | CarlWint0 -
Include mobile and international versions of pages to sitemap or not?
My pages already have alternate and hreflang references to point to international and mobile versions of the content. If I add 5 language desktop versions and 5 language mobile versions as https://support.google.com/webmasters/answer/2620865?hl=en explains, my sitemap will get bulky. What are the pros and cons for referencing all page versions in sitemap and for include just general (English/Desktop) version in sitemap?
International SEO | | poiseo0 -
Huge increase in US direct visits to a UK site, why?
Hi all, My UK website usually gets around 10,000 direct (Direct in Analytics) visits per month however for August this has shot up to 24,000! However the majority of these direct visits seem to be coming from the US and as a result the bounce rate is through the roof, 84%! Why would my UK based site suddenly be receiving huge amounts of US visits? Any ideas?
International SEO | | MarkHincks0 -
Non US site pages indexed in US Google search
Hi, We are having a global site wide issue with non US site pages being indexed by Google and served up in US search results. Conversley, we have US en pages showing in the Japan Google search results. We currently us IP detect to direct users to the correct regional site but it isn't effective if the users are entering through an incorrect regional page. At the top of each or our pages we have a drop down menu to allow users to manually select their preferred region. Is it possible that Google Bot is crawling these links and indexing these other regional pages as US and not detecting it due to our URL structure? Below are examples of two of our URLs for reference - one from Canada, the other from the US /ca/en/prod4130078/2500058/catalog50008/ /us/en/prod4130078/2500058/catalog20038/ If that is, in fact, what is happening, would setting the links within the drop down to 'no follow' address the problem? Thank you. Angie
International SEO | | Corel0 -
Multi Language / target market site
What is the best way to deal with multiple languages and multiple target markets? Is it better to use directories or sub-domains: English.domain.com Portuguese.domain.com Or Domain.com Domain.com/Portuguese Also should I use language meta tags to help the different language versions rank in different geographic areas e.g. Are there any examples of where this has been done well?
International SEO | | RodneyRiley0