UK website to be duplicated onto 2 ccTLD's - is this duplicate content?
-
Hi
We have a client who wishes to have a site created and duplicated onto 3 servers hosted in three different countries.
United Kingdom, Australia and USA. All of which will ofcourse be in the English language.
A long story short, the website will provide the user 3 options on the homepage asking them which "country site" they wish to view. (I know I can detect the user IP and autoredirect but this is not what they want) Once they choose an option it will direct the user to the appropriate ccTLD. Now the client wants the same information to appear on all 3 sites with some slight variations in products available and English/US spelling difference but for the most part, the sites will look the same with the same content on each page.
So my question is, will these 3 sites been seen as duplicates of each other even though they are hosted in different countries and are on ccTLD's?
Are there any considerations I should pass onto the client with this approach?
Many thanks for reading.
Kris -
Hi Guys
Thanks for the quick response. Okay, having read through the links and other resources on hreflang="x" and how to implement it I have a couple of follow up questions if you don't mind.
Question 1:
Using http://www.abercrombiekids.com as an example which I found, the source of their homepage has many alternate links to other sub domains or ccTLD's. The idea here I presume is that it notifies the search engine where alternative versions of the same content can be found (even though that some of the content presumably is in a different language which I find strange...). So the idea of this is then for Google to index those alternate webpages but assign them to the www.google.?? search engine somehow?Question 2:
In my site homepage for the .CO.UK site, I would have 1 alternate href as "x-default" (for the .co.uk domain) and then 2 alternate hrefs to the homepages of the USA site and the Australian site. And this process would change depending on which site I am looking at i.e. the USA site would set itself as the "x-default" and link to the other 2 domains as alternates - is that correct?Question 3:
Finally, do I need to apply this methodology to EVERY page on all 3 sites indicating which is the default and which is alternate?Kind regards and appreciation for your feedback.
Kris -
As Karl is doing, this is what HREFLANG was made for. Something of a saving grace for many sites with this same problem.
-Andy
-
We are going through the same thing at the moment and have opted to use the rel= alternate hreflang=x tag. There is more info on it on this Google page for you https://support.google.com/webmasters/answer/189077?hl=en
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Content Page URL Question
Our main website is geared toward the city where we are located and includes the city name in content page URLs. We also have separate websites for three surrounding cities; these websites have duplicate content except the city name: MainWebsite.com
Local Website Optimization | | sharon75025
City2-MainWebsite.com
City3-MainWebsite.com
City4-MainWebsite.com We're restructuring to eliminate the location websites and only use the main website. The new site will have city pages. We have well established Google business locations for all four cities. We will keep all locations, replacing the location website with the main website. Should we remove City-IL from all content page URLs in the new site? We don't want to lose traffic/ranking for City2 or City3 because the content pages have City1 in the URL. Page URLs are currently formatted as follows: www.MainWebsite.com/Service-1-City1-IL.html
www.MainWebsite.com/Service-2-City1-IL.html
www.MainWebsite.com/Service-3-City1-IL.html
www.MainWebsite.com/Service-4-City1-IL.html Thanks!0 -
Checking subdomains/ site structure of a website for International SEO
Dear Moz community, I am looking into two websites for a friend and we want to understand the following: What is the site structure as per the sub domains? e.g currently it is .com/en/ or .com/ru/ or .com/zh/ Using the crawl report, each page has a en or other language version. I take it this means that we have to create copy, meta titles and descriptions for each of the languages even if the page is the same but in a different language? To avoid duplication of content would you suggest canonical tags to be put in place? To check hreflang mark up, I couldn't find anything in the code which makes me thing a script is automatically translating this? This is the first time I have started to look at international SEO and want to understand what to look for in an audit of exisiting sites. Thank you,
Local Website Optimization | | TAT1000 -
Is it deceptive to attempt to rank for a city you're located just outside of?
I live in Greenville, SC (who has a large "Greater Greenville" reach). I work for an agency with many clients who are located just outside of the city in smaller towns, sometimes technically in counties other than Greenville. Often, they provide services in the city of Greenville and aim to grow business there, so we'll use "Greenville, SC" throughout site copy, in titles, and in meta descriptions. Are there any negative implications to this? Any chance search engines think these clients are being deceptive? And is it possible these clients are hurting their ranking in their actual location by trying to appear to be a Greenville-based company? Thank you for any thoughts!
Local Website Optimization | | engeniusbrent1 -
Client wants to rebrand but insists on keeping their old website live as well...
I am working with a client in the dental space that has an existing (11 year old) website for his practice. His domain is tied to his last name, which he would like to get away from because he plans to sell the practice in the next couple years. Backstory: Prior to taking him on, he was working with an SEO agency out of India that were built him quite an ugly backlink profile. Once we discovered it, we immediately notified him about the risk of a penalty if left alone. He was riding high in Google SERP's so of course, it was of no concern to him. Needless to say about a year ago he was inducted into Google's "manual penalty club" for suspicious links. His site vanished in Google and all! Hooray! But no, not really... We met with him to discuss the options, suggesting we clean up his backlink profile, then submit for reconsideration. Based on the time we told him it could take to make progress and be back up and running, he wasn't very excited about that approach. He said he wanted us to rebuild a new site, with a new domain and start fresh. In addition, he wanted keep his original site live since it is tied to his already thriving practice. To sum it all up, his goal is to keep what he has live since his customers are accustom to using his existing (penalized) website. While building a new brand/website that he can use to build a cleaner backlink profile and rank in Google as well as to sell off down the line without having his name tied to the practice. Question: Being that he has an existing site with the company NAP info throughout and the new site will also have the same NAP (just a different domain/brand), is there a "best way" to approach this? The content on the new site would be completely unique. I understand this approach is iffy but in his situation it makes sense to some extent. Any feedback or ideas on how to best handle having two sites running for the same dental practice? If any part of my question is confusing or you need further details to help make a suggestion, please fire away and I will be happy to give as much detail as possible. Thanks Mozzers!
Local Website Optimization | | Bryan_Loconto1 -
Subdomain for ticketing of a client website (how to solve SEO problems caused by the subdomain/domain relationship)
We have a client in need of a ticketing solution for their domain (let's call it www.domain.com) which is on Wordpress - as is our custom ticket solution. However, we want to have full control of the ticketing, since we manage it for them - so we do not want to build it inside their original Wordpress install. Our proposed solution is to build it on tickets.domain.com. This will exist only for selling and issuing the tickets. The question is, is there a way to do this without damaging their bounce rate and SEO scores?
Local Website Optimization | | Adam_RushHour_Marketing
Since customers will come to www.domain.com, then click the ticketing tab and land on tickets.domain.com, Google will see this as a bounce. In reality, customers will not notice the difference as we will clone the look and feel of domain.com Should we perhaps have the canonical URL of tickets.domain.com point to www.domain.com? And also, can we install Webmaster Tools for tickets.domain.com and set the preferred domain as www.domain.com? Are these possible solutions to the problem, or not - and if not, does anyone else have a viable solution? Thank you so much for the help.0 -
Ecommerce Site with Unique Location Pages - Issue with unique content and thin content?
Hello All, I have an Ecommerce Site specializing in Hire and we have individual location pages on each of our categories for each of our depots. All these pages show the NAP of the specific branch Given the size of our website (10K approx pages) , it's physically impossible for us to write unique content for each location against each category so what we are doing is writing unique content for our top 10 locations in a category for example , and the remaining 20 odd locations against the same category has the same content but it will bring in the location name and the individual NAP of that branch so in effect I think this thin content. My question is , I am quite sure I we are getting some form of algorithmic penalty with regards the thin/duplicate content. Using the example above , should we 301 redirect the 20 odd locations with the thin content , or should be say only 301 redirect 10 of them , so we in effect end up with a more 50/50 split on a category with regards to unique content on pages verses thin content for the same category. Alternatively, should we can 301 all the thin content pages so we only have 10 locations against the category and therefore 100% unique content. I am trying to work out which would help most with regards to local rankings for my location pages. Also , does anyone know if a thin/duplicate content penalty is site wide or can it just affect specific parts of a website. Any advice greatly appreciated thanks Pete
Local Website Optimization | | PeteC120 -
Duplicate content on a proxy site?
I have a local client with a 500 page site.
Local Website Optimization | | TFinder
They advertise online and use traditional media like direct mail.
A print media company, Valpak, has started a website
And wants the client to use their trackable phone number
And a proxy website. When I type the proxy domain in the browser
It appears to be client home page at this proxy URL. The vendor
Wishes to track activity on its site to prove their value or something
My question is: is their any "authority" risk to my clients website
By allowing this proxy site??0 -
How can I rank my .co.uk using content on my .com?
Hi, We currently have a .com site ranking second for our brand term in the .co.uk SERP. This is mainly because we don't own the exact match brand term which comes from not having a clue what we were doing when we set up the company. Would it be possible to out rank this term considering we the weighing that google puts towards exact matches in the URL? N.B - There are a few updates we could do to the homepage to make the on-page optimisation better and we have not actively done any link building yet which will obviously help. competitor SERP rank 1 - MOZ PA38 DA26 Our Site SERP rank 2 - MOZ PA43 DA32 Thanks Ben
Local Website Optimization | | benjmoz0