UK website to be duplicated onto 2 ccTLD's - is this duplicate content?
-
Hi
We have a client who wishes to have a site created and duplicated onto 3 servers hosted in three different countries.
United Kingdom, Australia and USA. All of which will ofcourse be in the English language.
A long story short, the website will provide the user 3 options on the homepage asking them which "country site" they wish to view. (I know I can detect the user IP and autoredirect but this is not what they want) Once they choose an option it will direct the user to the appropriate ccTLD. Now the client wants the same information to appear on all 3 sites with some slight variations in products available and English/US spelling difference but for the most part, the sites will look the same with the same content on each page.
So my question is, will these 3 sites been seen as duplicates of each other even though they are hosted in different countries and are on ccTLD's?
Are there any considerations I should pass onto the client with this approach?
Many thanks for reading.
Kris -
Hi Guys
Thanks for the quick response. Okay, having read through the links and other resources on hreflang="x" and how to implement it I have a couple of follow up questions if you don't mind.
Question 1:
Using http://www.abercrombiekids.com as an example which I found, the source of their homepage has many alternate links to other sub domains or ccTLD's. The idea here I presume is that it notifies the search engine where alternative versions of the same content can be found (even though that some of the content presumably is in a different language which I find strange...). So the idea of this is then for Google to index those alternate webpages but assign them to the www.google.?? search engine somehow?Question 2:
In my site homepage for the .CO.UK site, I would have 1 alternate href as "x-default" (for the .co.uk domain) and then 2 alternate hrefs to the homepages of the USA site and the Australian site. And this process would change depending on which site I am looking at i.e. the USA site would set itself as the "x-default" and link to the other 2 domains as alternates - is that correct?Question 3:
Finally, do I need to apply this methodology to EVERY page on all 3 sites indicating which is the default and which is alternate?Kind regards and appreciation for your feedback.
Kris -
As Karl is doing, this is what HREFLANG was made for. Something of a saving grace for many sites with this same problem.
-Andy
-
We are going through the same thing at the moment and have opted to use the rel= alternate hreflang=x tag. There is more info on it on this Google page for you https://support.google.com/webmasters/answer/189077?hl=en
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Using the Onpage Grader for Local Business websites
Hey Guys, Curious how people use the onpage grader for optimizing pages for local businesses specifically, I'm interested if people use keywords with or without a geo modifier since adding a geo modifier will prevent more natural writing to increase the score. If you don't use a geo modifier do you have some general rules of the city that needs to be in the H1 and first paragraph etc. Any tips for using the page grader for local businesses would be great Thanks!
Local Website Optimization | | solidlocal0 -
Does having 2 separate domains with similar content always = duplicate content?
I work for a global company which is in the process of launching their US & European websites, (just re-launched Australian site, migrated from an old domain) all with separate domains with the purpose of localising. However, the US website content will essentially be the same as the Australian one with minor changes (z instead of s, slightly different service offerings etc) but the core information will be the same as the AU site. Will this be seen as duplicate content and Is there a way we can structure this so that the content won’t be seen as duplicate but is still a separate localised website? Thank you.
Local Website Optimization | | PGAUE0 -
Content spinning or duplicate content — a potential penalty or a safe technique?
Currently I’m working on the local UK business website www.londonlocksmith.london and I have to say a few practises of the competition got me confused. For example websites like these:
Local Website Optimization | | PayPro
http://lambeth-trusted-local-locksmith.co.uk/
http://clapham-trusted-local-locksmith.co.uk/
http://streathamhill-trusted-local-locksmith.co.uk/
http://hernehillse24-trustedlocallocksmith.co.uk/ All of them rank decent for the main regional keyword (e.g. Lambeth locksmith) and have an ok-ish DA. But as you scroll through these websites you see that the content is the same for all of them except for the location name, plus they all link to each other (see the footer). Now my question is: can this be a good technique for higher local ranking by creating dedicated websites (not just landing pages) with the target keyword in the domain name? And also: what is your experience with such ways of keyword targeting; what do you think in general about content spinning for local services with high competition?; what are your suggestions?0 -
Why has my site dropped to page 2?
I haven't been paying attention to my sites SERP for the past year, and only realized I've dropped to page 2 on a keyword search. Specifically, on Google.ca, searching the keywords "wedding invitations" My site, www.stephita.com, used to consistently rank in the top 3 links. While my competitors have leapfrogged me. 😞 I realized that my site wasn't "mobile-friendly", and had a few other issues like keyword stuffing, long meta descriptions and titles. I've fixed these issues "now", but wanted to know does this mean my site was severely penalized by the Panda/Penguin updates for the last few years? Does having a PR3 site mean anything? My competitors who our rank me on SERP, are all PR1 sites. Greatly appreciate any feedback you can give me! 🙂
Local Website Optimization | | TysonWong0 -
How to approach SEO for a national website that has multiple chapter/location websites all under different URLs
We are currently working with a client who has one national site - let's call it CompanyName.net, and multiple, independent chapter sites listed under different URLs that are structured, for example, as CompanyNamechicago.org, and sometimes specific to neighborhoods, as in CompanyNamechicago.org/lakeview.org. The national umbrella site is .net, while all others are .orgs. These are not subdomains or subfolders, as far as we can tell. You can use a search function on the .net site to find a location near you and click to that specific local website. They are looking for help optimizing and increasing traffic to certain landing pages on the .net site...but similar landing pages also exist on a local level, which appear to be competing with the national site. (Example: there is a landing page on the national .net umbrella site for a "dog safety" campaign they are doing, but also that campaign has led to a landing page created independently on the local CompanyNameChicago.org website, which seems to get higher ranking due to a user looking for this info while located in Chicago.) We are wondering if our hands are tied here since they appear to be competing for traffic with all their localized sites, or if there are best practices to handle a situation like this. Thanks!
Local Website Optimization | | timfrick0 -
2 clients. 2 websites. Same City. Both bankruptcy attorneys. How to make sure Google doesn't penalize...
Hi Moz'ers! I am creating 2 new websites for 2 different bankruptcy attorneys in the same city. I plan to use different templates BUT from the same template provider. I plan to host with the same hosting company (unless someone here advises me not to). The content will be custom, but similar, as they both practice bankruptcy law. They have different addresses, as they are different law firms. My concern is that Google will penalize for duplicate content because they both practice the same area of law, in the same city, hosting the same, template maker the same, and both won't rank. What should I do to make sure that doesn't happen? Will it be enough that they have different business names, address, and phone numbers? Thanks for any help!!
Local Website Optimization | | BBuck0 -
Duplicate Theme, different Content, competing keywords?
Hello, We have 2 versions of a website. Notes: They will be the same theme, and slightly different images but the written content is different. SEO optimization is the same for both sites targeting the same the city and they will be competing for certain keywords mainly vanity keywords. So we have websites examples:
Local Website Optimization | | EVERWORLD.ENTERTAIMENT
http://mycompanytor.com http://mycompanytoronto.com How does google handle 2 websites like this, will one get penalized? Or will it treated as 2 different sites, even though the company name which is the brand shows up on the main url? Thanks for your help0 -
Does Google play fair? Is 'relevant content' and 'usability' enough?
It seems there are 2 opposing views, and as a newbie this is very confusing. One view is that as long as your site pages have relevant content and are easy for the user, Google will rank you fairly. The other view is that Google has 'rules' you must follow and even if the site is relevant and user-friendly if you don't play by the rules your site may never rank well. Which is closer to the truth? No one wants to have a great website that won't rank because Google wasn't sophisticated enough to see that they weren't being unfair. Here's an example to illustrate one related concern I have: I've read that Google doesn't like duplicated content. But, here are 2 cases in which is it more 'relevant' and 'usable' to the user to have duplicate content: Say a website helps you find restaurants in a city. Restaurants may be listed by city region, and by type of restaurant. The home page may have links to 30 city regions. It may also have links for 20 types of restaurants. The user has a choice. Say the user chooses a region. The resulting new page may still be relevant and usable by listing ALL 30 regions because the user may want to choose a different region. Altenatively say the user chooses a restaurant type for the whole city. The resulting page may still be relevant and usable by giving the user the ability to choose another type OR another city region. IOW there may be a 'mega-menu' at the top of the page which duplicates on every page in the site, but is very helpful. Instead of requiring the user to go back to the home page to click a new region or a new type the user can do it on any page. That's duplicate content in the form of a mega menu, but is very relevant and usable. YET, my sense is that Google MAY penalize the site even though arguably it is the most relevant and usable approach for someone that may or may not have a specific region or restaurant type in mind.. Thoughts?
Local Website Optimization | | couponguy0