US and UK Websites of Same Business with Same Content
-
Hello Community,
I need your help to understand, whether I can use the US website's content on my UK website or not?
US Website's domain: https://www.fortresssecuritystore.com
UK Website's domain: https://www.fortresssecuritystore.co.uk
Both websites are having same content on all the pages, including testimonials/reviews.
I am trying to gain business from Adwords and Organic SEO marketing.
Thanks.
-
Yup, but doesn't matter. Hreflang works for this situation whether cross-domain or on a subdirectory/subdomain basis (and in fact is even more effective when cross-domain as you're also getting the benefit of the geo-located ccTLD.)
P.
-
Hi Paul,
If I understood correctly, we are talking about two different websites, not a website with subdomains.
Hreflang can be used for other languages and countries although not for masking 100% duplicated content as I stated above.site A: https://www.fortresssecuritystore.com
site B: https://www.fortresssecuritystore.co.uk
The recommendations that Google gives are for the purpose of having the pages crawled and indexed not for having success with 100% duplicate content which do not serve a good UX, therefore gain a high bounce rate, then the overall SEO fall down.
Mª Verónica
-
Unfortunately, your information is incorrect, Veronica.
Hreflang is specifically designed for exactly this situation. As Google Engineer Maile Oye clearly states, one of the primary uses of hreflang markup is:
- Your content has small regional variations with** similar content in a single language**. For example, you might have English-language content targeted to the US, GB, and Ireland.
(https://support.google.com/webmasters/answer/189077?hl=en)
There's no question differentiating similar content in the same language for different regions/countries is more of a challenge than for totally different languages, but it can absolutely be done, and in fact is a very common requirement for tens of thousands of companies.
Paul
- Your content has small regional variations with** similar content in a single language**. For example, you might have English-language content targeted to the US, GB, and Ireland.
-
Hi CommercePundit,
Sadly, there is not "a non painful way to say it".
You cannot gain business from Adwords and Organic SEO marketing having 100% duplicated content.The options; canonical and hreflang would not work in this case.
The only option is language "localization", mean rewrite the whole content by a local writer.
Canonical can be used for up to 10% not for the whole 100%. Hreflang can be used for other languages and countries although not for masking 100% duplicated content.
Sorry to tell the bad news. Good luck!
Mª Verónica
-
The more you can differentiate these two sites, the better they will each perform in their own specific markets, CP.
First requirement will be a careful, full implementation of hreflang tags for each site.
Next, you'll need to do what you can to regionalise the content - for example changing to UK spelling for the UK site content, making sure prices are referenced in pounds instead of dollars, changing up the language to use British idioms and locations as examples where possible. It'll also be critical to work towards having the reviews/testimonials from each site's own country, rather than generic, This will help dramatically from a marketing standpoint and also help differentiate for the search engines, so a double win.
And finally, you'll want to make certain you've set up each in their own Google Search Console and used the geographic targeting for the .com site to specify its target as US. (You won't' need to target the UK site as the .co.uk is already targeted so you won't' get that option in GSC.). If you have an actual physical address/phone in the UK, would also help to set up a separate Google My Busines profile for the UK branch.
Bottom line is - you'll need to put in significant work to differentiate the sites and provide as many signals as possible for which site is for which country in order to help the search engines understand which to return in search results.
Hope that all makes sense?
Paul
-
Hi!
Yeap you can target UK market with US site version. Always keep in mind that its possible that you might perform as well as in the main market (US).
Also, before making any desition and/or implementing, take a look at these articles:
Multi-regional and multilingual sites - Google Search Console
International checklist - Moz Blog
Using the correct hreglang tag - Moz Blog
Guide to international website expansion - Moz Blog
Tool for checking hreflang anotations - Moz BlogHope it helps.
Best Luck.
GR.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Content from Another Site
Hi there - I have a client that says they'll be "serving content by retrieving it from another URL using loadHTMLFile, performing some manipulations on it, and then pushing the result to the page using saveHTML()." Just wondering what the SEO implications of this will be. Will search engines be able to crawl the retrieved content? Is there a downside (I'm assuming we'll have some duplicate content issues)? Thanks for the help!!
Technical SEO | | NetStrategies1 -
Website indexed but not ranking for anything
hello everyone, It seems my website http://www.scribidocampus.com/ is indexed by google but it si not ranking anywhere, even wehn i google scribidocampus. If i search any of the text on my website in " " no results come up. can someone tell me the reason?
Technical SEO | | themesh0 -
Slow website
Hi I have just migrated from a custom written php/mysql site to a site using wordpress and woocommerce. I couldnt believe the drop in speed . I am using a few plugins for wordpress - contact forms / social sharing. and I have a few woocommerce plugins for taking payment etc. I am hosting images css's and js's on W3 Total Cache and MAXCDN hoping to speed the site up but tools at http://tools.pingdom.com/fpt sometimes show that the time between browser request and reply can be between 1 and 15 secs. I have searched all day looking for a post I read about two months ago with a tool that seems to look at server responce and redirect processing etc hoping it would help but cant find it. If anyone knows what I am talking about I would appreciate them posting a link The site is http://www.synergy-health.co.uk and an example of an inner page is http://www.synergy-health.co.uk/home/shop/alacer-emergen-c-1000-mg-vitamin-c-acai-berry-30-packets-8-4-g-each/ Any suggestions please? Perhaps I have w3total cache set wrong? Also, as the has been tanked and is in freefal iin google ranking since January would this be a good time to change the structure of Url from home/shop/product to domain.name/brand/product? Thanks in advance !
Technical SEO | | StephenCallaghan0 -
Development Website Duplicate Content Issue
Hi, We launched a client's website around 7th January 2013 (http://rollerbannerscheap.co.uk), we originally constructed the website on a development domain (http://dev.rollerbannerscheap.co.uk) which was active for around 6-8 months (the dev site was unblocked from search engines for the first 3-4 months, but then blocked again) before we migrated dev --> live. In late Jan 2013 changed the robots.txt file to allow search engines to index the website. A week later I accidentally logged into the DEV website and also changed the robots.txt file to allow the search engines to index it. This obviously caused a duplicate content issue as both sites were identical. I realised what I had done a couple of days later and blocked the dev site from the search engines with the robots.txt file. Most of the pages from the dev site had been de-indexed from Google apart from 3, the home page (dev.rollerbannerscheap.co.uk, and two blog pages). The live site has 184 pages indexed in Google. So I thought the last 3 dev pages would disappear after a few weeks. I checked back late February and the 3 dev site pages were still indexed in Google. I decided to 301 redirect the dev site to the live site to tell Google to rank the live site and to ignore the dev site content. I also checked the robots.txt file on the dev site and this was blocking search engines too. But still the dev site is being found in Google wherever the live site should be found. When I do find the dev site in Google it displays this; Roller Banners Cheap » admin dev.rollerbannerscheap.co.uk/ A description for this result is not available because of this site's robots.txt – learn more. This is really affecting our clients SEO plan and we can't seem to remove the dev site or rank the live site in Google. In GWT I have tried to remove the sub domain. When I visit remove URLs, I enter dev.rollerbannerscheap.co.uk but then it displays the URL as http://www.rollerbannerscheap.co.uk/dev.rollerbannerscheap.co.uk. I want to remove a sub domain not a page. Can anyone help please?
Technical SEO | | SO_UK0 -
Category URL Duplicate Content
I've recently been hired as the web developer for a company with an existing web site. Their web architecture includes category names in product urls, and of course we have many products in multiple categories thus generating duplicate content. According to the SEOMoz Site Crawl, we have roughly 1600 pages of duplicate content, I expect primarily from this issue. This is out of roughly 3600 pages crawled. My questions are: 1. Fixing this for the long term will obviously mean restructuring the URLs for the site. Is this worthwhile and what will the ramifications be of performing such a move? 2. How can I determine the level and extent of the effects of this duplicated content? 3. Is it possible the best course of action is to do nothing? The site has many, many other issues, and I'm not sure how highly to prioritize this problem. In addition, the IT man is highly doubtful this is causing an SEO issue, and I'm going to need to be able to back up any action I request. I do feel I will need to strongly justify any possible risks this level of site change could cause. Thanks in advance, and please let me know if any more information is needed.
Technical SEO | | MagnetsUSA0 -
Duplicate Content on Navigation Structures
Hello SEOMoz Team, My organization is making a push to have a seamless navigation across all of its domains. Each of the domains publishes distinctly different content about various subjects. We want each of the domains to have its own separate identity as viewed by Google. It has been suggested internally that we keep the exact same navigation structure (40-50 links in the header) across the header of each of our 15 domains to ensure "unity" among all of the sites. Will this create a problem with duplicate content in the form of the menu structure, and will this cause Google to not consider the domains as being separate from each other? Thanks, Richard Robbins
Technical SEO | | LDS-SEO0 -
Would moving a large part of our website onto a separate website be SEO suicide?
Hello, Our website currently has what I would call educational and sales pages - which sells our services and also a techy section for the developer community. The developer pages on the website have some of the highest authority pages that we have and equates for about 50% of the content. It has been proposed to move the developer pages onto their own domain - away from the main website. Now, would this crush a lot of the SEO benefit that we have on our main site? Does anyone know of a workable solution that would help retain the SEO. Would linking to our main site from the developer site help? It would be great to hear what people think, Thanks,
Technical SEO | | esendex0 -
Duplicate Content Issue
Hello, We have many pages in our crawler report that are showing duplicate content. However, the content is not duplicateon the pages. It is somewhat close, but different. I am not sure how to fix the problem so it leaves our report. Here is an example. It is showing these as duplicate content to each other. www.soccerstop.com/c-119-womens.aspx www.soccerstop.com/c-120-youth.aspx www.soccerstop.com/c-124-adult.aspx Any help you could provide would be most appreciated. I am going through our crawler report and resolving issues, and this seems to be big one for us with lots in the report, but not sure what to do about it. Thanks
Technical SEO | | SoccerStop
James0