Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
US and UK Websites of Same Business with Same Content
-
Hello Community,
I need your help to understand, whether I can use the US website's content on my UK website or not?
US Website's domain: https://www.fortresssecuritystore.com
UK Website's domain: https://www.fortresssecuritystore.co.uk
Both websites are having same content on all the pages, including testimonials/reviews.
I am trying to gain business from Adwords and Organic SEO marketing.
Thanks.
-
Yup, but doesn't matter. Hreflang works for this situation whether cross-domain or on a subdirectory/subdomain basis (and in fact is even more effective when cross-domain as you're also getting the benefit of the geo-located ccTLD.)
P.
-
Hi Paul,
If I understood correctly, we are talking about two different websites, not a website with subdomains.
Hreflang can be used for other languages and countries although not for masking 100% duplicated content as I stated above.site A: https://www.fortresssecuritystore.com
site B: https://www.fortresssecuritystore.co.uk
The recommendations that Google gives are for the purpose of having the pages crawled and indexed not for having success with 100% duplicate content which do not serve a good UX, therefore gain a high bounce rate, then the overall SEO fall down.
Mª Verónica
-
Unfortunately, your information is incorrect, Veronica.
Hreflang is specifically designed for exactly this situation. As Google Engineer Maile Oye clearly states, one of the primary uses of hreflang markup is:
- Your content has small regional variations with** similar content in a single language**. For example, you might have English-language content targeted to the US, GB, and Ireland.
(https://support.google.com/webmasters/answer/189077?hl=en)
There's no question differentiating similar content in the same language for different regions/countries is more of a challenge than for totally different languages, but it can absolutely be done, and in fact is a very common requirement for tens of thousands of companies.
Paul
- Your content has small regional variations with** similar content in a single language**. For example, you might have English-language content targeted to the US, GB, and Ireland.
-
Hi CommercePundit,
Sadly, there is not "a non painful way to say it".
You cannot gain business from Adwords and Organic SEO marketing having 100% duplicated content.The options; canonical and hreflang would not work in this case.
The only option is language "localization", mean rewrite the whole content by a local writer.
Canonical can be used for up to 10% not for the whole 100%. Hreflang can be used for other languages and countries although not for masking 100% duplicated content.
Sorry to tell the bad news. Good luck!
Mª Verónica
-
The more you can differentiate these two sites, the better they will each perform in their own specific markets, CP.
First requirement will be a careful, full implementation of hreflang tags for each site.
Next, you'll need to do what you can to regionalise the content - for example changing to UK spelling for the UK site content, making sure prices are referenced in pounds instead of dollars, changing up the language to use British idioms and locations as examples where possible. It'll also be critical to work towards having the reviews/testimonials from each site's own country, rather than generic, This will help dramatically from a marketing standpoint and also help differentiate for the search engines, so a double win.
And finally, you'll want to make certain you've set up each in their own Google Search Console and used the geographic targeting for the .com site to specify its target as US. (You won't' need to target the UK site as the .co.uk is already targeted so you won't' get that option in GSC.). If you have an actual physical address/phone in the UK, would also help to set up a separate Google My Busines profile for the UK branch.
Bottom line is - you'll need to put in significant work to differentiate the sites and provide as many signals as possible for which site is for which country in order to help the search engines understand which to return in search results.
Hope that all makes sense?
Paul
-
Hi!
Yeap you can target UK market with US site version. Always keep in mind that its possible that you might perform as well as in the main market (US).
Also, before making any desition and/or implementing, take a look at these articles:
Multi-regional and multilingual sites - Google Search Console
International checklist - Moz Blog
Using the correct hreglang tag - Moz Blog
Guide to international website expansion - Moz Blog
Tool for checking hreflang anotations - Moz BlogHope it helps.
Best Luck.
GR.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Content in Accordion doesn't rank as well as Content in Text box?
Does content rank better in a full view text layout, rather than in a clickable accordion? I read somewhere because users need to click into an accordion it may not rank as well, as it may be considered hidden on the page - is this true? accordion example: see features: https://www.workday.com/en-us/applications/student.html
Technical SEO | Aug 24, 2017, 2:52 PM | DigitalCRO1 -
Problem with Yoast not seeing any of this website's text/content
Hi, My client has a new WordPress site http://www.londonavsolutions.co.uk/ and they have installed the Yoast Premium SEO plug-in. They are having issues with getting the lights to go green and the main problem is that on most pages Yoast does not see any words/content – although there are plenty of words on the pages. Other tools can see the words, however Yoast is struggling to find any and gives the following message:- Bad SEO score. The text contains 0 words. This is far below the recommended minimum of 300 words. Add more content that is relevant for the topic. Readability - You have far too little content. Please add some content to enable a good analysis. They have contacted the website developer who says that there is nothing wrong, but they are frustrated that they cannot use the Yoast tools themselves because of this issue, plus Yoast are offering no support with the issue. I hope that one of you guys has seen this problem before, or can spot a problem with the way the site has been built and can perhaps shed some light on the problem. I didn't build the site myself so won't be offended if you spot problems with it. Thanks in advance, Ben
Technical SEO | Sep 12, 2016, 12:41 PM | bendyman0 -
Will Google crawl and rank our ReactJS website content?
We have 250+ products dynamically inserted and sorted on our site daily (more specifically our homepage... yes, it's a long page). Our dev team would like to explore rendering the page server-side using ReactJS. We currently use a CDN to cache all the content, which of course we would like to continue using. SO... will Google be able to crawl that content? We've read some articles with different ideas (including prerendering): http://andrewhfarmer.com/react-seo/
Technical SEO | May 24, 2016, 8:33 AM | Jane.com
http://www.seoskeptic.com/json-ld-big-day-at-google/ If we were to only load the schema important to the page (like product title, image, price, description, etc.) from the server and then let the client render the remaining content (comments, suggested products, etc.), would that go against best practices? It seems like that might be seen as showing the googlebot 1 version and showing the site visitor a different (more complete) version.0 -
Car Dealership website - Duplicate Page Content Issues
Hi, I am currently working on a large car dealership website. I have just had a Moz crawl through and its flagging a lot of duplicate page content issues, these are mostly for used car pages. How can I get round this as the site stocks many of the same car, model, colour, age, millage etc. Only unique thing about them is the reg plate. How do I get past this duplicate issue if all the info is relatively the same? Anyone experienced this issue when working on a car dealership website? Thank you.
Technical SEO | Jan 24, 2016, 8:10 PM | karl621 -
.com and .co.uk duplicate content
hi mozzers I have a client that has just released a .com version of their .co.uk website. They have basically re-skinned the .co.uk version with some US amends so all the content and title tags are the same. What you do recommend? Canonical tag to the .co.uk version? rewrite titles?
Technical SEO | May 7, 2014, 5:38 AM | KarlBantleman0 -
Squarespace Duplicate Content Issues
My site is built through squarespace and when I ran the campaign in SEOmoz...its come up with all these errors saying duplicate content and duplicate page title for my blog portion. I've heard that canonical tags help with this but with squarespace its hard to add code to page level...only site wide is possible. Was curious if there's someone experienced in squarespace and SEO out there that can give some suggestions on how to resolve this problem? thanks
Technical SEO | Jul 16, 2017, 4:17 AM | cmjolley0 -
Duplicate content and http and https
Within my Moz crawl report, I have a ton of duplicate content caused by identical pages due to identical pages of http and https URL's. For example: http://www.bigcompany.com/accomodations https://www.bigcompany.com/accomodations The strange thing is that 99% of these URL's are not sensitive in nature and do not require any security features. No credit card information, booking, or carts. The web developer cannot explain where these extra URL's came from or provide any further information. Advice or suggestions are welcome! How do I solve this issue? THANKS MOZZERS
Technical SEO | May 28, 2018, 11:47 PM | hawkvt10 -
How to tell if PDF content is being indexed?
I've searched extensively for this, but could not find a definitive answer. We recently updated our website and it contains links to about 30 PDF data sheets. I want to determine if the text from these PDFs is being archived by search engines. When I do this search http://bit.ly/rRYJPe (google - site:www.gamma-sci.com and filetype:pdf) I can see that the PDF urls are getting indexed, but does that mean that their content is getting indexed? I have read in other posts/places that if you can copy text from a PDF and paste it that means Google can index the content. When I try this with PDFs from our site I cannot copy text, but I was told that these PDFs were all created from Word docs, so they should be indexable, correct? Since WordPress has you upload PDFs like they are an image could this be causing the problem? Would it make sense to take the time and extract all of the PDF content to html? Thanks for any assistance, this has been driving me crazy.
Technical SEO | Dec 14, 2011, 8:06 PM | zazo0