Will hreflang eliminate duplicate content issues for a corporate marketing site on 2 different domains?
-
Basically, I have 2 company websites running.
The first resides on a .com and the second resides on a .co.uk domain.
The content is simply localized for the UK audience, not necessarily 100% original for the UK.
The main website is the .com website but we expanded into the UK, IE and AU markets.
However, the .co.uk domain is targeting UK, IE and AU.
I am using the hreflang tag for the pages. Will this prevent duplicate content issues? Or should I use 100% new content for the .co.uk website?
-
Thanks guys, very helpful!
-
Hi Jeffrey,
Did Logan and/or Nikhilesh answer your question? If so, mind marking one or both responses as a "Good Answer?"
Otherwise, where are you still getting stuck?
-
Will this prevent duplicate content issues?
The short answer is may be. Sometimes when the content is duplicate, Google chooses to ignore Hreflang and "fold" multiple URLs into a single URL. See John Mueller's answer on this thread: https://productforums.google.com/forum/#!msg/webmasters/ezMvrlRWuDk/6XWuM1fIDgA
So make sure that the content is indeed localized to that country. In your case, use "en" for your co.uk site so that it applies worldwide and covers IE and AU as you intended. And for the .com site, use "en-US" and make sure your content there is Americanized.
Examples of localizing your content are spelling (localising), currency for prices (pounds vs. US dollars), addresses/contact info in the footer.
-
Hi Jeffrey,
Hreflang tags are intended to help isolate identical content meant for different versions of Google. Doing so will keep your .com content in the US version of Google, and your .co.uk content in the UK version of Google. So yes, the outcome of the hreflang tag is the proper handling of duplicates served to different geographies.
If you haven't already, I highly recommend checking out this guide: <a>https://moz.com/learn/seo/hreflang-tag</a>
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should Multi Location Businesses "Local Content Silo" Their Services Pages?
I manage a site for a medical practice that has two locations. We already have a location page for each office location and we have the NAP for both locations in the footer of every page. I'm considering making a change to the structure of the site to help it rank better for individual services at each of the two locations, which I think will help pages rank in their specific locales by having the city name in the URL. However, I'm concerned about diluting the domain authority that gets passed to the pages by moving them deeper in the site's structure. For instance, the services URLs are currently structured like this: www.domain.com/services/teeth-whitening (where the service is offered in each of the two locations) Would it make sense to move to a structure more like www.domain.com/city1name/teeth-whitening www.domain.com/city2name/teeth-whitening Does anyone have insight from dealing with multi-location brands on the best way to go about this?
Local Website Optimization | | formandfunctionagency1 -
I have a Wordpress site that ranks well and a blog (uses blogger) with slightly different URL/domain that also ranks decently. Should I combine the 2 under the website domain or keep both?
I realize that I am building essentially 2 different sites even though they are connected, but on some local town pages i have 2-3 results on Page #1. Nice problem to have eh? But i am worried as for a lot of my surrounding towns my competitor has the top listing or definitely ahead of me, so i am wondering if i combine or convert my blog into the same domain as my site, then all of that content + links should hopefully propel my site to #1. Anyone have an experience like this? thanks, Chris
Local Website Optimization | | Sundance_Kidd0 -
Site Audit: Indexed Pages Issue
Over the last couple of months I've been working through some issues with a client. One of my starting points was doing a site Audit. I'm following a post written by Geoff Kenyon https://moz.com/blog/technical-site-audit-for-2015 . One of the main issues of the site audit seems to be that when I run a "site:domain.com" query in Google my homepage isn't the first page listed in fact it isn't listed in this search when I go through all of the listings. I understand that it isn't required to have your homepage listed first when running this type of query, but I would prefer it. Here are some things I've done I ran another query "info:homepage.com" and the home page is indexed by Google. When I run a branded search for the company name the home page does come up first. The current page that is showing up first in the "site:domain.com" listing is my blog index page. Several months back I redirected the index.php page to the root of the domain. Not sure if this is helping or hurting. In the sitemap I removed the index.php and left only the root domain as the page to index. Also all interior links are sent to the root, index.php has been eliminated from all internal links everything links to root The main site navigation does not refer to the "Home" page, but instead my logo is the link to the Home page. Should I noindex my blog/index.php page? This page is only a compilation of posts and does not have any original content instead it actually throws up duplicate content warnings. Any help would be much appreciated. I apologize if this is a silly question, but I'm getting frustrated/ annoyed at the whole situation.
Local Website Optimization | | SEO_Matt0 -
Moving from html to wordpress site - 301's
Hello, I recently took control of my website from a web designer. I have been reading as much as I can regards SEO etc to make long term improvements to my site. The site was a basic 4 page website for a local cleaning company. Consisting of a homepage, services page, testimonial and contact page. The site performed reasonably given it's lack of detail or SEO but probably only because the level of competition isn't great. I am in the process of rebuilding the site in wordpress and with SEO in mind I intend to have more than 1 page regards services. I have 301'd my services.html page to my number 1 keyword term to gain any little link juice that is available. Now to my questions... Should I be doing this with the other pages? Is it worth 301'ing my contact us page? Is there anything to be gained by doing so? Again should I 301 the index.html to the new homepage? Been reading about this and the issues relating to loops etc but cannot find a definite answer regards the need? Last scenario - lets say my testimonials.html page has some link juice would it be beneficial to 301 that to 1 of my new service pages to give that a kick start as opposed to making a less important page like another testimonials page more powerful? Hope this makes sense, I am a beginner just thinking out loud. Thanks
Local Website Optimization | | sfrediktru80 -
Ecommerce Site with Unique Location Pages - Issue with unique content and thin content?
Hello All, I have an Ecommerce Site specializing in Hire and we have individual location pages on each of our categories for each of our depots. All these pages show the NAP of the specific branch Given the size of our website (10K approx pages) , it's physically impossible for us to write unique content for each location against each category so what we are doing is writing unique content for our top 10 locations in a category for example , and the remaining 20 odd locations against the same category has the same content but it will bring in the location name and the individual NAP of that branch so in effect I think this thin content. My question is , I am quite sure I we are getting some form of algorithmic penalty with regards the thin/duplicate content. Using the example above , should we 301 redirect the 20 odd locations with the thin content , or should be say only 301 redirect 10 of them , so we in effect end up with a more 50/50 split on a category with regards to unique content on pages verses thin content for the same category. Alternatively, should we can 301 all the thin content pages so we only have 10 locations against the category and therefore 100% unique content. I am trying to work out which would help most with regards to local rankings for my location pages. Also , does anyone know if a thin/duplicate content penalty is site wide or can it just affect specific parts of a website. Any advice greatly appreciated thanks Pete
Local Website Optimization | | PeteC120 -
Whats in a domain name (tld)
So we are setting up a new site for a Business Improvement District (BID) for our local town. So initially we would name the new site TownNameBID.co.uk (or .com) . However with the new domain tld out we are thinking of getting TownName.BID using the new BID tld. .BID is meant to be reserved for sites such as auction sites, however this will actually be more of a community support site. I would have thought that technically it should not really make much difference particularly once all the appropriate Local Business is placed on the site. But what is the possibility that by search engines it may perceive this a an auction site as opposed to a community site. as well as technical issues are there any anecdotal issues where the wrong tld may put people off. Thoughts
Local Website Optimization | | smartcow0 -
UK website to be duplicated onto 2 ccTLD's - is this duplicate content?
Hi We have a client who wishes to have a site created and duplicated onto 3 servers hosted in three different countries. United Kingdom, Australia and USA. All of which will ofcourse be in the English language. A long story short, the website will provide the user 3 options on the homepage asking them which "country site" they wish to view. (I know I can detect the user IP and autoredirect but this is not what they want) Once they choose an option it will direct the user to the appropriate ccTLD. Now the client wants the same information to appear on all 3 sites with some slight variations in products available and English/US spelling difference but for the most part, the sites will look the same with the same content on each page. So my question is, will these 3 sites been seen as duplicates of each other even though they are hosted in different countries and are on ccTLD's? Are there any considerations I should pass onto the client with this approach? Many thanks for reading.
Local Website Optimization | | yousayjump
Kris0 -
Sites Verification Issues
We have a group of automotive dealerships by a website provider that causes issues when trying to verify our sites. Because they use Analytics for their data program, they install a code into our websites-stopping us from doing so properly in our back end. We also cannot verify ourselves in webmasters or adwords. We can't actually "own" any of our sites since they run a java query script from within the website. They also do not allow the use of iframes or scripts, so we can't even use the container to verify these sites. Any help or insight would be greatly appreciated as I am sure there is some way to break this to get our data and be verified.
Local Website Optimization | | spentland0