Rel="alternate" hreflang="x" or Unique Content?
-
Hi All,
I have 3 sites; brand.com, brand.co.uk and brand.ca
They all have the same content with very very minor changes. What's best practice; to use rel="alternate" hreflang="x" or to have unique content written for all of them.
Just wondering after Panda, Penguin and the rest of the Zoo what is the best way to run multinational sites and achieve top positions for all of them in their individual countries.
If you think it would better to have unique content for each of them, please let us know your reasons.
Thanks!
-
Hello there,
In an ideal world I would recommend (wherever possible) that completely different content is created for UK / US / Canadian markets.
I recommend this mainly because there are a lot of differences in consumer behaviour. Although we all speak English, the English we speak, the way we search, the messaging we respond to etc is different.
Obviously the option to create separate content isn't open to everyone (budgets, resources, etc). As such, if you can't stretch to creating separate content for each market I'd probably go with the rel=alternate hreflang implementation.
I hope this helps,
Hannah
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
"Update" in Search Console is NOT an Algo Update
We've had a few questions about the line labeled "Update" in Google Search Console on the Search Analytics timeline graph (see attached image). Asking around the industry, there seems to be a fair amount of confusion about whether this indicates a Google algorithm update. This is not an algorithm update - it indicates an internal update in how Google is measuring search traffic. Your numbers before and after the update may look different, but this is because Google has essentially changed how they calculate your search traffic for reporting purposes. Your actual ranking and traffic have not changed due to these updates. The latest updated happened on April 27th and is described by Google on this page: Data anomalies in Search Console Given the historical connotations of "update" in reference to Google search, this is a poor choice of words and I've contacted the Webmaster Team about it. 2CsyN7Q
Algorithm Updates | | Dr-Pete12 -
Dublicate Content: Almost samt site on different domains
Hi, I own a couple of casting websites, which I'm at the moment launching "local" copies of all over the world. When I launch my website in a new country, the content is basically allways the same, except the language sometimes changes country for country. The domains will vary, so the sitename would be site.es for Spain, site.sg for Singapore, site.dk for Denmark and so. The websites will also feature diffent jobs (castings) and diffent profiles on the search.pages and so, BUT the more static pages are the same content (About us, The concept, Faq, Create user and so). So my Questions are: Is this something that is bad for Google SEO? The sites are atm NOT linking to each other with language-flags or anything - Should I do this? Basically to tell google that
Algorithm Updates | | KasperGJ
the business behind all these sites are somewhat big. Is there a way to inform Google on, that these sites should NOT be treated as dublicate content (Canonical tag wont do, since I want the "same" content to be listet on the locally Google sites). Hope there is some experts here which can help. /Kasper0 -
Ecommerce SEO: Is it bad to link to product/category pages directly from content pages?
Hi ! In Moz' Whiteboard friday video Headline Writing and Title Tag SEO in a Clickbait World, Rand is talking about (among other things) best practices related to linking between search, clickbait and conversion pages. For a client of ours, a cosmetics and make-up retailer, we are planning to build content pages around related keywords, for example video, pictures and text about make-up and fashion in order to best target and capture search traffic related to make-up that is prevalent earlier in the costumer journey. Among other things, we plan to use these content pages to link directly to some of the products. For example a content piece about how to achieve full lashes will to link to particular mascaras and/or the mascara category) Things is, in the Whiteboard video Rand Says:
Algorithm Updates | | Inevo
_"..So your click-bait piece, a lot of times with click-bait pieces they're going to perform worse if you go over and try and link directly to your conversion page, because it looks like you're trying to sell people something. That's not what plays on Facebook, on Twitter, on social media in general. What plays is, "Hey, this is just entertainment, and I can just visit this piece and it's fun and funny and interesting." _ Does this mean linking directly to products pages (or category pages) from content pages is bad? Will Google think that, since we are also trying to sell something with the same piece of content, we do not deserve to rank that well on the content, and won't be considered that relevant for a search query where people are looking for make-up tips and make-up guides? Also.. is there any difference between linking from content to categories vs. products? ..I mean, a category page is not a conversion page the same way a products page is. Looking forward to your answers 🙂0 -
How much content is it safe to change?
I have read that it is unsafe to change more than 20% of your site’s content in any update. The rationale is that "Changing too much at once can flag your site within the Google algorithm as having something suspicious going on." Is this true, has anyone had any direct experiences of this or similar?
Algorithm Updates | | GrouchyKids0 -
Content Caching Memory & Removal of 301 Redirect for Relieving Links Penalty
Hi, A client site has had very poor link legacy, stretching for over 5 years. I started the campaign a year ago, providing valuable good quality links. Link removals and creating a disavow to Google have been done, however after months and months of waiting nothing has happened. If anything, after the recent penguin update, results have been further affected. A 301 redirect was undertaken last year, consequently associating those bad links with the new site structure. I have since removed the 301 redirect in an attempt to detach this legacy, however with little success. I have read up on this and not many people appear to agree whether this will work. Therefore, my new decision is to start a fresh using a new domain, switching from the .com to .co.uk version, helping remove all legacy and all association with the spam ridden .com. However, my main concern with this is whether Google will forever cach content from the spammy .com and remember it, because the content on the new .co.uk site will be exactly the same (content of great quality, receiving hundreds of visitors each month from the blog section along) The problem is definitely link related and NOT content as I imagine people may first query. This could then cause duplicate content, knowing that this content pre-existed on another domain - I will implement a robots.txt file removing all of the .com site , as well as a no index no follow - and I understand you can present a site removal to Google within webmaster tools to help fast track the deindexation of the spammy .com - then once it has been deindexed, the new .co.uk site will go live with the exact same content. So my question is whether Google will then completely forget that this content has ever existed, allowing me to use exactly the same content on the new .co.uk domain without the threat of a duplicate content issue? Also, any insights or experience in the removal of a 301 redirect, detaching legacy and its success would also be very helpful! Thank you, Denver
Algorithm Updates | | ProdoDigital0 -
Are all duplicate contents bad?
We were badly hit by Panda back in January 2012. Unfortunately, it is only now that we are trying to recover back. CASE 1:
Algorithm Updates | | Gautam.Jain
We develop software products. We send out 500-1000 word description about the product to various download sites so that they can add to their product listing. So there are several hundred download sites with same content. How does Google view this? Did Google penalize us due to this reason? CASE 2: In the above case the product description does not match with any content on our website. However, there are several software download sites that copy and paste the content from our website as the product description. So in this case, the duplicate content match with our website. How does Google view this? Did Google penalize us due to this reason? Along with all the download sites, there are also software piracy & crack sites that have the duplicate content. So, should I remove duplicate content only from the software piracy & crack sites or also from genuine download sites? Does Google reject all kind of duplicate content? Or it depends on who hosts the duplicate content? Confused 😞 Please help.0 -
How to build good content and choose right keywords.?
I have started building content for our website using the Wordpress tool. Now I wanted to know that I use GA and the Adwords keyword tool. I go in for exact matching keywords and have selected a few of them. How do I know if these keywords are actually the ones going to give me good traffic? How can I select good keywords and write content along them. I don't wish to over stuff articles with the keywords. How can I refrain from doing so. Any optimum limit through which I know how much of the keyword needs to occur how many times within an article? Please give some good insights as to how this is accomplished? Thanks
Algorithm Updates | | shanky11 -
Content below the fold and Panda Update
Hi I was at the linklove conference and I heard some worrying stories about the way content is formatted on a page being a factor in ehow has avoided being slapped. It was the first time I had heard the expression "below the fold..." I am producing some very sexy SERP's results and other sexier metrics are up too but I am concerened that thefurnituremarket.co.uk has a ton of images on the home page and the nice content is below all of them.. firstly is this content..."below the fold"? secondly I know the site is old but do you think when this panda update hits the UK... were will be penalised for the look of the site.. I know there was talk yesterday at the conference of coming up woth a tool to check this out... my gut says that this will be a factor... sooner rather than later hence I am looking at magento and how we can skin it to look nice and present products better.. I would be really interested to know what exactly is "below the fold" on the furnituremarket.co.uk and some thoughts on the whole ehow formatting issue..
Algorithm Updates | | robertrRSwalters0