Rel="alternate" hreflang="x" or Unique Content?
-
Hi All,
I have 3 sites; brand.com, brand.co.uk and brand.ca
They all have the same content with very very minor changes. What's best practice; to use rel="alternate" hreflang="x" or to have unique content written for all of them.
Just wondering after Panda, Penguin and the rest of the Zoo what is the best way to run multinational sites and achieve top positions for all of them in their individual countries.
If you think it would better to have unique content for each of them, please let us know your reasons.
Thanks!
-
Hello there,
In an ideal world I would recommend (wherever possible) that completely different content is created for UK / US / Canadian markets.
I recommend this mainly because there are a lot of differences in consumer behaviour. Although we all speak English, the English we speak, the way we search, the messaging we respond to etc is different.
Obviously the option to create separate content isn't open to everyone (budgets, resources, etc). As such, if you can't stretch to creating separate content for each market I'd probably go with the rel=alternate hreflang implementation.
I hope this helps,
Hannah
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Blog-posts pages are dominating in search console "Internal Links". Only home-page at top!
Hi all, Ours is WordPress website and we have a blog...website.com/blog/. All the important pages in the website are well linked from top and footer menu. But in our webmasters...internal links section, only homepage is at the top. Blog-posts are others followed by homepage. I wonder why blog pages are dominating our website pages. Please give your suggestions on this. Do you think Google will give more priority for the blog-posts than website pages as they are more linked technically? Thanks
Algorithm Updates | | vtmoz1 -
HREFLANG for multiple country/language combinations
We have a site setup with English, German, French, Spanish and Italian. We offer these languages for every European country (over 30). Thus, there are 150 + different URL combinations, as we use the /country/language/ subdirectory path. Should I list out every combination in hreflang?Or should I simply choose the most applicable combinations (/de/de and fr/fr, etc.)? If we go the latter path, should I block google bot from crawling the atypical combinations? Best, Sam
Algorithm Updates | | JohnnyECCO0 -
I am wondering if there is a right answer for keywords with alternate spelling.
I work in insurance, specifically CoOp Insurance. I researched on Google trend the 3 different spellings (Co Op, Co-Op and CoOp) and there is search volume for all 3 but no big trends. Moz shows optimization for all 3. Is there a right one to go after? Is the correct spelling the correct one to target, even if it doesn't have the highest search volume? Does going after the misspellings dilute branding, or enhance search visibility?
Algorithm Updates | | Trent.Warner1 -
How Google's "Temporarily remove URLs" in search console works?
Hi, We have created new sub-domain with new content which we want to highlight for users. But our old content from different sub-domain is making top on google results with reputation. How can we highlight new content and suppress old sub-domain in results? Many pages have related title tags and other information in similar. We are planing to hide URLs from Google search console, so slowly new pages will attain the traffic. How does it works?
Algorithm Updates | | vtmoz0 -
Duplicate Content
I was just using a program (copyscpape) to see if the content on a clients website has been copied. I was surprised that the content on the site was displaying 70% duplicated and it's showing the same content on a few sites with different % duplicated (ranging from 35%-80%) I have been informed that the content on the clients site is original and was written by the client. My question is, does Google know or understand that the clients website's content was created as original and that the other sites have copied it word-for-word and placed it on their site? Does he need to re-write the content to make it original? I just want to make sure before I told him to re-write all the content on the site? I'm well aware that duplicate content is bad, but i'm just curious if it's hurting the clients site because they originally created the content. Thanks for your input.
Algorithm Updates | | Kdruckenbrod0 -
Big rise in "Keyword not defined"
Hi, all. Anyone else seen a massive increase in the Not Provided keywords in their analytics in the past couple of weeks. Probably related to this (source:http://searchengineland.com/post-prism-google-secure-searches-172487) _In the past month, Google quietly made a change aimed at encrypting all search activity — except for clicks on ads. Google says this has been done to provide “extra protection” for searchers, and the company may be aiming to block NSA spying activity. _ Other than the unreliable stats from WMT, there doesn't seem too many ways which we can now find out what is sending traffic to our sites!
Algorithm Updates | | GrumpyCarl0 -
Rich Snippets: rel=”Author” CTR?
Hi everybody, I want to put on my websites the rel="author" to appear in google search with the image of g+ of my profile. Does anyone have statistics or case history on the effects (positive or negative) that this can have on the CTR? Logically I think it should increase CTR, but I'm not sure that is the case for all sectors. Tnks in advance for your answers
Algorithm Updates | | BizonwebItaly0 -
Duplicate Pate Content - 404's or 301's?
I deleted about 100 pages of stale content 6 months ago and they are currently returning 404's. The crawl diagnostics have pointed out 77 duplicate pages because of this. Should I redirect these as 301's to get rid of the error or keep them as 404's? Most of the pages still have some page authority but I don't want to get penalized. Just looking for the best solution. Thanks!
Algorithm Updates | | braunna0