Implementation advice on fighting international duplicate content
-
Hi All,
Let me start by explaining that I am aware of the rel="canonical" and **rel="alternate" hreflang="x" **tags but I need advice on implementation.
The situation is that we have 5 sites with similar content. Out of these 5:
- 2 use the same URL stucture and have no suffix
- 2 have a different URL structure with a .html suffix
- 1 has an entirely different URL structure with a .asp suffix
The sites are quite big so it will take a lot of work to go through and add rel="alternate" hreflang="x" tags to every single page (as we know the tag should be applied on a page level not site level).
4 out of the 5 sites are managed by us and have the tag implemented so that makes it easier but the 5th is managed in Asia and we fear the amount of manual work required will put them off implementing it. The site is due to launch at the end of the month and we need to sort this issue out before it goes live so that we are not penalised for duplicate content.
Is there an easy way to go about this or is the only way a manual addition?
Has anyone had a similar experience?
Your advice will be greatly appreciated.
Many thanks,
Emeka.
-
Unfortunately yes, it is needed to be rerun the process with the tool.
-
Thanks Gianluca,
Have you had experience using the tool above? Presumably each time a new page is added to the site the tool would have to be run again?
I agree that an in-house solution will be best but given the time limit we are open to ideas.
I appreciate your response.
Emeka.
-
When it come to massive sites and hreflang annotations, the ideal solution is implementing the hreflang using the sitemap.xml method.
It is explained here by Google: https://support.google.com/webmasters/answer/2620865?hl=en.
A tool that makes easier to implement hreflang in a sitemap file is the one The Mediaflow created:
http://www.themediaflow.com/tool_hreflang.php.
Right now, that is the only tool I know for that kind of task, so you could also think to create an internal in-house solution, if you have internal developers who can be dedicated to this.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
International SEO: reposting my own posts to different ccTLDs versions of my website
Hello there Moz community! Moz has been super helpful for me and the team, keep up the good work! I have searched online for answers regarding my specific situation, but I haven't found any. I'm asking my fellow Moz users in hopes of an answer. Maybe this thread will help others too. I currently have this domain: https://eco-reusable.com/ I would like to target Ireland and the UK with my keywords so I have just bought eco-reusable**.IE** and eco-reusable**.CO.UK** My questions are: 1. In order to rank as high as possible for Ireland, do I create a new website for eco-reusable.ie using the same pages but changing all the content slightly so it is not duplicate content OR do I point the eco-reusable.ie domain to eco-reusable.com? By having two sites, we will add more hours but we don't mind if that will be of benefit in the longrun for ranking high in Ireland. I have the same question for eco-reusable.co.uk
Local Website Optimization | | Gael_Regnault
If we have to create three websites and make similar content (not duplicate), we will if it will be better for ranking high in ireland for .ie, in the UK for .co.uk and for the rest of the world for .com 2. If we create three websites, can I safely "copy/paste" my blog posts without being punished by Google for duplicate content? If so, how much variation do we have to have for each of the three sites if we are writing blogs that are the same context. Thank you in advance! 🙂0 -
Should Multi Location Businesses "Local Content Silo" Their Services Pages?
I manage a site for a medical practice that has two locations. We already have a location page for each office location and we have the NAP for both locations in the footer of every page. I'm considering making a change to the structure of the site to help it rank better for individual services at each of the two locations, which I think will help pages rank in their specific locales by having the city name in the URL. However, I'm concerned about diluting the domain authority that gets passed to the pages by moving them deeper in the site's structure. For instance, the services URLs are currently structured like this: www.domain.com/services/teeth-whitening (where the service is offered in each of the two locations) Would it make sense to move to a structure more like www.domain.com/city1name/teeth-whitening www.domain.com/city2name/teeth-whitening Does anyone have insight from dealing with multi-location brands on the best way to go about this?
Local Website Optimization | | formandfunctionagency1 -
What is a good "white hat" content distribution network for link building?
I am helping a client with Local SEO efforts who has hundreds of blog posts (they have been doing 5 a week for the last 3 years) that contain full length articles about their industry. The client's website itself has been very well optimized for all regards (CRO, Mobile, download speed, citations). However they have very weak domain authority compared to their competitors. I am looking for a bona fide content distribution network I could use to promote my client's blog posts/articles. I have used Linkvana in the past but I have become wary of them after the penguin update. I also had functionality problems using their interface. Are their any bona fide content/article distribution networks out there? Thanks
Local Website Optimization | | RosemaryB0 -
Will hreflang eliminate duplicate content issues for a corporate marketing site on 2 different domains?
Basically, I have 2 company websites running. The first resides on a .com and the second resides on a .co.uk domain. The content is simply localized for the UK audience, not necessarily 100% original for the UK. The main website is the .com website but we expanded into the UK, IE and AU markets. However, the .co.uk domain is targeting UK, IE and AU. I am using the hreflang tag for the pages. Will this prevent duplicate content issues? Or should I use 100% new content for the .co.uk website?
Local Website Optimization | | QuickToImpress0 -
International SEO - How to rank similar keys for differents countries
Hello MOZ friends.
Local Website Optimization | | NachoRetta
I work in an digital marketing agency in Argentina and since we have a lot of traffic from other Spanish-speaking countries like Mexico and Spain, we want to rank specific keywords for these countries.
We were thinking of putting new versions of the homepage in subfolders, for example /es/ for Spain, /mx/ to Mexico, etc. In these new subfolders we would place a very similar version of the homepage with a few minor modifications to work specific keywords in each country. For example, in Spain it is more searched "marketing online", and "marketing digital" is more used in Mexico and Argentina.
I have understood that to implement this we would be to place a label hrflang on the homepage directing visitors and crawlers to the correct version of each country. Is it ok?
Another concern is, whether they are very similar pages, Google does not take it as duplicate content ..
I read this:
https://moz.com/blog/the-international-seo-checklist
And i am not completely sure about using subfolders for each country, but i dont know how to position diferents keywords for diferent countries.
Regards,
Juan Ignacio Retta0 -
Does Google play fair? Is 'relevant content' and 'usability' enough?
It seems there are 2 opposing views, and as a newbie this is very confusing. One view is that as long as your site pages have relevant content and are easy for the user, Google will rank you fairly. The other view is that Google has 'rules' you must follow and even if the site is relevant and user-friendly if you don't play by the rules your site may never rank well. Which is closer to the truth? No one wants to have a great website that won't rank because Google wasn't sophisticated enough to see that they weren't being unfair. Here's an example to illustrate one related concern I have: I've read that Google doesn't like duplicated content. But, here are 2 cases in which is it more 'relevant' and 'usable' to the user to have duplicate content: Say a website helps you find restaurants in a city. Restaurants may be listed by city region, and by type of restaurant. The home page may have links to 30 city regions. It may also have links for 20 types of restaurants. The user has a choice. Say the user chooses a region. The resulting new page may still be relevant and usable by listing ALL 30 regions because the user may want to choose a different region. Altenatively say the user chooses a restaurant type for the whole city. The resulting page may still be relevant and usable by giving the user the ability to choose another type OR another city region. IOW there may be a 'mega-menu' at the top of the page which duplicates on every page in the site, but is very helpful. Instead of requiring the user to go back to the home page to click a new region or a new type the user can do it on any page. That's duplicate content in the form of a mega menu, but is very relevant and usable. YET, my sense is that Google MAY penalize the site even though arguably it is the most relevant and usable approach for someone that may or may not have a specific region or restaurant type in mind.. Thoughts?
Local Website Optimization | | couponguy0 -
Canonical for 80-90% duplicate content help
Hi . I seem to spend more time asking questions atm. I have a site I have revamped www.themorrisagency.co.uk I am working through sorting out the 80-90% duplicated content that just replaces a spattering of geographical and band styles eg: http://www.themorrisagency.co.uk/band-hire/greater-manchester/ 'manchester' being changed to : http://www.themorrisagency.co.uk/band-hire/oxfordshire/ etc So I am going through this slow but essential process atm. I have a main http://www.themorrisagency.co.uk/band-hire/ page My question is: Would it be sensible to (using Yoast SEO plug in) use a canonical redirect as a temp solution from these dup pages to http://www.themorrisagency.co.uk/band-hire/ Rather than remove them What are your thoughts as I am aware that the damage using a rel= could make it worse. Thanks as always Daniel
Local Website Optimization | | Agentmorris0 -
Recommendations on implementing regional home pages
My site is a directory that serves several regions. Each region has it's own "home page" with specific content for that visitor about their region. Right now we use Google location recognition after you visit the home page to redirect you to your regional home page. I am in the process of reviewing the best way to implement our home page for SEO purposes. Any advice or recommendations on how to present home pages that are location specific would be greatly appreciated. Thank you Steve
Local Website Optimization | | steve_linn0