Implementation advice on fighting international duplicate content
-
Hi All,
Let me start by explaining that I am aware of the rel="canonical" and **rel="alternate" hreflang="x" **tags but I need advice on implementation.
The situation is that we have 5 sites with similar content. Out of these 5:
- 2 use the same URL stucture and have no suffix
- 2 have a different URL structure with a .html suffix
- 1 has an entirely different URL structure with a .asp suffix
The sites are quite big so it will take a lot of work to go through and add rel="alternate" hreflang="x" tags to every single page (as we know the tag should be applied on a page level not site level).
4 out of the 5 sites are managed by us and have the tag implemented so that makes it easier but the 5th is managed in Asia and we fear the amount of manual work required will put them off implementing it. The site is due to launch at the end of the month and we need to sort this issue out before it goes live so that we are not penalised for duplicate content.
Is there an easy way to go about this or is the only way a manual addition?
Has anyone had a similar experience?
Your advice will be greatly appreciated.
Many thanks,
Emeka.
-
Unfortunately yes, it is needed to be rerun the process with the tool.
-
Thanks Gianluca,
Have you had experience using the tool above? Presumably each time a new page is added to the site the tool would have to be run again?
I agree that an in-house solution will be best but given the time limit we are open to ideas.
I appreciate your response.
Emeka.
-
When it come to massive sites and hreflang annotations, the ideal solution is implementing the hreflang using the sitemap.xml method.
It is explained here by Google: https://support.google.com/webmasters/answer/2620865?hl=en.
A tool that makes easier to implement hreflang in a sitemap file is the one The Mediaflow created:
http://www.themediaflow.com/tool_hreflang.php.
Right now, that is the only tool I know for that kind of task, so you could also think to create an internal in-house solution, if you have internal developers who can be dedicated to this.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Content Page URL Question
Our main website is geared toward the city where we are located and includes the city name in content page URLs. We also have separate websites for three surrounding cities; these websites have duplicate content except the city name: MainWebsite.com
Local Website Optimization | | sharon75025
City2-MainWebsite.com
City3-MainWebsite.com
City4-MainWebsite.com We're restructuring to eliminate the location websites and only use the main website. The new site will have city pages. We have well established Google business locations for all four cities. We will keep all locations, replacing the location website with the main website. Should we remove City-IL from all content page URLs in the new site? We don't want to lose traffic/ranking for City2 or City3 because the content pages have City1 in the URL. Page URLs are currently formatted as follows: www.MainWebsite.com/Service-1-City1-IL.html
www.MainWebsite.com/Service-2-City1-IL.html
www.MainWebsite.com/Service-3-City1-IL.html
www.MainWebsite.com/Service-4-City1-IL.html Thanks!0 -
Current advice or best practice for personalization by geolocation?
What is the current advice for displaying content based on a user's geolocation? On the one hand, I know the rule of thumb is that you are not supposed to treat googlebot any different than any other user to your site and shouldn't show different content than what you would show a regular user, however on the other hand, if we personalize the content based on the geography, it means that the content that is indexed would be specific to Mt. View, CA in Google's index, correct? I know I heard years ago that the best practice was to use javascript to personalize the content client side, and block the js with robots.txt so that google indexes a default page and not a geo-specific page. Any insights or advice appreciated.
Local Website Optimization | | IrvCo_Interactive0 -
Best SEO practice for project galley (image gallery) ? I need SEO Professionals advice.
Hi, i Have a website that is powerful and i dont want to hurt it. http://dreamgaragedoor.com/ right now i need a projects gallery page that people goes there to find out the models and products and services images. i have created the page and it would be 6 slider in the page and each slider has at least 10 images inside. first question is having this much images would or wouldnt hurt my webiste. second what ALT should i use for this many pictures in 1 page. for example i think having ALT like below in one page would be bad SEO wise. Sliding-gate-1, Sliding-gate-2, Sliding-gate-3, Sliding-gate-4,... please take a look at the gallery page and let me have your pro ideas. http://dreamgaragedoor.com/galleries/ thanks
Local Website Optimization | | Mishel2980 -
Duplicate Content - Local SEO - 250 Locations
Hey everyone, I'm currently working with a client that has 250 locations across the United States. Each location has its own website and each website has the same 10 service pages. All with identical content (the same 500-750 words) with the exception of unique meta-data and NAP which has each respective location's name, city, state, etc. I'm unsure how duplicate content works at the local level. I understand that there is no penalty for duplicate content, rather, any negative side-effects are because search engines don't know which page to serve, if there are duplicates. So here's my question: If someone searches for my client's services in Miami, and my client only as one location in that city, does duplicate content matter? Because that location isn't competing against any of my client's other locations locally, so search engines shouldn't be confused by which page to serve, correct? Of course, in other cities, like Phoenix, where they have 5 locations, then I'm sure the duplicate content is negatively affecting all 5 locations. I really appreciate any insight! Thank you,
Local Website Optimization | | SEOJedi510 -
Using geolocation for dynamic content - what's the best practice for SEO?
Hello We sell a product globally but I want to use different keywords to describe the product based on location. For this example let’s say in USA the product is a "bathrobe" and in Canada it’s a "housecoat" (same product, just different name). What this means… I want to show "bathrobe" content in USA (lots of global searches) and "housecoat" in Canada (less searches). I know I can show the content using a geolocation plugin (also found a caching plugin which will get around the issue of people seeing cached versions), using JavaScript or html5. I want a solution which enables someone in Canada searching for "bathrobe" to be able to find our site through Google search though too. I want to rank for "bathrobe" in BOTH USA and Canada. I have read articles which say Google can read the dynamic content in JavaScript, as well as the geolocation plugin. However the plugins suggest Google crawls the content based on location too. I don’t know about JavaScript. Another option is having two separate pages (one for “bathrobe” and one for “housecoat”) and using geolocation for the main menu (if they find the other page i.e. bathrobe page through a Canadian search, they will still see it though). This may have an SEO impact splitting the traffic though. Any suggestions or recommendations on what to do?? What do other websites do? I’m a bit stuck. Thank you so much! Laura Ps. I don’t think we have enough traffic to add subdomains or subdirectories.
Local Website Optimization | | LauraFalls0 -
Content writing for single entity business (The use of I)
Most of my clients consist of single entity law firms in which my clients repeatedly use the pronoun "I" to describe every service they provide. I have always preferred using the business name The Law Office of..." put lawyer name here". Is it ok to repetitively use the pronoun "I" in the content. To me it feels lack luster and childish not very professional, however I have a hard time convincing the lawyers of this. What are your thoughts? Can good content be written with the repetitive use of "I"? If not is the business name sufficient or maybe another pronoun? I will be showing responses to my clients if that is ok.
Local Website Optimization | | donsilvernail0 -
Expert Advice Needed: Single Domain vs Multiple Domain for 2 Different Countries?
Hi MOZers, We are looking for some advice on whether to have a single TLD(.com) or 2 separate domains (.ca) & (.com) Our website will have different products & pricing for each of US users(.com) and Canada users(.ca). Since, we are targeting different countries & user groups with each domain - we are not concerned about "duplicate content". So, does it make more sense to have a single domain for compounding our content marketing efforts? Or, Will it be more beneficial to have seperate domains for the geo-targeting benefits on Google.CA & Google.COM? Looking forward to some great suggestions.
Local Website Optimization | | ScorePromotions0 -
How can I rank my .co.uk using content on my .com?
Hi, We currently have a .com site ranking second for our brand term in the .co.uk SERP. This is mainly because we don't own the exact match brand term which comes from not having a clue what we were doing when we set up the company. Would it be possible to out rank this term considering we the weighing that google puts towards exact matches in the URL? N.B - There are a few updates we could do to the homepage to make the on-page optimisation better and we have not actively done any link building yet which will obviously help. competitor SERP rank 1 - MOZ PA38 DA26 Our Site SERP rank 2 - MOZ PA43 DA32 Thanks Ben
Local Website Optimization | | benjmoz0