Duplicate content - news archive
-
Most of them are due to news items having more than 1 category – which is pretty normal.Also /us/blog, /uk/blog and /ca/blog are effectively the same page.None of them are actually duplicate content – just alternate URLs for the same pagehttp://www.fdmgroup.com/category/news/
-
From developer: "Looking into this, we need to have /uk/blog, /us/blog and /ca/blog in order for them to appear on the menus – we could put a noindex meta tag on the us and ca pages to avoid duplicates?"
Or do you recommend href lang tag? Thanks.
-
From developer: "Looking into this, we need to have /uk/blog, /us/blog and /ca/blog in order for them to appear on the menus – we could put a noindex meta tag on the us and ca pages to avoid duplicates?"
-
Hi Christopher,
Google has definitely become a lot better in recent years at identifying this sort of duplication and dealing with it (largely because this has to be one of the most common accidental / non-malicious duplication causes - categories on blogs and news sites). That said, cleaning it up is for the best. I have been meaning to clean up a blog belonging to a family member in this manner for months (years...) because the version of each piece of content Google has chosen is wrong. Pages marked up with dates, e.g. example.com/2014/01 kept ranking better than the original posts in that date range. So even when Google makes the decision for you, it won't necessarily make the right one. You're risking visitors coming to a page they didn't expect, or a page that doesn't answer their query as succinctly as the "best" version would have, and if you are in e-commerce of any sort or focusing on conversions, this can make a big difference to how optimised your traffic's on-site experience is.
Where you'll be "penalised" for duplicate content, especially by Panda, is as you cite above: when the duplication looks like it has been done for spam purposes. This has happened accidentally to people when their content management systems have gone mad with infinite duplication, but it likely won't happen with simple blog categories.
In short, Google sees this sort of duplication all day, every day and will choose its favourite version to rank. However, if you can guide its choice, you're in control of what your visitors see.
You mention country-based categories in your original question. If internationalisation and duplicate content are a concern, you might want to check out the href lang tag (also called rel="alternate" tag - it gets called either by the community). Could be useful if you're publishing the same thing in different countries.
-
From my developer:"Doing a bit of research Google have explicitly stated that that don’t penalise duplicate content unless it appears to be deliberately deceptive. The only issue is which version appears in the search results.Duplicate content on a site is not grounds for action on that site unless it appears that the intent of the duplicate content is to be deceptive and manipulate search engine results. If your site suffers from duplicate content issues, and you don't follow the advice listed above, we do a good job of choosing a version of the content to show in our search results.https://support.google.com/webmasters/answer/66359?hl=enMatt Cutts, Google’s head of search spam, posted a video today about duplicate content and the repercussions of it within Google’s search results.Matt said that somewhere between 25% to 30% of the content on the web is duplicative. Of all the web pages and content across the internet, over one-quarter of it is repetitive or duplicative.But Cutts says you don’t have to worry about it. Google doesn’t treat duplicate content as spam. It is true that Google only wants to show one of those pages in their search results, which may feel like a penalty if your content is not chosen — but it is not.Google takes all the duplicates and groups them into a cluster. Then Google will show the best of the results in that cluster.Matt Cutts did say Google does reserve the right to penalize a site that is excessively duplicating content, in a manipulative manner. But overall, duplicate content is normal and not spam.http://searchengineland.com/googles-matt-cutts-25-30-of-the-webs-content-is-duplicate-content-thats-okay-180063http://searchengineland.com/googles-matt-cutts-duplicate-content-wont-hurt-you-unless-it-is-spammy-167459Cheers"
-
I'm afraid your blog pages are in fact duplicate content, in Google's eyes anyway.
The /us/blog, /uk/blog and /ca/blog examples are all separate URLs that you are asking Google to index (separate canonical tags for each and no robots instructions that I can see). Google is going to look at these and any blog posts within them as separate pages. Once it realises they all have the same content, it will likely result in a Panda algorithmic penalty.
The risk here is that this penalty might affect your entire domain, rather than the offending pages. I really don't see that as a risk worth taking. Therefore, I strongly advise to remove the separate versions of the blogs and consolidate into one blog, with redirections of the local blogs to the new ones. Failing that, choose one version and instruct Google not to index other versions of the page by using a meta robots tag in your header, or in the robots.txt file.
I also advise that you noindex the category page to be sure that its content isn't being seen as duplicate either. More info on how to do that can be found in the Moz Robots Guide.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content across English-speaking ccTLDs
Morning, If a brand offering pretty the same products/services has 4 English-speaking ccTLDs (.com, .co.uk, .com.au and .co.nz), what are the best practices when thinking about SEO and content? In an ideal world, all content should be totally unique, but when the products/services offered across every ccTLD are the same, this may prove tricky. Am I right in thinking that duplicate content across ccTLDs is tolerated by Google as they know you're targeting specific countries? Cheers!
International SEO | | PeaSoupDigital0 -
International Sites and Duplicate Content
Hello, I am working on a project where have some doubts regarding the structure of international sites and multi languages.Website is in the fashion industry. I think is a common problem for this industry. Website is translated in 5 languages and sell in 21 countries. As you can imagine this create a huge number of urls, so much that with ScreamingFrog I cant even complete the crawling. Perhaps the UK site is visible in all those versions http://www.MyDomain.com/en/GB/ http://www.MyDomain.com/it/GB/ http://www.MyDomain.com/fr/GB/ http://www.MyDomain.com/de/GB/ http://www.MyDomain.com/es/GB/ Obviously for SEO only the first version is important One other example, the French site is available in 5 languages and again... http://www.MyDomain.com/fr/FR/ http://www.MyDomain.com/en/FR/ http://www.MyDomain.com/it/FR/ http://www.MyDomain.com/de/FR/ http://www.MyDomain.com/es/FR/ And so on...this is creating 3 issues mainly: Endless crawling - with crawlers not focusing on most important pages Duplication of content Wrong GEO urls ranking in Google I have already implemented href lang but didn't noticed any improvements. Therefore my question is Should I exclude with "robots.txt" and "no index" the non appropriate targeting? Perhaps for UK leave crawable just English version i.e. http://www.MyDomain.com/en/GB/, for France just the French version http://www.MyDomain.com/fr/FR/ and so on What I would like to get doing this is to have the crawlers more focused on the important SEO pages, avoid content duplication and wrong urls rankings on local Google Please comment
International SEO | | guidoampollini0 -
How to handle different content on same domain internationally?
Dear community, I have encountered a unique situation and I am unsure as how to proceed, I have a U.S. based website for intentions of this question is www.musicstore.com. The customer has decided to offer their products up for sale internationally, however, has two business requirements, one is that his international presence differs with product offering and content then the domestic version and two, that they both live on the same domain of www.musicstore.com without any reference to offering a differing international presence. Many of his products are offered for purchase directly overseas, while not against his suppliers rules, it is frowned upon. All this said, now to my question. I'm currently running a Magento two website install. With GeoIP setting which version of www.musicstore.com is presented. Do I have to worry about different content being displayed on the same exact url even though the experience is completely location based? If it is a concern, any risks I should be concerned with. I could possibly do something along the lines of www.musicstore.com/in/ while this is not ideal for the customer, if it prevents many larger issues I'd steer the customer this way. I just want my customer to be able to sell his product internationally without upsetting his suppliers or making Google go, what does this site actually have. Hopefully I explained my question well enough for those who can help to understand. Please ask if you need any more information. Any help would be greatly appreciated, thank you.
International SEO | | swarming0 -
Geotarget subfolders with the same language or get rid of duplicates but lose option to geotarget?
Hi, we have a domain that is aimed to cover LatAm region. Currently, the homepage contains country selector for ~20 countries. 95% of them hold content in Spanish. We have only homepages for each regions as separate subfolders, i.e.
International SEO | | eset
www.maindomain.com/co
www.maindomain.com/cl
www.maindomain.com/br
etc. but once the user clicks on menu item he is taken back to main domain subpages, i.e. www.maindomain.com/comprar My struggle is to decide whether it is better to: A) copy all content for each subfolder, which will create huge amount of duplicates (there are no resources to create unique content and it is even impossible taking into account nature of the product - mostly tech.specs, etc.) and implement hreflang sitemaps and configure GWT to target each country with its own Spanish content (the same for each country) OR B) remove all local subfolders and keep only main domain in Spanish that will serve all countries within the region. With this option, we will get rid of duplicates but also lose option to geotarget. So, my questions is which option will do less harm, or if there is any other approach that comes to your minds. I consulted with two agencies but still haven't got clear answer. Thanks a lot for your help!0 -
Looking for content writers for multi-language SEO
Hi All, I'm currently doing a lot of work for a UK client who has multiple sites outside the UK (all part of the same business). We're currently discussing the option of us handling all of his SEO for his German, French, Spanish and Italian sites too, but we only have access to one person in the office who can speak French and Spanish. They're currently booked up on other jobs that we can't really move them off, so I'm looking for options of outsourcing some of the content writing. My question is, does anyone know of any high quality content writing services that have writers available to write for the countries languages above? We're going to focus initially on their on-site strategy and building up their high quality content. At the moment, they don't have much relevant content on their website, so we're going to initially look at this. Moving forward, we'll be looking at their off-site strategy and trying to find areas to submit high quality articles, look at guest blogging and PR opportunities. Any tips anyone has on this side (in terms of outsourcing to native speakers) would be quite useful too! Many thanks,
International SEO | | PinpointDesigns
Lewis0 -
How to fix the duplicate content problem on different domains (.nl /.be) of your brand's websites in multiple countries?
Dear all, what is the best way to fix the duplicate content problem on different domains (.nl /.be) of your brand's websites in multiple countries? What must I add to my code of websites my .nl domain to avoid duplicate content and to keep the .nl website out of google.be, but still well-indexed in google.nl? What must I add to my code of websites my .be domain to avoid duplicate content and to keep the .nl website out of google.be, but still well-indexed in google.nl? Thanks in advance!
International SEO | | HMK-NL3 -
Google search cache points to and uses content from different url
We have two sites, 1 in new zealand: ecostore.co.nz and 1 in Australia: ecostoreaustralia.com.au Both sites have been assigned with the correct country in Webmaster tools Both site use the same urls structure and content for product and category pages Both sites run off the same server in the US but have unique ip adresses. When I go to google.com.au and search for: site:ecostoreaustralia.com.au I get results which google says are from the Australian domain yet on closer inspection it is actually drawing content from the NZ website. When I view a cached page the URL bar displays the AU domain name but on the page (in the top grey box) it says: _This is Google's cache of http://www.ecostore.co.nz/pages/our-highlights. _ Here is the link to this page: http://webcache.googleusercontent.com/search?q=cache:Zg_CYkqyjP4J:www.ecostoreaustralia.com.au/pages/our-highlights+&cd=7&hl=en&ct=clnk&gl=au In the last four weeks the ranking of the AU website has dropped significantly and the NZ site now ranks first in Google AU, where before the AU site was listed first. Any idea what is going wrong here?
International SEO | | ArchMedia0 -
Differents TLDs and same contents not a problem Matt Cutts says?
Matt Cutts says on this video that you can have the same content on different TLDs and there is no duplicate content for Google. Have someone try this experience? For example : same content on "mysite.fr" and "mysite.be". And for the visitors from Belgium, will they see into the SERPs "mysite.be" and for the visitors from France "mysite.fr"? Thank you for your answer guys. Jon watch?v=Ets7nHOV1Yo&feature=player_embedded
International SEO | | JonathanLeplang1