Duplicate content - news archive
-
Most of them are due to news items having more than 1 category – which is pretty normal.Also /us/blog, /uk/blog and /ca/blog are effectively the same page.None of them are actually duplicate content – just alternate URLs for the same pagehttp://www.fdmgroup.com/category/news/
-
From developer: "Looking into this, we need to have /uk/blog, /us/blog and /ca/blog in order for them to appear on the menus – we could put a noindex meta tag on the us and ca pages to avoid duplicates?"
Or do you recommend href lang tag? Thanks.
-
From developer: "Looking into this, we need to have /uk/blog, /us/blog and /ca/blog in order for them to appear on the menus – we could put a noindex meta tag on the us and ca pages to avoid duplicates?"
-
Hi Christopher,
Google has definitely become a lot better in recent years at identifying this sort of duplication and dealing with it (largely because this has to be one of the most common accidental / non-malicious duplication causes - categories on blogs and news sites). That said, cleaning it up is for the best. I have been meaning to clean up a blog belonging to a family member in this manner for months (years...) because the version of each piece of content Google has chosen is wrong. Pages marked up with dates, e.g. example.com/2014/01 kept ranking better than the original posts in that date range. So even when Google makes the decision for you, it won't necessarily make the right one. You're risking visitors coming to a page they didn't expect, or a page that doesn't answer their query as succinctly as the "best" version would have, and if you are in e-commerce of any sort or focusing on conversions, this can make a big difference to how optimised your traffic's on-site experience is.
Where you'll be "penalised" for duplicate content, especially by Panda, is as you cite above: when the duplication looks like it has been done for spam purposes. This has happened accidentally to people when their content management systems have gone mad with infinite duplication, but it likely won't happen with simple blog categories.
In short, Google sees this sort of duplication all day, every day and will choose its favourite version to rank. However, if you can guide its choice, you're in control of what your visitors see.
You mention country-based categories in your original question. If internationalisation and duplicate content are a concern, you might want to check out the href lang tag (also called rel="alternate" tag - it gets called either by the community). Could be useful if you're publishing the same thing in different countries.
-
From my developer:"Doing a bit of research Google have explicitly stated that that don’t penalise duplicate content unless it appears to be deliberately deceptive. The only issue is which version appears in the search results.Duplicate content on a site is not grounds for action on that site unless it appears that the intent of the duplicate content is to be deceptive and manipulate search engine results. If your site suffers from duplicate content issues, and you don't follow the advice listed above, we do a good job of choosing a version of the content to show in our search results.https://support.google.com/webmasters/answer/66359?hl=enMatt Cutts, Google’s head of search spam, posted a video today about duplicate content and the repercussions of it within Google’s search results.Matt said that somewhere between 25% to 30% of the content on the web is duplicative. Of all the web pages and content across the internet, over one-quarter of it is repetitive or duplicative.But Cutts says you don’t have to worry about it. Google doesn’t treat duplicate content as spam. It is true that Google only wants to show one of those pages in their search results, which may feel like a penalty if your content is not chosen — but it is not.Google takes all the duplicates and groups them into a cluster. Then Google will show the best of the results in that cluster.Matt Cutts did say Google does reserve the right to penalize a site that is excessively duplicating content, in a manipulative manner. But overall, duplicate content is normal and not spam.http://searchengineland.com/googles-matt-cutts-25-30-of-the-webs-content-is-duplicate-content-thats-okay-180063http://searchengineland.com/googles-matt-cutts-duplicate-content-wont-hurt-you-unless-it-is-spammy-167459Cheers"
-
I'm afraid your blog pages are in fact duplicate content, in Google's eyes anyway.
The /us/blog, /uk/blog and /ca/blog examples are all separate URLs that you are asking Google to index (separate canonical tags for each and no robots instructions that I can see). Google is going to look at these and any blog posts within them as separate pages. Once it realises they all have the same content, it will likely result in a Panda algorithmic penalty.
The risk here is that this penalty might affect your entire domain, rather than the offending pages. I really don't see that as a risk worth taking. Therefore, I strongly advise to remove the separate versions of the blogs and consolidate into one blog, with redirections of the local blogs to the new ones. Failing that, choose one version and instruct Google not to index other versions of the page by using a meta robots tag in your header, or in the robots.txt file.
I also advise that you noindex the category page to be sure that its content isn't being seen as duplicate either. More info on how to do that can be found in the Moz Robots Guide.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate store (subdomain) not ranking
I have a store ( www.grocare.com ) and I made a duplicate store recently ( in.grocare.com ) for a different region. Both have different currencies and target different regions. I even targeted the new store ( in.grocare.com ) to that particular country in google search console. They both have different href lang tags to mark different regions too. Now its been a month since this has been done. But the new store is not ranking in the region. The old one is still ranking and I have to redirect the traffic from old to new based on IP.
International SEO | | grocare
I thought making a new store and targeting specifically would help with rankings. Am i doing something wrong here?0 -
Duplicate Page Content due to Language and Currency
Hi Folks, hoping someone can help me out please I have a site that I'd like to rank in France and the UK but I'm getting a stack of duplicate content errors due to English and French pages and GBP and EUR prices. Below is an example of how the home page is duplicated: http://www.site.com/?sl=en?sl=fr
International SEO | | Marketing_Today
http://www.site.com/?sl=fr?sl=fr
http://www.site.com
http://www.site.com/?currency=GBP?sl=fr
http://www.site.com/?currency=GBP?sl=en
http://www.site.com/?sl=fr?sl=en
http://www.site.com/?currency=EUR?sl=fr
http://www.site.com/?currency=EUR?sl=en
http://www.site.com/?currency=EUR
http://www.site.com/?sl=en¤cy=EUR
http://www.site.com/?sl=en¤cy=GBP
http://www.site.com/?sl=en
http://www.site.com/?currency=GBP
http://www.site.com/?sl=en?sl=en Each page has the following code in the that updates according to the page you are on: How do I simplify this and what's the correct approach?0 -
Multilanguage duplicate content question
I have following situation; First site, in four languages
International SEO | | nans
Second site, in one language Let's say we have the following setup: www.domain1.be/nl (dutch)
www.domain1.be/fr (french)
www.domain1.be/en (english)
www.domain1.be/de (german) www.domain2.be/ (french only) Possible problem is the content on
www.domain1.be/fr
www.domain2.be
Content on domain2 is a copy of domain1/fr. So French content is duplicated. For domain1, the majority (80%) are Dutch speaking clients, domain2 is 100% French.
Both companies operate in same country, one in the north, the second one in the south. QUESTION; what about duplicate content?
Can we 'fix' that with using the canonical tag? Canonical on domain1 (fr pages), pointin to domain2? Or vice versa.
Domain1 is more important than domain2, but customers of domain2 should not be pointed to domain1. Anybody any advice?0 -
Multi-Country Duplicate Content
Hello, We have an ecommerce site that serves several countries on the same .com domain - US, UK and CA. We have duplicate content across these countries because they are all English speaking so there is little variance in the pages and they each sell most of the same products. We have implemented hreflang into our sitemaps but we need to address the duplicate content. We were advised to canonicalize our UK and CA pages back to the duplicate US pages (our US pages account for the majority of our traffic and sales). This would cause the UK and CA pages to fall out of the index but the visitor would still be taken to the correct country's page due to the hreflang. I'm leary about doing this because they are across countries. Is this ok to do? If not, how do we address the duplicate content since they are not on their own CCTLD's?
International SEO | | Colbys0 -
Duplicate content or not ?
Hello, I would like your expert opinion I have a site in spanish for Spain and Mexico As domain name, I have .es and .mx This is the same site. We do not have any redirects. From .mx to .es for example. >> your opinion?
International SEO | | android_lyon
if I declare targeting in Spain in Google Webmaster tools (in settings) and in another profile with in Mexico, we have a duplicate content? Thank you for your feedback. Sorry for my english, i'm french 😉0 -
Freelancer.com: Same Content on Different TLD?
Take a look at freelancer.com and freelancer.in. Both have the same content. I check for rel=canonical and freelancer.in has one to itself. Not to the .com version. Both the sites are indexed in Google as well. Do you think high authority sites like freelancer can get away with duplicate content?
International SEO | | jombay0 -
Fresh content has had a negative affect on SERPs
Hi there, I was ranking pretty well for highly competitive keywords without actually doing any link building please see graph attached, so I thought I have an opportunity here in getting to page 1 for these keywords, the plan was to write fresh & original content for these pages, because hey Google loves fresh content, right? Well it seems NOT, after one week of these pages been re-written (21st Feb 2012), all of these pages dropped all together, please note: all the pages were under the same directory: /health/flu/keyword-1 /health/flu/keyword-2 and so on... I have compared both pages as I have back ups of the old content On Average there are more words on each of the new pages compared to previous pages Lower bounce rate by at least 30% (Via Adwords) More time on site by at least 2 minutes (Via Adwords) More page visits (Via Adwords) Lower keyword density, on average 4% (new pages) compared to 9% (old content) across all pages So since the end of February, these pages are still not ranked for these keywords, the funny thing is, these keyword are on page 1 of Bing. Another NOTE: We launched an irish version of the website, using the exact same content, I have done all the checks via webmaster tools making sure it's pointing to Ireland, I have also got hreflang tags on both website (just in case) If anyone can help with this that would be very much appreciated. Thanks usD8G.gif
International SEO | | Paul780 -
Duplicated 404 Pages (Travel Industry)
Our website has creating numberous "future pages" with no alt tag or class tag that are showing up as 404 pages, To make matters worst, they are causing duplicate 404 pages because we have different languages. The visitors cant find the 404s but the searchbots can. Would it better to remove or add the links to robot.txt or add nofollow/noindex tag? This is an example. http://www.solmelia.com/nGeneral.LINK_FAQ http://www.solmelia.com/nGeneral.LINK_HOTELESDESTINOS_BODAS http://www.solmelia.com/nGeneral.LINK_CONDICIONES http://www.solmelia.com/nGeneral.LINK_MAPSITE http://www.solmelia.com/nGeneral.LINK_HOTELESDESTINOS_EMPRESA
International SEO | | Melia0