I am trying to correct error report of duplicate page content. However I am unable to find in over 100 blogs the page which contains similar content to the page SEOmoz reported as having similar content is my only option to just dlete the blog page?
-
I am trying to correct duplicate content. However SEOmoz only reports and shows the page of duplicate content.
I have 5 years worth of blogs and cannot find the duplicate page.
Is my only option to just delete the page to improve my rankings.
Brooke
-
You can also confirm duplicate content (titles, descriptions and URLs) with either;
- Google webmaster tools - they will tell you most all errors you need to take action on
- screaming frog seo spider
You can also check out this post I just did about WordPress SEO to get some more in depth info on finding duplicate issues with WordPress blogs (I assume you're on WordPress).
Thanks!!
-Dan
-
I have always published multiple updated blogs with new information under the same individual category.
I know google likes curent information however do i need to do a new category when I want to write new info about the same topic
-
If you have lots of duplicate content, the chances are your url structure is causing it in which case the problem will return with every blog post, so deleting is probably not the answer.
Check the obvious things like is it caused by having a www. and non-www. version, if it accessable via diffrent capatalisation, and fix any related issues.
The other common things to check are tag and catagory pages which can easly end up looking exactly the same. Fix what you can before deleting the content, else you risk masking the problem.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Blog archive pages are meta noindexed but still flagged as duplicate
Hi all. I know there several threads related to noindexing blog archives and category pages, so if this has already been answered, please direct me to that post. My blog archive pages have preview text from the posts. Each time I post a blog, the last post on any given archive page shifts to the first spot on the next archive page. Moz seems to report these as new duplicate content issues each week. I have my archive pages set to meta noindex, so can I feel good about continuing to ignore these duplicate content issues, or is there something else I should be doing to prevent penalties? TIA!
Technical SEO | | mkupfer1 -
Is it possible to deindex old URLs that contain duplicate content?
Our client is a recruitment agency and their website used to contain a substantial amount of duplicate content as many of the listed job descriptions were repeated and recycled. As a result, their rankings rarely progress beyond page 2 on Google. Although they have started using more unique content for each listing, it appears that old job listings pages are still indexed so our assumption is that Google is holding down the ranking due to the amount of duplicate content present (one software returned a score of 43% duplicate content across the website). Looking at other recruitment websites, it appears that they block the actual job listings via the robots.txt file. Would blocking the job listings page from being indexed either by robots.txt or by a noindex tag reduce the negative impact of the duplicate content, but also remove any link juice coming to those pages? In addition, expired job listing URLs stay live which is likely to be increasing the overall duplicate content. Would it be worth removing these pages and setting up 404s, given that any links to these pages would be lost? If these pages are removed, is it possible to permanently deindex these URLs? Any help is greatly appreciated!
Technical SEO | | ClickHub-Harry0 -
Duplicate Content Mystery
Hi Moz community! I have an ongoing duplicate mystery going on here and I'm hoping someone here can answer my question. We have an Ecommerce site that has a variety of product pages and category pages. There are Rel canonicals in place, along with parameters in GWT, and there are also URL rewrites. Here are some scenarios, maybe you can give insight as to what’s exactly going on and how to fix it. All the duplicates look to be coming from category pages specifically. For example:
Technical SEO | | Ecom-Team-Access
This link re-writes: http://www.incipio.com/cases/tablet-cases/amazon-kindle-cases-sleeves.html?cat=407&color=152&price=20- To: http://www.incipio.com/cases/tablet-cases/amazon-kindle-cases-sleeves.html The rel canonical tag looks like this: http://www.incipio.com/cases/tablet-cases/amazon-kindle-cases-sleeves.html" /> The CONTENT is different, but the URLs are the same. It thinks that the product category view is the same as the all products view, even though there is a canonical in there telling it which one is the original. Some of them don’t have anything to do with each other. Take a look: Link identified as duplicate: http://www.incipio.com/cases/smartphone-cases/htc-smartphone-cases/htc-windows-phone-8x-cases.html?color=27&price=20- Link this is a duplicate of: http://www.incipio.com/cases/macbook-cases/macbook-pro-13in-cases.html Any idea as to what could be happening here?0 -
Why is Coyscape showing content duplication error even after implementing 301 redirect ?
We are maintaining the corporate website of one of our prestigious clients "FineTech Toolings" (http://www.finetechtoolings.in). Recently I had raised a question regarding "2 websites running paralley in 2 diferent domains, i.e. 1 organisation having 2 different websites on 2 different domains". Recently my domain changed from http://www.finetechtoolings.co.in to http://www.finetechtoolings.in via 301 redirect, but still I am facing content duplication issue as per Copyscape. Hence I am having a small doubt regarding the same. Please note the following question very carefully and provide me the exact problem and the solution for the same: Even though I have implemented 301 redirect (http://www.finetechtoolings.co.in is redirected to http://www.finetechtoolings.in), which is completely ok as per the SEO rules, why is copyscape still showing that duplicate content exists in the former website? I think I am clear enough with my question.
Technical SEO | | KDKini0 -
Duplicate content on report
Hi, I just had my Moz Campaign scan 10K pages out of which 2K were duplicate content and URL's are http://www.Somesite.com/modal/register?destination=question%2F37201 http://www.Somesite.com/modal/register?destination=question%2F37490 And the title for all 2K is "Register" How can i deal with this as all my pages have the register link and login and when done it comes back to the same page where we left and that it actually not duplicate but we need to deal with it propely thanks
Technical SEO | | mtthompsons0 -
Duplicate page content
Hello, My site is being checked for errors by the PRO dashboard thing you get here and some odd duplicate content errors have appeared. Every page has a duplicate because you can see the page and the page/~username so... www.short-hairstyles.com is the same as www.short-hairstyles.com/~wwwshor I don't know if this is a problem or how the crawler found this (i'm sure I have never linked to it). But I'd like to know how to prevent it in case it is a problem if anyone knows please? Ian
Technical SEO | | jwdl0 -
SEOMoz Crawl Diagnostic indicates duplicate page content for home page?
My first SEOMoz Crawl Diagnostic report for my website indicates duplicate page content for my home page. It lists the home page URL Page Title and URL twice. How do I go about diagnosing this? Is the problem related to the following code that is in my .htaccess file? (The purpose of the code was to redirect any non "www" backlink referrals to the "www" version of the domain.) RewriteCond %{HTTP_HOST} ^whatever.com [NC]
Technical SEO | | Linesides
RewriteRule ^(.*)$ http://www.whatever.com/$1 [L,R=301] Should I get rid of the "http" reference in the second line? Related to this is a notice in the "Crawl Notices Found" -- "301 Permanent redirect" which shows my home page title as "http://whatever.com" and shows the redirect address as http://http://www.whatever.com/ I'm guessing this problem is again related to the redirect code I'm using. Also... The report indicates duplicate content for those links that have different parameters added to the URL i.e. http://www.whatever.com?marker=Blah Blah&markerzoom=13 If I set up a canonical reference for the page, will this fix this? Thank you.0 -
Duplicate Content Issue
Hello, We have many pages in our crawler report that are showing duplicate content. However, the content is not duplicateon the pages. It is somewhat close, but different. I am not sure how to fix the problem so it leaves our report. Here is an example. It is showing these as duplicate content to each other. www.soccerstop.com/c-119-womens.aspx www.soccerstop.com/c-120-youth.aspx www.soccerstop.com/c-124-adult.aspx Any help you could provide would be most appreciated. I am going through our crawler report and resolving issues, and this seems to be big one for us with lots in the report, but not sure what to do about it. Thanks
Technical SEO | | SoccerStop
James0