My pages says it has 16 errors, need help
-
My pages says it has 16 errors, and all of them are due to duplicate content. How do I fix this? I believe its only due to my meta tag description.
-
Glad to help
Hope it's an easy fix.
-
and apparently in unison!
-
You are very welcome and good luck!
-
HAHA! What can I say other then great minds think alike!
-
Thank you both! I will look into Dr. Pete guide, pronto!
Cheers!
-
Hey Jake,
SNAP!
Sha
-
Hi Gajendra,
The Pro App identifies two types of duplicate Errors (the red button in your Crawl Diagnostics Summary). These are Duplicate Page Content **(**a significant amount of content on the page has been identified as duplicate) and Duplicate Page Title (page title only has been identified as duplicate).
Duplicate Page Title errors are most often an internal issue, where many pages in the site have been given the same page title.
Duplicate Page Content errors can be either an internal and/or external issue. It may be that identical pages within the site are visible via multiple URL's, AND/OR the content on pages may be a duplicate of content on other websites. This happens a lot with sites that use product descriptions, content feeds etc from other sites.
To identify the actual URLs where duplicated content has been detected, click the Red error button in the Diagnostics Summary and you will see a list of pages where the error has been identified. In the second column, you will see a blue link which tells you how many duplicates there are for that particular item. When you click the link you will see a full list of URLs which are duplicates.
There are a number of things which can cause duplicate content errors on a site - many are due to the way the site is structured or functions. To really understand what is happening and how to deal with it, you should read Dr Pete Meyers' landmark post Duplicate Content in a Post-Panda World.
Hope that helps,
Sha
-
My advice would be to dive deep into the campaign that you have running for your page and check to see what is causing the issue. It could be that your URL isn't refined resulting in there being a copy for the page for each URL variation. This can be solved a few different ways but unfortunately I am not quite sure what the problem is that is causing the duplicate content with the information you have provided. Dr. Pete has put together a fantastic Duplicate Content Guide I would recommend checking out . He goes over each variation and some great ways to deal with duplicate content issues.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Subdomain 403 error
Hi Everyone, A crawler from our SEO tool detects a 403 error from a link from our main domain to a a couple of subdomains. However, these subdomains are perfect accessibly. What could be the problem? Is this error caused by the server, the crawlbot or something else? I would love to hear your thoughts.
Technical SEO | | WeAreDigital_BE
Jens0 -
Site Crawl -> Duplicate Page Content -> Same pages showing up with duplicates that are not
These, for example: | https://im.tapclicks.com/signup.php/?utm_campaign=july15&utm_medium=organic&utm_source=blog | 1 | 2 | 29 | 2 | 200 |
Technical SEO | | writezach
| https://im.tapclicks.com/signup.php?_ga=1.145821812.1573134750.1440742418 | 1 | 1 | 25 | 2 | 200 |
| https://im.tapclicks.com/signup.php?utm_source=tapclicks&utm_medium=blog&utm_campaign=brightpod-article | 1 | 119 | 40 | 4 | 200 |
| https://im.tapclicks.com/signup.php?utm_source=tapclicks&utm_medium=marketplace&utm_campaign=homepage | 1 | 119 | 40 | 4 | 200 |
| https://im.tapclicks.com/signup.php?utm_source=blog&utm_campaign=first-3-must-watch-videos | 1 | 119 | 40 | 4 | 200 |
| https://im.tapclicks.com/signup.php?_ga=1.159789566.2132270851.1418408142 | 1 | 5 | 31 | 2 | 200 |
| https://im.tapclicks.com/signup.php/?utm_source=vocus&utm_medium=PR&utm_campaign=52release | Any suggestions/directions for fixing or should I just disregard this "High Priority" moz issue? Thank you!0 -
404 Errors for Form Generated Pages - No index, no follow or 301 redirect
Hi there I wonder if someone can help me out and provide the best solution for a problem with form generated pages. I have blocked the search results pages from being indexed by using the 'no index' tag, and I wondered if I should take this approach for the following pages. I have seen a huge increase in 404 errors since the new site structure and forms being filled in. This is because every time a form is filled in, this generates a new page, which only Google Search Console is reporting as a 404. Whilst some 404's can be explained and resolved, I wondered what is best to prevent Google from crawling these pages, like this: mydomain.com/webapp/wcs/stores/servlet/TopCategoriesDisplay?langId=-1&storeId=90&catalogId=1008&homePage=Y Implement 301 redirect using rules, which will mean that all these pages will redirect to the homepage. Whilst in theory this will protect any linked to pages, it does not resolve this issue of why GSC is recording as 404's in the first place. Also could come across to Google as 100,000+ redirected links, which might look spammy. Place No index tag on these pages too, so they will not get picked up, in the same way the search result pages are not being indexed. Block in robots - this will prevent any 'result' pages being crawled, which will improve the crawl time currently being taken up. However, I'm not entirely sure if the block will be possible? I would need to block anything after the domain/webapp/wcs/stores/servlet/TopCategoriesDisplay?. Hopefully this is possible? The no index tag will take time to set up, as needs to be scheduled in with development team, but the robots.txt will be an quicker fix as this can be done in GSC. I really appreciate any feedback on this one. Many thanks
Technical SEO | | Ric_McHale0 -
Indexing pages content that is not needed
Hi All, I have a site that has articles and a side block that shows interesting articles in a column block. While we google for a keyword i can see the page but the meta description is picked from the side block "interesting articles" and not the actual article in the page. How can i deny indexing that block alone Thanks
Technical SEO | | jomin740 -
Need help please with url guidelines.
Hi SEO PROS, I have a website and I am planing to change all the urls. I need to know what is the right way of making the urls. Here is some information. We are based in Brooklyn NY and we sell our services to Manhattan clients and Manhattan has few names. NY, NYC, Manhattan and NY. So by looking at my service area I came up with this url. http://www.signsny.com/brooklyn-ny/awnings this is my current url. http://www.signsny.com/sign-types/awnings-canopies-brooklyn-NYC. This is what I am planning to change it to. Please guide me to the right direction, so in future I don't have to re-do them again. Thanks Abie
Technical SEO | | signsny0 -
Need some help Pagerank N/A
Hi All! We have a small e commerce site, www.ruggedpcstore.com, built on Wordpress. The home page has a PR of 3, our About page is a PR 0, and all other pages are PR N/A. We've been pulling our hair out trying to fix what's wrong. Any ideas?
Technical SEO | | CsmBill0 -
I am trying to correct error report of duplicate page content. However I am unable to find in over 100 blogs the page which contains similar content to the page SEOmoz reported as having similar content is my only option to just dlete the blog page?
I am trying to correct duplicate content. However SEOmoz only reports and shows the page of duplicate content. I have 5 years worth of blogs and cannot find the duplicate page. Is my only option to just delete the page to improve my rankings. Brooke
Technical SEO | | wianno1680 -
Does it really matter to set 301 redirect for not found error pages?
I've very simple question for not found error pages. Does it really require to set up 301 redirect for all not found error pages which detected in Google webmaster tools? Honestly, I don't want to set 301 redirect exclude externally connected pages. So, what will impact on ranking after follow this process?
Technical SEO | | CommercePundit0