My pages says it has 16 errors, need help
-
My pages says it has 16 errors, and all of them are due to duplicate content. How do I fix this? I believe its only due to my meta tag description.
-
Glad to help
Hope it's an easy fix.
-
and apparently in unison!
-
You are very welcome and good luck!
-
HAHA! What can I say other then great minds think alike!
-
Thank you both! I will look into Dr. Pete guide, pronto!
Cheers!
-
Hey Jake,
SNAP!
Sha
-
Hi Gajendra,
The Pro App identifies two types of duplicate Errors (the red button in your Crawl Diagnostics Summary). These are Duplicate Page Content **(**a significant amount of content on the page has been identified as duplicate) and Duplicate Page Title (page title only has been identified as duplicate).
Duplicate Page Title errors are most often an internal issue, where many pages in the site have been given the same page title.
Duplicate Page Content errors can be either an internal and/or external issue. It may be that identical pages within the site are visible via multiple URL's, AND/OR the content on pages may be a duplicate of content on other websites. This happens a lot with sites that use product descriptions, content feeds etc from other sites.
To identify the actual URLs where duplicated content has been detected, click the Red error button in the Diagnostics Summary and you will see a list of pages where the error has been identified. In the second column, you will see a blue link which tells you how many duplicates there are for that particular item. When you click the link you will see a full list of URLs which are duplicates.
There are a number of things which can cause duplicate content errors on a site - many are due to the way the site is structured or functions. To really understand what is happening and how to deal with it, you should read Dr Pete Meyers' landmark post Duplicate Content in a Post-Panda World.
Hope that helps,
Sha
-
My advice would be to dive deep into the campaign that you have running for your page and check to see what is causing the issue. It could be that your URL isn't refined resulting in there being a copy for the page for each URL variation. This can be solved a few different ways but unfortunately I am not quite sure what the problem is that is causing the duplicate content with the information you have provided. Dr. Pete has put together a fantastic Duplicate Content Guide I would recommend checking out . He goes over each variation and some great ways to deal with duplicate content issues.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Our protected pages 302 redirect to a login page if not a member. Is that a problem for SEO?
We have a membership site that has links out in our unprotected pages. If a non-member clicks on these links it sends a 302 redirect to the login / join page. Is this an issue for SEO? Thanks!
Technical SEO | | rimix1 -
Overdynamic Pages - How to Solve it?
Hi everyone, I'm running a classified real estate ads site, where people can publish their apartment or house they want to sell, so we use multiple filters to help people find what they want. Lately we added multiple filters to the URL to make the search more precise, things like: Prices (priceAmount=###) Bedrooms (BedroomsNumber=2) Bathrooms (BathroomsNumber=3) TotalArea (totalArea=1_50) Services (Elevator, CommonAreas, security) Among other Filters so you see the picture, all this filters are on the URL so that people can share their search on multiple social media, that makes two problems for moz crawl: Overdynamic URLs Too long URLs Now what would be a good solution for this 2 problems, would a canonical to the original page before the "?" would be ok? Example:
Technical SEO | | JoaoCJ
http://urbania.pe/buscar/venta-de-propiedades?bathroomsNumber=2&services=gas&commonAreas=solarium The problem I have with this solution is that I also have a pagination parameter (page=2), and I'm using prev and next tags, if I use a such canonical will break the prev and next tag? http://urbania.pe/buscar/venta-de-propiedades?bathroomsNumber=2&services=gas&commonAreas=solarium&page=2 Also thinking if adding a noindex on pages with paramters could also be an option. Thanks a lot, I'm trying to address this issues.0 -
Redirect Error
Hello, I was sent a report from a colleague containing redirect errors: The link to "http://www.xxxx.com/old-page/" has resulted in HTTP redirection to "http://www.xxxx.com/new-page".Search engines can only pass page rankings and other relevant data through a single redirection hop. Using unnecessary redirects can have a negative impact on page ranking. Our site is host on Microsoft Servers (IIS). I'm not sure what is causing these errors. Would it be the way the redirect was implemented.
Technical SEO | | 3mobileIreland0 -
Page feedback
We recently wrote a new website page to cover the direct mail services our organization offers. We kept the title tag to 70 characters, the meta description under 150 characters. H1 tag has what we feel is the most important term. If anyone out there has time to review & provide a little feedback, we'd really appreciate it. It would be great to know if it is built well and providing a solid end user experience. http://www.cushingco.com/print_products/additional_services/direct_mail.shtml At the moment, the only links pointing to this page are from our blog. One bit of content I am contemplating is a short paragraph - What is Direct Mail Marketing? Literally providing a short definition of it. The page was activated last Thursday and showing up in some Google results on the 4th/5th page but I am thinking this is probably just a temporary bump for now. Anyway, thanks in advance for any advice!!!
Technical SEO | | SEOSponge0 -
406 errors
Just started seeing 406 errors on our last crawl (all jpg related). Seomoz found 670 of these on my site when there were 0 before. I have checked the MIME and everything seems to be in the right order. So could it be that Seomoz-crawler is showing errors that aren't really errors?
Technical SEO | | smines0 -
Secondary Pages Indexed over Primary Page
I have 4 pages for a single product Each of the pages link to the Main page for that product Google is indexing the secondary pages above my preferred landing page How do I fix this?
Technical SEO | | Bucky0 -
Seek help correcting large number of 404 errors generated, 95% traffic halt
Hi, The following GWT screen tells a bit of the story: site: http://bit.ly/mrgdD0 http://www.diigo.com/item/image/1dbpl/wrbp On about Feb 8 I decided to fix a large number of 'duplicate title' warnings being reported in GWT "HTML Suggestions" -- these were for URLs which differed only in parameter case, and which had Canonical tags, but were still reported as dups in GWT. My traffic had been steady at about 1000 clicks/day. At midnight on 2/10, google traffic completely halted, down to 11 clicks/day. I submitted a recon request and was told 'no manual penalty' Also, the 'sitemap' indexes in GWT showed 'pending' for 24x7 starting then. By about the 18th, the 'duplicate titles' count dropped to about 600 or so... the next day traffic hopped right back to about 800 clicks/day - for a week - then stopped again, down to 10/day, a week later, on the 26th. I then noticed that GWT was reporting 20K page-not found errors - this has now grown to 35K such errors! I realized that bogus internal links were being generated as I failed to disable the PHP warning messages.... so I disabled PHP warnings and fixed what I thought was the source of the errors. However, the not-found count continues to climb -- and I don't know where these bad internal links are coming from, because the GWT report lists these link sources as 'unavailable'. I'v been through a similar problem last year and it took months (4) for google to digest all the bogus pages ad recover. If I have to wait that long again I will lose much $$. Assuming that the large number of 404 internal errors is the reason for the sudden shutoff... How can I a) verify the source of these internal links, given that google says the source pages are 'unavailable'.. Most critically, how can I do a 'RESET" and have google re-spider my site -- or block the signature of these URLs in order to get rid of these errors ASAP?? thanks
Technical SEO | | mantucket0 -
I need help on web page load time, its very bad!
Note: This is KILLING my customer experience. Here is my webpage: http://www.stbands.com Here is a speed test that may help you (look at the poor ratings in the upper corner) http://www.webpagetest.org/result/110628_MW_Y8CQ/1/details/ I have an F on "Cache Static Content" - anyone know how I can fix this? Also, it is a e-commerce website hosted through core commmerce. I have some access to code but not all of it. Some of it is dynamic. However, if you tell me specific things I can forward it to their very awesome tech department. They are very willing to work with me and are now considering implementing a CDN after I schooled them. Any help is greatly appreciated. Don't be afraid to get very technical - I may not understand it, but the engineers there will.
Technical SEO | | Hyrule0