Duplicate content errors
-
Hi
I am getting some errors for duplicate content errors in my crawl report for some of our products
www.....com/brand/productname1.html
www.....com/section/productname1.html
www.....com/productname1.html
we have canonical in the header for all three pages
rel="canonical" href="www....com/productname1.html" />
-
Hi
still getting this error on my campagin reports duplicate content errors?
Thanks
-
Hi
sorry if I was not clear, I add canonical meta after the first weeks scan but these errors have appeared?
Thanks paul
-
If the errors were fixed since the last weekly crawl you will not see the changes reflected until you get the new weeks crawl data.
-
This campaign has been running for three weeks and I have corrected many of the errors, in fact I thought I had corrected all the errors. I was waiting for this weeks campaign to confirm the errors were fixed so that I could report back to my client but now have these new errors
-
I'd love to help get an answer for you! This is Abe from the Moz Help Team. Were these changes made after your weekly crawl took place?
-
Well if everything is correct in the header then technically you don't have any problem. So you're not really waiting for something to clear up, you're just waiting for Moz to catch up. If you could post one of the URLs here I could provide a second set of eyes to double-check your canonical.
-
right thats a bit painful, need to wait a week to see if it clears itself
-
Occasionally I've had issues where Moz doesn't seem to recognize the canonical right away (or sometimes randomly forgets the canonical is there when its seen it before) and reports duplicate content to me. Its ultimately always cleared itself up though. Double-check your canonicals are definitely written correctly, in the right spot and pointing to the right URL. If they are, I'd say wait to see if Moz fixes itself next crawl.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Crawler triggering Spam Throttle and creating 4xx errors
Hey Folks, We have a client with an experience I want to ask about. The Moz crawler is showing 4xx errors. These are happening because the crawler is triggering my client's spam throttling. They could increase from 240 to 480 page loads per minute but this could open the door for spam as well. Any thoughts on how to proceed? Thanks! Kirk
Moz Bar | | kbates1 -
902 Error and Page Size Limit
Hello, I am getting a 902 error when attempting to crawl one of my websites that was recently upgraded to a modern platform to be mobile friendly, https, etc. After doing some research it appears this is related to the page size. On Moz's 902 error description it states: "Pages larger than 2MB will not be crawled. For best practices, keep your page sizes to be 75k or less." It appears all pages on my site are over 2MB because Rogbot is no longer doing any crawling and not reporting issues besides the 902. This is terrible for us because we purchased MOZ to track and crawl this site specifically. There are many articles which show the average page size on the web is well over 2MB now: http://www.wired.com/2016/04/average-webpage-now-size-original-doom/ Due to that I would imagine other users have come up against this as well and I'm wondering how they handled it. I hope Moz is planning to increase the size limit on Rogbot as it seems we are on a course towards sites becoming larger and larger. Any insight or help is much appreciated!
Moz Bar | | Paul_FL0 -
Rogerbot will not crawl my site! Site URL is https but keep getting and error that homepage (http) can not be accessed. I set up a second campaign to alter the target url to the newer https version but still getting the same error! What can I do?
Site URL is https but keep getting and error that homepage (http://www.flogas.co.uk/) can not be accessed. I set up a second campaign to alter the target url to the newer https://www.flogas.co.uk/ version but still getting the same error! What can I do? I want to use Moz for everything rather than continuing to use a separate auditing tool!
Moz Bar | | digitalascend0 -
Weekly Custom Reports Send Duplicates
For one of my sites, I have set up a weekly custom report to be sent out, but when the report comes in, there are multiple copies of the report. Any help would be appreciated on how to make sure that only one copy of the report is sent.
Moz Bar | | Wharthog0 -
Error 4XX showing by SEOmoz tool
Hi, I am a SEOmoz user. Can anybody guide me how to fix 4XX errors as i got reported by "Crawl Diagnostics Summary". There are many referring URLs reporting same error. Please guide me what to do and how to fix it?? Thanks
Moz Bar | | acelerar0 -
Duplicate content reported for totally different pages
Hi, The Moz report is showing just over 21,500 duplicate page issues on our site. This is more or less every page we have. However when I look at the pages it says are duplicates they are totally different (it could for example report that a news page for 2009 is the same as a product page just added which has no relation when you read the content or view the page). What sort of thing could it be picking up as duplicate content? I assume it must be something in the HTML for the site rather than the actual page content as there is no cross over at all on the pages highlighted. The only issue I can currently identify is that the menu for the mobile version of the site has a huge number of internal links which I will cut down. If the tools purely look at HTML content this could be seen as duplicate but shouldn't it be clever enough to realise what is content and what is site structure? Thanks,
Moz Bar | | TW-Steve0 -
Crawl Diagnostics: Exlude known errors and others that have been detected by mistake? New moz analytics feature?
I'm curious if the new moz analytics will have the feature (filter) to exclude known errors from the crwal diagnostics. For example, the attached screenshot shows the URL as 404 Error, but it works fine: http://en.steag.com.br/references/owners-engineering-services-gas-treatment-ogx.php To maintain a better overview which errors can't be solved (so I just would like to mark them as "don't take this URL into account...") I will not try to fix them again next time. On the other hand I have hundreds of errors generated by forums or by the cms that I can not resolve on my own. Also these kind of crawl errors I would like to filter away and categorize like "errors to see later with a specialist". Will this come with the new moz analytics? Anyway is there a list that shows which new features will still be implemented? knPGBZA.png?1
Moz Bar | | inlinear0 -
Site Wide Content Analysis
Are there tools that can effectively analyze your content site wide and provide recommendations?
Moz Bar | | SpadaMan0