Why Does SEOMOZ Crawl show that i have 5,769 pages with Duplicate Content
-
Hello...
I'm trying to do some analysis on my site (http://goo.gl/JgK1e) and SEOMOZ Crawl Diagnostics is telling me that I have 5,769 pages with duplicate content.
Can someone, anyone, please help me understand:
-
how does SEOMOZ determine if i have duplicate content
-
Is it correct ? Are there really that many pages of duplicate content
-
How do i fix this, if true <---- ** Most important **
Thanks in advance for any help!!
-
-
Looks like its now sorted!
Checked link: http://www.plasticstorage.com/
Type of redirect: 301 Moved Permanently
Redirected to: http://plasticstorage.com/
-
Hey Lavellester,
I believe we got our 302/301 issue resolved. Any way you can check through whatever tool you were using last week?
We are still trying to figure out how to fix the code creating the dupe content, but can you advise if i fixed the redirect properly?
Thanks,
-
it will be related to the code/logic of your site creating the duplicate pages. You could work out why/update the code so you only have 1 page for each product OR you could use the rel canonical tag to resolve the issue.
Just thought...as you appear to have so many duplicate pages it may be quicker to look at the logic of the site and fix it all in one go.
-
Lavellester, thanks for all of your help, i am going to have that redirect changed to a 301 ASAP.
Do you have any idea where that duplicate content is coming from since i only have that item in my database once?
Thanks
-
The redirect is still a 302:
Checked link: http://www.plasticstorage.com/
Type of redirect: 302 Moved Temporarily
Redirected to: http://plasticstorage.com
Change the type of redirect to a 301 - this is done at the server level usually.
-
Unless I'm mistaken they appear to be identical duplicate pages.
-
little update:
In our GWT we have both the www version and the non www version.
On the non, we had set the preferred domain as the non www
On the WWW, we also had set the preferred domain as the non www
Does this mean I've done it correctly or not?
Thanks
-
I am looking into the 301/302 issue right now and i will report back...
Here are 3 supposed duplicate content pages:
<colgroup><col width="64"></colgroup>
|plasticstorage.com/catalog/product/view/id/5079/s/305b1/category/100/
plasticstorage.com/catalog/product/view/id/5079/s/305b1/category/101/
|
|
<colgroup><col width="64"></colgroup>
|Those pages when entered into your browser will appear identical... There is only one 305b1 item in the database. Somehow you can change the "100" or the "101" to "166666662" and you will still see that same "305b1" item.... Can you please help me figure out how or why and is this the "DUPLICATE CONTENT" or perhaps something else is being considered...
Thanks again
|
|
-
I'd still strongly recommend fixing the 301 redirect. The preference in the GWT tools is kinda like a soft preference.
Can you show 2 URSs that are deemed to be the same/duplicate?
-
We have set the canonical through Google to choose the non www version
Some examples of pages that are deemed duplicate is:
Page Title URL Other URLs Page Authority Linking Root Domains Ultra-Clear InSight Bin - Stacking 10x5x5 305B1 Akro-Mils
http://plasticstorage.com/305b1.html 50+ 15 1 Lids for Ultra-Clear InSight Bin - 305B1 - Akro-Mils 305B2
http://plasticstorage.com/305b2.html?related=1 50+ 1 0 Dividers for Ultra-Clear InSight Bin - 305B1 - Akro-Mils 405B1
http://plasticstorage.com/405b1.html?related=1 | 50+ | 1 |
| -
Hi there,
One issue is that your www.plasticstorage.com is a 302 redirect to plasticstorage.com. You should update this to a 301 redirect. This will be causing a few issues.
I think the SEOMoz tool shows content that it deems duplicate in the report. Can you post some examples of pages that are deemed duplicate?
Cheers.
-
I also had duplicate content. The cause was I have 2 similar domains that both led to the same page. Mine were (1) www.msperformanceonline.com & (2) w/o the " www." just msperformanceonline.com Look and see if this is why.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Dynamic links & duplicate content
Hi there, I am putting a proposal together for a client whose website has been optimised to include many dynamic links and so there are many pages with duplicate content: only the page title, h1 and URL is different. My client thinks this isn't an issue. What is the current consensus on this? Many thanks in advance.
On-Page Optimization | | lorraine.mcconechy0 -
Google Search Console issue: "This is how Googlebot saw the page" showing part of page being covered up
Hi everyone! Kind of a weird question here but I'll ask and see if anyone else has seen this: In Google Search Console when I do a fetch and render request for a specific site, the fetch and blocked resources all look A-OK. However, in the render, there's a large grey box (background of navigation) that covers up a significant amount of what is on the page. Attaching a screenshot. You can see the text start peeking out below (had to trim for confidentiality reasons). But behind that block of grey IS text. And text that apparently in the fetch part Googlebot does see and can crawl. My question: is this an issue? Should I be concerned about this visual look? Or no? Never have experienced an issue like that. I will say - trying to make a play at a featured snippet and can't seem to have Google display this page's information, despite it being the first result and the query showing a featured snippet of a result #4. I know that it isn't guaranteed for the #1 result but wonder if this has anything to do with why it isn't showing one. VmIqgFB.png
On-Page Optimization | | ChristianMKG0 -
Duplicate content from pagination and categories found in multiple locations
Hey Moz community, Really need help resolving duplicate content issues for an eCommerce utilizing Magento. We have duplicate content issues with category pagination and categories found in multiple locations. here's an example: "www.website.com/style/sequin-dresses" is also found at "www.website.com/features/sequin-dresses" *to resolve this issue do we just need to place a canonical tag on "www.website.com/features/sequin-dresses" pointing to "www.website.com/style/sequin-dresses"? In addition, the category "Sequin Dresses" also has pagination. to resolve duplicate content issues with pagination do we need to implement a rel=next/prev tag? (we do not have a view-all due to the amount of products featured) If anyone has experience with this or any insights on how to resolve these issues please let me know. Thanks!
On-Page Optimization | | LeapOfBelief0 -
Tool To Search For Duplicate Content
Hi Is there such a tool that can be use to search a website for duplicate content? Thanks
On-Page Optimization | | Bossandy0 -
Break-up content into individual pages or keep on one page
I am working on a dental website. Under menu item "services" lists everything he does like.. Athletic Sports Guards
On-Page Optimization | | Czubmeister
An athletic sports guard is a resilient plastic appliance that is worn to protect the teeth and gum tissues by absorbing the forces generated by traumatic blows during sports or other activities. Digital X-Rays We use state of the art digital x-rays and digital cameras to help with an accurate diagnosis of any concerns. Digital Imaging On initial visits, and recall visits, we take a series of digital photographs to aid us in diagnosis as well as to give you a close-up view of your mouth and any oral conditions. Smile Makeovers
We offer a number of different options including bleaching, bonding, porcelain veeners, and in some cases, implants and/or orthodontic care is utilized in our smile makeover planning. Nitrous oxide for your Comfort Would it be better to break these services up into individual pages? I was thinking I would because then I could add more pictures and expand on the topic and try to get an "A" grade on each page. I'm not sure how I could rank a page if I have 35 services listed on the page. That would be an awfully big H1! Suggestions?0 -
Tags creating duplicated content issue?
Hello i believe a lot of us use tags in our blogs as a way to categorize content and make it easy searchable but this usually (at lease in my case) cause duplicate content creation. For example, if one article has 2 tags like "SEO" & "Marketing", then this article will be visible and listed in 2 urls inside the blog like this domain.com/blog/seo and domain.com/blog/marketing In case of a blog with 300+ posts and dozens of different tags this is creating a huge issue. My question is 1. Is this really bad? 2. If yes how to fix it without removing tags?
On-Page Optimization | | Lakiscy0 -
Duplicate content and the Moz bot
Hi Does our little friend at SEOmoz follow the same rules as the search engine bots when he crawls my site? He has sent thousands of errors back to me with duplicate content issues, but I thought I had removed these with nofollow etc. Can you advise please.
On-Page Optimization | | JamieHibbert0 -
Duplicate content issue with dynamically generated url
Hi, For those who have followed my previous question, I have a similar one regarding dynamically generated urls. From this page http://www.selectcaribbean.com/listing.html the user can make a selection according to various criteria. 6 results are presented and then the user can go to the next page. I know I should probably rewrite url's such as these: http://www.selectcaribbean.com/listing.html?pageNo=1&selType=&selCity=&selPrice=&selBeds=&selTrad=&selMod=&selOcean= but since all the results presented are basically generated on the fly for the convenience of the user, I am afraid google my consider this as an attempt to generate more pages as there are pages for each individual listing. What is my solution for this? Nofollow these pages? Block them thru robots txt?
On-Page Optimization | | multilang0