Why are these pages considered duplicate content?
-
I have a duplicate content warning in our PRO account (well several really) but I can't figure out WHY these pages are considered duplicate content.
They have different H1 headers, different sidebar links, and while a couple are relatively scant as far as content (so I might believe those could be seen as duplicate), the others seem to have a substantial amount of content that is different. It is a little perplexing.
Can anyone help me figure this out?
Here are some of the pages that are showing as duplicate:
http://www.downpour.com/catalogsearch/advanced/byNarrator/narrator/Seth+Green/?bioid=5554
http://www.downpour.com/catalogsearch/advanced/byAuthor/author/Solomon+Northup/?bioid=11758
http://www.downpour.com/catalogsearch/advanced/byNarrator/?mediatype=audio+books&bioid=3665
http://www.downpour.com/catalogsearch/advanced/byAuthor/author/Marcus+Rediker/?bioid=10145
http://www.downpour.com/catalogsearch/advanced/byNarrator/narrator/Robin+Miles/?bioid=2075
-
Hey Jay,
I checked two of the pages:
http://www.downpour.com/catalogsearch/advanced/byNarrator/narrator/Seth+Green/?bioid=5554 and http://www.downpour.com/catalogsearch/advanced/byAuthor/author/Solomon+Northup/?bioid=11758 against each other in a duplicate content checker (http://www.webconfs.com/similar-page-checker.php) and they returned a similarity percentage of 67%, which we definitely shouldn't be showing as duplicate. (We consider pages at 90% or more to be dupes.)
I went to check on your crawl to see if it might be a bug and it looks like the number of duplicate content errors has gone down a lot with the crawl that took place today and none of these pages are included as duplicates, so it may have been a temporary bug. If you see these pages counted as duplicates again. Please let us know so that we can look into it further.
Hopefully, this helps!
Chiaryn
-
Beautiful, I will try it out!
-
A decent free tool for internal site duplication is siteliner.com it is made by Copyscape I believe, but quite helpful for any duplicate content concerns.
-
If we ever meet I will gladly buy!
Thanks!
-
These pages aren't duplicate at all. I wouldn't worry about it. The SEOmoz crawl tool isn't perfect and you can rest assured that Google won't consider these pages duplicate content.
You owe me a Coke.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO - Massive duplication of same page, but different link.
Hi!
Intermediate & Advanced SEO | | jennisprints
I'm dealing with a big client who's site has a big (approx. 39 000) duplication of the "same" page (same content) but each page has a different URL. The duplicated page is a "become a member"-page.
I've checked the backlinks in Google Search Console and there are no sites linking to any of the duplicated pages.
The developers have no clue where or how the pages came to be duplicated, but my guess is that every time a new customer sets up an account the page becomes duplicated. The customer want us to just remove the pages and sort out the duplication, but removing the pages might cause a big drop in back links/traffic and what not. I would much rather redirect the duplicated pages to the original page, but given that there are 39 000 pages it might mess with the site speed. Looking for ideas and suggestions of what the next step should be, remove or redirect.
Thanks so much!0 -
HELP! How does one prevent regional pages as being counted as "duplicate content," "duplicate meta descriptions," et cetera...?
The organization I am working with has multiple versions of its website geared towards the different regions. US - http://www.orionhealth.com/ CA - http://www.orionhealth.com/ca/ DE - http://www.orionhealth.com/de/ UK - http://www.orionhealth.com/uk/ AU - http://www.orionhealth.com/au/ NZ - http://www.orionhealth.com/nz/ Some of these sites have very similar pages which are registering as duplicate content, meta descriptions and titles. Two examples are: http://www.orionhealth.com/terms-and-conditions http://www.orionhealth.com/uk/terms-and-conditions Now even though the content is the same, the navigation is different since each region has different product options / services, so a redirect won't work since the navigation on the main US site is different from the navigation for the UK site. A rel=canonical seems like a viable option, but (correct me if I'm wrong) it tells search engines to only index the main page, in this case, it would be the US version, but I still want the UK site to appear to search engines. So what is the proper way of treating similar pages accross different regional directories? Any insight would be GREATLY appreciated! Thank you!
Intermediate & Advanced SEO | | Scratch_MM0 -
Joomla Duplicate Page content fix for mailto component?
Hi, I am currently working on my site and have the following duplicate page content issues: My Uni Essays http://www.myuniessays.co.uk/component/mailto/?tmpl=component&template=it_university&link=2631849e33 My Uni Essays http://www.myuniessays.co.uk/component/mailto/?tmpl=component&template=it_university&link=2edd30f8c6 This happens 15 times Any ideas on how to fix this please? Thank you
Intermediate & Advanced SEO | | grays01800 -
Duplicate Content/ Indexing Question
I have a real estate Wordpress site that uses an IDX provider to add real estate listings to my site. A new page is created as a new property comes to market and then the page is deleted when the property is sold. I like the functionality of the service but it creates a significant amount of 404's and I'm also concerned about duplicate content because anyone else using the same service here in Las Vegas will have 1000's of the exact same property pages that I do. Any thoughts on this and is there a way that I can have the search engines only index the core 20 pages of my site and ignore future property pages? Your advice is greatly appreciated. See link for example http://www.mylvcondosales.com/mandarin-las-vegas/
Intermediate & Advanced SEO | | AnthonyLasVegas0 -
Is this will post Duplicated Content
I have domain let say abcshoesonlinestore.com and inside pages of this abcshoesonlinestore.com is ranking very well such as affiliate page, knowledgebase page and other pages, HOWEVER i would like to change my home page and product page to shorter url which abcshoes.com and keep those inside page like www.abashoesonlinestore.com/affiliate or www.abcshoesonlinestore.com/knowledgebase as it is - will this pose duplicate content? This is my plan to do it: the home page and product page will be www.abcshoes.com and when people click www.abcshoes.com/affiliate it will redirect 301 to abcshoesonlinestore.com/affiliate HOWEVER if someone type abcshoesonlinestore.com or abcshoesonlinestore.com/product it will redirect to abcshoes.com or its product page itself (i want to use 302 instead 301 (ASSUMING if the homapage or product page have manual penalization or anything bad we want to leave it behind and start fresh JUST assume because i read some post that 301 will carry any bad thing to new site too) The reason i do not want to 301 from abcshoesonlinestore.com to abcshoes.com is because those many pages is ranking top 3 in GOOGLE ( i worry will lose this ranking since this bringing traffic for us) Is this good idea or bad idea or any better idea or should i try to see the outcome 🙂 - the only concern is from abcshoesonlinestore.com to abcshoes.com will pose as duplicate content if i do not use 301 - or can i use google webmaster tools to remove the home page and product page for abcshoesonlinestore.com can we tell google that? PS: (home page and product page will have new revise content and minor design change) but inside page will stay the same design Please give me some advise
Intermediate & Advanced SEO | | owen20110 -
Duplicate content - canonical vs link to original and Flash duplication
Here's the situation for the website in question: The company produces printed publications which go online as a page turning Flash version, and as a separate HTML version. To complicate matters, some of the articles from the publications get added to a separate news section of the website. We want to promote the news section of the site over the publications section. If we were to forget the Flash version completely, would you: a) add a canonical in the publication version pointing to the version in the news section? b) add a link in the footer of the publication version pointing to the version in the news section? c) both of the above? d) something else? What if we add the Flash version into the mix? As Flash still isn't as crawlable as HTML should we noindex them? Is HTML content duplicated in Flash as big an issue as HTML to HTML duplication?
Intermediate & Advanced SEO | | Alex-Harford0 -
Any experience regarding what % is considered duplicate?
Some sites (including 1 or two I work with) have a legitimate reason to have duplicate content, such as product descriptions. One way to deal with duplicate content is to add other unique content to the page. It would be helpful to have guidelines regarding what percentage of the content on a page should be unique. For example, if you have a page with 1,000 words of duplicate content, how many words of unique content should you add for the page to be considered OK? I realize that a) Google will never reveal this and b) it probably varies a fair bit based on the particular website. However... Does anyone have any experience in this area? (Example: You added 300 words of unique content to all 250 pages on your site, that each had 100 words of duplicate content before, and that worked to improve your rankings.) Any input would be appreciated! Note: Just to be clear, I am NOT talking about "spinning" duplicate content to make it "unique". I am talking about adding unique content to a page that has legitimate duplicate content.
Intermediate & Advanced SEO | | AdamThompson0 -
Subdomains - duplicate content - robots.txt
Our corporate site provides MLS data to users, with the end goal of generating leads. Each registered lead is assigned to an agent, essentially in a round robin fashion. However we also give each agent a domain of their choosing that points to our corporate website. The domain can be whatever they want, but upon loading it is immediately directed to a subdomain. For example, www.agentsmith.com would be redirected to agentsmith.corporatedomain.com. Finally, any leads generated from agentsmith.easystreetrealty-indy.com are always assigned to Agent Smith instead of the agent pool (by parsing the current host name). In order to avoid being penalized for duplicate content, any page that is viewed on one of the agent subdomains always has a canonical link pointing to the corporate host name (www.corporatedomain.com). The only content difference between our corporate site and an agent subdomain is the phone number and contact email address where applicable. Two questions: Can/should we use robots.txt or robot meta tags to tell crawlers to ignore these subdomains, but obviously not the corporate domain? If question 1 is yes, would it be better for SEO to do that, or leave it how it is?
Intermediate & Advanced SEO | | EasyStreet0