Duplicate Page Content on pages that appear to be different?
-
Hi Everyone! My name's Ross, and I work at CHARGED.fm. I worked with Luke, who has asked quite a few questions here, but he has since moved on to a new adventure. So I am trying to step into his role. I am very much a beginner in SEO, so I'm trying to learn a lot of this on the fly, and bear with me if this is something simple.
In our latest MOZ Crawl, over 28K high priority issues were detected, and they are all Duplicate Page Content issues. However, when looking at the issues laid out, the examples that it gives for "Duplicate URLs" under each individual issue appear to be completely different pages. They have different page titles, different descriptions, etc. Here's an example.
For "LPGA Tickets", it is giving 19 Duplicate URLs. Here are a couple it lists when you expand those:
http://www.charged.fm/one-thousand-one-nights-tickets
http://www.charged.fm/trash-inferno-tickets
http://www.charged.fm/mylan-wtt-smash-hits-tickets
http://www.charged.fm/mickey-thomas-ticketsInternally, one reason we thought this might be happening is that even though the pages themselves are different, the structure is completely similar, especially if there are no events listed or if there isn't any content in the News/About sections. We are going to try and noindex pages that don't have events/new content on them as a temporary fix, but is there possibly a different underlying issue somewhere that would cause all of these duplicate page content issues to begin appearing?
Any help would be greatly appreciated!
-
Nothing will positively effect this issue more than updating the content and giving the searchers solid, informative, unique content to read.
One way to do that might be to aggregate some reviews for these individual shows, give a short, unique bio of the performers, or rate the venues. 500-800 words of unique content will go a long way in this case.
Something else to work on would be the amount of internal links back and forth. When links are all robot sees, that becomes your duplicate content issue too. You can't do too much about that in this case. Most of the links come from the nav bars, so, the way to counter it would be again, adding great content.
-
Well, if it were one of my clients sites… I wouldn't do that. While I understand your logic with a noindex, I wouldn't want to create a situation where the pages would not be about to be found at all in search engines. Although it will drop your duplicate content numbers here on Moz, it's only a temporary fix. I guess a good question to explore is how long you will need to keep them as a noindex versus how long it would take to fix the content issues.
-
Hey Adam!
thanks for the response, that kind of confirms what we were thinking. So we are planning to put in a noindex follow on those pages while we work on adjusting the content/descriptions. Is that a good fix while we work on the pages or is there something else we should be doing?
-
Hey Ross!
Those pages are not "different" when it comes to search engines. Or maybe I should say, not different enough. The content is extremely thin and only switching out a word or two will absolutely make them come up as duplicate content. I would strongly suggest optimizing the page content and meta descriptions to be unique.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Large site with content silo's - best practice for deep indexing silo content
Thanks in advance for any advice/links/discussion. This honestly might be a scenario where we need to do some A/B testing. We have a massive (5 Million) content silo that is the basis for our long tail search strategy. Organic search traffic hits our individual "product" pages and we've divided our silo with a parent category & then secondarily with a field (so we can cross link to other content silo's using the same parent/field categorizations). We don't anticipate, nor expect to have top level category pages receive organic traffic - most people are searching for the individual/specific product (long tail). We're not trying to rank or get traffic for searches of all products in "category X" and others are competing and spending a lot in that area (head). The intent/purpose of the site structure/taxonomy is to more easily enable bots/crawlers to get deeper into our content silos. We've built the page for humans, but included link structure/taxonomy to assist crawlers. So here's my question on best practices. How to handle categories with 1,000+ pages/pagination. With our most popular product categories, there might be 100,000's products in one category. My top level hub page for a category looks like www.mysite/categoryA and the page build is showing 50 products and then pagination from 1-1000+. Currently we're using rel=next for pagination and for pages like www.mysite/categoryA?page=6 we make it reference itself as canonical (not the first/top page www.mysite/categoryA). Our goal is deep crawl/indexation of our silo. I use ScreamingFrog and SEOMoz campaign crawl to sample (site takes a week+ to fully crawl) and with each of these tools it "looks" like crawlers have gotten a bit "bogged down" with large categories with tons of pagination. For example rather than crawl multiple categories or fields to get to multiple product pages, some bots will hit all 1,000 (rel=next) pages of a single category. I don't want to waste crawl budget going through 1,000 pages of a single category, versus discovering/crawling more categories. I can't seem to find a consensus as to how to approach the issue. I can't have a page that lists "all" - there's just too much, so we're going to need pagination. I'm not worried about category pagination pages cannibalizing traffic as I don't expect any (should I make pages 2-1,000) noindex and canonically reference the main/first page in the category?). Should I worry about crawlers going deep in pagination among 1 category versus getting to more top level categories? Thanks!
Moz Pro | | DrewProZ1 -
Why do I see a duplicate content errors when rel="canonical" tag is present
I was reviewing my first Moz crawler report and noticed the crawler returned a bunch of duplicate page content errors. The recommendations to correct this issue are to either put a 301 redirect on the duplicate URL or use the rel="canonical" tag so Google knows which URL I view as the most important and the one that should appear in the search results. However, after poking around the source code I noticed all of the pages that are returning duplicate content in the eyes of the Moz crawler already have the rel="canonical" tag. Does the Moz crawler simply not catch whether that tag is being used? If I have that tag in place, is there anything else I need to do in order to get that error to stop showing up in the Moz crawler report?
Moz Pro | | shinolamoz0 -
Duplicate Content Issue because of root domain and index.html
SEOMoz crawl diagnostics is suggesting that my root domain and the rootdomain/index.html are duplicate content. What can be done to ensure that both are considered as a single age only?
Moz Pro | | h1seo1 -
About Duplicate Content found by SEOMOZ... that is not duplicate
Hi folks, I am hunting for duplicate content based on SEOMOZ great tool for that 🙂 I have some pages that are mentioned as duplicate but I cant say why. They are video page. The content is minimalistic so I guess it might be because all the navigation is the same but for instance http://www.nuxeo.com/en/resource-center/Videos/Nuxeo-World-2010/Nuxeo-World-2010-Presentation-Thierry-Delprat-CTO and http://www.nuxeo.com/en/resource-center/Videos/Nuxeo-World-2010/Nuxeo-World-2010-Presentation-Cheryl-McKinnon-CMO are mentioned as duplicate. Any idea? Is it hurting? Cheers,
Moz Pro | | nuxeo0 -
How To Solve Too Many On-Page Links In Blogger?
Hi, I Have An Issue Too Many On-Page Links In My Site And I Saw That There Are More Than 300 On Page Links On My Home Page URL. My Site Is Hosted On Blogger. So Please Tell Me How To Fix This Problem In Blogger.
Moz Pro | | MaherHackers0 -
How to check Page Authority in bulk?
Hey guys, I'm on the free trial for SEOmoz PRO and I'm in love. One question, though. I've been looking all over the internet for a way to check Page Authority in bulk. Is there a way to do this? Would I need the SEOmoz API? And what is the charge? All I really need is a way to check Page Authority in bulk--no extra bells and whistles. Thanks, Brandon
Moz Pro | | thegreatpursuit0 -
Is there a Tool to compare Duplicate content for non web Live content?
Is there a tool that can give me % of duplicate content when comparing two pieces of content that are not Live on the web? Like copyscape but for content that may not be indexed by copyscape or not live on the web? Does Word or any other program allow you do do this?
Moz Pro | | bozzie3110 -
Solving duplicate content errors for what is effectively the same page.
Hello,
Moz Pro | | jcarter
I am trying out your SEOMOZ and I quite like it. I've managed to remove most of the errors on my site however I'm not sure how to get round this last one. If you look at my errors you will see most of them revolve around things like this: http://www.containerpadlocks.co.uk/categories/32/dead-locks
http://www.containerpadlocks.co.uk/categories/32/dead-locks?PageSize=9999 These are essentially the same pages because the category for Dead Locks does not contain enough products to view over more than one resulting in the fact that when I say 'View all products' on my webpage, the results are the same. This functionality works with categories with more than the 20 per page limit. My question is, should I be either: Removing the link to 'show all products' (which adds the PageSize query string value) if no more products will be shown. Or putting a no-index meta tag on the page? Or some other action entirely? Looking forward to your reply and you showing how effective Pro is. Many Thanks,
James Carter0