Duplicate and thin content - advanced..
-
Hi Guys
Two issues to sort out..
So we have a website that lists products and has many pages for:
a) The list pages - that lists all the products for that area.
b) The detailed pages - that when click into from the list page, will list the specific product in full.On the list page, we perhaps have half the description written down, when clicked into you see the full description.
If you search in google for a phrase on the detailed page, you will see results for that specific page including 'multiple' list pages where it is on.For example, lets say we are promoting 'trees' which are situated in Manhatten. And we are also promoting trees in Brooklyn, there is a crossover. So a tree listed in Manhatten will also be listen in brooklyn as its close by (not from America so don't laugh if I have areas muddled)
We then have quite a few pages with the same content as a result.I read a post a while back from the mighty Cutts who said not to worry about the duplicate unless its spammy, but what is good for one person, is spammy to another..
Does anyone have any ideas as to if this is a genuine problem and how you would solve?
Also, we know we have alot of thin content on the site, but we dont know how to identify it. It's a large site so needs something automated (I think)..
Thanks in advance
Nick
-
Thanks William. We found screaming frog recently. Why nobody ever told us about it before is amazing.
-
If you are worried about duplicate content in you search pages, that should be pretty easily solved with canonical tags. These will tell search engines which pages should be indexed, even if that pages' content is seen somewhere else on the site. Here's a link to more information on that: http://moz.com/blog/canonical-url-tag-the-most-important-advancement-in-seo-practices-since-sitemaps
Even though Cutts said it shouldn't be an issue, he speaks in general terms (to put it lightly). Maybe Google tries to pick up the canonical version, but there's no harm in helping point Google in the right direction, just in case it doesn't crawl your site properly.
There are a few automated tools out there to crawl tons of pages and the potential issues on them. ScreamingFrog may be of use. There are also higher-level enterprise solutions to the problem like Searchlight Conductor.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content penalty
when moz crawls my site they say I have 2x the pages that I really have & they say I am being penalized for duplicate content. I know years ago I had my old domain resolve over to my new domain. Its the only thing that makes sense as to the duplicate content but would search engines really penalize me for that? It is technically only on 1 site. My business took a significant sales hit starting early July 2013, I know google did and algorithm update that did have SEO aspects. I need to resolve the problem so I can stay in business
On-Page Optimization | | cheaptubes0 -
Plagiarism or duplicate checker tool?
Do you know a plagiarism or duplicate checker tool where I can receive an email alert if someone copies my content? I know there's a tool like this (similar to http://www.tynt.com/ though people can still remove the link from the original source) but I forgot the name or site. It's like a source code that you must insert in each of your webpage. Thanks in advanced!
On-Page Optimization | | esiow20131 -
Duplicate content
Hi everybody, I am thrown into a SEO project of a website with a duplicate content problem because of a version with and a version without 'www' . The strange thing is that the version with www. has got more than 10 times more Backlings but is not in the organic index. Here are my questions: 1. Should I go on using the "without www" version as the primary resource? 2. Which kind of redirect is best for passing most of the link juice? Thanks in advance, Sebastian
On-Page Optimization | | Naturalmente0 -
Why Does SEOMOZ Crawl show that i have 5,769 pages with Duplicate Content
Hello... I'm trying to do some analysis on my site (http://goo.gl/JgK1e) and SEOMOZ Crawl Diagnostics is telling me that I have 5,769 pages with duplicate content. Can someone, anyone, please help me understand: how does SEOMOZ determine if i have duplicate content Is it correct ? Are there really that many pages of duplicate content How do i fix this, if true <---- ** Most important ** Thanks in advance for any help!!
On-Page Optimization | | Prime850 -
Page without content
Hey Everyone, I've started an SEO On Page analysis for a web site and I've found a lot of duplicate content and useless pages. What do I have to do? Delete this useless page, redirect or do canonical tag? If I have to delete what is the best way to do? Should I use GWT to delete? or just delete from the server? This URL for example: http://www.sexshopone.com.br/?1.2.44.0,0,1,13,0,0,aneis-evolved-boss-cock's.html [admin note: NSFW page} There is no content and it is duplicate in reference of this: http://www.sexshopone.com.br/?1.2.44.0,0,1,12,0,0,aneis-evolved-boss-cock's.html [admin note: NSFW page} and the correct page of the product is: http://www.sexshopone.com.br/?1.2.44.0,423,anel-peniano-evolved-boss-cock's-pleasure-rings-collar-white-reutilizavel-e-a-prova-d'agua-colecao-evolved.html [admin note: NSFW page} What is happening is that we have 8.000 pages like this. Useless and without any content. How do I proceed? Thanks!
On-Page Optimization | | luf07090 -
Tools for finding duplicate content offsite?
Hi is there a tool that will spider my site then find similar text on external sites?
On-Page Optimization | | adamzski0 -
How to avoid duplicate content on ecommerce pages?
I am currently building the site architecture for a very large ecommerce site. I am wondering how I should build it out if I have products that I want to include in multiple categories within my site. For example: Lets say I sell fitness equipment and I have categories for things such as: Treadmill, Exercise Bike, Stair Stepper, Weight Benches etc. But then I also have specific brand category pages such a: Precor, Life Fitness, Hammer, Body Solid So my question is how do I structure this so I am building this correctly? If I sell a Precor Treadmill I will want to include that product under the "Treadmill" category page as well as under the "Precor Equipment" category page. Can I get some advice for the best way to structure this? It's obviously something I want to avoid at all costs of doing improperly and having to fix later. Thank you Jake
On-Page Optimization | | PEnterprises0 -
How woud you deal with Blog TAGS & CATEGORY listings that are marked a 'duplicate content' in SEOmoz campaign reports?
We're seeing "Duplicate Content" warnings / errors in some of our clients' sites for blog / event calendar tags and category listings. For example the link to http://www.aavawhistlerhotel.com/news/?category=1098 provides all event listings tagged to the category "Whistler Events". The Meta Title and Meta Description for the "Whistler Events" category is the same as another other category listing. We use Umbraco, a .NET CMS, and we're working on adding some custom programming within Umbraco to develop a unique Meta Title and Meta Description for each page using the tag and/or category and post date in each Meta field to make it more "unique". But my question is .... in the REAL WORLD will taking the time to create this programming really positively impact our overall site performance? I understand that while Google, BING, etc are constantly tweaking their algorithms as of now having duplicate content primarily means that this content won't get indexed and there won't be any really 'fatal' penalties for having this content on our site. If we don't find a way to generate unique Meta Titles and Meta Descriptions we could 'no-follow' these links (for tag and category pages) or just not use these within our blogs. I am confused about this. Any insight others have about this and recommendations on what action you would take is greatly appreciated.
On-Page Optimization | | RoyMcClean0