Duplicate and thin content - advanced..
-
Hi Guys
Two issues to sort out..
So we have a website that lists products and has many pages for:
a) The list pages - that lists all the products for that area.
b) The detailed pages - that when click into from the list page, will list the specific product in full.On the list page, we perhaps have half the description written down, when clicked into you see the full description.
If you search in google for a phrase on the detailed page, you will see results for that specific page including 'multiple' list pages where it is on.For example, lets say we are promoting 'trees' which are situated in Manhatten. And we are also promoting trees in Brooklyn, there is a crossover. So a tree listed in Manhatten will also be listen in brooklyn as its close by (not from America so don't laugh if I have areas muddled)
We then have quite a few pages with the same content as a result.I read a post a while back from the mighty Cutts who said not to worry about the duplicate unless its spammy, but what is good for one person, is spammy to another..
Does anyone have any ideas as to if this is a genuine problem and how you would solve?
Also, we know we have alot of thin content on the site, but we dont know how to identify it. It's a large site so needs something automated (I think)..
Thanks in advance
Nick
-
Thanks William. We found screaming frog recently. Why nobody ever told us about it before is amazing.
-
If you are worried about duplicate content in you search pages, that should be pretty easily solved with canonical tags. These will tell search engines which pages should be indexed, even if that pages' content is seen somewhere else on the site. Here's a link to more information on that: http://moz.com/blog/canonical-url-tag-the-most-important-advancement-in-seo-practices-since-sitemaps
Even though Cutts said it shouldn't be an issue, he speaks in general terms (to put it lightly). Maybe Google tries to pick up the canonical version, but there's no harm in helping point Google in the right direction, just in case it doesn't crawl your site properly.
There are a few automated tools out there to crawl tons of pages and the potential issues on them. ScreamingFrog may be of use. There are also higher-level enterprise solutions to the problem like Searchlight Conductor.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I have an eCommerce Site with in some cases, 100s of versions of the same product. How do I avoid "duplicate content" without writing literally 100s of unique product descriptions for the exact same product?
For instance, one item where the only difference is the Sports Team Logo is different, etc... or It comes in a variety of color Variants. I'm using Shopify.
On-Page Optimization | | pstone291 -
Duplicate pages
Hi I have recently signed up to Moz Pro and the first crawl report on my wordpress site has brought up some duplicate content issues. I don't know what to do with this data! The original page : http://www.dwliverpoolphotography.co.uk/blog/ and the duplicate content page : http://www.dwliverpoolphotography.co.uk/author/david/ If anyone can point me to a resource or explain what I need to do thanks! David.
On-Page Optimization | | WallerD0 -
Links to Paywall from Content Pages
Hi, My site is funded by subscriptions. We offer lengthy excerpts, and then direct people to a single paywall page, something like domain.com/subscribe/ This means that most pages on the site links to /subscribe, including all of the high value pages that bring people in from Google. This is a page with an understandably high bounce rate, as most users are not interested in paying for content on the web. My question is are we being penalized in Google for having so many internal links to a page with a very high bounce rate? If anyone has worked with paywall sites before and knows the best practices for this, I'd be really grateful to learn more.
On-Page Optimization | | enotes0 -
Internal Duplicate Content/Canonical Issue/ or nothing to worry about
Unfortunately, my developer cannot give me an answer to this so I really do hope someone can help. The homepage of my website is http://www.laddersfree.co.uk however I also have a page http://www.laddersfree.co.uk/index.php that has a page rank and essentially duplicates the home page. Does someone know what this is? Do I need to get my developer to do a 404? It is worrying that he has not come back to me. Thanks Jason
On-Page Optimization | | gymmad0 -
Duplicate Content when Using "visibility classes" in responsive design layouts? - a SEO-Problem?
I have text in the right column of my responsive layout which will show up below the the principal content on small devices. To do this I use visibility classes for DIVs. So I have a DIV with with a unique style text that is visible only on large screen sizes. I copied the same text into another div which shows only up only on small devices while the other div will be hidden in this moment. Technically I have the same text twice on my page. So this might be duplicate content detected as SPAM? I'm concerned because hidden text on page via expand-collapsable textblocks will be read by bots and in my case they will detect it twice?Does anybody have experiences on this issue?bestHolger
On-Page Optimization | | inlinear0 -
What Should I Do With Low Quality Content?
As my site has definitely got hit by Panda, I am in the process of cleaning my website of low quality content. Needless to say, shitty articles are completed being removed but I think lots of this content is now of low quality because it is obsolete and dated. So what should I do with this content? Should I rewrite those articles as completely new posts and link from the old posts to the new ones? Or should I delete the old posts and do a 301 redirect to the new post? Or should I rewrite the content of these articles in place so I can keep the old URL and backlinks? One thing is that I've got a lot more followers than I used to so publishing a new post gets a lot more views, like and shares and whatnot from social networks.
On-Page Optimization | | sbrault741 -
Checking for content originality in a site
two part question on original content How would you go about checking if a site holds original content accept the long search quary within Google? ans also if I find many sites carrying my content and I am the original source should I replace the content? thanks
On-Page Optimization | | ciznerguy0 -
Geo-targeted content and SEO?
I am wondering, what effect does geo-targeted "cookie cutter" content have on SEO. For example, one might have a list of "Top US Comedians", which appears as "Top UK Comedians" for users from the United Kingdom. The data would be populated with information from a database in both cases, but would be completely different for each region, with the exception of a few words. Is this essentially giving Google's (US-based) crawler different content to users? I know that plenty of sites do it, but is it legitimate? Would it be better to redirect to a unique page, based on location, rather than change the content of one static page? I know what the logical SEO answer is here, but even some of the big players use the "wrong" tactic. I am very interested to hear your thoughts.
On-Page Optimization | | HalogenDigital0