How should I deal with "duplicate" content in an Equipment Database?
-
The Moz Crawler is identifying hundreds of instances of duplicate content on my site in our equipment database. The database is similar in functionality to a site like autotrader.com. We post equipment with pictures and our customers can look at the equipment and make purchasing decisions.
The problem is that, though each unit is unique, they often have similar or identical specs which is why moz (and presumably google/bing) are identifying the content as "duplicate". In many cases, the only difference between listings are the pictures and mileage- the specifications and year are the same.
Ideally, we wouldn't want to exclude these pages from being indexed because they could have some long-tail search value. But, obviously, we don't want to hurt the overall SEO of the site.
Any advice would be appreciated.
-
I think Tom lays this out quite well and I would follow this advice.
-
I would leave it like this especially if these pages generate long tail search traffic. Having semi-duplicate pages isn't necessarily going to hurt you (check also: https://blog.kissmetrics.com/myths-about-duplicate-content/). Check also this article https://moz.com/blog/have-we-been-wrong-about-panda-all-along) and finally Google (https://support.google.com/webmasters/answer/66359?hl=en) :
"Duplicate content on a site is not grounds for action on that site unless it appears that the intent of the duplicate content is to be deceptive and manipulate search engine results. If your site suffers from duplicate content issues, and you don't follow the advice listed above, we do a good job of choosing a version of the content to show in our search results."
If your site has enough pages with rich content & these "thin" pages have value as landing pages for your visitors don't start messing with it.
Dirk
-
"Ideally, we wouldn't want to exclude these pages from being indexed because they could have some long-tail search value. But, obviously, we don't want to hurt the overall SEO of the site."
You say that, but I'm not entirely sure it's true.
I understand the theory - if you have 20 Citroen C1s listed on the site, you could potentially have 20 pages of yours ranking for relevant terms, right?
Well, unique content on those pages or not, I think it would be extremely unlikely that Google would want to present all of those results to the user. Furthermore, if the pages expire or go "out of stock", as it were, when purchased, would Google want to rank it?
So I'm not convinced having all those pages indexed and treated as unique (whether they are or not) would result in traffic (please prove me wrong though - if you have lots of entrances to the site via organic search to those pages it'll show what I know!).
My preference, regardless of the above, would be to have a main page for your Citroen C1 products - a hub page - that then links to all the different products you have as and when they're available.
This has many advantages - you just need to focus on ranking one page in the category instead of several, you can collect all the link equity you earn to one page, you can ensure the page is well optimised for search engines and users, and the page will be evergreen - meaning your links would be too.
The short version:
Homepage > Hub Page > Product variant 1, variant 2 etc
Rank the homepage and the hub page.
Hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content issue with Wordpress tags?
Would Google really discount duplicate content created by Wordpress tags? I find it hard to believe considering tags are on and indexed by default and the vast majority of users would not know to deindex them . . .
Technical SEO | | BlueLinkERP0 -
Duplicate Content - Reverse Phone Directory
Hi, Until a few months ago, my client's site had about 600 pages. He decided to implement what is essentially a reverse phone directory/lookup tool. There are now about 10,000 reverse directory/lookup pages (.html), all with short and duplicate content except for the phone number and the caller name. Needless to say, I'm getting thousands of duplicate content errors. Are there tricks of the trade to deal with this? In nosing around, I've discovered that the pages are showing up in Google search results (when searching for a specific phone number), usually in the first or second position. Ideally, each page would have unique content, but that's next to impossible with 10,000 pages. One potential solution I've come up with is incorporating user-generated content into each page (maybe via Disqus?), which over time would make each page unique. I've also thought about suggesting that he move those pages onto a different domain. I'd appreciate any advice/suggestions, as well as any insights into the long-term repercussions of having so many dupes on the ranking of the 600 solidly unique pages on the site. Thanks in advance for your help!
Technical SEO | | sally580 -
How to protect against duplicate content?
I just discovered that my company's 'dev website' (which mirrors our actual website, but which is where we add content before we put new content to our actual website) is being indexed by Google. My first thought is that I should add a rel=canonical tag to the actual website, so that Google knows that this duplicate content from the dev site is to be ignored. Is that the right move? Are there other things I should do? Thanks!
Technical SEO | | williammarlow0 -
Content and url duplication?
One of the campaign tools flags one of my clients sites as having lots of duplicates. This is true in the sense the content is sort of boiler plate but with the different countries wording changed. The is same with the urls but they are different in the sense a couple of words have changed in the url`s. So its not the case of a cms or server issue as this seomoz advises. It doesnt need 301`s! Thing is in the niche, freight, transport operators, shipping, I can see many other sites doing the same thing and those sites have lots of similar pages ranking very well. In fact one site has over 300 keywords ranked on page 1-2, but it is a large site with an 12yo domain, which clearly helps. Of course having every page content unique is important, however, i suppose it is better than copy n paste from other sites. So its unique in that sense. Im hoping to convince the site owner to change the content over time for every country. A long process. My biggest problem for understanding duplication issues is that every tabloid or broadsheet media website would be canned from google as quite often they scrape Reuters or re-publish standard press releases on their sites as newsworthy content. So i have great doubt that there is a penalty for it. You only have to look and you can see media sites duplication everywhere, everyday, but they get ranked. I just think that google dont rank the worst cases of spammy duplication. They still index though I notice. So considering the business niche has very much the same content layout replicated content, which rank well, is this duplicate flag such a great worry? Many businesses sell the same service to many locations and its virtually impossible to re write the services in a dozen or so different ways.
Technical SEO | | xtopher660 -
Duplicate Content
Hi, we need some help on resolving this duplicate content issue,. We have redirected both domains to this magento website. I guess now Google considered this as duplicate content. Our client wants both domain name to go to the same magento store. What is the safe way of letting Google know these are same company? Or this is not ideal to do this? thanks
Technical SEO | | solution.advisor0 -
Video thumbnail pages with "sort" feature -- tons of duplicate content?
A client has 2 separate pages for video thumbnails. One page is "popular videos" with a sort function for over 700 pages of video thumbnails with 10 thumbnails and short desriptions per page. (/videos?sort_by=popularity). The second page is "latest videos" (/videos?sort_by=latest) with over 7,000 pages. Both pages have a sort function -- including latest, relevance, popularity, time uploaded, etc. Many of the same video thumbnails appear on both pages. Also, when you click a thumbnail you get a full video page and these pages appear to get indexed well. There seem to be duplicate content issues between the "popular" and "latest" pages, as well as within the sort results on each of those pages. (A unique URL is generated everytime you use the sort function i.e. /videos?sort_by=latest&uploaded=this_week). Before my head explodes, what is the best way to treat this? I was thinking a noindex,follow meta robot on every page of thumbnails since the individual video pages are well indexed, but that seems extreme. Thoughts?
Technical SEO | | 540SEO0 -
Duplicate homepage content
Hi, I recently did a site crawl using seomoz crawl test My homepage seems to have 3 cases of duplicate content.. These are the urls www.example.ie/ www.example..ie/%5B%7E19%7E%5D www.example..ie/index.htm Does anyone have any advise on this? What impact does this have on my seo?
Technical SEO | | Socialdude0 -
Duplicate content domains ranking successfully
I have a project with 8 domains and each domain is showing the same content (including site structure) and still all sites do rank. When I search for a specific word-string in google it lists me all 8 domains. Do you have an explanation, why Google doesn't filter those URLs to just one URL instead of 8 with the same content?
Technical SEO | | kenbrother0