Multiple products with legitimate duplicate descriptions
-
We are redeveloping a website for a card company who have far too many products to write unique descriptions for each. Even if they could I don't think it would be beneficial to the user. However they do have unique descriptions for each range which is useful for users viewing an individual card.
Which is better practice:
a) Ignore the duplicate content issue and supply the user with info about the range
b) Provide clear enticing links to find out more about the range which will leave the individual card page a little void of content.
Many thanks
-
Many thanks Alex, already on all of the points mentioned above but always nice to get some validation for plans of action... half the time that's more satisfying than coming up with them!
-
Also make sure all your page titles are unique. While descriptions don't matter that much, titles should be unique or at the very least descriptive and targeted for your keywords.
-
Sounds like you have a good handle on it. As Mike recommends, focusing on the more category-style pages is definitely more appealing to a user looking to browse the site. At the end of the day, beyond SEO, the end-user is really what you care about. Another suggestion might be to make the single card pages more like category pages. Show cards from the same category, suggest similar cards or categories. Related links are one of the best ways to promote pageviews and get both the end-user as well as a googlebot interested in "crawling" more pages. That's a bonus on all fronts.
-
Sorry again I should have been clearer it is not really the meta data I am concerned about at this point more the benefit of on-page content. I think the bigger issue is that for organic search traffic card pages are almost redundant and focus, as Mike says should be on the range pages which should encourage browsing.
-
Although in additon there is the issue about the card page lacking in enticing content which could reduce the chance of a conversion. Becuase this is a predominently visual subject pehaps I am just worrying about the lack of text as it fundamentally goes against usual SEO rather than benfitting usability.
-
Apologies, I should have said greetings cards, so this is indeed very helpful, thank you
-
Meta descriptions won't get you penalized for duplicate content so there's no need to worry about that. The descriptions are really just what you see in the Google Search. Users are more likely to click a link that has a nice descriptive description that leads them into the content you're looking for. Of course a custom description is always best for each individual page/product, but in some cases the time isn't worth it.
I'm not sure what you mean with clear links on B, but A is a perfectly fine solution if there are just too many pages. A good options might be to create a generic description that uses the product name as a variable. ie: "My (Car) is red", "My (Cart) is red", etc.
-
What sort of cards are we talking about? I immediately think "greeting cards" when you say that but I don't want to just assume that's the case. But if it is then from a personal user experience standpoint I would say that I would be more likely to search a specific range & gain more from finding a category page for a range of cards in the SERPs than I would from a page with an individual card on it. I.E. I'm more likely to search "birthday cards" or "get well cards" or "thank you cards" than I would to search "that birthday card with a grumpy cat that has balloons and a smushed cake". In which case I'd say go with robust category pages and, if possible, consider canonicalizing the individual cards to the category if you're going to also use the content of the parent category on the individual pages. If it's not greeting cards... well then I wrote all of this for nothing. (unless its playing cards... what I wrote might work for that as well)
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Thousands of duplicate website links at "Who links the most" in Google webmasters. Any risk being duplicate website links pointing to website?
Hi all, As I mentioned some days back here, our duplicate website got indexed a month back. Unfortunately there are links to our original website. I noticed that thousands of links are from our duplicate website at "Links to Your Site". Will this hurts? Now we have blocked the duplicate website getting indexed. What to do to remove these links from "Who links the most"? Thanks
Algorithm Updates | | vtmoz0 -
Our partners are using our website content for their websites. Do such websites hurt us due to duplicate content?
Hi all, Many of our partners across the globe are using the same content from our website and hosting on their websites including header tags, text, etc. So I wonder will these websites are hurting our website due to this duplicate content. Do we need to ask our partners to stop using our content? Any suggestions? What if some unofficial partners deny to remove the content? best way to handle? Thanks
Algorithm Updates | | vtmoz0 -
SEO Myth-Busters -- Isn't there a "duplicate content" penalty by another name here?
Where is that guy with the mustache in the funny hat and the geek when you truly need them? So SEL (SearchEngineLand) said recently that there's no such thing as "duplicate content" penalties. http://searchengineland.com/myth-duplicate-content-penalty-259657 by the way, I'd love to get Rand or Eric or others Mozzers aka TAGFEE'ers to weigh in here on this if possible. The reason for this question is to double check a possible 'duplicate content" type penalty (possibly by another name?) that might accrue in the following situation. 1 - Assume a domain has a 30 Domain Authority (per OSE) 2 - The site on the current domain has about 100 pages - all hand coded. Things do very well in SEO because we designed it to do so.... The site is about 6 years in the current incarnation, with a very simple e-commerce cart (again basically hand coded). I will not name the site for obvious reasons. 3 - Business is good. We're upgrading to a new CMS. (hooray!) In doing so we are implementing categories and faceted search (with plans to try to keep the site to under 100 new "pages" using a combination of rel canonical and noindex. I will also not name the CMS for obvious reasons. In simple terms, as the site is built out and launched in the next 60 - 90 days, and assume we have 500 products and 100 categories, that yields at least 50,000 pages - and with other aspects of the faceted search, it could create easily 10X that many pages. 4 - in ScreamingFrog tests of the DEV site, it is quite evident that there are many tens of thousands of unique urls that are basically the textbook illustration of a duplicate content nightmare. ScreamingFrog has also been known to crash while spidering, and we've discovered thousands of URLS of live sites using the same CMS. There is no question that spiders are somehow triggering some sort of infinite page generation - and we can see that both on our DEV site as well as out in the wild (in Google's Supplemental Index). 5 - Since there is no "duplicate content penalty" and there never was - are there other risks here that are caused by infinite page generation?? Like burning up a theoretical "crawl budget" or having the bots miss pages or other negative consequences? 6 - Is it also possible that bumping a site that ranks well for 100 pages up to 10,000 pages or more might very well have a linkuice penalty as a result of all this (honest but inadvertent) duplicate content? In otherwords, is inbound linkjuice and ranking power essentially divided by the number of pages on a site? Sure, it may be some what mediated by internal page linkjuice, but what's are the actual big-dog issues here? So has SEL's "duplicate content myth" truly been myth-busted in this particular situation? ??? Thanks a million! 200.gif#12
Algorithm Updates | | seo_plus0 -
Did Google update the length of characters allowed in Meta Description?
Hey all, I do SEO. I'm currently working with another SEO firm on a project. The lady mentioned to me that Google recently updated (couple months ago) and changed their font causing them to lower the meta description to 55 characters. Is this true? I have not heard of this. Could she be confusing the meta description with the title tag? I didn't know Google could have even update the Title tag too.
Algorithm Updates | | ColeLusby0 -
Does omitted results shown by Google always mean that website has duplicate content?
Google search results for a particular query was appearing in top 10 results but now the page appears but only after clicking on the " omitted results by google." My website lists different businesses in a particular locality and sometimes results for different localities are same because we show results from nearby area if number of businesses in that locality (search by users) are less then 15. Will this be considered as "duplicate content"? If yes then what steps can be taken to resolve this issue?
Algorithm Updates | | prsntsnh0 -
Staging site - Treated as duplicate?
Last week (exactly 8 days ago to be precise) my developer created a staging/test site to test some new features. The staging site duplicated the entire existing site on the same server. To explain this better -My site address is - www.mysite.com The path of the new staging site was www.mysite/staging I realized this only today and have immediately restricted robot text and put a no index no follow on the entire duplicate server folder but I am sure that Google would have indexed the duplicate content by now? So far I do not see any significant drop in traffic but should I be worried? and what if anything can I do at this stage?
Algorithm Updates | | rajatsharma0 -
Implications of removing all google products from site
Is there any data on the implications of removing everything google from a site; analytics, adsense, webmaster tools, sitemaps, etc. Obviously they still have their search data and they say they dont use these other sources of data for ranking information but has anyone actually tried this or is there any existing data on this?
Algorithm Updates | | jessefriedman0 -
How often do people use Google Product Search
I was was reading Tom Critchlow's excellent blog on how to rank well for Google Product Search. I'm trying to find out if there are stats on how often people use this feature in Google (since it is not listed on Google's main navigation). I'm working with a customer who has b-2-b products and am trying to determine the value of adjusting his ecommerce pages to appear on Google Product Search.
Algorithm Updates | | EricVallee340