Duplicate Content - What's the best bad idea?
-
Hi all,
I have 1000s of products where the product description is very technical and extremely hard to rewrite or create an unique one.
I'll probably will have to use the contend provided by the brands, which can already be found in dozens of other sites.
My options are:
-
Use the Google on/off tags "don't index
" -
Put the content in an image
Are there any other options?
We'd always write our own unique copy to go with the technical bit.
Cheers
-
-
This applies to Google Mini or Search Appliance which are custom search tools for an individual website.
They allow site owners to sculpt the indexing of their private set ups.
Adwords also has something to help indicate the important content for determining the page topic for relating ads.
However, they don't apply to Googlebot spidering as mentioned above.
-
Hi - The google on/off tags idea I got it from https://developers.google.com/search-appliance/documentation/46/admin_crawl/Preparing
| index | Words between the tags are not indexed as occurring on the current page. | fish shark
mackerel | The words fish and mackerel are indexed for this page, but the occurrence of shark is not indexed.
This page could appear in search results for the term shark only if the word appears elsewhere on the page or in anchortext for links to the page.
Hyperlinks that appear within these tags are followed. | -
I agree with Takeshi, but would also like to add that so-called "Google on/off tags" are a myth. What you have typed out would be an HTML comment (they begin with
-
If the descriptions are very technical then likely there is a fair amount of repetition in the sentence pattern, diction etc. I'd recommend playing with regex to help transform content into something original.
For instance, you could search for industry abbreviations CW and replace with long forms _**Clockwise (CW). **_Maybe they over use an adjective that you could changeto your own voice.
Also, perhaps the stock descriptions have blocks of useless content you could strip out in the mean time?
The DB probably has a few other fields (name, product attributes etc) so be sure to find a unique way of assembling the meta description, title and details.
If you find enough to change, I'd think having the description would be better then having a page that is too light on words.
Be sure to mark up with http://schema.org/Product so SE's understand the nature of the content.
EDIT: I have used the regex technique to enhance the content of a database by added inline tooltips, diagrams or figures and glossary links. However with Penguin, I would be careful with automated links. You would only want to create a handful using the same anchor text.
EDIT2: I forgot - MAKE FREQUENT BACK UPS. Regex is super powerful and can tank a database really fast. Make a backup of the original and of every successful iteration - it will take a little longer but it will save your butt when things go bad.
-
I would say use the content as is (regular text) and work on adding additional content on top of that. Most marketplaces and etailers (including Amazon) use the descriptions provided by the brands. Google understands that. The idea is to provide additional value on top of that content with things like user reviews and additional features that make your site stand out.
-
Wow, a really tough problem.
I would definitely go for the image, and then customise the copy around the image so you can still rank for those pages. If you go for noindex tags, you lose all optimisation opportunites.
Or, could you host the product description on a single domain and then link to that from all your relevant pages?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What online tools are best to identify website duplicate content (plagiarism) issues?
I've discovered that one of the sites I am working on includes content which also appears on number of other sites. I need to understand exactly how much of the content is duplicated so I can replace it with unique copy. To do this I have tried using tools such as plagspotter.com and copyscape.com with mixed results, nothing so far is able to give me a reliable picture of exactly how much of my existing website content is duplicated on 3rd party sites. Any advice welcome!
Technical SEO | | HomeJames0 -
Duplicate Content for Multiple Instances of the Same Product?
Hi again! We're set to launch a new inventory-based site for a chain of car dealers with various locations across the midwest. Here's our issue: The different branches have overlap in the products that they sell, and each branch is adamant that their inventory comes up uniquely in site search. We don't want the site to get penalized for duplicate content; however, we don't want to implement a link rel=canonical because each product should carry the same weight in search. We've talked about having a basic URL for these product descriptions, and each instance of the inventory would be canonicalized to this main product, but it doesn't really make sense for the site structure to do this. Do you have any tips on how to ensure that these products (same description, new product from manufacturer) won't be penalized as duplicate content?
Technical SEO | | newwhy0 -
Should we use & or and in our url's?
Example: /Zambia/kasanka-&-bangweulu or /Zambia/kasanka-and-bangweulu which is the better url from the search engines point of view?
Technical SEO | | tribes0 -
Product landing page URL's for e-commerce sites - best practices?
Hi all I have built many e-commerce websites over the years and with each one, I learn something new and apply to the next site and so on. Lets call it continuous review and improvement! I have always structured my URL's to the product landing pages as such: mydomain.com/top-category => mydomain.com/top-category/sub-category => mydomain.com/top-category/sub-category/product-name Now this has always worked fine for me but I see more an more of the following happening: mydomain.com/top-category => mydomain.com/top-category/sub-category => mydomain.com/product-name Now I have read many believe that the longer the URL, the less SEO impact it may have and other comments saying it is better to have the just the product URL on the final page and leave out the categories for one reason or another. I could probably spend days looking around the internet for peoples opinions so I thought I would ask on SEOmoz and see what other people tend to use and maybe establish the reasons for your choices? One of the main reasons I include the categories within my final URL to the product is simply to detect if a product name exists in multiple categories on the site - I need to show the correct product to the user. I have built sites which actually have the same product name (created by the author) in multiple areas of the site but they are actually different products, not duplicate content. I therefore cannot see a way around not having the categories in the URL to help detect which product we want to show to the user. Any thoughts?
Technical SEO | | yousayjump0 -
Duplicate Page Content
I've got several pages of similar products that google has listed as duplicate content. I have them all set up with rel="prev" and rel="next tags telling google that they are part of a group but they've still got them listed as duplicates. Is there something else I should do for these pages or is that just a short falling of googles webmaster tools? One of the pages: http://www.jaaronwoodcountertops.com/wood-countertop-gallery/walnut-countertop-9.html
Technical SEO | | JAARON0 -
How critical is Duplicate content warnings?
Hi, So I have created my first campaign here and I have to say the tools, user interface and the on-page optimization, everything is useful and I am happy with SEOMOZ. However, the crawl report returned thousands of errors and most of them are duplicate content warnings. As we use Drupal as our CMS, the duplicate content is caused by Drupal's pagination problems. Let's say there is a page called "/top5list" , the crawler decided /top5list?page=1" to be duplicate of "/top5list". There is no real solution for pagination problems in Drupal (as far as I know). I don't have any warnings in Google's webmaster tools regarding this and my sitemap I submitted to Google doesn't include those problematic deep pages. (that are detected as duplicate content by SEOMOZ crawler) So my question is, should I be worried about the thousands of error messages in crawler diagnostics? any ideas appreciated
Technical SEO | | Gamer070 -
Solution for duplicate content not working
I'm getting a duplicate content error for: http://www.website.com http://www.website.com/default.htm I searched for the Q&A for the solution and found: Access the.htaccess file and add this line: redirect 301 /default.htm http://www.website.com I added the redirect to my .htaccess and then got the following error from Google when trying to access the http://www.website.com/default.htm page: "This webpage has a redirect loop
Technical SEO | | Joeuspe
The webpage at http://www.webpage.com/ has resulted in too many redirects. Clearing your cookies for this site or allowing third-party cookies may fix the problem. If not, it is possibly a server configuration issue and not a problem with your computer." "Error 310 (net::ERR_TOO_MANY_REDIRECTS): There were too many redirects." How can I correct this? Thanks0 -
URL's for news content
We have made modifications to the URL structure for a particular client who publishes news articles in various niche industries. In line with SEO best practice we removed the article ID from the URL - an example is below: http://www.website.com/news/123/news-article-title
Technical SEO | | mccormackmorrison
http://www.website.com/news/read/news-article-title Since this has been done we have noticed a decline in traffic volumes (we have not as yet assessed the impact on number of pages indexed). Google have suggested that we need to include unique numerical IDs in the URL somewhere to aid spidering. Firstly, is this policy for news submissions? Secondly (if the previous answer is yes), is this to overcome the obvious issue with the velocity and trend based nature of news submissions resulting in false duplicate URL/ title tag violations? Thirdly, do you have any advice on the way to go? Thanks P.S. One final one (you can count this as two question credits if required), is it possible to check the volume of pages indexed at various points in the past i.e. if you think that the number of pages being indexed may have declined, is there any way of confirming this after the event? Thanks again! Neil0