Duplicate Content - What's the best bad idea?
-
Hi all,
I have 1000s of products where the product description is very technical and extremely hard to rewrite or create an unique one.
I'll probably will have to use the contend provided by the brands, which can already be found in dozens of other sites.
My options are:
-
Use the Google on/off tags "don't index
" -
Put the content in an image
Are there any other options?
We'd always write our own unique copy to go with the technical bit.
Cheers
-
-
This applies to Google Mini or Search Appliance which are custom search tools for an individual website.
They allow site owners to sculpt the indexing of their private set ups.
Adwords also has something to help indicate the important content for determining the page topic for relating ads.
However, they don't apply to Googlebot spidering as mentioned above.
-
Hi - The google on/off tags idea I got it from https://developers.google.com/search-appliance/documentation/46/admin_crawl/Preparing
| index | Words between the tags are not indexed as occurring on the current page. | fish shark
mackerel | The words fish and mackerel are indexed for this page, but the occurrence of shark is not indexed.
This page could appear in search results for the term shark only if the word appears elsewhere on the page or in anchortext for links to the page.
Hyperlinks that appear within these tags are followed. | -
I agree with Takeshi, but would also like to add that so-called "Google on/off tags" are a myth. What you have typed out would be an HTML comment (they begin with
-
If the descriptions are very technical then likely there is a fair amount of repetition in the sentence pattern, diction etc. I'd recommend playing with regex to help transform content into something original.
For instance, you could search for industry abbreviations CW and replace with long forms _**Clockwise (CW). **_Maybe they over use an adjective that you could changeto your own voice.
Also, perhaps the stock descriptions have blocks of useless content you could strip out in the mean time?
The DB probably has a few other fields (name, product attributes etc) so be sure to find a unique way of assembling the meta description, title and details.
If you find enough to change, I'd think having the description would be better then having a page that is too light on words.
Be sure to mark up with http://schema.org/Product so SE's understand the nature of the content.
EDIT: I have used the regex technique to enhance the content of a database by added inline tooltips, diagrams or figures and glossary links. However with Penguin, I would be careful with automated links. You would only want to create a handful using the same anchor text.
EDIT2: I forgot - MAKE FREQUENT BACK UPS. Regex is super powerful and can tank a database really fast. Make a backup of the original and of every successful iteration - it will take a little longer but it will save your butt when things go bad.
-
I would say use the content as is (regular text) and work on adding additional content on top of that. Most marketplaces and etailers (including Amazon) use the descriptions provided by the brands. Google understands that. The idea is to provide additional value on top of that content with things like user reviews and additional features that make your site stand out.
-
Wow, a really tough problem.
I would definitely go for the image, and then customise the copy around the image so you can still rank for those pages. If you go for noindex tags, you lose all optimisation opportunites.
Or, could you host the product description on a single domain and then link to that from all your relevant pages?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
No: 'noindex' detected in 'robots' meta tag
Pages on my site show No: 'noindex' detected in 'robots' meta tag. However, when I inspect the pages html, it does not show noindex. In fact, it shows index, follow. Majority of pages show the error and are not indexed by Google...Not sure why this is happening. The page below in search console shows the error above...
Technical SEO | | Sean_White_Consult0 -
Content in Accordion doesn't rank as well as Content in Text box?
Does content rank better in a full view text layout, rather than in a clickable accordion? I read somewhere because users need to click into an accordion it may not rank as well, as it may be considered hidden on the page - is this true? accordion example: see features: https://www.workday.com/en-us/applications/student.html
Technical SEO | | DigitalCRO1 -
Duplicate Content Issues on Product Pages
Hi guys Just keen to gauge your opinion on a quandary that has been bugging me for a while now. I work on an ecommerce website that sells around 20,000 products. A lot of the product SKUs are exactly the same in terms of how they work and what they offer the customer. Often it is 1 variable that changes. For example, the product may be available in 200 different sizes and 2 colours (therefore 400 SKUs available to purchase). Theese SKUs have been uploaded to the website as individual entires so that the customer can purchase them, with the only difference between the listings likely to be key signifiers such as colour, size, price, part number etc. Moz has flagged these pages up as duplicate content. Now I have worked on websites long enough now to know that duplicate content is never good from an SEO perspective, but I am struggling to work out an effective way in which I can display such a large number of almost identical products without falling foul of the duplicate content issue. If you wouldnt mind sharing any ideas or approaches that have been taken by you guys that would be great!
Technical SEO | | DHS_SH0 -
Deleting 30,000 pages all at once - good idea or bad idea?
We have 30,000 pages that we want to get rid of. Each product within our database has it's own page. And these particular 30,000 products are not relevant anymore. They have very little content on them and are basically the same exact page but with a few title changes. We no longer want them weighing down our database so we are going to delete them. My question is - should we get rid of them in smaller batches like 2,000 pages at a time, or is it better to get rid of all them in one fell swoop? Which is least likely to raise a flag to Google? Anyone have any experience with this?
Technical SEO | | Viewpoints0 -
Best Way to Break Down Paginated Content?
(Sorry for my english) I have lots of user reviews on my website and in some cases, there are more than a thousand reviews for a single product/service. I am looking for the best way to break down these reviews in several sub-pages. Here are the options I thought of: 1. Break down reviews into multiple pages / URL http://www.mysite.com/blue-widget-review-page1
Technical SEO | | sbrault74
http://www.mysite.com/blue-widget-review-page2
etc... In this case, each page would be indexed by search engines. Pros: all the reviews are getting indexed Cons: It will be harder to rank for "blue widget review" as their will be many similar pages 2. Break down reviews into multiple pages / URL with noindex + canonical tag http://www.mysite.com/blue-widget-review-page1
http://www.mysite.com/blue-widget-review-page2
etc... In this case, each page would be set to noindex and the canonical tag would point to the first review page. Pros: only one URL can potentially rank for "blue widget review" Cons: Subpages are not indexed 3. Load all the reviews into one page and handle pagination using Javascript reviews, reviews, reviews
more reviews, more reviews, more reviews
etc... Each page would be loaded in a different which would be shown or hidden using Javascript when browsing through the pages. Could that be considered as cloaking?!? Pros: all the reviews are getting indexed Cons: large page size (kb) - maybe too large for search engines? 4. Load only the first page and load sub-pages dynamically using AJAX Display only the first review page on initial load. I would use AJAX to load additional reviews into the . It would be similar to some blog commenting systems where you have to click on "Load more comments" to see all the comments. Pros: Fast initial loading time + faster loading time for subpages = better user experience Cons: Only the first review page is indexed by search engines ========================================================= My main competitor who's achieving great rankings (no black hat of course) is using technique #3. What's your opinion?0 -
How to avoid duplicate content penalty when our content is posted on other sites too ?
For recruitment company sites, their job ads are posted muliple times on thier own sites and even on other sites too. These are the same ads (job description is same) posted on diff. sites. How do we avoid duplicate content penalty in this case?
Technical SEO | | Personnel_Concept0 -
Duplicate Content
Many of the pages on my site are similar in structure/content but not exactly the same. What amount of content should be unique for Google to not consider it duplicate? If it is something like 50% unique would it be preferable to choose one page as the canonical instead of keeping them both as separate pages?
Technical SEO | | theLotter0 -
Duplicate content
I have to sentences that I want to optimize to different pages for. sentence number one is travel to ibiza by boat sentence number to is travel to ibiza by ferry My question is, can I have the same content on both pages exept for the keywords or will Google treat that as duplicate content and punish me? And If yes, where goes the limit/border for duplicate content?
Technical SEO | | stlastla0