Duplicate Content - What's the best bad idea?
-
Hi all,
I have 1000s of products where the product description is very technical and extremely hard to rewrite or create an unique one.
I'll probably will have to use the contend provided by the brands, which can already be found in dozens of other sites.
My options are:
-
Use the Google on/off tags "don't index
" -
Put the content in an image
Are there any other options?
We'd always write our own unique copy to go with the technical bit.
Cheers
-
-
This applies to Google Mini or Search Appliance which are custom search tools for an individual website.
They allow site owners to sculpt the indexing of their private set ups.
Adwords also has something to help indicate the important content for determining the page topic for relating ads.
However, they don't apply to Googlebot spidering as mentioned above.
-
Hi - The google on/off tags idea I got it from https://developers.google.com/search-appliance/documentation/46/admin_crawl/Preparing
| index | Words between the tags are not indexed as occurring on the current page. | fish shark
mackerel | The words fish and mackerel are indexed for this page, but the occurrence of shark is not indexed.
This page could appear in search results for the term shark only if the word appears elsewhere on the page or in anchortext for links to the page.
Hyperlinks that appear within these tags are followed. | -
I agree with Takeshi, but would also like to add that so-called "Google on/off tags" are a myth. What you have typed out would be an HTML comment (they begin with
-
If the descriptions are very technical then likely there is a fair amount of repetition in the sentence pattern, diction etc. I'd recommend playing with regex to help transform content into something original.
For instance, you could search for industry abbreviations CW and replace with long forms _**Clockwise (CW). **_Maybe they over use an adjective that you could changeto your own voice.
Also, perhaps the stock descriptions have blocks of useless content you could strip out in the mean time?
The DB probably has a few other fields (name, product attributes etc) so be sure to find a unique way of assembling the meta description, title and details.
If you find enough to change, I'd think having the description would be better then having a page that is too light on words.
Be sure to mark up with http://schema.org/Product so SE's understand the nature of the content.
EDIT: I have used the regex technique to enhance the content of a database by added inline tooltips, diagrams or figures and glossary links. However with Penguin, I would be careful with automated links. You would only want to create a handful using the same anchor text.
EDIT2: I forgot - MAKE FREQUENT BACK UPS. Regex is super powerful and can tank a database really fast. Make a backup of the original and of every successful iteration - it will take a little longer but it will save your butt when things go bad.
-
I would say use the content as is (regular text) and work on adding additional content on top of that. Most marketplaces and etailers (including Amazon) use the descriptions provided by the brands. Google understands that. The idea is to provide additional value on top of that content with things like user reviews and additional features that make your site stand out.
-
Wow, a really tough problem.
I would definitely go for the image, and then customise the copy around the image so you can still rank for those pages. If you go for noindex tags, you lose all optimisation opportunites.
Or, could you host the product description on a single domain and then link to that from all your relevant pages?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content and Subdirectories
Hi there and thank you in advance for your help! I'm seeking guidance on how to structure a resources directory (white papers, webinars, etc.) while avoiding duplicate content penalties. If you go to /resources on our site, there is filter function. If you filter for webinars, the URL becomes /resources/?type=webinar We didn't want that dynamic URL to be the primary URL for webinars, so we created a new page with the URL /resources/webinar that lists all of our webinars and includes a featured webinar up top. However, the same webinar titles now appear on the /resources page and the /resources/webinar page. Will that cause duplicate content issues? P.S. Not sure if it matters, but we also changed the URLs for the individual resource pages to include the resource type. For example, one of our webinar URLs is /resources/webinar/forecasting-your-revenue Thank you!
Technical SEO | | SAIM_Marketing0 -
Duplicate Content
HI There, Hoping someone can help me - before i damage my desk banging my head. Getting notifications from ahrefs and Moz for duplicate content. I have no idea where these weird urls have came from , but they do take us to the correct page (but it seems a duplicate of this page). correct url http://www.acsilver.co.uk/shop/pc/Antique-Vintage-Rings-c152.htm Incorrect url http://www.acsilver.co.uk/shop/pc/vintage-Vintage-Rings- c152.htm This is showing for most of our store categories 😞 Desperate for help as to what could be causing these issues. I have a technical member of the ecommerce software go through the large sitemap files and they assured me it wasn't linked to the sitemap files. Gemma
Technical SEO | | acsilver0 -
Are image pages considered 'thin' content pages?
I am currently doing a site audit. The total number of pages on the website are around 400... 187 of them are image pages and coming up as 'zero' word count in Screaming Frog report. I needed to know if they will be considered 'thin' content by search engines? Should I include them as an issue? An answer would be most appreciated.
Technical SEO | | MTalhaImtiaz0 -
Is this duplicate content when there is a link back to the original content?
Hello, My question is: Is it duplicate content when there is a link back to the original content? For example, here is the original page: http://www.saugstrup.org/en-ny-content-marketing-case-infografik/. But that same content can be found here: http://www.kommunikationsforum.dk/anders-saugstrup/blog/en-ny-content-marketing-case-til-dig, but there is a link back to the original content. Is it still duplicate content? Thanks in advance.
Technical SEO | | JoLindahl912 -
How do I deal with Duplicate content?
Hi, I'm trying SEOMOZ and its saying that i've got loads of duplicate content. We provide phone numbers for cities all over the world, so have pages like this... https://www.keshercommunications.com/Romaniavoipnumbers.html https://www.keshercommunications.com/Icelandvoipnumbers.html etc etc. One for every country. The question is, how do I create pages for each one without it showing up as duplicate content? Each page is generated by the server, but Its impossible to write unique text for each one. Also, the competition seem to have done the same but google is listing all their pages when you search for 'DID Numbers. Look for DIDWW or MyDivert.
Technical SEO | | DanFromUK0 -
I am trying to correct error report of duplicate page content. However I am unable to find in over 100 blogs the page which contains similar content to the page SEOmoz reported as having similar content is my only option to just dlete the blog page?
I am trying to correct duplicate content. However SEOmoz only reports and shows the page of duplicate content. I have 5 years worth of blogs and cannot find the duplicate page. Is my only option to just delete the page to improve my rankings. Brooke
Technical SEO | | wianno1680 -
404's and duplicate content.
I have real estate based websites that add new pages when new listings are added to the market and then deletes pages when the property is sold. My concern is that there are a significant amount of 404's created and the listing pages that are added are going to be the same as others in my market who use the same IDX provider. I can go with a different IDX provider that uses IFrame which doesn't create new pages but I used a IFrame before and my time on site was 3min w/ 2.5 pgs per visit and now it's 7.5 pg/visit with 6+min on the site. The new pages create new content daily so is fresh content and better on site metrics (with the 404's) better or less 404's, no dup content and shorter onsite metrics better? Any thoughts on this issue? Any advice would be appreciated
Technical SEO | | AnthonyLasVegas0 -
What's the best way to eliminate duplicate page content caused by blog archives?
I (obviously) can't delete the archived pages regardless of how much traffic they do/don't receive. Would you recommend a meta robot or robot.txt file? I'm not sure I'll have access to the root directory so I could be stuck with utilizing a meta robot, correct? Any other suggestions to alleviate this pesky duplicate page content issue?
Technical SEO | | ICM0