Will duplicate product information paragraphs negatively impact our site?
-
We are selling paint and have separate pages for different colour cans, each with their own unique description.
We would like to include a few additional paragraphs of product information below each description, but this will be identical across all the products. Do you think this will be a problem being duplicate content?
-
I wouldn't say there would be massive chances of a penalty here, that being said it's an area where you could be 'adding value' and uniqueness to your pages and you're not doing it. So your pages may be 'less competitive' and you may be missing out on an opportunity. It's more of a competitive missed opportunity than an 'error' per-se
In reality you should have one product page for each product and then just have 'product variants' for stuff like quantity, size, colour etc. On the modern web people find this easier to navigate and since many sites do offer that, they might seem like more competitive places to shop for paint cans than your site. Price does matter, but it's not the sole arbiter of how products are ranked on Google's search engine - other stuff matters too. Unless you have a virtual monopoly on the product (only you can sell it, or only you can sell it at a greatly discounted price due to a special relationship with the supplier) then I would consider the UX and design of your site. No one wants an 'arse-ache' of a browsing experience
Many tools will flag what you are about to do as duplicate content and they're technically right. But instead of going on some crazy copy-writing crusade, think about the architecture of your site. You can still have separate URLs for different product variations if you want, even via parameter-variables (though that's a bit of a 'basic' implementation). If you make it clear to Google through new, more streamlined architecture that they're all actually the same product, the duplicate description(s) won't matter 'as much' (though they'll still be a missed opportunity for more diverse rankings IMO)
You can make it even more apparent to Google that all the different variations are actually the 'same product' by utilising Product schema and some of the deeper stuff like ProductModel which will bind it all together. Whatever you implement, test it here. If this tool throws errors and warnings, keep working away until they're all fixed
Canonical tags are another option but they will decrease your ranking 'footprint' and in this case I wouldn't recommend them, despite 'slight' content duplication risk (which in reality, are mostly negligible)
Final note: you say you have 'unique' descriptions, but remember if they're used elsewhere online they're not unique. If they're unique internally that's great, but if you got them all from a supplier then... obviously loads of other sites are probably using them, which could easily be a big issue for you
-
Hi Justin,
Great question, to help answer that question I will use a quote from Google's support document regarding duplicate content.
https://support.google.com/webmasters/answer/66359?hl=en
"Examples of non-malicious duplicate content could include:
- Discussion forums that can generate both regular and stripped-down pages targeted at mobile devices
- Store items shown or linked via multiple distinct URLs
- Printer-only versions of web pages
"
I think your situation would likely fall under the similar category as "acceptable" like the store items example I highlighted. Keep in mind although duplicate content should really be avoided when possible, Google does NOT actually penalize site's for having it.
Although I would try to keep the overall amount of duplicate content to a minimum, it shouldn't be too big of an issue. Utilize the unique descriptions, in this case, you likely won't have to worry too much about the duplicate content.
I hope that helps!
Best,
Alex Ratynski -
Hi Joe,
Thanks for your help, it would probably be about 50%, but we could look to make this more like 80% unique content if you think this will help.
-
Hello,
How much of the copy is unique per page?
WRT to content originality, I've worked to is 80% unique content per page as a general rule.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will 301s to Amazon Hurt Site?
We have 155 podcasts and in many we have affiliate links to Amazon. But I recently found out that one of the two products we are promoting is no longer available. I now have to fix many podcast descriptions. My thought is maybe to build a link like: financiallysimple.com/camera and 301 it to the Amazon product. That way if the product changes, I simply change where the 301 points. Simple. BUT my question is does that bouncing people offsite immediately hurt us? Are there any other options that will accomplish the same goal?
Technical SEO | | jgoethert
Thanks!1 -
I have a question about the impact of a root domain redirect on site-wide redirects and slugs.
I have a question about the impact (if any) of site-wide redirects for DNS/hosting change purposes. I am preparing to redirect the domain for a site I manage from https://siteImanage.com to https://www.siteImanage.com. Traffic to the site currently redirects in reverse, from https://www.siteImanage.com to https://siteImanage.com. Based on my research, I understand that making this change should not affect the site’s excellent SEO as long as my canonical tags are updated and a 301 redirect is in place. But I wanted to make sure there wasn’t a potential consequence of this switch I’m not considering. Because this redirect lives at the root of all the site’s slugs and existing redirects, will it technically produce a redirect chain or a redirect loop? If it does, is that problematic? Thanks for your input!
Technical SEO | | mollykathariner_ms0 -
Will redirecting a logged in user from a public page to an equivalent private page (not visible to google) impact SEO?
Hi, We have public pages that can obviously be visited by our registered members. When they visit these public pages + they are logged in to our site, we want to redirect them to the equivalent (richer) page on the private site e.g. a logged in user visiting /public/contentA will be redirected to /private/contentA Note: Our /public pages are indexed by Google whereas /private pages are excluded. a) will this affect our SEO? b) if not, is 302 the best http status code to use? Cheers
Technical SEO | | bernienabo0 -
Redirecting old html site to new wordpress site
Hi I'm currently updating an old (8 years old) html site to wordpress and about a month ago I redirected some url's to the new site (which is in a directory) like this... Redirect 301 /article1.htm http://mysite.net/wordpress/article1/
Technical SEO | | briandee
Redirect 301 /article2.htm http://mysite.net/wordpress/article2/
Redirect 301 /article3.htm http://mysite.net/wordpress/article3/ Google has indexed these new url's and they are showing in search results. I'm almost finished the new version of site and it is currently in a directory /wordpress I intend to move all the files from the directory to the root so new url when this is done will be http://mysite.net/article1/ etc My question is - what to I do about the redirects which are in place - do I delete them and replace with something like this? Redirect 301 /wordpress/article1/ http://mysite.net/article1/
Redirect 301 /wordpress/article2/ http://mysite.net/article2/
Redirect 301 /wordpress/article3/ http://mysite.net/article3/ Appreciate any help with this0 -
When will all of Google Maps be the same again?
As many of you are aware that the pigeon update was only applied to the new Google maps resulting in very different search results for Google local business. When you search for a business on old Google maps then you get totally different results vs the new Google maps. Some businesses totally disappeared completely from the search results. I have done my research and found out that it's because the new Algo was only applied to the new maps. Also new algo does not apply to other countries. Well the reason I posted this topic is because I have noticed that all the new Google Business listings I am verifying for my clients are all being put under the old Google maps and not the new ones. They come up fine when searching from old maps but not the new ones. I understand Google has not rolled out the pigeon on all data centers but why? Will Google eventually roll out the update to old maps? Since Google is adding businesses to old google maps then what's the point of even adding new listings?
Technical SEO | | bajaseo0 -
Content Duplication - Zencart
Hi Guys !!! Based on crawler results, it shows that I have 188 duplicate content pages, out of which some are those in which I am not able to understand where the duplication is ??? The page created is unique. All the URL's are static, all titles, metat tags are unique. How do I remove this duplication !!! I am using Zencart as a platform. Thanks in advance for the help !!! 🙂
Technical SEO | | sidjain4you0 -
"Fourth-level" subdomains. Any negative impact compared with regular "third-level" subdomains?
Hey moz New client has a site that uses: subdomains ("third-level" stuff like location.business.com) and; "fourth-level" subdomains (location.parent.business.com) Are these fourth-level addresses at risk of being treated differently than the other subdomains? Screaming Frog, for example, doesn't return these fourth-level addresses when doing a crawl for business.com except in the External tab. But maybe I'm just configuring the crawls incorrectly. These addresses rank, but I'm worried that we're losing some link juice along the way. Any thoughts would be appreciated!
Technical SEO | | jamesm5i0 -
Duplication Penalty through Specs?
I am trying to figure our how to correct a recently incurred duplication penalty on a partner site. I didn't see any posts on this yet specific to my problem. The site used to be ranked on page 1 of Google for all important keywords but now we ran into the situation that many pages were bumped to pos 100 or lower due to duplication issues. This is an aviation site, discussing airplanes and each page discusses a different model but each page also has the specs of the plane and while the data parts are different for each plane the specification terms are the same ,see here: Primary Function:
Technical SEO | | WizardHQ
Crew:
Engine:
Thrust:
Weight Empty:
Max. Weight:
Length:
Wingspan:
Cruise Speed:
Max.Speed:
Climb:
Ceiling:
Range:
First Flight:
Year Deployed: Is there an easy way to get Google to stop including these terms (not the data in the 2nd column) from the page anaysis to prevent this causing the duplication issues we are are seeing due to this? Thanks in advance!0