Having a Size Chart and Personalization Descriptions on each page - Duplicate Content?
-
Hi everyone,
I am coding a Shopify Store theme currently and we want to show customers the size comparisons and personalization options for each product. It will be a great UX addition since it is the number one & two things asked via customer support.
But my only concern is that Google might flag it as duplicate content since it will be visible on each product page.
What are your thoughts and/or suggestions?
Thank you so much in advance.
-
This Matt Cutts video is from 2013, but I think it still applies today. I ran into the same issue once re: terms & conditions, asked a question very similar to yours, and somebody on this message board shared it with me. Hope it helps. Long story short, as long as you're not being spammy about it, you should be fine. I would add that since it's actually useful to your user, I would implore you to include it!
-
Dirk,
Well it is essentially a Size Chart and then description of the sizes below. And Personalization examples and then a description of the options.
Yeah I was going to have "If Tabbed Click" then pull data into the tab, but that's what I had read that GoogleBot has a good understanding of this. Thanks for the clarification!
Just didn't want to hurt us when we want to be as relevant as possible when it comes to search.
-
Duplicate content is determined on page level - just adding a few elements that are common on each page will not hurt, especially if it's content your customers are looking for on a page (I assume that the product description is present on the page and has a certain "body" - not just a one line description)
Putting it in Javascript wouldn't make a big difference - consider Googlebot as a small version of Chrome which is able to understand javascript (also the reason why you shouldn't block javascript resources for Googlebot)
Apart from that, duplicate content is not a cause for punishment (unless in extreme cases) - the main difficulty with duplicate content is that Google will decide which version of the content it will show, which does not always correspond with your preferred version.
Dirk
-
Heres a scenario after continued research, display content on clicking of these particular tabs. I've heard Google can/can't pull content via javascript. But nothing in stone.
If anyone can advise, it would be greatly appreciated.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I noindex shop page and blog page for SEO?
I have about 15 products in my store. Should I noindex shop and blog page for SEO? The reason I ask this question is because I see someone suggesting noindex archives pages. And the shop page is product archive and blog page is archive too, so should I choose index or noindex? Thanks!
White Hat / Black Hat SEO | | Helloiamgood0 -
Why would a blank page rank? What am I missing about this page?
In terms of content, this page is blank. Yes, there's a sidebar and footer, but no content. I've seen a page like this rank before. I'm curious if they're implementing something on the back-end I don't realize or if this is just a fluke? Etc. Also, the DA of the site is only a 15, so I don't think that's the reason. http://www.thenurselawyer.com/component/tags/tag/20-pasco-county-personal-injury-lawyers.html Thanks, Ruben
White Hat / Black Hat SEO | | KempRugeLawGroup1 -
Best practice to preserve the link juice to internal pages from expired domain?
This question relates to setting up an expired domain, that already has quality links, including deep links to internal pages. Since the new site structure will be different, what's the best practice to preserve the link juice to these internal pages? Export all the internal pages linked to using majestic Seo/ ahrefs etc, and set these pages previously linked to? Or 301 redirect these pages to home page? I heard there's a Wordpress plugin that 301 redirects all the 404 errors successfully preserving all the potential link juice.
White Hat / Black Hat SEO | | adorninvitations0 -
Plugin to duplicate CMS pages, changing the location
Hi all, We have recently noticed a rise in local business websites using a plugin to duplicate hundreds of pages changing only the location in the h1 tag and the page description, we're pretty sure this is a black hat technique allowing them to rank for all locations (although the duplicate page content must not be doing them any favours). An example of this is http://www.essexcarrecovery.co.uk We would like to know what plugin they are using as we think there may be better ways to use this, we may be able to create original location pages faster than we do now? Also why does not seem to be too detrimental to the businesses SEO as surely this method should be damaging?
White Hat / Black Hat SEO | | birdmarketing0 -
I need a lot of content completed in a short amount of time. Suggestions on where to look?
I'm looking for writers to write content for 1000+ key words. 300-400 words per keyword. I would like this done by the end of July. Any suggestions or recommendations on where to find a team that can produce quality content in that amount of time? Thank you!
White Hat / Black Hat SEO | | cloudhasher0 -
Creating duplicate site for testing purpose. Can it hurt original site
Hello, We are soon going to upgrade the cms to latest version along with new functionlaities - the process may take anywhere from 4 week to 6 weeks. may suggest - we need to work on live server, what we have planned take exact replica of site and move to a test domain, but on live server Block Google, Bing, Yahoo - User-agent: Google Disallow: / , User-agent: Bing Disallow: / User-agent: Yahoo Disallow: / in robots.txt Will upgrade CMS and add functionality - will test the entire structure, check url using screaming frog or xenu and move on to configure the site on original domain The process upgradation and new tools may take 1 - 1.5 month.... Concern is that despite blocking Google, Bing & Yahoo through User agent disallow - can still the url can be crawled by the search engines - if yes - it may hurt the original site as will read on as entire duplicate or is there any alternate way around.. Many thanks
White Hat / Black Hat SEO | | Modi1 -
Passing page rank with frames - Is this within Google Guidelines?
It appears this site is gaming Google for better rankings. I haven't seen a site do it this before way before. Can you tell me what enables this to get such good rankings, and whether what they are doing is legitimate? The site is http://gorillamikes.com/ Earlier this year this site didn't show up in the rankings for terms like "Cincinnati tree removal" and"tree trimming Cincinnati" etc. The last few months they have been ranking #1 or #2 for these terms. The site has a huge disparity in MozRank (8, very low) vs. Page Rank (6, high). The only links to this page come from the BBB. However, when you look at the source code you find 100% of what is displayed on the site comes from a page on another site via a frame. The content is here: http://s87121255.onlinehome.us/hosting/gorillamikes/ When I go to onlinehome.us I'm redirected to http://www.1and1.com/. I'm only speculating, but my guess is onlinehome.us has a high page rank that it is passing to http://gorillamikes.com/, enabling Gorilla Mikes to achieve PR of 6. Does this make sense? In addition, the content is over optimized for the above terms (they use "Cincinnati (Cincinnat, OH)" in the first three H2 tags on the page. And all of the top menu links result in 404 errors. Are the tactics this site is using legitimate? It appears that everything they're doing is designed to improve search results, and not in ways that are helpful to users. What do you think?
White Hat / Black Hat SEO | | valkyrk0 -
Why doesn't Google find different domains - same content?
I have been slowly working to remove near duplicate content from my own website for different locals. Google seems to be doing noting to combat the duplicate content of one of my competitors showing up all over southern California. For Example: Your Local #1 Rancho Bernardo Pest Control Experts | 858-352 ... <cite>www.pestcontrolranchobernardo.com/</cite>CachedYou +1'd this publicly. UndoPest Control Rancho Bernardo Pros specializes in the eradication of all household pests including ants, roaches, etc. Call Today @ 858-352-7728. Your Local #1 Oceanside Pest Control Experts | 760-486-2807 ... <cite>www.pestcontrol-oceanside.info/</cite>CachedYou +1'd this publicly. UndoPest Control Oceanside Pros specializes in the eradication of all household pests including ants, roaches, etc. Call Today @ 760-486-2807. The competitor is getting high page 1 listing for massively duplicated content across web domains. Will Google find this black hat workmanship? Meanwhile, he's sucking up my business. Do the results of the competitor's success also speak to the possibility that Google does in fact rank based on the name of the url - something that gets debated all the time? Thanks for your insights. Gerry
White Hat / Black Hat SEO | | GerryWeitz0