How can I tell Google not to index a portion of a webpage?
-
I'm working with an ecommerce site that has many product descriptions for various brands that are important to have but are all straight duplicates. I'm looking for some type of tag tht can be implemented to prevent Google from seeing these as duplicates while still allowing the page to rank in the index. I thought I had found it with Googleoff, googleon tag but it appears that this is only used with the google appliance hardware.
-
Correct you should make sure is not used elsewhere.
But I can't refrain from stressing again to hide the content is unlikely the best strategy.
-
So what should it look like in the code?
If my area to block was a product description it might say
"Product Description
bla bla bla bla bla bla bla bla bla bla bla bla bla bla bla bla bla bla bla"
Secondly, in the robots.txt if I disallow /iframes/ then I would need to make sure we are not using iframes anywhere else?
-
Just as Chris Painter pointed out, you shouldn't worry too much about duplicate content if your site is legit (not an autoblog for example) and if you really want hide it from google, iframes are the way to go.
-
Matt Cutts has said (source:http://youtu.be/Vi-wkEeOKxM ) not to worry too much about duplicate content especially with the sheer volume of it there is on the internet. You may find you're looking more like you are trying to cheat Google or similar which could cause you a bigger head ache not to mention you may slow your webapge down, duplicate content isn't the worse enemy for seo. If you are worried put all the effort of trying to hide stuff from Google into making the product description unique.
-
Hello Brad,
Just to get your question clear.
I'm I correct that you want a method that does let Google (and other search engines) know a portion of your pages are duplicates while you want both duplicated pages and original pages to rank in the SERP's?
If you could provide us with an example (link) that would help a great deal as well.
-
Hi Brad,
You can prevent Google from seeing portions of the page by putting those portions in iframes that are blocked by
robots.txt.
Disallow: /iframes/
Thanks
-
You can iframe those chunks of content and block with robots.txt or just meta tagging noindex in the iframe source.
But I would not, if you can't build a plan to make the content unique just canonicalize, or let google choose which page to pick among the duplicate bunch.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Website homepage temporarily getting removed from google index
hi, website: www.snackmagic.com The home page goes out of google index for some hours and then comes back. We are not sure why our home page is getting de-indexed temporarily. This doesn't happen with other pages on our website. This has been happening intermittently in the gap of 2-3 days. Any inputs will be very useful for us to debug this issue Thanks
Technical SEO | | manikbystadium0 -
Indexed pages
Just started a site audit and trying to determine the number of pages on a client site and whether there are more pages being indexed than actually exist. I've used four tools and got four very different answers... Google Search Console: 237 indexed pages Google search using site command: 468 results MOZ site crawl: 1013 unique URLs Screaming Frog: 183 page titles, 187 URIs (note this is a free licence, but should cut off at 500) Can anyone shed any light on why they differ so much? And where lies the truth?
Technical SEO | | muzzmoz1 -
Get List Of All Indexed Google Pages
I know how to run site:domain.com but I am looking for software that will put these results into a list and return server status (200, 404, etc). Anyone have any tips?
Technical SEO | | InfinityTechnologySolutions0 -
Index problems
“The website http://www.vaneyckshutters.com/nl/ does not show in the index of Google (site:vaneyckshutters.com/nl/). This must be the homepage in the Netherlands. Previously, the page www.vaneyckshutters.com was redirected to /nl/. This page is accessible now with a canonical tag to http://www.vaneyckshutters.com/nl/ in the hope to let /nl/ be indexed. When we look at the SERPS for keyword ‘shutters’, the page http://www.vaneyckshutters.com/ is shown in Google.nl on #32 and in Belgium #3. Problem & question: Why is it that /nl/ has not been indexed properly and why is it that we rank with http://www.vaneyckshutters.com on ‘shutters’ instead the/nl/ page?”
Technical SEO | | Happy-SEO1 -
Can I have an http AND a https site on Google Webmaster tools
My website is https but the default property that was configured on Google WMT was http and wasn't showing me any information because of that. I added an https property for that, but my question is: do I need to delete the original HTTP or can I leave both websites?
Technical SEO | | Onboard.com0 -
What is the best practice to re-index the de-indexed pages due to a bad migration
Dear Mozers, We have a Drupal site with more than 200K indexed URLs. Before 6 months a bad website migration happened without proper SEO guidelines. All the high authority URLs got rewritten by the client. Most of them are kept 404 and 302, for last 6 months. Due to this site traffic dropped more than 80%. I found today that around 40K old URLs with good PR and authority are de-indexed from Google (Most of them are 404 and 302). I need to pass all the value from old URLs to new URLs. Example URL Structure
Technical SEO | | riyas_
Before Migration (Old)
http://www.domain.com/2536987
(Page Authority: 65, HTTP Status:404, De-indexed from Google) After Migration (Current)
http://www.domain.com/new-indexed-and-live-url-version Does creating mass 301 redirects helps here without re-indexing the old URLS? Please share your thoughts. Riyas0 -
Can You Use More Then One Google Local Rich Snippet on a single site/ on a single page.
I am currently working on a website for a business that has multiple office locations. As I am trying to target all four locations I was wondering if it is okay to have more then one Local Rich Snippet on a single page. (For example they list all four locations and addresses within their footer and I was wondering if I could make these local rich snippets). What about having more then one on a single website. For example if a company has multiple offices located in several different cities and have set up individual contact pages for these cities, can each page have it's own Local Rich Snippet? Will Google look at these multiple "local rich snippets" as spaming or will they recognize the multiple locations and count it towards their local seo?
Technical SEO | | webdesignbarrie1 -
Google Search Parameters
Couple quick questions. Is using the parameter pws=0 still useful for turning off personalization? Is there a way to set my location as a URL parameter as well? For instance, I want to set my location to United States, can this be done with a URL param the same way as pws=0?
Technical SEO | | nbyloff0