How can I tell Google not to index a portion of a webpage?
-
I'm working with an ecommerce site that has many product descriptions for various brands that are important to have but are all straight duplicates. I'm looking for some type of tag tht can be implemented to prevent Google from seeing these as duplicates while still allowing the page to rank in the index. I thought I had found it with Googleoff, googleon tag but it appears that this is only used with the google appliance hardware.
-
Correct you should make sure is not used elsewhere.
But I can't refrain from stressing again to hide the content is unlikely the best strategy.
-
So what should it look like in the code?
If my area to block was a product description it might say
"Product Description
bla bla bla bla bla bla bla bla bla bla bla bla bla bla bla bla bla bla bla"
Secondly, in the robots.txt if I disallow /iframes/ then I would need to make sure we are not using iframes anywhere else?
-
Just as Chris Painter pointed out, you shouldn't worry too much about duplicate content if your site is legit (not an autoblog for example) and if you really want hide it from google, iframes are the way to go.
-
Matt Cutts has said (source:http://youtu.be/Vi-wkEeOKxM ) not to worry too much about duplicate content especially with the sheer volume of it there is on the internet. You may find you're looking more like you are trying to cheat Google or similar which could cause you a bigger head ache not to mention you may slow your webapge down, duplicate content isn't the worse enemy for seo. If you are worried put all the effort of trying to hide stuff from Google into making the product description unique.
-
Hello Brad,
Just to get your question clear.
I'm I correct that you want a method that does let Google (and other search engines) know a portion of your pages are duplicates while you want both duplicated pages and original pages to rank in the SERP's?
If you could provide us with an example (link) that would help a great deal as well.
-
Hi Brad,
You can prevent Google from seeing portions of the page by putting those portions in iframes that are blocked by
robots.txt.
Disallow: /iframes/
Thanks
-
You can iframe those chunks of content and block with robots.txt or just meta tagging noindex in the iframe source.
But I would not, if you can't build a plan to make the content unique just canonicalize, or let google choose which page to pick among the duplicate bunch.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Weird Google indexing issues with www being forced
IM working on a site which is really not indexing as it should, I have created a sitemap.xml which I thought would fix the issue but it hasn't, what seems to be happening is the Google is making www pages canonical for some of the site and without www for the rest. the site should be without www. see images attached for a visual explanation.
Technical SEO | | Donsimong
when adding pages in Google search console without www some pages cannot be indexed as Google thinks the www version is canonical, and I have no idea why, there is no canonical set up at all, what I would do if I could is to add canonical tags to each page to pint to the non www version, but the CMA does not allow for canonical. not quite sure how to proceed, how to tell google that the non www version is in fact correct, I dont have any idea why its assuming www is canonical either??? k11cGAv zOuwMxv0 -
Page disappeared from Google index. Google cache shows page is being redirected.
My URL is: http://shop.nordstrom.com/c/converse Hi. The week before last, my top Converse page went missing from the Google index. When I "fetch as Googlebot" I am able to get the page and "submit" it to the index. I have done this several times and still cannot get the page to show up. When I look at the Google cache of the page, it comes up with a different page. http://webcache.googleusercontent.com/search?q=cache:http://shop.nordstrom.com/c/converse shows: http://shop.nordstrom.com/c/pop-in-olivia-kim Back story: As far as I know we have never redirected the Converse page to the Pop-In page. However the reverse may be true. We ran a Converse based Pop-In campaign but that used the Converse page and not the regular Pop-In page. Though the page comes back with a 200 status, it looks like Google thinks the page is being redirected. We were ranking #4 for "converse" - monthly searches = 550,000. My SEO traffic for the page has tanked since it has gone missing. Any help would be much appreciated. Stephan
Technical SEO | | shop.nordstrom0 -
How can I get Google to forget an https version of one page on my site?
Google mysteriously decided to index the broken, https version of one page on my company's site (we have a cert for the site, but this page is not designed to be served over https and the CSS doesn't load). The page already has many incoming links to the http version, and it has a canonical URL with http. I resubmitted it on http with webmaster tools. Is there anything else I could do?
Technical SEO | | BostonWright0 -
Bing indexing at a tiny fraction of Google
I've read through other posts about this but I can't find a solution that works for us. My site is porch.com, 1M+ pages indexed on Google, ~10k on Bing. I've submitted the same sitemaps, and there's nothing different for each bot in our robots file. It looks like Bing is more concerned with our 500 errors than Google, but not sure if that might be causing the issue. Can anyone point me to the right things to be researching/investigating? Fixing errors, sitemap crawling issues, etc. I'm not sure what to spend my time looking into...
Technical SEO | | Porch0 -
How Google can interpret all "hreflag" links into HTML code
I've found the solution. The problem was that did not put any closing tag into the HTML code....
Technical SEO | | Red_educativa0 -
Google webmaster showing 0 indexed, yet I can see them all them Google search?
I can see them all the pages showing up in Google when i search for my site. But in webmaster tools under the sitemaps section in the indexed pages - the red bar is showing 0 indexed pages, even though they seem to be indexed. Any idea why is this showing like this? I don’t really think it’s that important as the pages are still indexed, but it just seems odd. Please see in the image.
Technical SEO | | Perfect0070 -
Duplicate content q - Can search engines tell where the original text was copied from?
I was under the impression that when a search engine comes across duplicate content it won't be able to determine which one is the original. Is this not the case?
Technical SEO | | Sparkstone0 -
Google indexing less url's then containded in my sitemap.xml
My sitemap.xml contains 3821 urls but Google (webmaster tools) indexes only 1544 urls. What may be the cause? There is no technical problem. Why does Google index less URLs then contained in my sitemap.xml?
Technical SEO | | Juist0