Is using JavaScript injected text in line with best practice on making blocks of text non-crawlable?
-
I have an ecommerce website that has common text on all the product pages, e.g. delivery and returns information.
Is it ok to use non-crawlable JavaScript injected text as a method to make this content invisible to search engines? Or is this method frowned upon by Google?
By way of background info - I'm concerned about duplicate/thin content, so want to tackle this by reducing this 'common text' as well as boosting unique content on these pages.
Any advice would be much appreciated.
-
I haven't found standard shipping info text to be much of a problem in terms of duplicate content on product pages, but if you're concerned about it I would consider an iframe.
While hiding the content in a non-crawlable .js file might work, it isn't advisable. First, Google PreviewBot can parse javascript on the page and see the content in many cases, as is necessary to generate the site preview and visual cache of a page. Second, anything that keeps Google from being able to properly render the page may be considered suspect.
Please view this video by Matt Cutts: http://www.youtube.com/watch?v=B9BWbruCiDc
Essentially, he says "Don't block Google bot from css or javascript".
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Using # in parameters?
I am trying to understand why a website would use # instead of a ? for its parameters? I have put an example of the URL below: http://www.warehousestationery.co.nz/office-supplies/adhesives-tapes-and-fastenings#prefn1=brand&prefn2=colour&prefv1=Command&prefv2=Clear Any help would be much appreciated.
Technical SEO | | CaitlinDW1 -
Best Practice Out-of-Date Sales and Events
Hi Everyone, http://www.palaceresorts.com/cozumelpalace/en/cozumel-ironman-triathlon I found an out-of-date event in the clients' crawl report that returns a 404 not found Status Code. I remember to have read an article advising webmasters to don't ever remove a single landing page, but instead to advise that the sale/event it's expired and some information about the upcoming event. Does anyone have had this experience before ? Could you provide me with a real case scenario. Thank You
Technical SEO | | Ariadna-US0 -
Google not using redirect
We have a GEO-IP redirect in place for our domain, so that users are pointed to the subfolder relevant for their region, e.g: Visit example.com from the UK and you will be redirected to example.com/uk This works fine when you manually type the domain into your browser, however if you search for the site and come to example.com, you end up at example.com I didn't think this was too much of an issue but our subfolders /uk and /au are not getting ranked at all in Google, even for branded keywords. I'm wondering if the fact that Google isn't picking up the redirect means that the pages aren't being indexed properly? Conversely our US region (example.com/us) is being ranked well. Has anyone encountered a similar issue?
Technical SEO | | ahyde0 -
Deindexed site - is it best to start over?
A potential client's website has been deindexed from Google. We'd be completely redesigning his site with all new content. Would it be best to purchase a new url and redirect the old deindexed site to the new one, or try stick with the old domain?
Technical SEO | | WillWatrous0 -
Using rich snippets with a CMS?
My site uses the webEdition CMS that is split between templates and documents (pages). A template responsible for a number of documents. If I wish to use rich snippets, how and where can I add the microdata as I can't do it on a page per page basis? My CMS documentation and its developer forum doesn't give any information on this but I'm hoping its a common problem for open source CMS's and there is an easy fix. I live in hope! Iain
Technical SEO | | iain0 -
Block Quotes and Citations for duplicate content
I've been reading about the proper use for block quotes and citations lately, and wanted to see if I was interpreting it the right way. This is what I read: http://www.pitstopmedia.com/sem/blockquote-cite-q-tags-seo So basically my question is, if I wanted to reference Amazon or another stores product reviews, could I use the block quote and citation tags around their content so it doesn't look like duplicate content? I think it would be great for my visitors, but also to the source as I am giving them credit. It would also be a good source to link to on my products pages, as I am not competing with the manufacturer for sales. I could also do this for product information right from the manufacturer. I want to do this for a contact lens site. I'd like to use Acuvue's reviews from their website, as well as some of their product descriptions. Of course I have my own user reviews and content for each product on my website, but I think some official copy could do well. Would this be the best method? Is this how Rottentomatoes.com does it? On every movie page they have 2-3 sentences from 50 or so reviews, and not much unique content of their own. Cheers, Vinnie
Technical SEO | | vforvinnie1 -
Our Development team is planning to make our website nearly 100% AJAX and JavaScript. My concern is crawlability or lack thereof. Their contention is that Google can read the pages using the new #! URL string. What do you recommend?
Discussion around AJAX implementations and if anybody has achieved high rankings with a full AJAX website or even a partial AJAX website.
Technical SEO | | DavidChase0 -
Best blocking solution for Google
Posting this for Dave SottimanoI Here's the scenario: You've got a set of URLs indexed by Google, and you want them out quickly Once you've managed to remove them, you want to block Googlebot from crawling them again - for whatever reason. Below is a sample of the URLs you want blocked, but you only want to block /beerbottles/ and anything past it: www.example.com/beers/brandofbeer/beerbottles/1 www.example.com/beers/brandofbeer/beerbottles/2 www.example.com/beers/brandofbeer/beerbottles/3 etc.. To remove the pages from the index should you?: Add the Meta=noindex,follow tag to each URL you want de-indexed Use GWT to help remove the pages Wait for Google to crawl again If that's successful, to block Googlebot from crawling again - should you?: Add this line to Robots.txt: DISALLOW */beerbottles/ Or add this line: DISALLOW: /beerbottles/ "To add the * or not to add the *, that is the question" Thanks! Dave
Technical SEO | | goodnewscowboy0