Robots.txt - blocking JavaScript and CSS, best practice for Magento
-
Hi Mozzers,
I'm looking for some feedback regarding best practices for setting up Robots.txt file in Magento.
I'm concerned we are blocking bots from crawling essential information for page rank.
My main concern comes with blocking JavaScript and CSS, are you supposed to block JavaScript and CSS or not?
You can view our robots.txt file here
Thanks,
Blake
-
As Joost said, you should not block access to files with help in the reading / rendering of the page.
Looking at your Robots file, I would look at the following two exclusions. Do they block anything else that runs on a live page that Google should be seeing?
Disallow: /includes/ Disallow: /scripts/ -Andy
-
Best practice is not to block access to JS / CSS anymore, to allow google to properly understand the website and give determine mobile-friendliness.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Application & understanding of robots.txt
Hello Moz World! I have been reading up on robots.txt files, and I understand the basics. I am looking for a deeper understanding on when to deploy particular tags, and when a page should be disallowed because it will affect SEO. I have been working with a software company who has a News & Events page which I don't think should be indexed. It changes every week, and is only relevant to potential customers who want to book a demo or attend an event, not so much search engines. My initial thinking was that I should use noindex/follow tag on that page. So, the pages would not be indexed, but all the links will be crawled. I decided to look at some of our competitors robots.txt files. Smartbear (https://smartbear.com/robots.txt), b2wsoftware (http://www.b2wsoftware.com/robots.txt) & labtech (http://www.labtechsoftware.com/robots.txt). I am still confused on what type of tags I should use, and how to gauge which set of tags is best for certain pages. I figured a static page is pretty much always good to index and follow, as long as it's public. And, I should always include a sitemap file. But, What about a dynamic page? What about pages that are out of date? Will this help with soft 404s? This is a long one, but I appreciate all of the expert insight. Thanks ahead of time for all of the awesome responses. Best Regards, Will H.
Intermediate & Advanced SEO | | MarketingChimp100 -
What are the best practices for microdata?
Not too long ago, Dublin Core was all the rage. Then Open Graph data exploded, and Schema seems to be highly regarded. In a best-case scenario, on a site that's already got the basics like good content, clean URLs, rich and useful page titles and meta descriptions, well-named and alt-tagged images and document outlines, what are today's best practices for microdata? Should Open Graph information be added? Should the old Dublin Core be resurrected? I'm trying to find a way to keep markup light and minimal, but include enough microdata for crawlers to get a better sense of the content and its relationships to other subdomains and sites.
Intermediate & Advanced SEO | | WebElaine0 -
Best practices for structuring an ecommerce site
I'm revamping my wife's ecommerce site. It is currently a very low traffic website that is not indexed very well in Google. So, my plan is to restructure it based upon the best practices that helps me avoid duplicate content penalties, and easier to index strategies. The store has about 7 types of products. Each product has approximately 30 different size variations that are sometimes specifically searched for. For example: 20x10x1 air filters, 20x10x2 air filters, 20x10x1 allergy reducing air filters, etc So, is it best for me to create 7 different products with 30 different size variations (size selector at the product level that changes the price) or is it better to create 210 different product pages, one for each style/size?
Intermediate & Advanced SEO | | pherbio0 -
How to make Google index your site? (Blocked with robots.txt for a long time)
The problem is the for the long time we had a website m.imones.lt but it was blocked with robots.txt.
Intermediate & Advanced SEO | | FCRMediaLietuva
But after a long time we want Google to index it. We unblocked it 1 week or 8 days ago. But Google still does not recognize it. I type site:m.imones.lt and it says it is still blocked with robots.txt What should be the process to make Google crawl this mobile version faster? Thanks!0 -
Use Canonical or Robots.txt for Map View URL without Backlink Potential
I have a Page X with lots of unique content. This page has a "Map view" option, which displays some of the info from Page X, but a lot is ommitted. Questions: Should I add canonical even though Map View URL does not display a lot of info from Page X or adding to robots.txt or noindex, follow? I don't see any back links coming to Map View URL Should Map View page have unique H1, title tag, meta des?
Intermediate & Advanced SEO | | khi50 -
Pagination, Canonical Tag & Best Practices
I have an eCommerce site that dynamically creates category pages, which produce canonical tags in the header. For multiple page categories, it adds the page number to the URL. For example, this category has 3 pages.... Because most categories have too many products, I can't follow Googles suggestion of creating a "view all" page. Furthermore since all these pages use the same template, I'm unable to insert a NOINDEX tag in all the pages after the first page. Also, in this scenario, I'm unable to insert the discreet code for Next/Previous, which is also suggested by Google. My only option for maintaining these dynamically generated category pages would be to hardcode the first conical tag in the template, which would then be produced on all subsequent paginated pages. Consequently, every paginated page in this category would have the same canonical tag pointing to the first page. Would this incur the wrath of Google and would I'd be better off leaving the pagination they way it is?
Intermediate & Advanced SEO | | alrockn0 -
How should i best structure my internal links?
I am new to SEO and looking to employ a logical but effective internal link strategy. Any easy ways to keep track of what page links to what page? I am a little confused regarding anchor text in as much as how I should use this. e.g. for a category page "Towels", I was going to link this to another page we want to build PA for such as "Bath Sheets". What should I put in for anchor text? keep it simple and just put "Bath Sheets" or make it more direct like "Buy Bath Sheets". Should I also vary anchor text if i have another 10 pages internally linking to this or keep it the same. Any advise would be really helpful. Thanks Craig
Intermediate & Advanced SEO | | Towelsrus0 -
Not using a robot command meta tag
Hi SEOmoz peeps. Was doing some research on robot commands and found a couple major sites that are not using them. If you check out the code for these: http://www.amazon.com http://www.zappos.com http://www.zappos.com/product/7787787/color/92100 http://www.altrec.com/ You fill not find a meta robot command line. Of course you need the line for any noindex, nofollow, noarchive pages. However for pages you want crawled and indexed, is there any benefit for not having the line at all? Thanks!
Intermediate & Advanced SEO | | STPseo0