Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How do i block an entire category/directory with robots.txt?
-
Anyone has any idea how to block an entire product category, including all the products in that category using the robots.txt file? I'm using woocommerce in wordpress and i'd like to prevent bots from crawling every single one of products urls for now.
The confusing part right now is that i have several different url structures linking to every single one of my products for example www.mystore.com/all-products, www.mystore.com/product-category, etc etc.
I'm not really sure how i'd type it into the robots.txt file, or where to place the file.
any help would be appreciated thanks
-
Thanks for the detailed answer, i will give it a try!
-
Hi
This should do it, you place the robots.txt in the root directory of your site.
User-agent: * Disallow: /product-category/
You can check out some more examples here: http://www.seomoz.org/learn-seo/robotstxt
As for the multiple urls linking to the same pages, you will just need to check all possible variants and make sure you have them covered in the robots.txt file.
Google webmaster tools has a page where you can use to check if the robots.txt file is doing what you expect it to do (under Health -> Blocked Urls).
It might be easier to block the pages with a meta tag as described in the link above if you are running a plugin allowing this, that should take care of all the different url structures also.
Hope that helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Value of using spaces or no spaces on product category page varient keywords
Hello, all fellow Mozzers,
On-Page Optimization | | JamesDavison
I have taken over a project and this account, so can't change the username according to MOZ.🙃 We run an eCommerce website, and to me, some of the content is conflicting as some pages have more information content than what I would put in a commerce page, but this is how the boss wants it to work, personally, I would separate the content out.
The page I'm working on:
https://www.longstonetyres.co.uk/tyres/205-70-14.html
and this is an example of the rest of these types of pages, I will be tackling:
https://www.longstonetyres.co.uk/tyres/125-15.html I was tasked to improve SEO ranking, when using the MOZ page grader I had a score of 24 out of 27 83% SEO score and 3-page problems. 7th position in Google for the search term 205/70 R14 As it is a generic product listing page, It was pointless to add to the URL and the Internal links I can't reduce as these are links to products, so I went to reduce the
keyword stuffing and making the page content more natural, this improved the page to 25 out of 27, 87% SEO score and 2-page problems. Improvement to 3rd position in Google, but he wants to chase 1st place to be above his competitors, which is fair enough. It turns out that in the past, they have used this type of page to try and get a high ranking for several search terms, as it is a different variation on a tyre size terms are:
205/70 R14, 205/70R14, 205/70 R 14
205/70 X 14, 205/70X14, 205/70 X14
and so on for all the different ways you can search for this tyre size. He is also convinced Google will see these as different search terms, and while I agree to an extent, this causes Keyword Stuffing on the page, which in turn was harming the rankings. Each product listed on the page already has its own title 205/70 R14, 205/70 HR14 and so on, so my question is. What is the best practice for writing content on these types of pages to gain high rankings for several Keywords, and what value does writing the same keyword with spaces and no spaces have? Any help or advice is welcome, so I have a better understanding of how to approach this for this page and the rest of the site. Cheers Mal0 -
Meta Robots index & noindex Both Implemented on Website
I don't want few of the pages of website to get indexed by Google, thus I have implemented meta robots noindex code on those specific pages. Due to some complications I am not able to remove meta robots index from header of every page Now, on specific pages I have both codes 'index & noindex' implemented. Question is: Will Google crawl/index pages which have noindex code along with index code? Thanks!
On-Page Optimization | | Exa0 -
Content hidden behind a 'read all/more..' etc etc button
Hi Anyone know latest thinking re 'hidden content' such as body copy behind a 'read more' type button/link in light of John Muellers comments toward end of last year (that they discount hidden copy etc) & follow up posts on Search Engine Round Table & Moz etc etc ? Lots of people were testing it and finding such content was still being crawled & indexed so presumed not a big deal after all but if Google said they discount it surely we now want to reveal/unhide such body copy if it contains text important to the pages seo efforts. Do you think it could be the case that G is still crawling & indexing such content BUT any contribution that copy may have had to the pages seo efforts is now lost if hidden. So to get its contribution to SEO back one needs to reveal it, have fully displayed ? OR no need to worry and can keep such copy behind a 'read more' button/link ? All Best Dan
On-Page Optimization | | Dan-Lawrence0 -
No-index all the posts of a category
Hi everyone! I would like no-indexing all the posts of a specific category of my wordpress site. The problem is that the structure of my URL is composed without /category/: www.site-name.ext/date/post-name/
On-Page Optimization | | salvyy
so without /category-name/ Is possibile to disallow the indexing of all the posts of the category via robots.txt? Using Yoast Plugin I can put the no-index for each post, but I would like to put the no-index (or disallow/) a time for all the post of the category. Thanks in advance for your help and sorry for my english. Mike0 -
H2s & H3s for Category Navigation
Hi all. I am wondering how best to format a category navigation menu. Currently I don't think we're using H2s correctly on our website. Am I right to think that the top level category e.g. Games should be formatted as an H2 and the sub-categories underneath this should be formatted as H3s (to show a hierarchy)? Is there a limit on how many H2s and H3s you should use? Obviously only one H1 per page. Thanks in advance Paul
On-Page Optimization | | kevinliao0 -
Same H1 tag in header across entire site
Should I have the same H1 tag in my header through out my entire site? Or is this considered to be self canalization for my main keywords. For example right now I have an H1 tag with my main targeted keywords on every page on my site, even if the pages content doesn't necessarily match the keywords in the H1 tag.
On-Page Optimization | | TRICORSystems0 -
Any SEO effect(s) / impact of Meta No Cache?
Hi SEOMoz Guys, Hope you guys are doing well. I've been searching online and bumped into this archived page (http://www.seomoz.org/qa/view/34982/meta-nocache-affect-ranking). I would like to get an updated take on this issue whether or not the meta no cache code on a page bears negative/positive or no SEO impact / effect. <meta http-equiv="Pragma" content="no-cache" /> <meta http-equiv="Cache-Control" content="no-cache"/> Thanks! Steve
On-Page Optimization | | sjcbayona-412182 -
How do we handle sitemaps in robots.txt when multiple domains point to same physical location?
we have www.mysite.net, www.mysite.se, www.mysite.fi and so on. all of these domains point to the same physical location on our webserver, and we replace texts given back to client depending on which domain he/she requested. My problem is this: How do i configure sitemaps in robots.txt when robots.txt is used by multiple domains? If I for instance put the rows Sitemap: http://www.mysite.net/sitemapNet.xml
On-Page Optimization | | nordicnetproducts
Sitemap: http://www.mysite.net/sitemapSe.xml in robots.txt, would that result in some cross submission error?0