Does using robots.txt to block pages decrease search traffic?
-
I know you can use robots.txt to tell search engines not to spend their resources crawling certain pages.
So, if you have a section of your website that is good content, but is never updated, and you want the search engines to index new content faster, would it work to block the good, un-changed content with robots.txt? Would this content loose any search traffic if it were blocked by robots.txt? Does anyone have any available case studies?
-
If you block the pages from being crawled, you are also telling the search engines to not index the pages (they don't want to include something they haven't looked at). So yes, the traffic numbers from organic search will change if you block the pages in robots.txt.
-
Agreed, that is a better solution, but, I am still wondering if you block something with robots.txt, will that lead to a decrease in traffic? What if we have some duplicate content that is highly trafficked, if we block it with robots.txt, will the traffic numbers change?
-
You certainly don't want to block this content!
One thing I'd consider is the if-modified-since header, or other headers. Here are two articles that explain more about the concept of using headers to tell the search engines " this hasn't changed, don't bother crawling it". I haven't personally used this, but have read about it in many places.
http://www.feedthebot.com/ifmodified.html
http://searchengineland.com/how-to-improve-crawl-efficiency-with-cache-control-headers-88824
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I use NoIndex on short-lived pages?
Hello, I have a large number of product pages on my site that are relatively short-lived: probably in the region of a million+ pages that are created and then removed within a 24 hour period. Previously these pages were being indexed by Google and did receive landings, but in recent times I've been applying a NoIndex tag to them. I've been doing that as a way of managing our crawl budget but also because the 410 pages that we serve when one of these product pages is gone are quite weak and deliver a relatively poor user experience. We're working to address the quality of those 410 pages but my question is should I be no-indexing these product pages in the first place? Any thoughts or comments would be welcome. Thanks.
Intermediate & Advanced SEO | | PhilipHGray0 -
Category Page as Shopping Aggregator Page
Hi, I have been reviewing the info from Google on structured data for products and started to ponder.
Intermediate & Advanced SEO | | Alexcox6
https://developers.google.com/search/docs/data-types/products Here is the scenario.
You have a Category Page and it lists 8 products, each products shows an image, price and review rating. As the individual products pages are already marked up they display Rich Snippets in the serps.
I wonder how do we get the rich snippets for the category page. Now Google suggest a markup for shopping aggregator pages that lists a single product, along with information about different sellers offering that product but nothing for categories. My ponder is this, Can we use the shopping aggregator markup for category pages to achieve the coveted rich results (from and to price, average reviews)? Keen to hear from anyone who has had any thoughts on the matter or had already tried this.0 -
What to try when Google excludes your URL only from high-traffic search terms and results?
We have a high authority blog post (high PA) that used to rank for several high-traffic terms. Right now the post continues to rank high for variations of the high-traffic terms (e.g keyword + " free", keyword + " discussion") but the URL has been completed excluded from the money terms with alternative URLs of the domain ranking on positions 50+. There is no manual penalty in place or a DCMA exclusion. What are some of the things ppl would try here? Some of the things I can think of: - Remove keyword terms in article - Change the URL and do a 301 redirect - Duplicate the POST under new URL, 302 redirect from old blog post, and repoint links as much as you have control - Refresh content including timestamps - Remove potentially bad neighborhood links etc Has anyone seen the behavior above for their articles? Are there any recommendations? /PP
Intermediate & Advanced SEO | | ppseo800 -
Duplicate Page getting indexed and not the main page!
Main Page: www.domain.com/service
Intermediate & Advanced SEO | | Ishrat-Khan
Duplicate Page: www.domain.com/products-handler.php/?cat=service 1. My page was getting indexed properly in 2015 as: www.domain.com/service
2. Redesigning done in Aug 2016, a new URL pattern surfaced for my pages with parameter "products-handler"
3. One of my product landing pages had got 301-permanent redirected on the "products-handler" page
MAIN PAGE: www.domain.com/service GETTING REDIRECTED TO: www.domain.com/products-handler.php/?cat=service
4. This redirection was appearing until Nov 2016.
5. I took over the website in 2017, the main page was getting indexed and deindexed on and off.
6. This June it suddenly started showing an index of this page "domain.com/products-handler.php/?cat=service"
7. These "products-handler.php" pages were creating sitewide internal duplicacy, hence I blocked them in robots.
8. Then my page (Main Page: www.domain.com/service) got totally off the Google index Q1) What could be the possible reasons for the creation of these pages?
Q2) How can 301 get placed from main to duplicate URL?
Q3) When I have submitted my main URL multiple times in Search Console, why it doesn't get indexed?
Q4) How can I make Google understand that these URLs are not my preferred URLs?
Q5) How can I permanently remove these (products-handler.php) URLs? All the suggestions and discussions are welcome! Thanks in advance! 🙂0 -
Large robots.txt file
We're looking at potentially creating a robots.txt with 1450 lines in it. This will remove 100k+ pages from the crawl that are all old pages (I know, the ideal would be to delete/noindex but not viable unfortunately) Now the issue i'm thinking is that a large robots.txt will either stop the robots.txt from being followed or will slow our crawl rate down. Does anybody have any experience with a robots.txt of that size?
Intermediate & Advanced SEO | | ThomasHarvey0 -
Fixing A Page Google Omits In Search
Hi, I have two pages ranking for the same keyword phrase. Unfortunately, the wrong page is ranking higher, and the other page, only ranks when you include the omitted results. When you have a page that only shows when its omitted, is that because the content is too similar in google's eyes? Could there be any other possible reason? The content really shouldn't be flagged as duplicate, but if this is the only reason, I can change it around some more. I'm just trying to figure out the root cause before I start messing with anything. Here are the two links, if that's necessary. http://www.kempruge.com/personal-injury/ http://www.kempruge.com/location/tampa/tampa-personal-injury-legal-attorneys/ Best, Ruben
Intermediate & Advanced SEO | | KempRugeLawGroup0 -
Local Search Results Tanked My 1st Page Ranking
My site was routinely ranking in the top 2-3 in Google for my relevant search terms. Then I started working on my local SEO. Now I'm in the map list at 1-2, but my site no longer shows up with the rest of the search results. I've heard that this has been happening to other local businesses with a big Google presence. I'm thinking that I should create some micro sites for each location listing that gives a location specific intro and then links to my main site. Then I can sever my main site from Google places. Here are my two questions: 1) Is this going to kill my placement in the map results; and, How long will it take for my main site to get back to its 2-3 spot rankings in Google's regular results?
Intermediate & Advanced SEO | | ToughTimesLawyer0 -
Using 2 wildcards in the robots.txt file
I have a URL string which I don't want to be indexed. it includes the characters _Q1 ni the middle of the string. So in the robots.txt can I use 2 wildcards in the string to take out all of the URLs with that in it? So something like /_Q1. Will that pickup and block every URL with those characters in the string? Also, this is not directly of the root, but in a secondary directory, so .com/.../_Q1. So do I have to format the robots.txt as //_Q1* as it will be in the second folder or just using /_Q1 will pickup everything no matter what folder it is on? Thanks.
Intermediate & Advanced SEO | | seo1234560