Massive URL blockage by robots.txt
-
Hello people,
In May there has been a dramatic increase in blocked URLs by robots.txt, even though we don't have so many URLs or crawl errors. You can view the attachment to see how it went up. The thing is the company hasn't touched the text file since 2012. What might be causing the problem? Can this result any penalties? Can indexation be lowered because of this?
-
Even though there are less pages indexed compared to those that are blocked, you still have a significant increase in indexed pages as well. That is a good thing! You technically have more pages that are indexed than before. It looks like you possibly relaunched the site or something? More pages blocked could be an indexing problem, or it might be a good thing - it all depends on what pages are being blocked.
If you relaunched the site and used this great new whiz-bang CMS that created an online catalog that gave your users 54 ways to sort your product catalog, then the number of "pages" could increase with each sort. Just imagine, sort your widgets by color, or by size or by price, or by price and size, or by size and color, or by color and price - you get the idea. Very quickly you have a bunch of duplicate pages of a single page. If your SEO was on his or her toes, they would account for this using a canonical approach or possibly a meta noindex or changing the robots.txt etc. That would be good as you are not going to confuse Google with all the different versions of the same page.
Ultimately, Shailendra has the approach that you need to take. Look in robots.txt, look at the code on your pages. What happened around 5/26/2013? All those things need to be looked at to try and answer your question.
-
Le Fras,
You don't only have to change the robots.txt file for Google to indicate that more URLs are being blocked by it. The robots.txt file tells the search engines not to crawl given URLs, but that they may keep them in the index and display the URLs in the search results.
So the search engines do know of the URLs that are being blocked and they are able to indicate that more are being blocked as you add pages to your site that are restricted by the robots.txt file.
-
Check you robots file. Are there entries to block the crawling? If you can give the url then it would be helpful/
Regards
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How To Shorten Long URLS
Hi I want to shorten some URLs, if possible, that Moz is reporting as too long. They are all the same page but different categories - the page advertises jobs but the client requires various links to types of jobs on the menu. So the menu will have: Job type 1
Intermediate & Advanced SEO | | Ann64
Job type 2
Job Type 3 I'm getting the links by going to the page, clicking a dropdown to filter the Job type, then copying the resulting URL from the address bar. Bu these are really long & cumbersome. I presume if I used a URL shortener, this would count as redirects and alsonot be good for SEO. Any thoughts? Thanks
Ann0 -
How to make Google index your site? (Blocked with robots.txt for a long time)
The problem is the for the long time we had a website m.imones.lt but it was blocked with robots.txt.
Intermediate & Advanced SEO | | FCRMediaLietuva
But after a long time we want Google to index it. We unblocked it 1 week or 8 days ago. But Google still does not recognize it. I type site:m.imones.lt and it says it is still blocked with robots.txt What should be the process to make Google crawl this mobile version faster? Thanks!0 -
URL rewrite traffic drop
Hello, A while ago (Sep. 19 2013) we had a new url structure upgrade for products pages within our website (with all the needed 301 redirects in place,internal links & sitemaps updates), but our new urls lost the serps of the old ones and with that we experienced a big traffic drop (and since September I can't see any sign of recovery).
Intermediate & Advanced SEO | | Silviu
Here are just 3 examples of old and coresponding new urls: http://www.nobelcom.com/phone-cards/calling-Mexico-from-United-States-1-182.html
http://www.nobelcom.com/Mexico-phone-cards-182.html http://www.nobelcom.com/es/phone-cards/calling-Mexico-from-United-States-1-182.html
http://www.nobelcom.com/es/Mexico-tarjetas-telefonicas-182.html http://www.nobelcom.com/phone-cards/calling-Angola-Cell-from-Canada-55-407.html
http://www.nobelcom.com/Angola-Cell-phone-cards/from-Canada-55-407.html We followed every seo/usability rule and have no clue why this happened. Any ideea? Cheers,
S.0 -
Numbers (2432423) in URL
Hello All Mozers, Quick question on URL. I know URL is important and should include keywords and all that but my question is does including numbers (not date or page numbers but numbers for internal use) in the URL affect SEO? For example, www.domain.com/screw-driver,12,1,23345.htm Is that any better or worse than www.domain.com/screw-driver.htm? I understand that this is not user friendly but in SEO stand point does it hurt ranking? What's your opinion on this? Thank you!
Intermediate & Advanced SEO | | TommyTan0 -
How long until my correct url is in the serps?
We changed our website including urls. We setup 301 redirects for our pages. Some of the pages show up as the old url and some the new url. When does that change?
Intermediate & Advanced SEO | | EcommerceSite0 -
URL Structure for Directory Site
We have a directory that we're building and we're not sure if we should try to make each page an extension of the root domain or utilize sub-directories as users narrow down their selection. What is the best practice here for maximizing your SERP authority? Choice #1 - Hyphenated Architecture (no sub-folders): State Page /state/ City Page /city-state/ Business Page /business-city-state/
Intermediate & Advanced SEO | | knowyourbank
4) Location Page /locationname-city-state/ or.... Choice #2 - Using sub-folders on drill down: State Page /state/ City Page /state/city Business Page /state/city/business/
4) Location Page /locationname-city-state/ Again, just to clarify, I need help in determining what the best methodology is for achieving the greatest SEO benefits. Just by looking it would seem that choice #1 would work better because the URL's are very clear and SEF. But, at the same time it may be less intuitive for search. I'm not sure. What do you think?0 -
How do I make my URLs SEO friendly?
Hi all, I am aware that overly-dynamic URLs hurt a website's SEO potential and I want to fix mine. At present they look like this: http://www.societyboardshop.co.uk/products.php?brand=Girl+Skateboards&BrandID=153 What do I need to do to fix them please... do I add some code to the htaccess file? Many thanks, much apreciated. Paul.
Intermediate & Advanced SEO | | Paul530 -
How to fix duplicated urls
I have an issue with duplicated pages. Should I use cannonical tag and if so, how? Or should change the page titles? This is causing my pages to compete with each other in the SERPs. 'Paradisus All Inclusive Luxury Resorts - Book your stay at Paradisus Resorts' is also used on http://www.paradisus.com/booking-template.php | http://www.paradisus.com/booking-template.php?codigoHotel=5889 line 9 | | http://www.paradisus.com/booking-template.php?codigoHotel=5891 line 9 | | http://www.paradisus.com/booking-template.php?codigoHotel=5910 line 9 | | http://www.paradisus.com/booking-template.php?codigoHotel=5911 line 9 |
Intermediate & Advanced SEO | | Melia0