Get a list of robots.txt blocked URL and tell Google to crawl and index it.
-
Some of my key pages got blocked by robots.txt file and I have made required changes in robots.txt file but how can I get the blocked URL's list.
My webmaster page Health>blocked URL's shows only number not the blocked URL's.My first question is from where can I fetch these blocked URL's and how can I get them back in searches,
One other interesting point I see is that blocked pages are still showing up in searches.Title is appearing fine but Description shows blocked by robots.txt file.
I need urgent recommendation as I do not want to see drop in my traffic any more.
-
"changing the lastmod of those pages to today".
How can I make these changes?
Right now the news is that Resubmitted the Sitemap and no warnings this time.
-
I imagine that since you've got a robots txt error you'll probably ended closing a whole directory to bots which you wanted to be indexed. You can easily spot the directory and resubmit a sitemap to google changing the lastmod of those pages to today and the priority to 1 but only of those pages.
If you still receive warnings it may be due to errors in your sitemap. You're probably including some directory you don't want. You can try it in GWT putting in the box at the bottom the url you want to maintain in the index and then trying to see if some urls are being blocked by your robots.
If you want you can post here your robots and the URIs you want to be indexed without knowing the domain so that won't be public. Hope this may help you
-
Ok Resubmitted it.but even with updated file it gives a lot of errors.I think it takes some time.20,016 warnings
I have not added no index attribute in my header region.It was all messy stuff with robots.txt file.It means that with site showing up in SERP the rank will probably be the same or it was deranked?
-
Go into GWMT and resubmit sitemap.xml files (with the URLs you want indexed) for recrawling and Google will digest the sitemaps again, instead of waiting for Googlebot to come around on their own, you are requesting it to come around, also include those new sitemap files in your robots.txt file.
-
In Google Webmaster Tools, go to Health -> Fetch As Google. Then add the previously blocked URL and click Fetch. Once you've done that, refresh the page and click "Submit to index". That should get Google indexing those pages again.
Getting external links to your pages also helps get pages crawled & indexed, so it may be worth submitting your pages to social bookmarking sites, or get other types of backlinks to your previously blocked pages if possible.
-
Since you fixed your robots.txt file you should be good to. It will probably take a few days for Google to recrawl your site and update the index with the URLs they are now allow to crawl.
Blocked URLs can still show up in SERPs if you haven't defined the no-index attribute in your section.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Analytics Goals - Button Tracking
Does anyone know if there is a really easy way to track a button in Google Analytics yourself? It seems that most button click goal setups involve some use of tricky code and I'm wondering if there is a much easier way to do this that will allow us to simply setup and track certain button clicks as goal conversions in Analytics. Your help here is much appreciated!
Reporting & Analytics | | Gavo0 -
How does Google measure page position in Webmasters?
Does anyone know exactly how Google measures page position in Webmaster Tools? For example: In Google Webmaster Tools, we had a product which on the 22/12/15 was at position 7, and then dropped to position 112 on the 30/12/15. It then rose back up to position 7 on the 6/01/16 and then down to position 25 on the 16/01/16. What does this mean and why?
Reporting & Analytics | | CostumeD0 -
Google Analytics, new property or views?
Hi All We have an existing Google analytics account which was customised relatively heavily a few years back, with custom variables, e-commerce tracking and the a large number of filters. We have determined that the custom variables and e-commerce tracking, at least for now, are no longer required and actually complicating our lives. We want to simplify the account and add a fresh set of goals and events. Our initial train of thought was to create an entire new property with the usual views, raw, test, main etc. We are proposing to retain both properties (6 months) so we can still do some comparative stuff. We're now wondering if we should retain the existing property and just create new views, the downside being that whilst we can compare data with previous periods it's meaningless to some degree as the filters etc are different. The plus of more views is that we're not firing the tag twice retaining two properties. Views and opinions welcome. Thanks
Reporting & Analytics | | mde9110 -
Google Analytics and Bounce Rates Query - Should I block access from foreign countries ?
Hi , When I look at my google analytics for my UK Website, I can see alot of visits come from outside the UK , i.e Brazil and USA. Both of which give me almost 100% bounce rates from people visiting from there. I am wondering, if google looks at bounce rates with regards to ranking factors and should I therefore block access to my site from visitors outside the UK ?... Would this help increase my rankings ? Given that we only serve uk customers, I cant see any benefit of allowing non uk customers the ability to see the site . what does people think ? thanks pete
Reporting & Analytics | | PeteC121 -
Google Analytics - Adding a sub-domain
Hi I have a google analytics query.
Reporting & Analytics | | Niki_1
I have a main site with a google analytics tag and I have 2 forms that sit on a subdomain with a different GA code. As I would like to measure end to end tracking, I would like the same GA code on the subdomain. What is the best way for me to implement this? Would I need to make some changes to the GA code that sits on the main site or can I add the the GA code from the main site onto the subdomain? Thanks0 -
How to get crawled pages indexed?
Hi, I've got over 1k pages crawled but approx 100 pages indexed. Although, i submit them on Google Fetch and the links are indexable,they are not indexed. What shall i do the get max pages indexed? Any input highly appreciated. Thanks!
Reporting & Analytics | | Rubix0 -
Difference between site: search and Total Indexed in Google Webmaster Tools.
This morning I did a search on Google for my site using the site: operator. I noticed that the number of results returned was significantly different than the "Total indexed" in Google Webmaster Tools. What is the difference and is it normal to have two very different numbers here?
Reporting & Analytics | | Gordian0 -
Phantom urls causing 404
I have a very strange problem. When I run SEOmoz diagnostics on my site, it reveals urls that I never created. It seems to combine two slugs into a new url. For example, I have created the pages http://www.naplesrealestatestars.com/abaco-bay-condos-naples/ and http://www.naplesrealestatestars.com/beachwalk-naples-florida/ and now the url http://www.naplesrealestatestars.com/abaco-bay-condos-naples/beachwalk-naples-florida/ exists in addition to the two I created. There are over 100 of these phantom urls and they all show a 404 error when clicked on or crawled by SEOmoz. Any body know how to correct this?
Reporting & Analytics | | DanBoyle760