Get a list of robots.txt blocked URL and tell Google to crawl and index it.
-
Some of my key pages got blocked by robots.txt file and I have made required changes in robots.txt file but how can I get the blocked URL's list.
My webmaster page Health>blocked URL's shows only number not the blocked URL's.My first question is from where can I fetch these blocked URL's and how can I get them back in searches,
One other interesting point I see is that blocked pages are still showing up in searches.Title is appearing fine but Description shows blocked by robots.txt file.
I need urgent recommendation as I do not want to see drop in my traffic any more.
-
"changing the lastmod of those pages to today".
How can I make these changes?
Right now the news is that Resubmitted the Sitemap and no warnings this time.
-
I imagine that since you've got a robots txt error you'll probably ended closing a whole directory to bots which you wanted to be indexed. You can easily spot the directory and resubmit a sitemap to google changing the lastmod of those pages to today and the priority to 1 but only of those pages.
If you still receive warnings it may be due to errors in your sitemap. You're probably including some directory you don't want. You can try it in GWT putting in the box at the bottom the url you want to maintain in the index and then trying to see if some urls are being blocked by your robots.
If you want you can post here your robots and the URIs you want to be indexed without knowing the domain so that won't be public. Hope this may help you
-
Ok Resubmitted it.but even with updated file it gives a lot of errors.I think it takes some time.20,016 warnings
I have not added no index attribute in my header region.It was all messy stuff with robots.txt file.It means that with site showing up in SERP the rank will probably be the same or it was deranked?
-
Go into GWMT and resubmit sitemap.xml files (with the URLs you want indexed) for recrawling and Google will digest the sitemaps again, instead of waiting for Googlebot to come around on their own, you are requesting it to come around, also include those new sitemap files in your robots.txt file.
-
In Google Webmaster Tools, go to Health -> Fetch As Google. Then add the previously blocked URL and click Fetch. Once you've done that, refresh the page and click "Submit to index". That should get Google indexing those pages again.
Getting external links to your pages also helps get pages crawled & indexed, so it may be worth submitting your pages to social bookmarking sites, or get other types of backlinks to your previously blocked pages if possible.
-
Since you fixed your robots.txt file you should be good to. It will probably take a few days for Google to recrawl your site and update the index with the URLs they are now allow to crawl.
Blocked URLs can still show up in SERPs if you haven't defined the no-index attribute in your section.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Analtyics during site redesign
Hi, We will be launching a new redesign for our website. There will be new URLs and navigation and almost everything (except for static pages like about and contact) will be different. The overwhelming opinion seems to say that it's important to keep the same Google Analytics profile. How can we compare the past URLs to the new ones if they are completely different. Does anyone have any experience in this? Did you create any segmentation? Thanks 🙂
Reporting & Analytics | | WSteven0 -
Google Analytics
Is there any way around being able to set up 20 goals in Analytics? What if one of my clients want to set up 10 pages with separate actions to track on each page. Will then each actions count as a separate goal? If this is the case, how could I work around creating more than 20 goals? Would I have to look into hiring/buying a third party or separate software? Your help is greatly appreciated.
Reporting & Analytics | | flcity150 -
800,000 pages blocked by robots...
We made some mods to our robots.txt file. Added in many php and html pages that should not have been indexed. Well, not sure what happened or if there was some type of dynamic conflict with our CMS and one of these pages, but in a few weeks we checked webmaster tools and to our great surprise and dismay, the number of blocked pages we had by robots.txt was up to about 800,000 pages out of the 900,000 or so we have indexed. 1. So, first question is, has anyone experienced this before? I removed the files from robots.txt and the number of blocked files has still been climbing. Changed the robots.txt file on the 27th. It is the 29th and the new robots.txt file has been downloaded, but the blocked pages count has been rising in spite of it. 2. I understand that even if a page is blocked by robots.txt, it still shows up in the index, but does anyone know how the blocked page affects the ranking? i.e. while it might still show up even though it has been blocked will google show it at a lower rank because it was blocked by robots.txt? Our current robots.txt just says: User-agent: *
Reporting & Analytics | | TheCraig
Disallow: Sitemap: oursitemap Any thoughts? Thanks! Craig0 -
Google Maps not passing referral data
Google Maps is not passing referral data (URLs, not KWs). Google+ Local is referring, but nothing from maps. Maps referrals appear to be coming across as direct. Any ideas? We haven't found anything online, one of the guys at the office documented what we did find, using Chrome's debugger - http://manofactionmetrics.com/2012/11/02/google-maps-not-passing-any-referral-data/
Reporting & Analytics | | Danieljacobree0 -
Comparing % Change, Google Analytics
Hey Mozzers, Is there a simple way to compare the "% Change" in traffic when comparing two separate time periods in a single Google Analytics report? When comparing data from two separate time periods, an exported CSV doesn't include the % Change (booo!), and there's no option to sort by % Change within the GA report, essentially forcing you to scroll through all the results to pinpoint the major movers and shakers. I'm not averse to using spreadsheets to sort this data, but I'm thinking that I'd likely need a macro to make this work, something like this. However, none of the macros on that page are working (possibly because they were designed for a previous version of Analytics). All suggestions are appreciated. Thanks!
Reporting & Analytics | | dangaul0 -
Set up Google Analytics by product category
Does anyone know if it's possible to set up Google Analytics data by product category? We sell roughly twenty product categories on our ecommerce site. We can look at our analytics performance for the site overall and drill down to specific pages, but what we really want to see is our performance by product category. Our product categories can include up to 5000 products. Any pointers on how to do this?
Reporting & Analytics | | ironpac0 -
How do I best segment tablets on Google Analytics
I would like to find a way to best segment out my tablet traffic to measure performance; however I'm finding that there are road blocks. It doesn't seem that device operating systems or screen resolutions have clear cut differences in the tablet/mobile versions. Has anyone here found a good way to create a "tablet" segment in Google Analytics? Right now I'm having to lean on solely the ipad traffic to get indicators of tablet performance. Thanks!
Reporting & Analytics | | lvstrickland0 -
Does using Google URL Builder override original source in Google Analytics?
During a free trial on Tatango, we send daily emails to customers to give them advice, resources, etc. We started using Google URL Builder http://www.google.com/support/analytics/bin/answer.py?answer=55578 to create individual links in each of these emails, but when the customer purchases a subscription now, the source in GA isn't Google, Facebook, Twitter, etc. they are all showing up as the source we created using the URL builder for each email. Does Google URL builder override the original source in Google Analytic?
Reporting & Analytics | | Tatango0