Get a list of robots.txt blocked URL and tell Google to crawl and index it.
-
Some of my key pages got blocked by robots.txt file and I have made required changes in robots.txt file but how can I get the blocked URL's list.
My webmaster page Health>blocked URL's shows only number not the blocked URL's.My first question is from where can I fetch these blocked URL's and how can I get them back in searches,
One other interesting point I see is that blocked pages are still showing up in searches.Title is appearing fine but Description shows blocked by robots.txt file.
I need urgent recommendation as I do not want to see drop in my traffic any more.
-
"changing the lastmod of those pages to today".
How can I make these changes?
Right now the news is that Resubmitted the Sitemap and no warnings this time.
-
I imagine that since you've got a robots txt error you'll probably ended closing a whole directory to bots which you wanted to be indexed. You can easily spot the directory and resubmit a sitemap to google changing the lastmod of those pages to today and the priority to 1 but only of those pages.
If you still receive warnings it may be due to errors in your sitemap. You're probably including some directory you don't want. You can try it in GWT putting in the box at the bottom the url you want to maintain in the index and then trying to see if some urls are being blocked by your robots.
If you want you can post here your robots and the URIs you want to be indexed without knowing the domain so that won't be public. Hope this may help you
-
Ok Resubmitted it.but even with updated file it gives a lot of errors.I think it takes some time.20,016 warnings
I have not added no index attribute in my header region.It was all messy stuff with robots.txt file.It means that with site showing up in SERP the rank will probably be the same or it was deranked?
-
Go into GWMT and resubmit sitemap.xml files (with the URLs you want indexed) for recrawling and Google will digest the sitemaps again, instead of waiting for Googlebot to come around on their own, you are requesting it to come around, also include those new sitemap files in your robots.txt file.
-
In Google Webmaster Tools, go to Health -> Fetch As Google. Then add the previously blocked URL and click Fetch. Once you've done that, refresh the page and click "Submit to index". That should get Google indexing those pages again.
Getting external links to your pages also helps get pages crawled & indexed, so it may be worth submitting your pages to social bookmarking sites, or get other types of backlinks to your previously blocked pages if possible.
-
Since you fixed your robots.txt file you should be good to. It will probably take a few days for Google to recrawl your site and update the index with the URLs they are now allow to crawl.
Blocked URLs can still show up in SERPs if you haven't defined the no-index attribute in your section.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Crawl errors for pages that no longer exist
Hey folks, I've been working on a site recently where I took a bunch of old, outdated pages down. In the Google Search Console "Crawl Errors" section, I've started seeing a bunch of "Not Found" errors for those pages. That makes perfect sense. The thing that I'm confused about is that the "Linked From" list only shows a sitemap that I ALSO took down. Alternatively, some of them list other old, removed pages in the "Linked From" list. Is there a reason that Google is trying to inform me that pages/sitemaps that don't exist are somehow still linking to other pages that don't exist? And is this ultimately something I should be concerned about? Thanks!
Reporting & Analytics | | BrianAlpert780 -
How to get multiple pages to appear under main url in search - photo attached
How do you get a site to have an organized site map under the main url when it is searched as in the example photo? SIte-map.png
Reporting & Analytics | | marketingmediamanagement0 -
Parenthesis in URL?
For some reason, we have some URLs on our site with parentheses in them such as somesite.com/used-this(that)What will parenthesis do to the ranking of those pages?
Reporting & Analytics | | CFSSEO0 -
Referral Exclusion List - Data Questsion
This may seem like a silly question. Since we are having an issue with self-referrals, we checked all the pages and everything is tagged properly, I used the referral exclusion list to exclude our domains. Question is, since we had a large number of our revenue coming in from the self-referring traffic, what happens to that revenue data once I add our domains to the referral exclusion list?
Reporting & Analytics | | K2_Sports0 -
Does Google encryption of keyword data impact SEO revenue reporting in Google analytics?
Hi there, I know Google has been encrypting SEO keyword data which they rolled out in September 2013. My question is - will this impact SEO revenue figures reported in Google analytics? I have been monitoring SEO revenue figures for a client and they are significantly down even though rankings have not lowered. Is this because of Google's encryption? Could there be another reason? Many thanks!
Reporting & Analytics | | CayenneRed890 -
Google Analytics Organic Search Keywords Suddenly Displaying FulL Urls
In my Google Analytics, the top keywords for Organic Search are suddenyl displaying full URLs. For example, now the third and fourth keywords are http://www.domain.com/highly-specific-URL. These have all started recently around the same day, July 12th. I've checked back, and we've made no internal changes to the site around that time that could affect this. Any thoughts on this? Thanks! P.S. It might be related to rich snippets, but I cannot tell at this point.
Reporting & Analytics | | 10SL0 -
Google Analytics complexe solution?
Hello, We have Google Analytics on our website and we have started to track the conversions.
Reporting & Analytics | | lunacloud
Basically we have a goal with 3 steps: Account Details (Personal Information) Confirmation (Mobile Confirmation Code) Email ( confirmation link) On the last step (Destination Goal) we send an email to the customer with the account confirmation link, the tracking works perfectly. Our problem is with the Goal Completions on "Traffic Sources" >> "AdWords" >> "Campaigns", Analytics doesn't add the conversions. This problem is related to the email confirmation? There is any solution to overcome this problem? Thank you! yzujFZU.png f4fay1G.png0 -
Google Analytics
Hello In Google analytics you can obtain the number of visits as a result of non-paid search. You can also set up custom reports to find the number of organic searches. The numbers are different, so what is the obvious difference between these two metrics that I'm missing. Thanks in advance for any assistance. Neil
Reporting & Analytics | | mccormackmorrison0