Get a list of robots.txt blocked URL and tell Google to crawl and index it.
-
Some of my key pages got blocked by robots.txt file and I have made required changes in robots.txt file but how can I get the blocked URL's list.
My webmaster page Health>blocked URL's shows only number not the blocked URL's.My first question is from where can I fetch these blocked URL's and how can I get them back in searches,
One other interesting point I see is that blocked pages are still showing up in searches.Title is appearing fine but Description shows blocked by robots.txt file.
I need urgent recommendation as I do not want to see drop in my traffic any more.
-
"changing the lastmod of those pages to today".
How can I make these changes?
Right now the news is that Resubmitted the Sitemap and no warnings this time.
-
I imagine that since you've got a robots txt error you'll probably ended closing a whole directory to bots which you wanted to be indexed. You can easily spot the directory and resubmit a sitemap to google changing the lastmod of those pages to today and the priority to 1 but only of those pages.
If you still receive warnings it may be due to errors in your sitemap. You're probably including some directory you don't want. You can try it in GWT putting in the box at the bottom the url you want to maintain in the index and then trying to see if some urls are being blocked by your robots.
If you want you can post here your robots and the URIs you want to be indexed without knowing the domain so that won't be public. Hope this may help you
-
Ok Resubmitted it.but even with updated file it gives a lot of errors.I think it takes some time.20,016 warnings
I have not added no index attribute in my header region.It was all messy stuff with robots.txt file.It means that with site showing up in SERP the rank will probably be the same or it was deranked?
-
Go into GWMT and resubmit sitemap.xml files (with the URLs you want indexed) for recrawling and Google will digest the sitemaps again, instead of waiting for Googlebot to come around on their own, you are requesting it to come around, also include those new sitemap files in your robots.txt file.
-
In Google Webmaster Tools, go to Health -> Fetch As Google. Then add the previously blocked URL and click Fetch. Once you've done that, refresh the page and click "Submit to index". That should get Google indexing those pages again.
Getting external links to your pages also helps get pages crawled & indexed, so it may be worth submitting your pages to social bookmarking sites, or get other types of backlinks to your previously blocked pages if possible.
-
Since you fixed your robots.txt file you should be good to. It will probably take a few days for Google to recrawl your site and update the index with the URLs they are now allow to crawl.
Blocked URLs can still show up in SERPs if you haven't defined the no-index attribute in your section.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Search Console Crawl Errors/Not Found - Strange URLs
Hello, In Google Search Console under Crawl > Crawl Errors > Not found I have strange URLs like the following: https://www.domain.com//UbaOZ/
Reporting & Analytics | | chuck-layton
https://www.domain.com//UPhXZ/
https://www.domain.com//KaUpZ/WYdhZ/SnQZZ/MOcUZ/ There is no info in Linked From tab. Have you seen this type of error??
Does anyone know whats causing it??
How should it be fixed?? Thanks for reading and the help!0 -
Google Analytics global page empty
Hello, I don't know what happened with my Google Analytics initial page. For around a week I have there all my websites, but they show no results on global page. If I enter on any property I can see everything as normal, just not on the initial page. Screenshot: http://imgur.com/JKFVIh4 For sure I changed anything that I shouldn't. Perhaps you know what's wrong? Thank you in advance.
Reporting & Analytics | | prozis0 -
Google analytics vs Webmaster tools data
Hi Which is more accurate WT or GA data ? Since GA reporting a KW (thats very recently fallen from page 1 to 3 hence looking into data to find the cause) in the Organic part of Search tab as having generated just 1 visitor over a month (hence presuming fall could be due to low visits from a page 1 result) whilst under Search Engine Optimisation tab (data sourced from WT i think) its reporting 5 click thrus from 150 impressions over same period resulting in a quite good 3.33% CTR (hence wouldn't expect to be the cause of a fall) and what i would have thought GA would report as 5 visits instead of the 1 they do report !? The reason im looking for answer in the data is because no on-page has changed and still scoring a grade A and off page metrics have all improved across the board (apart from small drop in majseo's Trust Flow) such as increased links, RD, Citation Flow, Ref Subnets etc etc etc Cheers Dan
Reporting & Analytics | | Dan-Lawrence0 -
Does GWT "Fetch as Google Bot" feature affect crawl rate?
Hello Mozians, I have noticed many people saying using GWT fetch as GoogleBot can affect your crawl rate in future, if used regularly. Though, i am not very sure if this is true or just another stale SEO myth. As currently GWT provides a limit of 500 URLs to fetch every month. I hope my doubts will be cleared by the Moz community experts. Thanks!
Reporting & Analytics | | pushkar630 -
Accidental Link not being removed by Google WMT
I operate two sites for a client. One is a local business and one is their national business. I used the same template for both sites (with changes) but accidentally left a link in the footer to the local site. Now the local site is showing 12k backlinks from the national site. I removed the link over 2 weeks ago but it still shows up in Google WMT in the "Links to your Site" section. It goes to a coupon page and not a "targeted" page but 12k links to the local site is 6 TIMES what I had before. My question is: "Is there a way to get Google to remove the link from Google WMT?" More specifically force it. Like I said the link has been removed for over 2 weeks but it still shows up in the Local site's Incoming Links section of WMT. Thanks.
Reporting & Analytics | | DarinPirkey0 -
How to track affiliate ads on Google analytics
Hi Im new google analytics user, I would like to track affiliates ads (banner flow -javascript) on my website. How to set up goals to track affiliate ads? thanks.
Reporting & Analytics | | info_tipovanie-stavkovanie.com0 -
Why do I have few different index URL addresses?
Yes I know, sorry guys but I also have a problem with duplicate pages. It shows that almost every page of my site has a duplicate content issue and looking at my folders in the server, I don't see all these pages... This is a static Website with no shopping cart or anything fancy. The first on the list is my [index] page and this is giving me a hint about some sort of bad settings on my end with the SEOMOZ crawler??? Please advice and thank you! index-variations.jpg
Reporting & Analytics | | cssyes0 -
Google Analytics - Intelligence Custom Alerts
When you add an intelligence custom alert does it look at the history and backdate issues? I've added a few custom alerts, even basic ones like my homepage gets 5 views, create an alert. I either need to wait a day or something is going wrong. Cheers.
Reporting & Analytics | | Seaward-Group0