Get a list of robots.txt blocked URL and tell Google to crawl and index it.
-
Some of my key pages got blocked by robots.txt file and I have made required changes in robots.txt file but how can I get the blocked URL's list.
My webmaster page Health>blocked URL's shows only number not the blocked URL's.My first question is from where can I fetch these blocked URL's and how can I get them back in searches,
One other interesting point I see is that blocked pages are still showing up in searches.Title is appearing fine but Description shows blocked by robots.txt file.
I need urgent recommendation as I do not want to see drop in my traffic any more.
-
"changing the lastmod of those pages to today".
How can I make these changes?
Right now the news is that Resubmitted the Sitemap and no warnings this time.
-
I imagine that since you've got a robots txt error you'll probably ended closing a whole directory to bots which you wanted to be indexed. You can easily spot the directory and resubmit a sitemap to google changing the lastmod of those pages to today and the priority to 1 but only of those pages.
If you still receive warnings it may be due to errors in your sitemap. You're probably including some directory you don't want. You can try it in GWT putting in the box at the bottom the url you want to maintain in the index and then trying to see if some urls are being blocked by your robots.
If you want you can post here your robots and the URIs you want to be indexed without knowing the domain so that won't be public. Hope this may help you
-
Ok Resubmitted it.but even with updated file it gives a lot of errors.I think it takes some time.20,016 warnings
I have not added no index attribute in my header region.It was all messy stuff with robots.txt file.It means that with site showing up in SERP the rank will probably be the same or it was deranked?
-
Go into GWMT and resubmit sitemap.xml files (with the URLs you want indexed) for recrawling and Google will digest the sitemaps again, instead of waiting for Googlebot to come around on their own, you are requesting it to come around, also include those new sitemap files in your robots.txt file.
-
In Google Webmaster Tools, go to Health -> Fetch As Google. Then add the previously blocked URL and click Fetch. Once you've done that, refresh the page and click "Submit to index". That should get Google indexing those pages again.
Getting external links to your pages also helps get pages crawled & indexed, so it may be worth submitting your pages to social bookmarking sites, or get other types of backlinks to your previously blocked pages if possible.
-
Since you fixed your robots.txt file you should be good to. It will probably take a few days for Google to recrawl your site and update the index with the URLs they are now allow to crawl.
Blocked URLs can still show up in SERPs if you haven't defined the no-index attribute in your section.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Blocking Standard pages with Robots.txt (t&c's, shipping policy, pricing & privacy policies etc)
Hi I've just had best practice site migration completed for my old e-commerce store into a Shopify environment and I see in GSC that it's reporting my standard pages as blocked by robots.txt, such as these below examples. Surely I don't want these blocked ? is that likely due to my migrators or s defaults setting with Shopify does anyone know? : t&c's shipping policy pricing policy privacy policy etc So in summary: Shall I unblock these? What caused it Shopify default settings or more likely my migration team? All Best Dan
Reporting & Analytics | | Dan-Lawrence0 -
Tasks for Google Analytics training
Hi Mozzers, I'm delivering some Google Analytics (Fundamentals level) training, and trying to make it was fun and as interesting as possible... which is quite a challenge when it comes to GA. I was just wondering if you're aware of training tasks, or interactions, I could bring into this kind of training session? The group are particularly interested in user journeys and the effectiveness of content. Thanks!
Reporting & Analytics | | A_Q0 -
Buffer Link and Google Impressions
Afternoon, I noticed a spike in impressions over a couple of days in April, so I investigated Analytics to see where these were coming from. It appears these impressions were split between two URLs; one was a blog post, the other was the Buffer link to the blog post that we used on Twitter and Facebook. According to Analytics, this Buffer URL received 1000 impressions over two days, with an average SERP position of 16. This surely can't be right, can it? Is this just another Analytics quirk? After two days of a decent amount of impressions to this Buffer link, the amount of impressions dropped to pretty much zero. I know Tweets are now starting to rank, but this would be the Twitter URL, not the Buffer link to our blog post? Any ideas, Cheers, Lewis
Reporting & Analytics | | PeaSoupDigital0 -
I want to take down some pages, how do I inform google?
Hey Guys, I'm hoping someone can help - I'm in the midst of a site re-design - whilst one of our biggest reasons for the redesign was to create more space to write valuable content and unique content I have been reading other posts on moz about content auditing. I have come across a few articles in my own blog that are from 250 - 300 words to which the articles seem similar and the traffic is low. I'm wondering while I'm left to consolidate these articles and create a fresh article for each entry that is more in-depth, when I consolidate or delete these pages - do I need to inform google these pages have now been deleted? If so using Wordpress what is the best way to do this? Cheers and would appreciate some advise Thanks
Reporting & Analytics | | edward-may0 -
Tracking Google places (7 pack listing) traffic in google analytics
Is there a way to see Google Places traffic (traffic from users clicking through the 7 pack listings) segmented in Google analytics ? Normally is it just lumped together with the organic traffic ? Can you see the search phrases used to find your site, or do they also show up under 'not provided' when from Google Places. Im aware i can see some limited data in the Google Places analytics, but these seem to be 2 days behind when ever i view them.
Reporting & Analytics | | Sam-P0 -
Does GWT "Fetch as Google Bot" feature affect crawl rate?
Hello Mozians, I have noticed many people saying using GWT fetch as GoogleBot can affect your crawl rate in future, if used regularly. Though, i am not very sure if this is true or just another stale SEO myth. As currently GWT provides a limit of 500 URLs to fetch every month. I hope my doubts will be cleared by the Moz community experts. Thanks!
Reporting & Analytics | | pushkar630 -
Run Crawl Diagnostics
hi i have fix some error refering the error list how to re-run crawl diagnostics immediate again to check the error ? thanks
Reporting & Analytics | | AlfredLim0 -
Track banner ad with google analytics
Hi We have a top banner ad on an external renown site only to attrack traffic. It does show up in google analytics, but i have seen some increase in (direct) / (none) traffic. Is there anyway i can track it when banner is showed from 2 separate URLs http://www.tv2torget.no/bedrift/netthandel/ and http://www.tv2torget.no/bedrift/netthandel/barneklaer/ Thanks Dan Lærum
Reporting & Analytics | | danlae0