Get a list of robots.txt blocked URL and tell Google to crawl and index it.
-
Some of my key pages got blocked by robots.txt file and I have made required changes in robots.txt file but how can I get the blocked URL's list.
My webmaster page Health>blocked URL's shows only number not the blocked URL's.My first question is from where can I fetch these blocked URL's and how can I get them back in searches,
One other interesting point I see is that blocked pages are still showing up in searches.Title is appearing fine but Description shows blocked by robots.txt file.
I need urgent recommendation as I do not want to see drop in my traffic any more.
-
"changing the lastmod of those pages to today".
How can I make these changes?
Right now the news is that Resubmitted the Sitemap and no warnings this time.
-
I imagine that since you've got a robots txt error you'll probably ended closing a whole directory to bots which you wanted to be indexed. You can easily spot the directory and resubmit a sitemap to google changing the lastmod of those pages to today and the priority to 1 but only of those pages.
If you still receive warnings it may be due to errors in your sitemap. You're probably including some directory you don't want. You can try it in GWT putting in the box at the bottom the url you want to maintain in the index and then trying to see if some urls are being blocked by your robots.
If you want you can post here your robots and the URIs you want to be indexed without knowing the domain so that won't be public. Hope this may help you
-
Ok Resubmitted it.but even with updated file it gives a lot of errors.I think it takes some time.20,016 warnings
I have not added no index attribute in my header region.It was all messy stuff with robots.txt file.It means that with site showing up in SERP the rank will probably be the same or it was deranked?
-
Go into GWMT and resubmit sitemap.xml files (with the URLs you want indexed) for recrawling and Google will digest the sitemaps again, instead of waiting for Googlebot to come around on their own, you are requesting it to come around, also include those new sitemap files in your robots.txt file.
-
In Google Webmaster Tools, go to Health -> Fetch As Google. Then add the previously blocked URL and click Fetch. Once you've done that, refresh the page and click "Submit to index". That should get Google indexing those pages again.
Getting external links to your pages also helps get pages crawled & indexed, so it may be worth submitting your pages to social bookmarking sites, or get other types of backlinks to your previously blocked pages if possible.
-
Since you fixed your robots.txt file you should be good to. It will probably take a few days for Google to recrawl your site and update the index with the URLs they are now allow to crawl.
Blocked URLs can still show up in SERPs if you haven't defined the no-index attribute in your section.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Analytics, Wordpress, Sumo - who to trust
Something is wrong. I have a website that according to Google Analytics gets zero relevant clicks, with most of the clicks it does get coming from Russia. Wordpress says it gets a hundred or so visitors a month, with most traffic going to the pages that I know rank. Sumo says the heat map got 76 clicks. How would you figure out what's really going on?
Reporting & Analytics | | julie-getonthemap0 -
Google Analytics for User Experience
Hi In terms of looking at the overall User Experience of a website is there any particular areas of Google analytics that you believe to be particularly useful to identify areas of worry or opportunity?
Reporting & Analytics | | TheZenAgency1 -
Has anybody else had unusual /feed crawl errors in GWT on normal url's?
I'm getting crawl error notifications in Google Webmaster tools for pages that do not exist on my sites?! Basically normal URL's with /feed on the end.. http://jobs-transport.co.uk/submit/feed/ http://jobs-transport.co.uk/login/feed Has any body else experienced this problem? I have no idea why this is happening. Simon
Reporting & Analytics | | simmo2350 -
Google Webmaster. Backlinks
GWMT only shows that there are 3 domains pointing to a site of mine. I'm looking under "Links to site". But this can't be true because the site is pretty old and I know there are hundreds of domains that point to this one. What would explain this discrepancy? And is there some other free tool that will show all the backlinks? I've used Opensite explorer but that tool isn't close to comprehensive as GWMT usually is (based on other sites I've analyzed)
Reporting & Analytics | | priceseo0 -
Google SEO - Where have I disappeared to?
Okay, so first off Google, I hate you. Before I signed up for SEOMoz, my website was hitting page 9 and page 10 for some ultra difficult keywords. After spending a month using Hubspot and SEOMoz, I finally made it on to page 2 of google, for said 'impossible to rank high' keywords, which I was super happy with. But last week, I login to find that I have disappeared off the top 100 pages of Google for about 100 of my top keywords!!!! What the hell did I do wrong? I tried to please Google, but my website is still indexed, but just not ranked at all for any of my top keywords. The last thing I did before I disappeared overnight was add "follow me" buttons to all my pages and "share this" buttons to all my blogs. Could this be the problem? My website and main keyword is Process Server Is there anyone who could help push me in the right direction? I have no idea what I did wrong. 😕 Martyn
Reporting & Analytics | | spymore0 -
Google Analytics - multiple counters
Hey there Mozzers! One of our customers wants to seperate one Google Analytics account into multiple accounts. The website is divided in three parts: Main: www.website.nl Sub1: www.website.nl/sub1 Sub2: http://www.website.nl/sub2 And they would like 4 different reports under one account. R1: Total count R2: Website.nl (without Sub 1 & Sub2) R3: Sub1 R4: Sub2 I know multiple counters will get in conflict with each other, so I have to implement some filters. E.g: We can configure a filter for R3 on "astmakids" in URL. My question is: is there a safe way to implement multiple Analytics filters on one website? And how will R3 see visitors that come from the root domain astmafonds.nl? Are they referrals? Thanks a lot in advance!! Partouter
Reporting & Analytics | | Partouter0 -
Google analytics advanced segments
Ok, I need help with a simple (although, for some reason, I'm having trouble with it) advanced segment. Dilemma**:** All of our techs have a backend cookie that they use to log into our website. I want a way toexclude all visites where the landing page contained this in the url:/backend/cookie.php?username= Advanced Segment: We have a lot of techs and each one of them has a different "username=example". So how can I set up an advanced segment where it will exclude any visit where the visitor came in one a landing page containing /backend/cookie.php?username=
Reporting & Analytics | | NerdsOnCall0 -
How to change a url for google analytics account
We recently changed the url of a client's website. Is there a way to change the url on the ga account instead of creating a new account so that we don't lose comparative data? Thanks! Sorry- I know this is a novice question.
Reporting & Analytics | | marketing12340