Get a list of robots.txt blocked URL and tell Google to crawl and index it.
-
Some of my key pages got blocked by robots.txt file and I have made required changes in robots.txt file but how can I get the blocked URL's list.
My webmaster page Health>blocked URL's shows only number not the blocked URL's.My first question is from where can I fetch these blocked URL's and how can I get them back in searches,
One other interesting point I see is that blocked pages are still showing up in searches.Title is appearing fine but Description shows blocked by robots.txt file.
I need urgent recommendation as I do not want to see drop in my traffic any more.
-
"changing the lastmod of those pages to today".
How can I make these changes?
Right now the news is that Resubmitted the Sitemap and no warnings this time.
-
I imagine that since you've got a robots txt error you'll probably ended closing a whole directory to bots which you wanted to be indexed. You can easily spot the directory and resubmit a sitemap to google changing the lastmod of those pages to today and the priority to 1 but only of those pages.
If you still receive warnings it may be due to errors in your sitemap. You're probably including some directory you don't want. You can try it in GWT putting in the box at the bottom the url you want to maintain in the index and then trying to see if some urls are being blocked by your robots.
If you want you can post here your robots and the URIs you want to be indexed without knowing the domain so that won't be public. Hope this may help you
-
Ok Resubmitted it.but even with updated file it gives a lot of errors.I think it takes some time.20,016 warnings
I have not added no index attribute in my header region.It was all messy stuff with robots.txt file.It means that with site showing up in SERP the rank will probably be the same or it was deranked?
-
Go into GWMT and resubmit sitemap.xml files (with the URLs you want indexed) for recrawling and Google will digest the sitemaps again, instead of waiting for Googlebot to come around on their own, you are requesting it to come around, also include those new sitemap files in your robots.txt file.
-
In Google Webmaster Tools, go to Health -> Fetch As Google. Then add the previously blocked URL and click Fetch. Once you've done that, refresh the page and click "Submit to index". That should get Google indexing those pages again.
Getting external links to your pages also helps get pages crawled & indexed, so it may be worth submitting your pages to social bookmarking sites, or get other types of backlinks to your previously blocked pages if possible.
-
Since you fixed your robots.txt file you should be good to. It will probably take a few days for Google to recrawl your site and update the index with the URLs they are now allow to crawl.
Blocked URLs can still show up in SERPs if you haven't defined the no-index attribute in your section.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
PDF best practices: to get them indexed or not? Do they pass SEO value to the site?
All PDFs have landing pages, and the pages are already indexed. If we allow the PDFs to get indexed, then they'd be downloadable directly from google's results page and we would not get GA events. The PDFs info would somewhat overlap with the landing pages info. Also, if we ever need to move content, we'd now have to redirects the links to the PDFs. What are best practices in this area? To index or not? What do you / your clients do and why? Would a PDF indexed by google and downloaded directly via a link in the SER page pass SEO juice to the domain? What if it's on a subdomain, like when hosted by Pardot? (www1.example.com)
Reporting & Analytics | | hlwebdev1 -
Weird URL Structure in GA
Hey everyone, Thanks in advance for any insight on this. I've been researching it quite a bit on Google and haven't found anything yet. In Analytics, under our pages report, we're getting a lot of pages that look like this: www.execucar.com/https://www.execucar.com or www.execucar.com/https://www.execucar.com/locations/orlando-car-service Any thoughts on how to fix this? These pages don't exist...I'm at such a loss.
Reporting & Analytics | | SuperShuttle0 -
Google Analytics and Bounce Rates Query - Should I block access from foreign countries ?
Hi , When I look at my google analytics for my UK Website, I can see alot of visits come from outside the UK , i.e Brazil and USA. Both of which give me almost 100% bounce rates from people visiting from there. I am wondering, if google looks at bounce rates with regards to ranking factors and should I therefore block access to my site from visitors outside the UK ?... Would this help increase my rankings ? Given that we only serve uk customers, I cant see any benefit of allowing non uk customers the ability to see the site . what does people think ? thanks pete
Reporting & Analytics | | PeteC121 -
Google analytics is not tracking well
Hey, There is something not working in our GA account, its shows too many visits per day, when look where this traffic comes from, the majority comes from (not set). Please find attached. Really appreciate help! Thank you! BvmAKtO
Reporting & Analytics | | Comunicare0 -
Is Google Webmaster Tools Index Status Completely Correct?
So I was thrilled when Webmaster tools released the little graph telling you how many pages are indexed. However, I noticed with one of my sites, that it was actually going down over the past year. I've only been doing the SEO on the site for a few months, so it wasn't anything I did. The chart is attached to this post. Now here's the funny part. I haven't really noticed anything out of the ordinary for keyword ranking dropping off. I also tested my most recent page that I put up. 3 days after I posted it, I could find it in a Google search. Shouldn't this mean that's it's indexed? I can also find any other page I've posted in the last few months. Another oddity is that I submitted a sitemap a while ago when the site was only 22 pages. The sitemap index count says 20 of those pages are indexed. The chart only says that there are 3 indexed pages right now. However, I can clearly find dozens of pages in Google searches. Is there something I'm missing? Is my chart for this website broken? Should I report this to Google? Has anyone had a similar issue? decreaseIndex.png
Reporting & Analytics | | mytouchoftech0 -
SEOMoz crawls skewing Avg Visit Duration in Google Analytics
Hello, We are a UK based company. Our Google Analytics account is showing a rise in Avg Visit Duration for 'Direct traffic' since we started using SEOMoz. Are any other users experiencing this issue? We tracked it down to City Seattle, and when doing an Advanced filter by removing Seattle, the results are normal again. What does SEOMoz or other users recommend we do besides continuously using advanced filters? We have enquired about excluding Roger's IP address, but have been told that Roger uses the Amazon cloud, so the IP is not static. See attachment for screenshot of our Google Analytics account of Avg Visit Duration since we began with SEOMoz. Rich Talbot 718g3.gif
Reporting & Analytics | | STL1 -
Regular Expressions in Google Analytics
I want to use the Google Analytics landing page reports to look at the bounce rate of top level pages excluding the homepage. So pages with urls: www.example.com/example Does anyone know a regular expression that will allow me to do this? Just to clarify I do not want to look at the bounce rate of the homepage or any pages deeper than www.example.com/example e.g: www.example.com/example/example www.example.com/example/example/example etc Thanks in advance
Reporting & Analytics | | CPLDistribution0 -
Google Webmaster not accounting for internal links
Hi SEO gurus! All my websites in GWT show the website in question at the top of the "Links to your site", in the form of: Domains Total links my-site.com 1,000 third-party-1.com 500 third-party-2.com 300 third-party-3.com 200 etc.com 100 However, I have a specific account that suddenly (a few weeks back) disappeared its own link count: Domains Total links third-party-1.com 500 third-party-2.com 300 third-party-3.com 200 etc.com 100 Has this happened to any of you? Any ideas how to solve it? The website is www.gmvbodybuilding.com which you can see has plenty of properly formed links.
Reporting & Analytics | | hectorpn0