Get a list of robots.txt blocked URL and tell Google to crawl and index it.
-
Some of my key pages got blocked by robots.txt file and I have made required changes in robots.txt file but how can I get the blocked URL's list.
My webmaster page Health>blocked URL's shows only number not the blocked URL's.My first question is from where can I fetch these blocked URL's and how can I get them back in searches,
One other interesting point I see is that blocked pages are still showing up in searches.Title is appearing fine but Description shows blocked by robots.txt file.
I need urgent recommendation as I do not want to see drop in my traffic any more.
-
"changing the lastmod of those pages to today".
How can I make these changes?
Right now the news is that Resubmitted the Sitemap and no warnings this time.
-
I imagine that since you've got a robots txt error you'll probably ended closing a whole directory to bots which you wanted to be indexed. You can easily spot the directory and resubmit a sitemap to google changing the lastmod of those pages to today and the priority to 1 but only of those pages.
If you still receive warnings it may be due to errors in your sitemap. You're probably including some directory you don't want. You can try it in GWT putting in the box at the bottom the url you want to maintain in the index and then trying to see if some urls are being blocked by your robots.
If you want you can post here your robots and the URIs you want to be indexed without knowing the domain so that won't be public. Hope this may help you
-
Ok Resubmitted it.but even with updated file it gives a lot of errors.I think it takes some time.20,016 warnings
I have not added no index attribute in my header region.It was all messy stuff with robots.txt file.It means that with site showing up in SERP the rank will probably be the same or it was deranked?
-
Go into GWMT and resubmit sitemap.xml files (with the URLs you want indexed) for recrawling and Google will digest the sitemaps again, instead of waiting for Googlebot to come around on their own, you are requesting it to come around, also include those new sitemap files in your robots.txt file.
-
In Google Webmaster Tools, go to Health -> Fetch As Google. Then add the previously blocked URL and click Fetch. Once you've done that, refresh the page and click "Submit to index". That should get Google indexing those pages again.
Getting external links to your pages also helps get pages crawled & indexed, so it may be worth submitting your pages to social bookmarking sites, or get other types of backlinks to your previously blocked pages if possible.
-
Since you fixed your robots.txt file you should be good to. It will probably take a few days for Google to recrawl your site and update the index with the URLs they are now allow to crawl.
Blocked URLs can still show up in SERPs if you haven't defined the no-index attribute in your section.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How does switching to HTTPS effect Google Analytics?
We are looking at making our site HTTPS. We have been using the same Google Analytics account for years and I like having the historical data. All of our pages will be the same, we are just going to redirect from the http to https. Does anything need to be done with Google Analytics? What about other addons such as Optimizely, Crazy Egg, or Share this?
Reporting & Analytics | | EcommerceSite0 -
Is it possible to use Google Tag Manager to pass a user’s text input into a form field to Google analytics?
Hey Everyone, I finally figured out how to use auto event tracking with Google Tag Manager, but didn't get the data I wanted. I want to see what users are typing into the search field on my site (the URL structure of my site isn't set up properly to use GA's built-in site search tracking). So, I set up the form submit event tracking in Google Tag Manager and used the following as my event tracking parameters: Category: Search Action: Search Value When I test and look in Google Analytics I just see: "search" and "search value." I wanted to see the text that I searched on my site. Not just the Action and Category of the event.... Is what I'm trying to do even possible? Do I need to set up a different event tracking parameter? Thanks everyone!
Reporting & Analytics | | DaveGuyMan0 -
Google analytics Goals and Funnels??
Hey - how's things? I've got 2 goals setup, each of which use the same thank you page but with different required funnels. The results have been identical for the last few weeks - which tells me I've got something wrong!! The first goal has a contact form on every page in the /products/ directory set up like this:
Reporting & Analytics | | agua
destination equal to: /thanks-enquiry/funnel step1 page: ^/products/.* required: yes The first goal has a form on 1 page called /on-sale set up like this:
destination equal to: /thanks-enquiry/
funnel step1 page: /on-sale required: yes Any ideas?0 -
Google analytcics sub domain dot or not?
Buongiorno from 16 degrees C wetherby UK famous for the Wetherby Whaler Chippy 😉 OK... on this site http://www.philpotts.co.uk/ I've set up sub domain tracking as so:
Reporting & Analytics | | Nightwing
Parent site:
http://www.philpotts.co.uk/ Sub domain
http://shop.philpotts.co.uk/ So my question is please: should a dot be placed in the sub domain line as in : _gaq.push(['_setDomainName', '.philpotts.co.uk']); Some advice places a dot in setDomainName other advice doesnt 😞
Any insights welcome, Grazie,David0 -
Google penalty
Hi, I've seen a steady improvement in my google search referrals since optimising my site and keeping an eye on SEO over the last few months. I'm only getting relatively small numbers of referrals, around 120-130 a day, but this has steadily increased from about 40 before Christmas. It's been a fairly consistent increase until 4 days ago when suddenly my referral numbers were cut in half. I'd be surprised if google was penalising me since I'm such a small site, but are there any obvious signs I should look out for? Oh, and my site is www.madegood.org should that be of help. Thanks! Will
Reporting & Analytics | | madegood0 -
Google Analytics data - Canonical problems?
Hi everyone, We're trying to determine why Google Analytics is showing multiple versions of the same page as having "landing page traffic". For instance, these 2 pages are both shown as landing pages in GA: www.oursite.com/product_page
Reporting & Analytics | | darkgreenguy
www.oursite.com/product_page/ This occurs many times in Google analytics. Also, there are instances such as these: www.oursite.com/index.php/custom_product_url www.oursite.com/custom_product_url I can't find anything in Google Webmaster tools that would indicate a problem. However, this GA data is making me think there are duplicate content issues on the site... Thanks in advance for any help...0 -
Track image searches on google analytics
How can i track image searches on google analytics? To filter the refferers by image.google or /imgres does not work in new google analytics (and brings an impossibly small number on old GA). also, this would not cover local google image refferers, that don't come from image.google.com. anyone knows how to find image search sources? i haven't found anything helpful in google's documentation or forums. thanks!
Reporting & Analytics | | zeepartner0 -
List all URL's indexed by google
Hi all i need a list of all urls google has indexed from my site i want this in excel format or csv how do i go about getting this thanks in advance
Reporting & Analytics | | Will_Craig0