Get a list of robots.txt blocked URL and tell Google to crawl and index it.
-
Some of my key pages got blocked by robots.txt file and I have made required changes in robots.txt file but how can I get the blocked URL's list.
My webmaster page Health>blocked URL's shows only number not the blocked URL's.My first question is from where can I fetch these blocked URL's and how can I get them back in searches,
One other interesting point I see is that blocked pages are still showing up in searches.Title is appearing fine but Description shows blocked by robots.txt file.
I need urgent recommendation as I do not want to see drop in my traffic any more.
-
"changing the lastmod of those pages to today".
How can I make these changes?
Right now the news is that Resubmitted the Sitemap and no warnings this time.
-
I imagine that since you've got a robots txt error you'll probably ended closing a whole directory to bots which you wanted to be indexed. You can easily spot the directory and resubmit a sitemap to google changing the lastmod of those pages to today and the priority to 1 but only of those pages.
If you still receive warnings it may be due to errors in your sitemap. You're probably including some directory you don't want. You can try it in GWT putting in the box at the bottom the url you want to maintain in the index and then trying to see if some urls are being blocked by your robots.
If you want you can post here your robots and the URIs you want to be indexed without knowing the domain so that won't be public. Hope this may help you
-
Ok Resubmitted it.but even with updated file it gives a lot of errors.I think it takes some time.20,016 warnings
I have not added no index attribute in my header region.It was all messy stuff with robots.txt file.It means that with site showing up in SERP the rank will probably be the same or it was deranked?
-
Go into GWMT and resubmit sitemap.xml files (with the URLs you want indexed) for recrawling and Google will digest the sitemaps again, instead of waiting for Googlebot to come around on their own, you are requesting it to come around, also include those new sitemap files in your robots.txt file.
-
In Google Webmaster Tools, go to Health -> Fetch As Google. Then add the previously blocked URL and click Fetch. Once you've done that, refresh the page and click "Submit to index". That should get Google indexing those pages again.
Getting external links to your pages also helps get pages crawled & indexed, so it may be worth submitting your pages to social bookmarking sites, or get other types of backlinks to your previously blocked pages if possible.
-
Since you fixed your robots.txt file you should be good to. It will probably take a few days for Google to recrawl your site and update the index with the URLs they are now allow to crawl.
Blocked URLs can still show up in SERPs if you haven't defined the no-index attribute in your section.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Pandora in Google Analytics
Hi Mozers, in Google Analytics our ads are registering a 10 second page duration and a 90% bounce rate, which is pretty atrocious. My questions are: Organic page duration is 3:15 minutes, and Social + Referral are similar. Yet Pandora is bringing the average time on site way down, to less than a minute. Will those paid ad metrics be factored to the User Behavior and therefore negative affect our overall rankings? In other words does Google now see our page duration as less than a minute? The Pandora rep says that Google Analytics and Pandora have "systemic issues" such that GA is very inaccurate with Pandora metrics. I find that difficult to believe. Any insights here? Thanks all! Jane
Reporting & Analytics | | CalamityJane770 -
Google Tag Manager Classes
Does anyone know of classes or people who teach Google Tag Manager and how to use it effectively?
Reporting & Analytics | | JQC0 -
Google Analytics and Webmaster Tools Setup for Agencies
Hi, As agencies, what are people finding to be the best practices for allowing multiple members of the agency's team to access client WMT and GA data? Have a generic "analytics@myagency.com" account that's used for the shares, that anyone in the agency can use as needed (limited, of course, not admin). Have the individual person at the company use their company email for the share for each particular client? employee@agency.com. Yet what happens when we need someone else to check the GA or WMT data? Any advice is much appreciated.
Reporting & Analytics | | Titan552
Thank you!0 -
Google Analytics: Deleted Profile
Has anyone ever successfully managed to have a deleted GA profile restored? One of our client's profiles was deleted accidentally. I know the official line is it can't be restored, but...
Reporting & Analytics | | David_ODonnell0 -
Switching Google Analytics from Urchin to GA
I want to finally make the switch from Urchin to GA in Analytics, but the Google support page is no longer working (I realize I'm late to this so they likely took it down/moved it to where I can't find it). I've been searching on Google but can't find a straightforward guide to changing that won't disrupt my data. Anyone have a link to a working guide, or can you provide some guidance?
Reporting & Analytics | | Axios_Systems0 -
My first campaign identidied long URLs
Hello! 🙂 I've just created my first campaign, and the crawling proccess have detected posts with long URL (more than 70 characters). If I change it, i.e., alter the URL's, can some problem happens to my blog? Or do I have to disconsider this problem and just "work correctly" from now on? Thanks in advance for your help!
Reporting & Analytics | | Andarilho0 -
Drop in google referral traffic
Hi guys, As we know, GA shows google as traffic source in two ways: google / organic for organic searches and google.TLD / referral for everything else: google groups, base.google.com, static pages, google reader, google image search, google search appliance/mini. What we noticed is that around Oct 20th there's a huge drop of google.TLD / referral traffic to our site. Do you experience something similar? I couldn't find anything Google-related that happened around this specific date. We use GSA for our site search and I'm wondering if this could be the reason - maybe someone from our development team made changes to GSA settings that affected this traffic source. Looking forward to hearing from you! Thanks.
Reporting & Analytics | | lgrozeva0 -
Google Keyword Tool versus Google Analytics
Hi I'm trying to establish a methodology to best show the gap between potential and realised organic keyword traffic. To obtain potential keyword traffic I'm using the Google Adwords keyword tool to derive local monthly search volumes for exact keyword matches. To get the realised data I'm using Google Analytics. However, to get the I'm confused as to which is the best way of getting a comparable metric from Google Analytics (GA). I was using custom reports and the 'organic searches' metric. However, this provides different values to a standard report selecting non-paid search in the default advanced segments. What is the best report/metric in GA to use for both organic and paid search volumes that would be comparable to the Google Adwords keyword tool. Thanks Neil
Reporting & Analytics | | mccormackmorrison0