Get a list of robots.txt blocked URL and tell Google to crawl and index it.
-
Some of my key pages got blocked by robots.txt file and I have made required changes in robots.txt file but how can I get the blocked URL's list.
My webmaster page Health>blocked URL's shows only number not the blocked URL's.My first question is from where can I fetch these blocked URL's and how can I get them back in searches,
One other interesting point I see is that blocked pages are still showing up in searches.Title is appearing fine but Description shows blocked by robots.txt file.
I need urgent recommendation as I do not want to see drop in my traffic any more.
-
"changing the lastmod of those pages to today".
How can I make these changes?
Right now the news is that Resubmitted the Sitemap and no warnings this time.
-
I imagine that since you've got a robots txt error you'll probably ended closing a whole directory to bots which you wanted to be indexed. You can easily spot the directory and resubmit a sitemap to google changing the lastmod of those pages to today and the priority to 1 but only of those pages.
If you still receive warnings it may be due to errors in your sitemap. You're probably including some directory you don't want. You can try it in GWT putting in the box at the bottom the url you want to maintain in the index and then trying to see if some urls are being blocked by your robots.
If you want you can post here your robots and the URIs you want to be indexed without knowing the domain so that won't be public. Hope this may help you
-
Ok Resubmitted it.but even with updated file it gives a lot of errors.I think it takes some time.20,016 warnings
I have not added no index attribute in my header region.It was all messy stuff with robots.txt file.It means that with site showing up in SERP the rank will probably be the same or it was deranked?
-
Go into GWMT and resubmit sitemap.xml files (with the URLs you want indexed) for recrawling and Google will digest the sitemaps again, instead of waiting for Googlebot to come around on their own, you are requesting it to come around, also include those new sitemap files in your robots.txt file.
-
In Google Webmaster Tools, go to Health -> Fetch As Google. Then add the previously blocked URL and click Fetch. Once you've done that, refresh the page and click "Submit to index". That should get Google indexing those pages again.
Getting external links to your pages also helps get pages crawled & indexed, so it may be worth submitting your pages to social bookmarking sites, or get other types of backlinks to your previously blocked pages if possible.
-
Since you fixed your robots.txt file you should be good to. It will probably take a few days for Google to recrawl your site and update the index with the URLs they are now allow to crawl.
Blocked URLs can still show up in SERPs if you haven't defined the no-index attribute in your section.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Email traffic tracking in Google Analytics
Hello! I have a question about how to assure email traffic is properly tracked in GA > Acquisition > All Traffic > Channels. **First, some background... ** Our company (Wisconsin's largest group dental practice) is about to revamp the way we try to re-capture patients who don't have a future appointment set with us. Part of that process will include emails. Those emails will point back to our website to request an appointment. Now to the question... Is there anything special we should do to assure that links coming to the website and the resulting appointment request goal conversions track appropriately and appear under "Email" in the Default Channel Group of Analytics > Acquisition > All Traffic > Channels area of GA? For example, should we use Campaign URL Builder to establish UTM links? Thanks in advance for any feedback. Erik
Reporting & Analytics | | SmileMoreSEO0 -
Google Analytics - Dashboard Question
I'm looking to set up a dashboard widget in Google Analytics that does the following but can't : Shows traffic sources in a table as the dimension, and wanting to show goal completions to a specific product page and then only show what the average time spent (by source) on just those product pages. It looks like it's showing the whole session duration for the entire source, but I want to create a secondary filter that is only showing the time spent on those specific pages. Can anyone help - or is this possible? Thanks all!
Reporting & Analytics | | ReunionMarketing0 -
URL Structure for Study Guides
I'm creating study guides for various musicals, plays and operas. A few issues: I want to easily view in Google Analytics data for all study guides, data at the category level (musical,play, opera) and data for an individual show. I want urls to be as short as possible for usability purposes. Some show titles are the same for shows in different categories. For example, there is the play Romeo and Juliet but there is also an opera Romeo and Juliet. Each study guide contains multiple sections that will live on different URLs (overview, context, plot summary, characters) What would be the ideal URL structure? Here's what I was currently thinking we should use: /show/play/romeo-juliet/ /show/play/romeo-juliet/context /show/play/romeo-juliet/plot /show/play/romeo-juliet/characters
Reporting & Analytics | | stageagent0 -
Google Images referral visits fell off a cliff
As of 28-Jan, referral traffic from Google images (/imgres) to my domain has pretty much vaporized. Visits are down 85%. It's actually not a disaster because most of those visits were from poorly optimized alt tags that were resulting in low quality visits. The interesting thing is that visit duration is up 80% during the same period. So I'm asking this question out of curiosity more than anything. Is this likely an algorithm tweak? I can't think of any major changes on my end. There's only one other data point to mention but I don't see how they'd be connected. I did two PRWEB releases on 29-Jan and 30-Jan that resulted in a few hundred new no-follow links back to my site. 22b7k2.png
Reporting & Analytics | | JonDiPietro0 -
How long until changes are reflected in Google
I updated my site to remove duplicate titles and also include the rel=
Reporting & Analytics | | MartinSpence46
"next" etc tags. When can I expect to see these reflected in Google andSEOMoz? I also found out I should update my sitemap every month, when will Google pick up the new sitemap? Sorry if these are basic questions, new to the whole world of SEO. Thanks M0 -
Automatic Checking indexation of websites
Hi Guys, do you know a tool that can check al list of websites (directorys) wich automatically checks if the website are indexed in Google. The list is very long and I would like to have a tool wich checks them all with only CnP them once. thankx a lot der.rabauke
Reporting & Analytics | | Lincus0 -
Traffic Down for Most Referrers - not just Google
Our traffic has taken a severe hit, over the past 3 weeks - down about 60%, which I had assumed was caused by the Penguin update. However on closer inspection of our analytics, our traffic is is down by between 30 and 50% for nearly all our referrers - including Bing and other search engines, referring sites, and even direct traffic! Google provides the vast majority of our traffic, so in terms of the absolute visitor count, the drop in Google traffic has the biggest effect - by some distance. But the fact that traffic is down by similar percentages suggest that Penguin isn't the cause of our troubles at all. We sell garden products in the UK, and it's just coming into peak season. Last year, May was one of our top months, External conditions - such as the very wet weather over the past month, economic gloom and doom - don't begin to explain this sudden and dramatic drop in traffic. We are very perplexed. If anyone has any bright ideas, I'd be interested to hear them. Ben
Reporting & Analytics | | atticus70 -
Google Product Feed help
Hello, We are working with a particular service that uses our Google product feed to pull information. We wanted to add product info such as dimensions and hex colors for each product to better service the third party we are using, but we not sure if this is something Google supports in their feed. Would it cause any problems if we added that info in? Thanks for any help! Also, my knowledge regarding a Google product feed is lacking so if the question is confusing or I didn't give enough info on our situation just let me know and I'll do what I can to better explain.
Reporting & Analytics | | ClaytonKendall0