Get a list of robots.txt blocked URL and tell Google to crawl and index it.
-
Some of my key pages got blocked by robots.txt file and I have made required changes in robots.txt file but how can I get the blocked URL's list.
My webmaster page Health>blocked URL's shows only number not the blocked URL's.My first question is from where can I fetch these blocked URL's and how can I get them back in searches,
One other interesting point I see is that blocked pages are still showing up in searches.Title is appearing fine but Description shows blocked by robots.txt file.
I need urgent recommendation as I do not want to see drop in my traffic any more.
-
"changing the lastmod of those pages to today".
How can I make these changes?
Right now the news is that Resubmitted the Sitemap and no warnings this time.
-
I imagine that since you've got a robots txt error you'll probably ended closing a whole directory to bots which you wanted to be indexed. You can easily spot the directory and resubmit a sitemap to google changing the lastmod of those pages to today and the priority to 1 but only of those pages.
If you still receive warnings it may be due to errors in your sitemap. You're probably including some directory you don't want. You can try it in GWT putting in the box at the bottom the url you want to maintain in the index and then trying to see if some urls are being blocked by your robots.
If you want you can post here your robots and the URIs you want to be indexed without knowing the domain so that won't be public. Hope this may help you
-
Ok Resubmitted it.but even with updated file it gives a lot of errors.I think it takes some time.20,016 warnings
I have not added no index attribute in my header region.It was all messy stuff with robots.txt file.It means that with site showing up in SERP the rank will probably be the same or it was deranked?
-
Go into GWMT and resubmit sitemap.xml files (with the URLs you want indexed) for recrawling and Google will digest the sitemaps again, instead of waiting for Googlebot to come around on their own, you are requesting it to come around, also include those new sitemap files in your robots.txt file.
-
In Google Webmaster Tools, go to Health -> Fetch As Google. Then add the previously blocked URL and click Fetch. Once you've done that, refresh the page and click "Submit to index". That should get Google indexing those pages again.
Getting external links to your pages also helps get pages crawled & indexed, so it may be worth submitting your pages to social bookmarking sites, or get other types of backlinks to your previously blocked pages if possible.
-
Since you fixed your robots.txt file you should be good to. It will probably take a few days for Google to recrawl your site and update the index with the URLs they are now allow to crawl.
Blocked URLs can still show up in SERPs if you haven't defined the no-index attribute in your section.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
URL Formatting for Internal Link Tagging
After doing some research on internal campaign link tagging, I have seen conflicting viewpoints from analytics and SEO professionals regarding the most effective and SEO-friendly way to tag internal links for a large ecommerce site. It seems there are several common methods of tagging internal links, which can alter how Google interprets these links and indexes the URLs these links point to. Query Parameter - Using ? or & to separate a parameter like cid that will be appended to all internal-pointing links. Since Google will crawl and index these, I believe this method has the potential of causing duplicate content. Hash - Using # to separate a parameter like cid that will be appended to all internal-pointing links. Javascript - Using an onclick event to pass tracking data to your analytics platform Not Tagging Internal Links - While this method will provide the cleanest possible internal link paths for Google and users navigating the site and prevent duplicate content issues, analytics will be less effective. For those of you that manage SEO or analytics for large (1 million+ visits per month) ecommerce sites, what method do you employ and why? Edit* - For this discussion, I am only concerned with tagging links within the site that point to other pages within the same site - not links that come from outside the site or lead offsite. Thank you
Reporting & Analytics | | RobbieFoglia0 -
Google analytics : exclude traffic to a subdomain
Hi, I have a website with a client access on a subdomain. I want to exclude the traffic to that subdomain because it messes up my conversion goal for the main site. Per example, 2 out of 10 visitors are existing clients that want to access to my SaaS product. The 8 other are potential clients. I want to exclude the 2 clients from my stats so I could have the good conversion percentage for my free trial for the other 8 potential clients. Thanks in advance for your help!
Reporting & Analytics | | slestage0 -
Robots.txt file issue.
Hi, Its my third thread here and i have created many like it on many webmaster communities.I know many pro are here so badly needs help. Robots.txt blocked 2k important URL's of my blogging site http://Muslim-academy.com/ Especially of my blog area which are bringing good number of visitors daily.My organic traffic declined from 1k daily to 350. I have removed the robots.txt file.Resubmitted existing Sitemap.Used all Fetch to index options and 50 URL submission option in Bing Webmaster Tool. What Can I do know to have these blocked URL's back in Google index? 1.Create a NEW sitemap and submit it again in Google webmaster and bing webmaster tool? 2.Bookmark,linkbuilding or share the URL's.I did a lot of bookmarking for blocked URL's. I fetch the list of blocked URLS Using BING WEBMASTER TOOLS.
Reporting & Analytics | | csfarnsworth0 -
Google analytics and software applications
Hei Guys. I think i know the answer for this one but i thought i ask you in order to be 100% sure. Ok let's go.. So i set up url based goals in Google analytics. My website (what are running on WordPress) has google analytics enabled but just before customers makes desired action i have to send them to the application page. Trick is that the application page is not running on wordpress and doesn't have google analytic tracking. After customer fills the application form i send him to my /thank-you page on my wordpress site. My question is: Does the conversion still count because customer left my website for a minute in order to fill in the application form? Best Regards, Tauri
Reporting & Analytics | | seopartnermarketing0 -
Alternative to Google Analytics
Hi from sunny Manchester! I'm looking for recommendations for alternatives to GA. I'm specifically looking for something which won't sample data and will give us accurate numbers. Our site is pretty big, it receives 6 million+ visits and 50 million+ pageviews/year. Any recommendations would be greatly received together with any good/bad experiences you've had with them. Thanks in advance, Brendan.
Reporting & Analytics | | Confetti_Wedding0 -
Strange 404 Error URL
Can anyone help determine how a URL like "www.mycompany.com/lago_www.bad-nsfw-content.com" would appear on the "not found" crawl error list in Google Webmaster Tools? The "www.bad-nsfw-content" site has nothing to do with our company and I don't how it would get associated with our site.
Reporting & Analytics | | pbhatt0 -
Does using Google URL Builder override original source in Google Analytics?
During a free trial on Tatango, we send daily emails to customers to give them advice, resources, etc. We started using Google URL Builder http://www.google.com/support/analytics/bin/answer.py?answer=55578 to create individual links in each of these emails, but when the customer purchases a subscription now, the source in GA isn't Google, Facebook, Twitter, etc. they are all showing up as the source we created using the URL builder for each email. Does Google URL builder override the original source in Google Analytic?
Reporting & Analytics | | Tatango0 -
Google Analytics: how many visits from country Google domains?
Hello, I manage a site with visitors from many different countries. With Google Analytics, it is normal to see the number of visitors from each search engine. However, I would like to identify the number of visitors from each Google-search contry domain. How many visitors from Google.com? How many from Google.co.uk. And from Google.co.zm? And so on. Anybody knows if this is possible and if yes, how can it be done? Thank you in advance, Dario
Reporting & Analytics | | Darioz0