Get a list of robots.txt blocked URL and tell Google to crawl and index it.
-
Some of my key pages got blocked by robots.txt file and I have made required changes in robots.txt file but how can I get the blocked URL's list.
My webmaster page Health>blocked URL's shows only number not the blocked URL's.My first question is from where can I fetch these blocked URL's and how can I get them back in searches,
One other interesting point I see is that blocked pages are still showing up in searches.Title is appearing fine but Description shows blocked by robots.txt file.
I need urgent recommendation as I do not want to see drop in my traffic any more.
-
"changing the lastmod of those pages to today".
How can I make these changes?
Right now the news is that Resubmitted the Sitemap and no warnings this time.
-
I imagine that since you've got a robots txt error you'll probably ended closing a whole directory to bots which you wanted to be indexed. You can easily spot the directory and resubmit a sitemap to google changing the lastmod of those pages to today and the priority to 1 but only of those pages.
If you still receive warnings it may be due to errors in your sitemap. You're probably including some directory you don't want. You can try it in GWT putting in the box at the bottom the url you want to maintain in the index and then trying to see if some urls are being blocked by your robots.
If you want you can post here your robots and the URIs you want to be indexed without knowing the domain so that won't be public. Hope this may help you
-
Ok Resubmitted it.but even with updated file it gives a lot of errors.I think it takes some time.20,016 warnings
I have not added no index attribute in my header region.It was all messy stuff with robots.txt file.It means that with site showing up in SERP the rank will probably be the same or it was deranked?
-
Go into GWMT and resubmit sitemap.xml files (with the URLs you want indexed) for recrawling and Google will digest the sitemaps again, instead of waiting for Googlebot to come around on their own, you are requesting it to come around, also include those new sitemap files in your robots.txt file.
-
In Google Webmaster Tools, go to Health -> Fetch As Google. Then add the previously blocked URL and click Fetch. Once you've done that, refresh the page and click "Submit to index". That should get Google indexing those pages again.
Getting external links to your pages also helps get pages crawled & indexed, so it may be worth submitting your pages to social bookmarking sites, or get other types of backlinks to your previously blocked pages if possible.
-
Since you fixed your robots.txt file you should be good to. It will probably take a few days for Google to recrawl your site and update the index with the URLs they are now allow to crawl.
Blocked URLs can still show up in SERPs if you haven't defined the no-index attribute in your section.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How is Google Analytics defining page depth?
We run two websites and as part of our KPIs we are treating those who visit 3 or more pages of our website as a client served. As a digital team we are not convinced that this is the best metric to use as the improvements we are making to the sites mean that people are able to find the information quicker. Additionally other organisations including forums etc link to us so those users will get the info they need in one click. What I would like to know is how Google calculates page depth in GA. Are they treating the landing page as ground zero and then when users clicks a link they go one page deep? Or is the landing page, page depth 1 . Is page depth a measure of how many clicks a user needs to find their information?
Reporting & Analytics | | MATOnlineServices0 -
How do you adjust the Google Analytics date range?
Hello! Every time I log into Google Analytics (via desktop), the date range is set to October 1, 2013 to October 28, 2013. I've looked throughout the GA settings to find a place where I change this default setting to anything else ("Last week?") but have not be able to find where that setting is. Can you help? Losing my mind,
Reporting & Analytics | | SmileMoreSEO
Erik0 -
Conflicting numbers in Google Analytics
I am getting 2 completely different numbers in Google Analytics.According to the graph on my Dashboard organic traffic decreased ~40% comparing June 7 -> June 6.However, when I dig a bit deeper and look at the 2 dates specifically compared to each other I get a `3% increase in traffic.When I look at the traffic on just June 7, I get the number indicating the increase.Any ideas or someone getting similar conflicting numbers??
Reporting & Analytics | | theLotter0 -
URL Parameters
Hi there, I have a magento sort by feature which has indexed loads of pages in Google with urls that have /shopby/ in them.Over 8k pages have been indexed like this. I cannot edit the robots within the page but have now disallowed the urls in robots.txt - i guess this will prevent new ones being indexed but not deindex current ones? So I looked into URL parameters, I added 'shopby' as a parameter in webmaster tools and told Google not to crawl any urls with this in it, will this deindex the pages already indexed? The only other way seems to be manually removing 8k urls, which i do not want to do. Any advice much appreciated. Obviously I do not want these urls indexed as they are weak/duplicate sort by search pages, I fear the panda update would not be too kind on it long term?
Reporting & Analytics | | tdigital0 -
Google penalty
Hi, I've seen a steady improvement in my google search referrals since optimising my site and keeping an eye on SEO over the last few months. I'm only getting relatively small numbers of referrals, around 120-130 a day, but this has steadily increased from about 40 before Christmas. It's been a fairly consistent increase until 4 days ago when suddenly my referral numbers were cut in half. I'd be surprised if google was penalising me since I'm such a small site, but are there any obvious signs I should look out for? Oh, and my site is www.madegood.org should that be of help. Thanks! Will
Reporting & Analytics | | madegood0 -
Goal tracking in Google Analytics
Hi folks I read from various sources that if you setup goals in Google Analytics each of these goals can only be fulfilled once per visit. Also some sources suggesting that only one goal from each goal group can be fulfilled per visit. On our site we have a goal for external links since this provides value to partners. Some users do open an external link in a new tab, then come back to the main site. Any further goal completions would then not get tracked. Since we apply a result based payment model for our work this means we are literally loosing money. Anyone has official info from Google on this? Can it be configured? How long is a visit? Thanks a million and have a great day. Fredrik
Reporting & Analytics | | Resultify0 -
Google WebMasters Tool - Preferred Domain
I just added Google Analytics to my wordpress site with Google Analytics by YOAST. I then added Google WebMaster tools through via verify through google analytics account. I then tried to set a preferred domain. I chose the non www. version; however, google wanted me to verify ownership of both versions in order to set a preferred domain. I then added the www. version of my domain. I was able to set the non-www. version to my preferred domain. Now, there are two example.com's in my webmaster tools. I have 10 sites. I intend to replicate this process on all of my sites. Do I have to leave the non-preferred version of my sites in the google webmaster? Can I delete it after I have set my preferred version? If I delete the non-preferred version will it delete my setting on the preferred version because it is now no longer verified (saved)?
Reporting & Analytics | | JML11791 -
Tracking PDF Downloads in Google Analytics
Hi, I work on a site that allows users to download whitepapers after filling out a form. Once they do this they are redirected to a URL which is the PDF. We use Wordpress and these documents were uploaded to the media center. I've tried researching how to track these downloads in GA, since the code is not present on these pages, but have read a few different answers. Anyone have firsthand experience? Thanks!
Reporting & Analytics | | tinarose0