Robots.txt Question
-
In the past, I had blocked a section of my site (i.e. domain.com/store/) by placing the following in my robots.txt file: "Disallow: /store/" Now, I would like the store to be indexed and included in the search results. I have removed the "Disallow: /store/" from the robots.txt file, but approximately one week later a Google search for the URL produces the following meta description in the search results: "A description for this result is not available because of this site's robots.txt – learn more"
Is there anything else I need to do to speed up the process of getting this section of the site indexed?
-
Thanks for the "Good Answer" flag, David! I reformatted & added a little extra info to make the process a little clearer.
Paul
-
To help speed up the process of getting re-included, use the "Fetch as Googlebot" and "Fetch as Bingbot" tools in Webmaster Tools for a page in the blocked section - this significantly helps jumpstart indexing of pages.Once you see a successful Fetch status, click Submit to Index, and then specify to submit URL and all linked pages.
In addition
- make certain your new pages are listed in your sitmap.xml file, and then resubmit the sitemap to the search engines using Google and Bing Webmaster Tools
- make sure your own internal pages (especially a few strong ones) link to the newly unblocked content
- see if you can get a couple good new incoming links to some of the pages in the new section - even if they're no-follow, they can help guide the crawlers to the newly available pages
Essentially you're trying to give the SEs as many hints as possible that there are new pages to crawl and hopefully index.
Paul
[edited for additional clarity]
-
Thanks. I figured this was the case, but was not sure if I was missing any "best practices" about getting the previously blocked URL included faster.
-
David, If I am correct this is an old message sitting in the index. Give it another week or so and I am sure this message will vanish. I had this with one of my sites that I went live with but forget to allow in the robots.txt file.
shivun
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to stop robots.txt restricting access to sitemap?
I'm working on a site right now and having an issue with the robots.txt file restricting access to the sitemap - with no web dev to help, I'm wondering how I can fix the issue myself? The robots.txt page shows User-agent: * Disallow: / And then sitemap: with the correct sitemap link
Technical SEO | | Ad-Rank0 -
Google Webmaster Tools is saying "Sitemap contains urls which are blocked by robots.txt" after Https move...
Hi Everyone, I really don't see anything wrong with our robots.txt file after our https move that just happened, but Google says all URLs are blocked. The only change I know we need to make is changing the sitemap url to https. Anything you all see wrong with this robots.txt file? robots.txt This file is to prevent the crawling and indexing of certain parts of your site by web crawlers and spiders run by sites like Yahoo! and Google. By telling these "robots" where not to go on your site, you save bandwidth and server resources. This file will be ignored unless it is at the root of your host: Used: http://example.com/robots.txt Ignored: http://example.com/site/robots.txt For more information about the robots.txt standard, see: http://www.robotstxt.org/wc/robots.html For syntax checking, see: http://www.sxw.org.uk/computing/robots/check.html Website Sitemap Sitemap: http://www.bestpricenutrition.com/sitemap.xml Crawlers Setup User-agent: * Allowable Index Allow: /*?p=
Technical SEO | | vetofunk
Allow: /index.php/blog/
Allow: /catalog/seo_sitemap/category/ Directories Disallow: /404/
Disallow: /app/
Disallow: /cgi-bin/
Disallow: /downloader/
Disallow: /includes/
Disallow: /lib/
Disallow: /magento/
Disallow: /pkginfo/
Disallow: /report/
Disallow: /stats/
Disallow: /var/ Paths (clean URLs) Disallow: /index.php/
Disallow: /catalog/product_compare/
Disallow: /catalog/category/view/
Disallow: /catalog/product/view/
Disallow: /catalogsearch/
Disallow: /checkout/
Disallow: /control/
Disallow: /contacts/
Disallow: /customer/
Disallow: /customize/
Disallow: /newsletter/
Disallow: /poll/
Disallow: /review/
Disallow: /sendfriend/
Disallow: /tag/
Disallow: /wishlist/
Disallow: /aitmanufacturers/index/view/
Disallow: /blog/tag/
Disallow: /advancedreviews/abuse/reportajax/
Disallow: /advancedreviews/ajaxproduct/
Disallow: /advancedreviews/proscons/checkbyproscons/
Disallow: /catalog/product/gallery/
Disallow: /productquestions/index/ajaxform/ Files Disallow: /cron.php
Disallow: /cron.sh
Disallow: /error_log
Disallow: /install.php
Disallow: /LICENSE.html
Disallow: /LICENSE.txt
Disallow: /LICENSE_AFL.txt
Disallow: /STATUS.txt Paths (no clean URLs) Disallow: /.php$
Disallow: /?SID=
disallow: /?cat=
disallow: /?price=
disallow: /?flavor=
disallow: /?dir=
disallow: /?mode=
disallow: /?list=
disallow: /?limit=5
disallow: /?limit=10
disallow: /?limit=15
disallow: /?limit=20
disallow: /*?limit=250 -
Migration to New Domain - 301 Redirect Questions
My client is migrating their site to a new domain. I just did a big redesign, including URL structure change, and 301s from old URLs to new URLs. Now they want a new name, so we're moving forward with a new domain name. However, we're going to keep the site on the current domain while we ease customers into the new name. During that time, I'm going to be building links to the new domain name and 301 Redirecting that new one to the current domain name. Then, once we migrate the site to the new domain name, I'm then going to redirect the current domain name to the new domain name. So, my question(s) is/are: Is the above process the best way to use 301 redirects to to build links to the new domain while we transition everything? Should I (or can I) do 3 redirects from the oldest URLs, to the current URLs then to the new URLs? General question... I can't seem to find this anywhere online, but what is the best practice for what order URLs should be in in the htaccess file? Thanks!
Technical SEO | | Kenny-King0 -
Can I rely on just robots.txt
We have a test version of a clients web site on a separate server before it goes onto the live server. Some code from the test site has some how managed to get Google to index the test site which isn't great! Would simply adding a robots text file to the root of test simply blocking all be good enough or will i have to put the meta tags for no index and no follow etc on all pages on the test site also?
Technical SEO | | spiralsites0 -
Robots.txt issue - site resubmission needed?
We recently had an issue when a load of new files were transferred from our dev server to the live site, which unfortunately included the dev site's robots.txt file which had a disallow:/ instruction. Bad! Luckily I spotted it quickly and the file has been replaced. The extent of the damage seems to be that some descriptions aren't displaying and we're getting a message about robots.txt in the SERPs for a few keywords. I've done a site: search and generally it seems to be OK for 99% of our pages. Our positions don't seem to be affected right now but obviously it's not great for the CTRs on those keywords affected. My question is whether there is anything I can do to bring the updated robots.txt file to Google's attention? Or should we just wait and sit it out? Thanks in advance for your answers!
Technical SEO | | GBC0 -
Robots.txt
should I add anything else besides User-Agent: * to my robots.txt file? http://melo4.melotec.com:4010/
Technical SEO | | Romancing0 -
Meta tags question - imagetoolbar
We inherited some sites from another vendor & they have these tags in the head of all pages. Are they of any value at all? Thanks for the help! Wick Smith
Technical SEO | | wcksmith0