Robots.txt Question
-
In the past, I had blocked a section of my site (i.e. domain.com/store/) by placing the following in my robots.txt file: "Disallow: /store/" Now, I would like the store to be indexed and included in the search results. I have removed the "Disallow: /store/" from the robots.txt file, but approximately one week later a Google search for the URL produces the following meta description in the search results: "A description for this result is not available because of this site's robots.txt – learn more"
Is there anything else I need to do to speed up the process of getting this section of the site indexed?
-
Thanks for the "Good Answer" flag, David! I reformatted & added a little extra info to make the process a little clearer.
Paul
-
To help speed up the process of getting re-included, use the "Fetch as Googlebot" and "Fetch as Bingbot" tools in Webmaster Tools for a page in the blocked section - this significantly helps jumpstart indexing of pages.Once you see a successful Fetch status, click Submit to Index, and then specify to submit URL and all linked pages.
In addition
- make certain your new pages are listed in your sitmap.xml file, and then resubmit the sitemap to the search engines using Google and Bing Webmaster Tools
- make sure your own internal pages (especially a few strong ones) link to the newly unblocked content
- see if you can get a couple good new incoming links to some of the pages in the new section - even if they're no-follow, they can help guide the crawlers to the newly available pages
Essentially you're trying to give the SEs as many hints as possible that there are new pages to crawl and hopefully index.
Paul
[edited for additional clarity]
-
Thanks. I figured this was the case, but was not sure if I was missing any "best practices" about getting the previously blocked URL included faster.
-
David, If I am correct this is an old message sitting in the index. Give it another week or so and I am sure this message will vanish. I had this with one of my sites that I went live with but forget to allow in the robots.txt file.
shivun
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content question
Hey Mozzers! I received a duplicate content notice from my Cycle7 Communications campaign today. I understand the concept of duplicate content, but none of the suggested fixes quite seems to fit. I have four pages with HubSpot forms embedded in them. (Only two of these pages have showed up so far in my campaign.) Each page contains a title (Content Marketing Consultation, Copywriting Consultation, etc), plus an embedded HubSpot form. The forms are all outwardly identical, but I use a separate form for each service that I offer. I’m not sure how to respond to this crawl issue: Using a 301 redirect doesn’t seem right, because each page/form combo is independent and serves a separate purpose. Using a rel=canonical link doesn’t seem right for the same reason that a 301 redirect doesn’t seem right. Using the Google Search Console URL Parameters tool is clearly contraindicated by Google’s documentation (I don’t have enough pages on my site). Is a meta robots noindex the best way to deal with duplicate content in this case? Thanks in advance for your help. AK
Technical SEO | | AndyKubrin0 -
Robots File
For some reason the robots file on this site: http://rushhour.net.au/robots.txt Is giving this in Google: <cite class="_Rm">www.rushhour.net.au/bootcamp.html</cite>A description for this result is not available because of this site's robots.txtLearn moreCan anyone tell me why please?thanks.
Technical SEO | | SuitsAdmin0 -
Ajax Pagination in Magento Question
Hi, We just launched our new theme for Magento and my developer stated the pagination uses Ajax. Previously I had the developers set up rel prev/next for all our pages (categories/ecommerce site) that had multiples. He said it's not required with Ajax. Is this correct? Example: https://www.bestpricenutrition.com/whey.html and when you go to Page 2, the URL shows: https://www.bestpricenutrition.com/whey.html? I want to make sure these pages are set up correctly.
Technical SEO | | vetofunk0 -
Some SEO 2016 questions
Hello MOZ Community, I have some questions where the following is still working for seo in 2016: Is an exact keyword in the domain still a good start? If a domain contains the most important keyword does one still need subfolders with that keyword in the url? Do you need multiple subpages so the main url becomes stronger? Is linkbuilding still the number one factor? Thank you for your thoughts!
Technical SEO | | mhenze0 -
Question on URL wording and structure best practices
We're mapping out some URL structures and trying to figure out what would be best for separating folders for articles and videos regarding wording in the folder say: www.site.com/category/article/name-of-article/id#/ ---- www.site.com/category/video/name-of-video/id#/ vs. www.site.com/category/a/name-of-article/id#/ ---- www.site.com/category/v/name-of-video/id#/ Second option came about the ''shorter is better' way of thinking. Downside I see to it is if the link would be copied and pasted somewhere probably would be best for a user to make it clear they are clicking into an article or a video, don't think just an 'a' or a 'v' would be very telling in that scenario. Would it be better for search engines to make it clearer with the whole word in there? Any other pros and cons to each? Not sure what's the best route here.
Technical SEO | | SBRMarketing0 -
Migration to New Domain - 301 Redirect Questions
My client is migrating their site to a new domain. I just did a big redesign, including URL structure change, and 301s from old URLs to new URLs. Now they want a new name, so we're moving forward with a new domain name. However, we're going to keep the site on the current domain while we ease customers into the new name. During that time, I'm going to be building links to the new domain name and 301 Redirecting that new one to the current domain name. Then, once we migrate the site to the new domain name, I'm then going to redirect the current domain name to the new domain name. So, my question(s) is/are: Is the above process the best way to use 301 redirects to to build links to the new domain while we transition everything? Should I (or can I) do 3 redirects from the oldest URLs, to the current URLs then to the new URLs? General question... I can't seem to find this anywhere online, but what is the best practice for what order URLs should be in in the htaccess file? Thanks!
Technical SEO | | Kenny-King0 -
Question about creating friendly URLs
I am working on creating new SEO friendly URLs for my company website. The products are the items with the highest search volume and each is very geo-specific
Technical SEO | | theLotter
There is not a high search volume for the geo-location associated with the product, but the searches we do get convert well. Do you think it is preferable to leave the location out of the URL or include it?0 -
Newbie Duplicate Title Question
We recently update our website with DNN 6. Once the upgrade was done, I kept recieving log in links on my duplicate title and duplicate content error reports. Is anyone familiar with how to stop these links from showing up? Example of link: http://www.faisongroup.com/Login/tabid/750/Default.aspx?returnurl=%2F Any help would be greatly appreciated! Thank you!
Technical SEO | | VeronicaCFowler0