Robots.txt Question
-
In the past, I had blocked a section of my site (i.e. domain.com/store/) by placing the following in my robots.txt file: "Disallow: /store/" Now, I would like the store to be indexed and included in the search results. I have removed the "Disallow: /store/" from the robots.txt file, but approximately one week later a Google search for the URL produces the following meta description in the search results: "A description for this result is not available because of this site's robots.txt – learn more"
Is there anything else I need to do to speed up the process of getting this section of the site indexed?
-
Thanks for the "Good Answer" flag, David! I reformatted & added a little extra info to make the process a little clearer.
Paul
-
To help speed up the process of getting re-included, use the "Fetch as Googlebot" and "Fetch as Bingbot" tools in Webmaster Tools for a page in the blocked section - this significantly helps jumpstart indexing of pages.Once you see a successful Fetch status, click Submit to Index, and then specify to submit URL and all linked pages.
In addition
- make certain your new pages are listed in your sitmap.xml file, and then resubmit the sitemap to the search engines using Google and Bing Webmaster Tools
- make sure your own internal pages (especially a few strong ones) link to the newly unblocked content
- see if you can get a couple good new incoming links to some of the pages in the new section - even if they're no-follow, they can help guide the crawlers to the newly available pages
Essentially you're trying to give the SEs as many hints as possible that there are new pages to crawl and hopefully index.
Paul
[edited for additional clarity]
-
Thanks. I figured this was the case, but was not sure if I was missing any "best practices" about getting the previously blocked URL included faster.
-
David, If I am correct this is an old message sitting in the index. Give it another week or so and I am sure this message will vanish. I had this with one of my sites that I went live with but forget to allow in the robots.txt file.
shivun
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
One robots.txt file for multiple sites?
I have 2 sites hosted with Blue Host and was told to put the robots.txt in the root folder and just use the one robots.txt for both sites. Is this right? It seems wrong. I want to block certain things on one site. Thanks for the help, Rena
Technical SEO | | renalynd270 -
Robots txt. in page with 301 redirect
We currently have a a series of help pages that we would like to disallow from our robots txt. The thing is that these help pages are located in our old website, which now has a 301 redirect to current site. Which is the proper way to go around? 1- Add the pages we want to disallow to the robots.txt of the new website? 2- Break the redirect momentarily and add the pages to the robots.txt of the old one? Thanks
Technical SEO | | Kilgray0 -
No descripton on Google/Yahoo/Bing, updated robots.txt - what is the turnaround time or next step for visible results?
Hello, New to the MOZ community and thrilled to be learning alongside all of you! One of our clients' sites is currently showing a 'blocked' meta description due to an old robots.txt file (eg: A description for this result is not available because of this site's robots.txt) We have updated the site's robots.txt to allow all bots. The meta tag has also been updated in WordPress (via the SEO Yoast plugin) See image here of Google listing and site URL: http://imgur.com/46wajJw I have also ensured that the most recent robots.txt has been submitted via Google Webmaster Tools. When can we expect these results to update? Is there a step I may have overlooked? Thank you,
Technical SEO | | adamhdrb
Adam 46wajJw0 -
301 redirect Question
Hi all, I have a client who has a domain lets say www.xyz.de which is redirected 301 to www.zyx.de. Now they're working on a relaunch and they want to use the www.xyz.de as their origibnal doman after that. So, at the end the www.zyx.de - which is indexed by Google - should be redirected to www.xyz.de. It vice versa. So the redirect becomes the original and the original becomes the redirect 😕 Is there anything we have to care off? Or will that run into the hell? Thanx. Seb.
Technical SEO | | TheHecksler0 -
Keyword density question.
For instance, if the keyword I'm targeting on a specific page is "New Orleans", the Keyword is everywhere it's supposed to be, title, meta, content, internal links, etc, .... So when I check my most relative key words with different tools, it always breaks the word up like: new - 12 times 2.3% orleans - 12 times 2.3% Should I try to fix this? or is this normal? and does google view this as 1 keyword when evaluating my site?
Technical SEO | | Nola5040 -
I accidentally blocked Google with Robots.txt. What next?
Last week I uploaded my site and forgot to remove the robots.txt file with this text: User-agent: * Disallow: / I dropped from page 11 on my main keywords to past page 50. I caught it 2-3 days later and have now fixed it. I re-imported my site map with Webmaster Tools and I also did a Fetch as Google through Webmaster Tools. I tweeted out my URL to hopefully get Google to crawl it faster too. Webmaster Tools no longer says that the site is experiencing outages, but when I look at my blocked URLs it still says 249 are blocked. That's actually gone up since I made the fix. In the Google search results, it still no longer has my page title and the description still says "A description for this result is not available because of this site's robots.txt – learn more." How will this affect me long-term? When will I recover my rankings? Is there anything else I can do? Thanks for your input! www.decalsforthewall.com
Technical SEO | | Webmaster1230 -
Back Link Question
Hi Folks, Our domain (www.alabu.com) has been around since 2000. We've accumulated a lot of back links over the years, many of which I don't recognize and didn't ask for. I've been reading on here recently about "cleaning up" back links. I do see a lot of ours that just aren't relevant and I don't know why they decided to link to us. We haven't gotten a warning from google or anything like that, but I wonder, how do I know if we could benefit from cleaning up our back links? Is there a benefit to it even if google hasn't warned us? Thanks! Hal
Technical SEO | | AlabuSkinCare0 -
Summarize your question.Crawl Diagnostics Summary
Hi, Crawl Diagnostics Summary pointed on some mistakes I've done, I fixed them, but Crawl Diagnostics Summary still shows same errors, how often does ithe data refreshes?
Technical SEO | | AndreyStotsky0