The SEOmoz crawler is being blocked by robots.txt need help
-
SEO moz is showing me that the robot.txt is blocking content on my site
-
Jason, if you can post the contents of your robots.txt file, or give us a link to the site in question, we can help you diagnose what is happening.
A second question is -- what type of content is being blocked? If it's a directory like /admin that is being blocked, the robots.txt is likely working as intended.
You can also verify your site in Google Webmaster Tools and look in there at the crawling section, as it will tell you what pages Googlebot hasn't been able to crawl. Google offers some help at http://googlewebmastercentral.blogspot.com/2008/03/speaking-language-of-robots.html.
-
Hi Jason,
What's in your robots.txt file? It will be a text file in the root directory of your website. If you could share the contents we can help.
-
Or simply - another way - another idea: Go to your robots.txt and see what is going on directly.
You can use Google Webmaster tools to help you make a proper robots.txt file.
Best of luck
-
Open your htaccess file by adding .txt to it and see if it blocks certain robots from crawling your pages. If it does then remove these. Put the file back on your server. Remove the .txt
-
what needs to be done in the htaccess file. ? can anyone give me a step by step process
-
I would look at your htaccess file.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved Need Wordpress Front-end Plugin With Moz API
Hi Guys,
On-Page Optimization | | mrezair
I'm looking for Moz SEO Front-end Wordpress Plugin To Audit My Visitors Website And Show Results in my site. Like This plugin for Moz DA Checker: https://www.724ws.net/domain-authority-checker/ It's not important to be a free plugin or Premium one. I need to increase leads and traffic by it. Any suggestion will be appreciated.0 -
Can I robots.txt an entire site to get rid of Duplicate content?
I am in the process of implementing Zendesk and will have two separate Zendesk sites with the same content to serve two separate user groups (for the same product-- B2B and B2C). Zendesk does not allow me the option to changed canonicals (nor meta tags). If I robots.txt one of the Zendesk sites, will that cover me for duplicate content with Google? Is that a good option? Is there a better option. I will also have to change some of the canonicals on my site (mysite.com) to use the zendesk canonicals (zendesk.mysite.com) to avoid duplicate content. Will I lose ranking by changing the established page canonicals on my site go to the new subdomain (only option offered through Zendesk)? Thank you.
On-Page Optimization | | RoxBrock0 -
Can Robots.txt on Root Domain override a Robots.txt on a Sub Domain?
We currently have beta sites on sub-domains of our own domain. We have had issues where people forget to change the Robots.txt and these non-relevant beta sites get indexed by search engines (nightmare). We are going to move all of these beta sites to a new domain that we disallow all in the root of the domain. If we put fully configured Robots.txt on these sub-domains (that are ready to go live and open for crawling by the search engines) is there a way for the Robots.txt in the root domain to override the Robots.txt in these sub-domains? Apologies if this is unclear. I know we can handle this relatively easy by changing the Robots.txt in the sub-domain on going live but due to a few instances where people have forgotten I want to reduce the chance of human error! Cheers, Dave.
On-Page Optimization | | davelane.verve0 -
When You Add a Robots.txt file to a website to block certain URLs, do they disappear from Google's index?
I have seen several websites recently that have have far too many webpages indexed by Google, because for each blog post they publish, Google might index the following: www.mywebsite.com/blog/title-of-post www.mywebsite.com/blog/tag/tag1 www.mywebsite.com/blog/tag/tag2 www.mywebsite.com/blog/category/categoryA etc My question is: if you add a robots.txt file that tells Google NOT to index pages in the "tag" and "category" folder, does that mean that the previously indexed pages will eventually disappear from Google's index? Or does it just mean that newly created pages won't get added to the index? Or does it mean nothing at all? thanks for any insight!
On-Page Optimization | | williammarlow0 -
SEOmoz's On-page Checker upto date?
Helllo Mozzers, Just wondering if SEOmoz's on-page optimisation checker is upto date with google recent updates? If not... what do you suggest?
On-Page Optimization | | Prestige-SEO0 -
Duplicate content because of content scrapping - please help
We manage brands websites in a very competitive industry that have thousands of affiliate links We see that more and more websites (mainly affiliates websites) are scrapping our brand websites content and it generate many duplicate content (but most of them link to us back with an affiliate link). Our brand websites still rank for any sentence in brackets you search in Google, Will this duplicate content hurt our brand websites ? If yes, should we take some preventive actions ? We are not able to add ongoing UGC or additional text to all our duplicate content and trying to stop those websites of stealing our content is like playing cat and mouse... Thanks for your advices
On-Page Optimization | | Tit0 -
Telephone numbers on page getting classed as 404s by SEOMoz
Hi there, I have a number of clients who have their telephone numbers on their sites (understandably of course!) and SEOmoz is classing them as links and therefore a 404 in the crawl software. The protocol is added in the code so if viewing the page on a mobile you can call the client. Should I be doing anything else? Webmaster does not pick these up as 404s so I am wondering if this is an SEOmoz bug or that I should be adding a no-follow? Thanks jT
On-Page Optimization | | Switch_Digital0 -
Why isn't SEOMoz using File Extensions (*.html etc) on any of their web page URLs?
...and what is the SEO benefit of this? This video from Matt Cutts suggests using file extentions, except for a directory.
On-Page Optimization | | magicrob0