The SEOmoz crawler is being blocked by robots.txt need help
-
SEO moz is showing me that the robot.txt is blocking content on my site
-
Jason, if you can post the contents of your robots.txt file, or give us a link to the site in question, we can help you diagnose what is happening.
A second question is -- what type of content is being blocked? If it's a directory like /admin that is being blocked, the robots.txt is likely working as intended.
You can also verify your site in Google Webmaster Tools and look in there at the crawling section, as it will tell you what pages Googlebot hasn't been able to crawl. Google offers some help at http://googlewebmastercentral.blogspot.com/2008/03/speaking-language-of-robots.html.
-
Hi Jason,
What's in your robots.txt file? It will be a text file in the root directory of your website. If you could share the contents we can help.
-
Or simply - another way - another idea: Go to your robots.txt and see what is going on directly.
You can use Google Webmaster tools to help you make a proper robots.txt file.
Best of luck
-
Open your htaccess file by adding .txt to it and see if it blocks certain robots from crawling your pages. If it does then remove these. Put the file back on your server. Remove the .txt
-
what needs to be done in the htaccess file. ? can anyone give me a step by step process
-
I would look at your htaccess file.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
The actual title tag is different from the one showing in SEOmoz
Hello all I was looking at the crawling report for my project, I have noticed that there are so many duplicate titles in the "Duplicate Page Title" report, I checked the title tag for one of the titles (for this page) manually (in the page code), and it was showing like this: <title></span><span>Interviews | TechSparks</span><span></title> where it's showing in the "Duplicate Page Title" report like this: Interviews | TechSparksEasyRotator Preview Hovering over the tab in the browser shows the title as I mentioned first, and searching Google for this page shows the title like the first one I mentioned too. Do you any clue why that is happening? fUbtXe6 fUbtXe6.png
On-Page Optimization | | MHD0 -
Need help, i am lost
Hello all, I am new in this community. I have been for a while on Page 1 on Google with the keyword "Wooden Signs" with my website CreateYourWoodSign.com Since the Google update (April 24th i think)I completely disappeared from Google. I have not been able to come back since. Any help would you highly appreciated. Thank you!
On-Page Optimization | | manu45
Emmanuel0 -
How to leverage user reviews & ratings to help in SEO rankings
Folks, We have User Generated Content site in travel domain, we want to leverage reviews & ratings for SEO rankings/CTR . We already are getting micro formats in google but not in all the cases. Any ideas/suggestions will be really very helpful. Thanks in Advance. -Amit
On-Page Optimization | | holidayiq0 -
Can Cascading DropDown HELP improve site architucture?
Our online stores is selling furnace filters. http://www.furnacefilterscanada.com We have 4 different qualities furnace filters and each of them are avaible in almost 50 different sizes. My store is setup with almost 200 different products. ( 4 filtres X 50 sizes) None of them have variable. I'm looking for a solution to help customers to find there furnace filter size fast and easy on our HOME page. I find this cascading dropdown option http://www.asp.net/ajaxLibrary/AjaxControlToolkitSampleSite/CascadingDropDown/CascadingDropDown.aspx where I wiil setup 3 options to select: filter width filter lenght filter depth Will it be the best options? Now my store is setup with categories and it is not the best solution. http://www.furnacefilterscanada.com This was a lots of work to create 200 different products. The reason I did this is because when searching in search engine, vistors will use such keywords: 16x20x4 filter 20x25x2 furnace filter a/c filter 10x20x1 replacement filter online 20x20x1 16x25x5 filter toronto and so on... Must keywords request in search engine will included the furnace filter size. Thank you for your help, BigBlaze
On-Page Optimization | | BigBlaze2050 -
Need YouTube Tips for ranking videos using keywords
Hi, I'm working on building explination videos to had to our site. The video will be publish on a new YouTube channel I will create and the YouTube file ad to web site. How and Where to use keyword on YouTube to rank well? Is there some tips I should know to bring traffic to site using YouTube? Can anybody give some tips? Thank you, BigBlaze
On-Page Optimization | | BigBlaze2050 -
Blocking Google seeing outbound links?
Apart from rewriting the outbound url to look like a folder 'abc.co.uk/out/link1' and blocking the folder 'out' in the robots.txt file, along with also nofollowing the links as well, is there anything else you can do?
On-Page Optimization | | activitysuper0 -
New CMS system - 100,000 old urls - use robots.txt to block?
Hello. My website has recently switched to a new CMS system. Over the last 10 years or so, we've used 3 different CMS systems on our current domain. As expected, this has resulted in lots of urls. Up until this most recent iteration, we were unable to 301 redirect or use any page-level indexation techniques like rel 'canonical' Using SEOmoz's tools and GWMT, I've been able to locate and redirect all pertinent, page-rank bearing, "older" urls to their new counterparts..however, according to Google Webmaster tools 'Not Found' report, there are literally over 100,000 additional urls out there it's trying to find. My question is, is there an advantage to using robots.txt to stop search engines from looking for some of these older directories? Currently, we allow everything - only using page level robots tags to disallow where necessary. Thanks!
On-Page Optimization | | Blenny0 -
Does the seomoz crawler that crawls for the onpage reports have a set ip?
I would like to test my site, but its not launched yet and don;t want anybody to see it. But I can allow myself and other to view the site if I have there ip address. So does the seomoz crawler have a static one or range? James
On-Page Optimization | | BarefootJames0