The SEOmoz crawler is being blocked by robots.txt need help
-
SEO moz is showing me that the robot.txt is blocking content on my site
-
Jason, if you can post the contents of your robots.txt file, or give us a link to the site in question, we can help you diagnose what is happening.
A second question is -- what type of content is being blocked? If it's a directory like /admin that is being blocked, the robots.txt is likely working as intended.
You can also verify your site in Google Webmaster Tools and look in there at the crawling section, as it will tell you what pages Googlebot hasn't been able to crawl. Google offers some help at http://googlewebmastercentral.blogspot.com/2008/03/speaking-language-of-robots.html.
-
Hi Jason,
What's in your robots.txt file? It will be a text file in the root directory of your website. If you could share the contents we can help.
-
Or simply - another way - another idea: Go to your robots.txt and see what is going on directly.
You can use Google Webmaster tools to help you make a proper robots.txt file.
Best of luck
-
Open your htaccess file by adding .txt to it and see if it blocks certain robots from crawling your pages. If it does then remove these. Put the file back on your server. Remove the .txt
-
what needs to be done in the htaccess file. ? can anyone give me a step by step process
-
I would look at your htaccess file.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Meta Robots information
Hi, I have a question about the Meta Robots information Accoarding to the Moz bar, our page uses the meta robots noodp and noydir. Our competitor uses
On-Page Optimization | | AdoBike
INDEX,FOLLOW I read that noodp and noydir are dated and not used anymore. Is it wise to use INDEX FOLLOW instead for better SEO? Thanks in advance!1 -
I have a question about having to much content on a single page. Please help :)
I am working on a music related site. We are building a feature in our system to allow people to write information about songs on their playlist. So when a song is currently being played a user can read some cool facts or information about the song. http://imgur.com/5jFumPW ( screenshot). Some playlists have over 100 songs and could be completely random in genre and artist. I am wondering if some of these playlists have over 5,000 words of content if that is going to hurt us? We will be very strict about making sure its non spammy and good content. Also for the titles of the content is it bad to have over 100 h3 tags on one page? Just want to make sure we are on the right track. Any advice is greatly appreciated.
On-Page Optimization | | mikecrib10 -
Hit by Panda - Google Disavow Help
Hi I hope you can help me A Website I manage has been hit hard by the Panda Update. I am really struggling to understand what is seen as a Spammy link. The Website use to be on page 1 for "fancy dress" now it isnt visable for that term at all and most other terms the site has dropped for. I have looked into what might have gone wrong and have removed several links , used the disavow tool 2-3 times and submitted re-consideration requests, but each time google informs me that they are still detecting unnatural links. Could somebody please take a look at our link profile www.partydomain.co.uk for "fancy dress" as an example and show examples of links you would consider that google might not like. It would also be good if anybody had any contacts in the UK that could help thanks Adam
On-Page Optimization | | AMG1000 -
Can we listed URL on Website sitemap page which are blocked by Robots.txt
Hi, I need your help here. I have a website, and few pages are created for country specific. (www.example.com/uk). I have blocked many country specific pages from Robots.txt file. It is advisable to listed those urls (blocked by robots.txt) on my website sitemap. (html sitemap page) I really appreciate your help. Thanks, Nilay
On-Page Optimization | | Internet-Marketing-Profs0 -
To many links hurting me even though they are helping users
I have a scrabble based site where I function as a anagram solver, scrabble dictionary look up and tons of different word lists. In each of these word lists I link every word to my scrabble dictionary. This has caused Google to index 10018 pages total for my site and over 300 of them have well over 100 links. Many of them contain over 1000 links. I know Google's and SEOMOZ stance that anything over 100 will hurt me. I have always seen the warnings in my dashboard warning me of this but I have simply ignored it. I have posted on this Q and A that I have this issue, but IMO having these links benefit the users in the aspect that they don't have to worry about coping the text and putting it in the search box, they can simply click the link. Some have said if it helps the users then I am good, others have said opposite. I am thinking about removing these links from all these word lists to reduce the links per page. My questions are these. 1. If I remove the links from my page could this possible help me? No harm in trying it out so this is an easy question 2. If I remove the links then I will have over 9000 pages that are indexed with Google that no longer have a link pointing to them, except for the aspect that they are indexed with Google still. Is it going to hurt me if I remove these links and Google no longer sees them linked from my site or anywhere else?
On-Page Optimization | | cbielich0 -
Dropped rankings due to too many links, help needed
We have just re-designed our site and added a big drop-down navigation menu to help users get straight to the category/sub-category they are looking for, take a look at http://www.discountfiresupplies.co.uk to see what I mean. Since doing so our rankings have dropped which we've been told may be because there are now so many links on (particularly) the home page diluting the rankings of that page. We've been advised that if we hide the drop-down menus using "style='display: none'" until they are required that the search engines will ignore them which we have now done but is this correct or will they still be indexed? And if so do you have any other suggestions? Thanks, Tariq
On-Page Optimization | | tjhossy0 -
How do we handle sitemaps in robots.txt when multiple domains point to same physical location?
we have www.mysite.net, www.mysite.se, www.mysite.fi and so on. all of these domains point to the same physical location on our webserver, and we replace texts given back to client depending on which domain he/she requested. My problem is this: How do i configure sitemaps in robots.txt when robots.txt is used by multiple domains? If I for instance put the rows Sitemap: http://www.mysite.net/sitemapNet.xml
On-Page Optimization | | nordicnetproducts
Sitemap: http://www.mysite.net/sitemapSe.xml in robots.txt, would that result in some cross submission error?0