The SEOmoz crawler is being blocked by robots.txt need help
-
SEO moz is showing me that the robot.txt is blocking content on my site
-
Jason, if you can post the contents of your robots.txt file, or give us a link to the site in question, we can help you diagnose what is happening.
A second question is -- what type of content is being blocked? If it's a directory like /admin that is being blocked, the robots.txt is likely working as intended.
You can also verify your site in Google Webmaster Tools and look in there at the crawling section, as it will tell you what pages Googlebot hasn't been able to crawl. Google offers some help at http://googlewebmastercentral.blogspot.com/2008/03/speaking-language-of-robots.html.
-
Hi Jason,
What's in your robots.txt file? It will be a text file in the root directory of your website. If you could share the contents we can help.
-
Or simply - another way - another idea: Go to your robots.txt and see what is going on directly.
You can use Google Webmaster tools to help you make a proper robots.txt file.
Best of luck
-
Open your htaccess file by adding .txt to it and see if it blocks certain robots from crawling your pages. If it does then remove these. Put the file back on your server. Remove the .txt
-
what needs to be done in the htaccess file. ? can anyone give me a step by step process
-
I would look at your htaccess file.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Need advice on the better URL structure to go with
I am rebuilding our existing website on a new platform and need advice on which URL structure would be the most ideal. The following examples are of a product that we have with a very long page title. Not all of our products have titles this long, but enough of them do to cause some concern. I was also wondering if I should end the url with file type .html or if leaving it out is better. Thanks in advance! OPTION 1. this example just uses the root domain and the entire product title separated by dashes http://ewheels.nextmp.net/staggered-full-set-br-2-20x9-ace-alloy-aff01-metallic-silver-machined-face-flow-formed-br-2-20x10-5-ace-alloy-aff01-metallic-silver-machined-face-flow-formed OPTION 2. this example uses the crawl path as well as the entire product title http://ewheels.nextmp.net/wheels/ace-alloy-wheels/ace-alloy-aff01-metallic-silver-machined-face-flow-formed/staggered-full-set-br-2-20x9-ace-alloy-aff01-metallic-silver-machined-face-flow-formed-br-2-20x10-5-ace-alloy-aff01-metallic-silver-machined-face-flow-formed OPTION 3. this example uses the crawl path and just the part number at the end since the folders already contain all the keywords necessary http://ewheels.nextmp.net/wheels/ace-alloy-wheels/ace-alloy-aff01-metallic-silver-machined-face-flow-formed/ace-2090aff01silace-20105aff01sil
On-Page Optimization | | elementmotor0 -
Does name of town in title tag help if queries don't include the town name?
Hi. Wanted to know if targeting local traffic online and the search volume of KWs in the area do not include the local names (according to KW planner) does it still help to keep the town names in the title tag? does google deliver local results based on location names in title tag if query didn't mention it?
On-Page Optimization | | Morris770 -
Google ranking is HORRIBLE. Following SEOMoz suggestions and just can't climb.
First of all, the URL is stores.dhsequipment.com. In January, this online store switched from Homestead to Big Commerce. Since the store updated, we decided now is the time to update our product descriptions, URL's, title tags and meta descriptions. (For the first time, we had the ability to customize our URL's.) Product Description: I went through 2,500 products and updated the product description. I added an H1 & H2 to each description, and included pertinent information such as part numbers. Each product also received a new page title, meta description (which is usually the first line of the product description, don't know if this is bad or not) and a new URL, (which did redirect). Once I would complete a section, I would submit a new sitemap to Webmaster Tools. After a month and nothing happening, I started using SEOMoz which helped me rebuild some of my more important pages, such as the home page and main category pages such as: http://stores.dhsequipmentparts.com/stihl-ts420-parts/
On-Page Optimization | | pearldesign
http://stores.dhsequipmentparts.com/stihl-ts700-parts-stihl-ts800-parts/ I fetched these pages in Webmaster Tools after completion. However, it's been several weeks since and I'm still on page 4 or 5 in the SERPs. Just a little history on the store; this store has been in operation for more than 6 years. Previously, we ranked on page one for 75%+ of our products. My belief is because our URL's had history, probably more so than our competitors. I'm not sure what I should do. Business is super slow and we can't afford to wait much longer.0 -
How to exclude URL filter searches in robots.txt
When I look through my MOZ reports I can see it's included 'pages' which it shouldn't have included i.e. adding filtering rules such as this one http://www.mydomain.com/brands?color=364&manufacturer=505 How can I exclude all of these filters in the robots.txt? I think it'll be: Disallow: /*?color=$ Is that the correct syntax with the $ sign in it? Thanks!
On-Page Optimization | | neenor0 -
I have more pages in my site map being blocked by the robot file than I have being allowed to be crawled. Is Google going to hate me for this?
Using some rules to block all pages which start with "copy-of" on my website because people have a bad habit of duplicating new product listings to create our refurbished, surplus etc. listings for those products. To avoid Google seeing these as duplicate pages I've blocked them in the robot file, but of course they are still automatically generated in our sitemap. How bad is this?
On-Page Optimization | | absoauto0 -
Rankings going down and down. Help!
I just joined a company as an in house seo. When I looked at their rankings I noticed a downward trend. How can I reverse that? I'm currently working on their onsite optimization, but is there anything more that I can do? edit
On-Page Optimization | | EcomLkwd0 -
Too Many On-Page Links Reported By SEOmoz
Hi, I recently did run a crawl report for my blog dapazze.com, and found that SEOmoz is reporting many pages on my blog having more than 100 internal links. I opened OSE, and made a search for one of my pages which was reported to contain more than 100 links. And I found it to contain 464 internal links. Here is the link: http://www.opensiteexplorer.org/links?page=1&site=dapazze.com%2F2012%2F10%2Fwin-a-commentluv-premium-single-site-and-multi-site-license-worth-about-154-giveaway-of-october%2F&sort=page_authority&filter=&source=internal&target=page&group=0 Please have a look at it. I have chosen - Show "All" links from "only internal" pages to "this page" option in OSE, which reports me this. I see almost every page in my blog linking to every page. This is not the problem for me. I have also tried to make a search for some popular bloggers, like ProBlogger.net, ShoutMeLoud.com, HellBoundBloggers.com, etc, and all of them have the same problem. Should I be worrying about this problem? What is the problem actually?
On-Page Optimization | | rahulchowdhury0 -
Do Blog Comments On Your Site Help SEO?
There is a lot of debate as whether or not having comments on your blog is helpful from an SEO perspective. Proponents believe that more comments (1) creates more content, which search engines love, (2) creates more relevant keywords that can be searched, and (3) helps with "freshness" of the site/content leading to greater site authority. Others like Joost de Valk believe that comments can actually hurt SEO because keyword density cannot be controlled. He argues that his top SEO content are pages not posts for this very reason. What is your opinion?
On-Page Optimization | | marcperry0