Search Engine Blocked by Robot Txt warnings for Filter Search result pages--Why?
-
Hi,
We're getting 'Yellow' Search Engine Blocked by Robot Txt warnings for URLS that are in effect product search filter result pages (see link below) on our Magento ecommerce shop. Our Robot txt file to my mind is correctly set up i.e. we would not want Google to index these pages. So why does SeoMoz flag this type of page as a warning? Is there any implication for our ranking? Is there anything we need to do about this? Thanks.
Here is an example url that SEOMOZ thinks that the search engines can't see.
http://www.site.com/audio-books/audio-books-in-english?audiobook_genre=132
Below are the current entries for the robot.txt file.
User-agent: Googlebot
Disallow: /index.php/
Disallow: /?
Disallow: /.js$
Disallow: /.css$
Disallow: /checkout/
Disallow: /tag/
Disallow: /catalogsearch/
Disallow: /review/
Disallow: /app/
Disallow: /downloader/
Disallow: /js/
Disallow: /lib/
Disallow: /media/
Disallow: /.php$
Disallow: /pkginfo/
Disallow: /report/
Disallow: /skin/
Disallow: /utm
Disallow: /var/
Disallow: /catalog/
Disallow: /customer/
Sitemap: -
Thanks Keri for your advice
-
Thanks Rick for your advice
-
Like Rick said, it's just a "hey, make sure that you really wanted to do this" type warning, since you can easily write a robots.txt that blocks things you didn't really think would be blocked. Or someone else can modify the robots.txt without telling you, and this can be a warning that you need to go find someone and get that fixed.
-
So what your saying is:
1. SEOmoz says these pages can't get indexed by search engines because of our robot.txt
2. We don't want these pages indexed and blocked them using robots.txt
My initial reaction is: no problem, SEOmoz is just showing you as a 'confirmation warning' that these pages are not indexed, but since you did that on purpose, it's okay.
Hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Clarification regarding robots.txt protocol
Hi,
Technical SEO | | nlogix
I have a website , and having 1000 above url and all the url already got indexed in Google . Now am going to stop all the available services in my website and removed all the landing pages from website. Now only home page available . So i need to remove all the indexed urls from Google . I have already used robots txt protocol for removing url. i guess it is not a good method for adding bulk amount of urls (nearly 1000) in robots.txt . So just wanted to know is there any other method for removing indexed urls.
Please advice.0 -
Is there a limit to how many URLs you can put in a robots.txt file?
We have a site that has way too many urls caused by our crawlable faceted navigation. We are trying to purge 90% of our urls from the indexes. We put no index tags on the url combinations that we do no want indexed anymore, but it is taking google way too long to find the no index tags. Meanwhile we are getting hit with excessive url warnings and have been it by Panda. Would it help speed the process of purging urls if we added the urls to the robots.txt file? Could this cause any issues for us? Could it have the opposite effect and block the crawler from finding the urls, but not purge them from the index? The list could be in excess of 100MM urls.
Technical SEO | | kcb81780 -
Utilising Wordpress Attachment Pages Without Getting Duplicate Content Warnings.
I have a wordpres site that relies heavily on images and their usefulness. Each post links to larger sizes of the images with links back to the post and the "gallery" all images uploaded to the post. Unfortunately this goes against the "rules" and our attachment page show as duplicate content in Google (even though the image titles are different). There must be a way to utlise and make the most of attachment pages without getting duplicate content warnings?
Technical SEO | | DotP0 -
Rel=Canonical for filter pages
Hi folks, I have a bit of a dilemma that I'd appreciate some advice on. We'll just use the solid wood flooring of our website as an example in this case. We use the rel=canonical tag on the solid wood flooring listings pages where the listings get sorted alphabetically, by price etc.
Technical SEO | | LukeyB30
e.g. http://www.kensyard.co.uk/products/category/solid-wood-flooring/?orderBy=highestprice uses the canonical tag to point to http://www.kensyard.co.uk/products/category/solid-wood-flooring/ as the main page. However, we also uses filters on our site which allows users to filter their search by more specific product features e.g.
http://www.kensyard.co.uk/products/category/solid-wood-flooring/f/18mm/
http://www.kensyard.co.uk/products/category/solid-wood-flooring/f/natural-lacquered/ We don't use the canonical tag on these pages because they are great long-tail keyword targeted pages so I want them to rank for phrases like "18mm solid wood flooring". But, in not using the canonical tag, I'm finding google is getting confused and ranking the wrong page as the filters mean there is a huge number of possible URLs for a given list of products. For example, Google ranks this page for the phrase "18mm solid wood flooring" http://www.kensyard.co.uk/products/category/solid-wood-flooring/f/18mm,116mm/ This is no good. This is a combination of two filters and so the listings are very refined, so if someone types the above phrase into Google and lands on this page their first reaction will be "there are not many products here". Google should be ranking the page with only the 18mm filter applied: http://www.kensyard.co.uk/products/category/solid-wood-flooring/f/18mm How would you recommend I go about rectifying this situation?
Thanks, Luke0 -
Robots.txt Syntax
Does the order of the robots.txt syntax matter in SEO? For example (are there potential problems with this format): User-agent: * Sitemap: Disallow: /form.htm Allow: / Disallow: /cgnet_directory
Technical SEO | | RodrigoStockebrand0 -
Need Help With Robots.txt on Magento eCommerce Site
Hello, I am having difficulty getting my robots.txt file to be configured properly. I am getting error emails from Google products stating they can't view our products because they are being blocked, and this past week, in my SEO dashboard, the URL's receiving search traffic dropped by almost 40%. Is there anyone that can offer assistance on a good template robots.txt file I can use for a Magento eCommerce website? The one I am currently using was found at this site here: e-commercewebdesign.co.uk/blog/magento-seo/magento-robots-txt-seo.php - However, I am getting problems from Google now because of it. I searched and found this thread here: http://www.magentocommerce.com/wiki/multi-store_set_up/multiple_website_setup_with_different_document_roots#the_root_folder_robots.txt_file - But I felt like maybe I should get some additional help on properly configuring a robots for a Magento site. Thanks in advance for any help. Please, let me know if you need more info to provide assistance.
Technical SEO | | JerDoggMckoy0 -
Anchor text in Flash Discoverable by Search Engines?
What recommendations do you all have to make anchor text discoverable in flash? More importantly is it even possible and does it contribute to link juice?
Technical SEO | | sunfever0