Need Help With Robots.txt on Magento eCommerce Site
-
Hello, I am having difficulty getting my robots.txt file to be configured properly. I am getting error emails from Google products stating they can't view our products because they are being blocked, and this past week, in my SEO dashboard, the URL's receiving search traffic dropped by almost 40%.
Is there anyone that can offer assistance on a good template robots.txt file I can use for a Magento eCommerce website?
The one I am currently using was found at this site here: e-commercewebdesign.co.uk/blog/magento-seo/magento-robots-txt-seo.php - However, I am getting problems from Google now because of it.
I searched and found this thread here: http://www.magentocommerce.com/wiki/multi-store_set_up/multiple_website_setup_with_different_document_roots#the_root_folder_robots.txt_file - But I felt like maybe I should get some additional help on properly configuring a robots for a Magento site.
Thanks in advance for any help. Please, let me know if you need more info to provide assistance.
-
You better back up your DB before doing that. Anyway, take a look at this MagentoConnect extension http://www.magentocommerce.com/magento-connect/MageWorx.com/extension/2852/seo-suite-enterprise#overview
or this one (it's by the same company
http://www.mageworx.com/seo-suite-pro-magento-extension.html
-
Thank you very much. We'll give that a shot and see how it goes. What started us tinkering with the robots file in the first place is that Bing Shopping told us it couldn't crawl our product images. Plus, our pdf files for product specs and manuals are all listed within the media folder. Do you have a suggestion for this? I would think we would get rid of "Disallow: /media/" and replace it with the following (what do you think?):
Disallow: /media/aitmanufacturers/
Disallow: /media/bigtom_media/
Disallow: /media/css/
Disallow: /media/downloadable/
Disallow: /media/easybanner/
Disallow: /media/geoip/
Disallow: /media/icons/
Disallow: /media/import/
Disallow: /media/js/
Disallow: /media/productsfeed/
Disallow: /media/sales/
Disallow: /media/tmp/
Disallow: /media/UPS/ -
Hello,
Below is what I use. You need to have the modrewrite enabled if you are going to disallow index.php and even then it's still very risky. This may be part of the issue. Robots.txt is so important, but you need to know what you are doing. Especially when disallowing as much as that UK site is.
Tyler
User-agent: *
Disallow: /*?
Disallow: /*.js$
Disallow: /*.css$
Disallow: /checkout/
Disallow: /catalogsearch/
Disallow: /review/
Disallow: /app/
Disallow: /downloader/
Disallow: /images/
Disallow: /js/
Disallow: /lib/
Disallow: /media/
Disallow: /*.php$
Disallow: /pkginfo/
Disallow: /report/
Disallow: /skin/
Disallow: /var/
Disallow: /customer/
Disallow: /enable-cookies/
Sitemap: http://domain.com/sitemap.xml
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Need help with list schema!
Hi all, I am trying out list schema on my site, but in Google's structured data testing tool I'm having an issue with the URL section. Whenever I have the same URL for each position is says that duplicate URLs aren't allowed, then when I have different URLs it says that they all have to be the same URL. Does anyone have any pointers that can help make my list schema error free!? Heres my schema:
Technical SEO | | Saba.Elahi.M.0 -
Good robots txt for magento
Dear Communtiy, I am trying to improve the SEO ratings for my website www.rijwielcashencarry.nl (magento). My next step will be implementing robots txt to exclude some crawling pages.
Technical SEO | | rijwielcashencarry040
Does anybody have a good magento robots txt for me? And what need i copy exactly? Thanks everybody! Greetings, Bob0 -
Is possible to reutilize products description taken from mydomain.com in a ecommerce site?
My issue is related with cross-domain duplicate content. In the first domain (aaa.com) I have 30-40 products well described with lots of content (story, description, features, technical sheets etc). This is my primary, brand domain. I want to open an e-commerce in another domain (bbb.com) where I will sell all the products that reside in aaa.com domain. If I'm going to use the content (taken from aaa.com) for describing e-commerce products in the bbb.com domain could it be seen as duplicate content? What do you suggest? It would be a hell to rewrite all the products description and even worse, technical sheets and features/characteristics can't be written differently. Thanks in advace
Technical SEO | | polidistillerie1 -
If you are organizing the site structure for an ecommerce site, how would you do it?
Should you use not use slashes and use all dashes or use just a few slashes and the rest with dashes? For example, domain.com/category/brand/product-color-etc OR domain.com/anythinghere-color-dimensions-etc Which structure would you rather go for and why?
Technical SEO | | Zookeeper0 -
Best use of robots.txt for "garbage" links from Joomla!
I recently started out on Seomoz and is trying to make some cleanup according to the campaign report i received. One of my biggest gripes is the point of "Dublicate Page Content". Right now im having over 200 pages with dublicate page content. Now.. This is triggerede because Seomoz have snagged up auto generated links from my site. My site has a "send to freind" feature, and every time someone wants to send a article or a product to a friend via email a pop-up appears. Now it seems like the pop-up pages has been snagged by the seomoz spider,however these pages is something i would never want to index in Google. So i just want to get rid of them. Now to my question I guess the best solution is to make a general rule via robots.txt, so that these pages is not indexed and considered by google at all. But, how do i do this? what should my syntax be? A lof of the links looks like this, but has different id numbers according to the product that is being send: http://mywebshop.dk/index.php?option=com_redshop&view=send_friend&pid=39&tmpl=component&Itemid=167 I guess i need a rule that grabs the following and makes google ignore links that contains this: view=send_friend
Technical SEO | | teleman0 -
Www. version of my site shows nothing in Open Site Explorer
When I first setup my site the domain was learnbonds.com. I moved hosts a couple of months ago and as part of the process I asked them to make the site show as www.learnbonds.com which they did. Now however when I goto www.learnbonds.com in open site explorer it says there is no data. When I enter learnbonds.com into open site explorer it gives me data but says that the site has been redirected to the www. version which shows no data. Also in google webmaster when I try to set the preferred domain as the www. version it gives me the following message: Part of the process of setting a preferred domain is to verify that you own http://www.learnbonds.com/. Please verify http://www.learnbonds.com/. I am concerned that this is hurting my SEO and would appreciate any advice you can give. Thanks Dave
Technical SEO | | fxtrader19790 -
Is it a good idea to make 301 from a site which you know google has banned certain keywords for to a new site with similar content
Here is a short question re. 301. I read Dovers article on how to move an old domain to a new one. Say you have been a little inexperienced regarding linkbuilding and used some cheap service in the past and you have steadily seen that certain keywords have been depreciating in the SERP - however the PR is still 3 for the domain - now the qustion is should you rediect with a 301 in .htaccess to a new domain when you know that google does not like certain keywords with respect to the old site. Will the doom and gloom carry over to the new site?
Technical SEO | | Kofoed0 -
How do I use the Robots.txt "disallow" command properly for folders I don't want indexed?
Today's sitemap webinar made me think about the disallow feature, seems opposite of sitemaps, but it also seems both are kind of ignored in varying ways by the engines. I don't need help semantically, I got that part. I just can't seem to find a contemporary answer about what should be blocked using the robots.txt file. For example, I have folders containing site comps for clients that I really don't want showing up in the SERPS. Is it better to not have these folders on the domain at all? There are also security issues I've heard of that make sense, simply look at a site's robots file to see what they are hiding. It makes it easier to hunt for files when they know the directory the files are contained in. Do I concern myself with this? Another example is a folder I have for my xml sitemap generator. I imagine google isn't going to try to index this or count it as content, so do I need to add folders like this to the disallow list?
Technical SEO | | SpringMountain0