How to exclude URL filter searches in robots.txt
-
When I look through my MOZ reports I can see it's included 'pages' which it shouldn't have included i.e. adding filtering rules such as this one http://www.mydomain.com/brands?color=364&manufacturer=505
How can I exclude all of these filters in the robots.txt? I think it'll be:
Disallow: /*?color=$
Is that the correct syntax with the $ sign in it? Thanks!
-
Unless you're specifically calling out Bing or Baidu... in your Robots.txt file they should follow the same directives as Google so testing with Google's Robots.txt file tester should suffice for all of them.
-
Yes, but what about bing and rest of Search Engine?
-
Adrian,
I agree that there certainly is a right answer to the question posted, as the question asks specifically about one way to manage the issue, being a block of filters in the robots.txt file. What I was getting at is that this may or may not necessarily be the "best" way, and that I'd need to look at your site and your unique situation to figure our which would be the best solution for your needs.
It is very likely that with these parameters a robots.txt file block is the best approach, assuming the parameters aren't added by default into category page or category pagination page navigational links, as then it would affect the bot's ability to crawl the site. Also, if people are linking to those URLs (highly unlikely though) you may consider a robots meta noindex,follow tag instead so the pagerank could flow to other pages.
And I'm not entirely sure the code you provided above will work if the blocked parameter is the first one in the string (e.g. domain.com/category/?color=red) as there is the additional wildcard between the ? and the parameter. I would advise testing this in Google Webmaster Tools first.
- On the Webmaster Tools Home page, click the site you want.
- Under Crawl, click Blocked URLs.
- If it's not already selected, click the Test robots.txt tab.
- Copy the content of your robots.txt file, and paste it into the first box.
- In the URLs box, list the site to test against.
- In the User-agents list, select the user-agents you want (e.g. Googlebot)
-
There certainly is a right answer to my question - I already posted it here earlier today:
Disallow: /*?color=
Disallow: /?*manufacturer=Without the $ at the end which would otherwise denote the end of the URL.
-
Hello Adrian,
The Moz reports are meant to help you uncover issues like this. If you're seeing non-canonical URLs in the Moz report then there is a potential issue for Google, Bing and other search engines as well.
Google does respect wildcards (*) in the robots.txt file, though it can easily be done wrong. There is not right or wrong answer to the issue of using filters or faceted navigation, as each circumstance is going to be different. However, I hope some of these articles will help you identify the best approach for your needs:
(Note: Faceted Navigation is not exactly the same as category filters, but the issues and possible solutions are very similar
)Building Faceted Navigation That Doesn't Suck Faceted Navigation Whiteboard Friday
Duplicate Content: Block, Redirect or Canonical
Guide to eCommerce Facets, Filters and Categories
Rel Canonical How To and Why Not
Moz.com Guide to Duplicate ContentI don't know how your store handles these (e.g. does it add the filter automatically, or only when a user selects a filter?) so I can't give you the answer, but I promise if you read those articles above you will have a very good understanding of all of the options so you can choose which is best for you. That might end up being as simple as blocking the filters in your robots.txt file, or you may opt for rel canonical, noindex meta tag, ajax, Google parameter handling, etc...
Good luck!
-
It's not Google's index that I'm interested in in this case, it's for the MOZ reports. Moz was including over 10,000 'pages' because it was indexing these URLs. Now I know how to edit the robots.txt Moz will be prevented from indexing them again (we only have around 2,000 real pages, not 10,000)
-
I sought out the answer from a developer and got the following reply, so posting here in case it helps someone else:
To exclude pages with color or manufacture in them you can use
Disallow: /*?color=
Disallow: /?*manufacturer=A question mark in your try should be omitted as it denotes the end of the url
-
Hi
I would recommend excluding these in Google Webmaster Tools. Once logged in to your account under the "Crawl" menu you will find "URL Parameters". Find the relevant parameter in the list on this page and you can tell Google not to index these pages.
Hope this helps.
Steve
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Solved Recommended title length for Google search results
I read the recommended title length is 50-60 characters depending on alphabets, etc,.
On-Page Optimization | | Mike555
Anyways, my question is, is there any harm of having longer title?
If all my important keywords are within the 50-60 characters that will show up on search results, I can still make the title longer, it's just that those keywords outside won't have any effect on search results?0 -
How to format URL if main key word is my domain name?
Hello All! I have a question about having my search term in my URL when the first two words are actually my domain name. For example, my domain is plutobeach.com and I want to optimize for events. Which is better? plutobeach.com/events plutobeach.com/plutobeach-events Is the latter keyword stuffing? I'm using the on-page grader here and wondering how much of a difference that can make. Thanks! Steve
On-Page Optimization | | recoil0 -
Can Robots.txt on Root Domain override a Robots.txt on a Sub Domain?
We currently have beta sites on sub-domains of our own domain. We have had issues where people forget to change the Robots.txt and these non-relevant beta sites get indexed by search engines (nightmare). We are going to move all of these beta sites to a new domain that we disallow all in the root of the domain. If we put fully configured Robots.txt on these sub-domains (that are ready to go live and open for crawling by the search engines) is there a way for the Robots.txt in the root domain to override the Robots.txt in these sub-domains? Apologies if this is unclear. I know we can handle this relatively easy by changing the Robots.txt in the sub-domain on going live but due to a few instances where people have forgotten I want to reduce the chance of human error! Cheers, Dave.
On-Page Optimization | | davelane.verve0 -
What are the benefits of the URL meta tag?
We have too many meta tags and want to get rid of all the outdated ones. However, we don't want to eliminate valuable meta tags by mistake. So, before we say goodbye to the URL meta tag, we want to make sure we understand the pros and cons, if any. By the way, we are not referring to canonical URL tags, just URL as in:
On-Page Optimization | | GRIP-SEO0 -
URL Length
I know a URL should "technically" shorter than 75 characters. Does that include the http://www.domainname.com ? Thank you 🙂
On-Page Optimization | | Libra0130 -
How to properly handle these search results pages
I work with Afternic.com and they are restructuring their website. The current search results pages generate tons of duplicate content, but it's not so clear to me what the best solution. Navigation links to all of these pages which are exactly the same and do not generate content on the specific url indicated. The results all display on variations of the url names.php http://www.afternic.com/names.php?feat=1 http://www.afternic.com/names.php?cls=1 Then they have this page with categories: http://www.afternic.com/categories.php And when you select a category, it generates sort results of names.php Now because there is no real content on names.php I'm wondering if all these url variations ought to have a rel canonical tag for names.php or if they should be blocked. If blocked, it seems we're stuck with a page that doesn't display any content. Thanks for any suggestions.
On-Page Optimization | | cakelady0 -
301 redirect and then keywords in URL
Hi, Matt Cutts says that 301 redirects, including the ones on internal pages, causes the loss of a little bit of link juice. But also, I know that keywords in the URL are very important. On our site, we've got unoptimized URLs (few keywords) in the internal pages. Is it worth doing a 301 redirect in order to optimize the URLs for each main page. 301 redirects are the only way we can do it on our premade cart For example (just an example) say our main (1 of the 4) keywords for the page is "brown shoes". I'm wondering if I should redirect something like shoes.com/shoecolors.html to shoes.com/brown-shoes.html In other words, with the loss of juice would we come out ahead? In what instances would we come out ahead?
On-Page Optimization | | BobGW0 -
Best URL Structure For Products That Are The Same
I know that the url structure is very important for seo preferably using the keyword. But is it okay to have the same url with the product number at the end ? Each of our products have a name with a product number. Or will this cause to many similar urls? or if the folder is the name of the product that needs to be optimized, can the page just be called the product number? Example: Say you have a 20 different product lines and they are all catagorized in the appropriate folders, and need to be optimized for the actual product name. XXX (folder name ) WWW-PR-123 WWW-PR-1234 WWW-PR-12345 WWW-PR-123456 what would be the best url structure? Can they have the same begining? The product name? something like: www.example.com/xxx/www-pr-123.php www.example.com/xxx/www-pr-1234.php or www.example.com/xxx/pr-123.php www.example.com/xxx/pr-1234.php
On-Page Optimization | | hfranz0