Should I disallow all URL query strings/parameters in Robots.txt?
-
Webmaster Tools correctly identifies the query strings/parameters used in my URLs, but still reports duplicate title tags and meta descriptions for the original URL and the versions with parameters. For example, Webmaster Tools would report duplicates for the following URLs, despite it correctly identifying the "cat_id" and "kw" parameters:
/Mulligan-Practitioner-CD-ROM
/Mulligan-Practitioner-CD-ROM?cat_id=87
/Mulligan-Practitioner-CD-ROM?kw=CROMAdditionally, theses pages have self-referential canonical tags, so I would think I'd be covered, but I recently read that another Mozzer saw a great improvement after disallowing all query/parameter URLs, despite Webmaster Tools not reporting any errors.
As I see it, I have two options:
- Manually tell Google that these parameters have no effect on page content via the URL Parameters section in Webmaster Tools (in case Google is unable to automatically detect this, and I am being penalized as a result).
- Add "Disallow: *?" to hide all query/parameter URLs from Google. My concern here is that most backlinks include the parameters, and in some cases these parameter URLs outrank the original.
Any thoughts?
-
Correct. They won't be indexed but are still followed.
-
The statement was in a response to a question I asked earlier.
"I was having an issue like this where moz was showing a lot more duplicate content than webmaster tools was, actually webmaster tools showed none, but I was being penalized. I realized this when I added an exclusion to robots.txt to exclude any query strings on my site. After I did this I saw my rankings shoot through the roof."
Thanks for the info. I did edit the settings in the URL parameters section to tell Google that these parameters do not change the page content, so it should now index only one representative URL. My only concern was that the kw (keyword) parameter does change page content for search result pages, but I just read that Matt Cutts encourages disallowing those pages anyway.
Just to verify, disallowing those pages with parameters won't affect the "link juice" passed from external links?
-
Hi there
I recently answered a question in a similar question in the Q+A that references resources that can help you help Google understand these parameters and categorize them. You can read that here.
That being said, blocking these parameters in your robots.txt will not affect your rankings, especially if those parameter or query strings are properly canonicalized to the proper product page.
That being said, I would make sure you understand the resources above and the options, as you understand your users and website better than anyone - test on a few pages to see what happens and go from there.
Hope this helps! Good luck!
-
"I recently read that another Mozzer saw a great improvement after disallowing all query/parameter URLs" - do you have a link for this?
Canonicals should be enough but Google does mess up and the more clues you can give them, the better.
You can also manually tell Google parameter meanings (if you check out your parameter page now in search console, you should see all of the parameters they've detected for you - you can just change their meaning).
I don't see any harm in disallowing parameters via robots.txt. They will still be crawled and internal links followed, just not indexed in serps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How would you address these URLS
Hey Mozzers, long time no post. Just a quick one for you regarding URLS, this is an example of a url on a site https://www.thisismyurl.co.uk/products/spacehoppers/special-spacehopper.html Many of these pages are getting flagged for having a url that is too long. The target of this page is "special spacehoppers". Should i be concerned with the url being to long given my keyword is at the end? Would this be a suitable idea? https://www.thisismyurl.co.uk/p/spacehoppers/special.html Would changing products to p be worthwhile? It would remove length from nearly all urls but would require a site wide re-direct. 2)Would removing the "spacehoppers" bit from the url be worth it? Yes it would shorten the url but would also remove the exact keyword from the url which could be detrimental to rankings.
Intermediate & Advanced SEO | | ATP0 -
URL Parameter Setting Recommendation - Webmaster Tools, Breadcrumbs & 404s
Hi All, We use a parameter called "breadCrumb" to drive the breadcrumbs on our ecommerce product pages that are categorized in multiple places. For example, our "Blue Widget" product may have the following URLs: http://www.oursite.com/item3332/blue-widget
Intermediate & Advanced SEO | | Doug_G
http://www.oursite.com/item3332/blue-widget_?breadCrumb=BrandTree_
http://www.oursite.com/item3332/blue-widget_?breadCrumb=CategoryTree1_
http://www.oursite.com/item3332/blue-widget_?breadCrumb=CategoryTree2_ We use a canonical tag pointing back to the base product URL. The parameter only changes the breadcrumbs. Which of the following, if any, settings would you recommend for such a parameter in GWT: Does this parameter change page content seen by the user? Options: Yes/No
How does this parameter affect page content? Options: Narrows/Specifies/Other Currently, google decided to automatically assign the parameter as "Yes/Other/Let Googlebot Decide" without notifying us. We noticed a drop in rankings around the suspected time of the assignment. Lastly, we have a consistent flow of products that are discontinued that we 404. As a result of the breadcrumb parameter, our 404s increase significantly (one for each path). Would 800 404 crawl errors out of 18k products cause a penalty on a young site? We got an "Increase in '404' pages' email from GWT, shortly after our rankings seemed to drop. Thank you for any advice or suggestions! Doug0 -
Outside Top 10 Even though - Higher Domain/Page Authority/Higher On Page Grade
Hi, Note: this is for Australian search results - for people in Perth.
Intermediate & Advanced SEO | | HeadStud
The website is: http://thedj.com.au I am trying to optimise for the keyword 'perth wedding dj', but also 'wedding dj perth' and for some reason my website isn't even in the top 10 results. Here is what's weird though: My on-page grade with the On-Page Grader for the keyword 'wedding DJ perth' is an 'A' for http://thedj.com.au (http://awesomescreenshot.com/0135135hca) When checking the Keyword Difficulty in the Google Australia search enginge for 'wedding DJ perth' - there are 4 results which have a lower domain authority than 15 (in fact one result has a domain authority of 1) - http://awesomescreenshot.com/03f5134zd1 http://thedj.com.au has a Domain Authority of 23/100 and a Page Authority of 34/100. (http://awesomescreenshot.com/0bb5134tb8) So seeing as the page has gotten an A for on-page optimisation for the keyword 'wedding DJ Perth' and has a higher domain authority then many results in the top 10... why isn't it in the Top 10?! Bonus Question:
Why is DJ Avi showing up at the top of search results (Local listing) depsite the fact that:
a) He has no website to link to
b) No reviews for his listing
c) No keywords that I can see (other than the fact that he's a DJ)
Screenshot: http://awesomescreenshot.com/05151349cb Meanwhile our Local Places - Thanks,
Kosta
http://www.headstudios.com.au0 -
URL Injection Hack - What to do with spammy URLs that keep appearing in Google's index?
A website was hacked (URL injection) but the malicious code has been cleaned up and removed from all pages. However, whenever we run a site:domain.com in Google, we keep finding more spammy URLs from the hack. They all lead to a 404 error page since the hack was cleaned up in the code. We have been using the Google WMT Remove URLs tool to have these spammy URLs removed from Google's index but new URLs keep appearing every day. We looked at the cache dates on these URLs and they are vary in dates but none are recent and most are from a month ago when the initial hack occurred. My question is...should we continue to check the index every day and keep submitting these URLs to be removed manually? Or since they all lead to a 404 page will Google eventually remove these spammy URLs from the index automatically? Thanks in advance Moz community for your feedback.
Intermediate & Advanced SEO | | peteboyd0 -
Will blocking urls in robots.txt void out any backlink benefits? - I'll explain...
Ok... So I add tracking parameters to some of my social media campaigns but block those parameters via robots.txt. This helps avoid duplicate content issues (Yes, I do also have correct canonical tags added)... but my question is -- Does this cause me to miss out on any backlink magic coming my way from these articles, posts or links? Example url: www.mysite.com/subject/?tracking-info-goes-here-1234 Canonical tag is: www.mysite.com/subject/ I'm blocking anything with "?tracking-info-goes-here" via robots.txt The url with the tracking info of course IS NOT indexed in Google but IT IS indexed without the tracking parameters. What are your thoughts? Should I nix the robots.txt stuff since I already have the canonical tag in place? Do you think I'm getting the backlink "juice" from all the links with the tracking parameter? What would you do? Why? Are you sure? 🙂
Intermediate & Advanced SEO | | AubieJon0 -
How to determine URL Parameters in Google Webmaster
Hi there! I have a new website with so many duplicate meta titles and descriptions because of its expanded features from the e-commerce shopping cart that I am using like mobile website, product sorting, etc. Aside from canonical, is it advisable to use the URL parameters from Google webmaster tools to disallow crawling of mobile website and other parameters like, "parent", "catalogsetview", "pcsid", "pg" "mode". I appreciate and advise. 🙂 Thanks!
Intermediate & Advanced SEO | | paumer800 -
My URLs are a mess!
Hi all, I am having some SEO done on my website and I have been asked to tidy up my URLs. They show the word 'brand' or 'item' and an ID number in every one. http://www.societyboardshop.co.uk/brand/Girl-Skateboards/153/ http://www.societyboardshop.co.uk/item/Girl%20Skateboards%20Guy%20Mariano%20OG%20Guy%20Skateboards/898/ My developer says that we cannot remove these words as they 'form part of a routing table' for each url. How do I fix these URLs? Many thanks in advance. Paul.
Intermediate & Advanced SEO | | Paul530 -
What are the different tactics for getting ranked/ included in Google finance searches such as http://www.google.com/finance/company_news?q=NASDAQ:ADBE
I don't know what ranking factors they are using for this feed. The results vary greatly from a search done at google.com or google.com/news and google.com/finance I'm working with a website that regularly publishes finance-related news and currently gets traffic from google finance. I'm wondering what we can do to optimize our news articles to possibly show more prominently or more often. Thanks
Intermediate & Advanced SEO | | joemascaro0