URL Parameters
-
On our webshop we've added some URL-parameters. We've set URL's like min_price, filter_cat, filter_color etc. on "don't Crawl" in our Google Search console. We see that some parameters have 100.000+ URL's and some have 10.000+
Is it better to add these parameters in the robots.txt file? And if that's better, how can we write it down so the URL's will not be crawled.
Our robotos.txt files shows now:
# Added by SEO Ultimate's Link Mask Generator module User-agent: * Disallow: /go/ # End Link Mask Generator output User-agent: * Disallow: /wp-admin/
-
Hi,
You might want to read this article on faceted navigation on the google webmaster blog which gives some good advice on how to handle the situation. What to use depends a bit on your actual situation.
Options include using a nofollow links / use a separate subdomain or block in robots.txt (using a separate folder).On Moz there is this article (the part of faceting) - its mainly about listing sites - but the core problem is more or less similar.
Hope this helps,
Dirk
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
URL Indexing with Keyword
Hi, My webpage url is indexed in Google but don't show when searching the Main Keyword. How can i index it with keyword. It should show on any SERP when the keyword is searched. Any suggestions.
Technical SEO | | green.h1 -
Trailing Slashes on URLs
Hi everyone I have a question on trailing slashes in URL. The crux of it is this: is having both: example.com/subdirectory/ and: example.com/subdirectory on all of your subdirectories considered duplicate content by Google - or in some other way really bad? We have done a heck a lot of research into this, and it would seem...no one knows for sure (it is easy to get lost in a sea of Webmaster tool forums from 2012). Google itself has both URLs for it's subdirectories (try https://www.google.co.uk/maps and https://www.google.co.uk/maps/) as does Moz; and yet there are some rumblings on the internet of people who think you must put a 'redirect' (although not really a redirect as it isn't a 301) in your htaccess file to one or the other (so for example.com/subdirectory/ would 'forward' to example.com/subdirectory); and this is what bbc.co.uk do. We tried putting this htaccess 'forward' in as an experiment, but I noticed our site then stopped being fully crawled by Google bot, so we reversed it. Can any one shed any light?
Technical SEO | | NickOrbital0 -
URL Format
Often we have web platforms that have a default URL structure that looks something like this www.widgetcompany.co.uk/widget-gallery/coloured-widgets/red-widgets This format is quite well structured but would it just be more effective to be www.widgetcompany.co.uk/red-widgets? I realise that it may depend on a lot of factors but generally is it better to have the shorter URL if targeting the key phrase "red widgets" One thing, it certainly looks a bit keyword stuffy with all those "widgets"
Technical SEO | | vital_hike0 -
Canonical URLs in an eCommerce site
We have a website with 4 product categories (1. ice cream parlors, 2. frozen yogurt shops etc.). A few sub-categories (e.g. toppings, smoothies etc.) and the products contained in those are available in more than one product category (e.g. the smoothies are available in the "ice cream parlors" category, but also in the "frozen yogurt shops" category). My question: Unfortunately the website has been designed in a way that if a subcategory (e.g. smoothies) is available in more than 1 category, then itself (the subcategory page) + all its product pages will be automatically visible under various different urls. So now I have several urls for one and the same product: www.example.com/strawberry-smoothie|SMOOTHIES|FROZEN-YOGURT-SHOPS-391-2-5 and http://www.example.com/strawberry-smoothie|SMOOTHIES|ICE-CREAM-PARLORS-391-1-5 And also several ones for one and the same sub-category (they all include exactly the same set of products): http://www.example.com/SMOOTHIES-1-12-0-4 (the smoothies contained in the ice cream parlors category) http://www.example.com/SMOOTHIES-2-12-0-4 (the same smoothies, contained in the frozen yogurt shops category) This is happening with around 100 pages. I would add canonical tags to the duplicates, but I'm afraid that by doing so, the category (frozen yogurt shops) that contains several non-canonical sub-categories (smoothies, toppings etc.) , might not show up anymore in search results or become irrelevant for Google when searching for example for "products for frozen yoghurt shops". Do you know if this would be actually the case? I hope I explained it well..
Technical SEO | | Gabriele_Layoutweb0 -
Do keywords in url parameter count?
I have a client who is on an older ecommerce platform that does not allow url rewrites in anyway. It would cost a ton of money to custom dev a solution. Anyways right now they have set up a parameter on their product urls to at least get the keyword in there. My question is, will this keyword actually be counted since it is in a parameter? An example url is http://domain.com/Catalog.aspx?Level1=01&Level2=02&C=Product-name-here Does this 'product-name-here' count as having the keyword in the url according to google?
Technical SEO | | webfeatseo0 -
Is there actual risk to having multiple URLs that frame in main url? Or is it just bad form and waste of money?
Client has many urls that just frame in the main site. It seems like a total waste of money, but if they are frames, is there an actual risk?
Technical SEO | | gravityseo0 -
URL restructure and phasing out HTML sitemap
Hi SEOMozzies, Love the Q&A resource and already found lots of useful stuff too! I just started as an in-house SEO at a retailer and my first main challenge is to tidy up the complex URL structures and remove the ugly sub sitemap approach currently used. I already found a number of suggestions but it looks like I am dealing with a number of challenges that I need to resolve in a single release. So here is the current setup: The website is an ecommerce site (department store) with around 30k products. We are using multi select navigation (non Ajax). The main website uses a third party search engine to power the multi select navigation, that search engine has a very ugly URL structure. For example www.domain.tld/browse?location=1001/brand=100/color=575&size=1&various other params, or for multi select URL’s www.domain.tld/browse?location=1001/brand=100,104,506/color=575&size=1 &various other non used URL params. URL’s are easily up to 200 characters long and non-descriptive at all to our users. Many of these type of URL’s are indexed by search engines (we currently have 1.2 million of those URL’s indexed including session id’s and all other nasty URL params) Next to this the site is using a “sub site” that is sort of optimized for SEO, not 100% sure this is cloaking but it smells like it. It has a simplified navigation structure and better URL structure for products. Layout is similair to our main site but all complex HTMLelements like multi select, large top navigations menu's etc are all removed. Many of these links are indexed by search engines and rank higher than links from our main website. The URL structure is www.domain.tld/1/optimized-url .Currently 64.000 of these URL’s are indexed. We have links to this sub site in the footer of every page but a normal customer would never reach this site unless they come from organic search. Once a user lands on one of these pages we try to push him back to the main site as quickly as possible. My planned approach to improve this: 1.) Tidy up the URL structure in the main website (e.g. www.domain.tld/women/dresses and www.domain.tld/diesel-red-skirt-4563749. I plan to use Solution 2 as described in http://www.seomoz.org/blog/building-faceted-navigation-that-doesnt-suck to block multi select URL’s from being indexed and would like to use the URL param “location” as an indicator for search engines to ignore the link. A risk here is that all my currently indexed URL (1.2 million URL’s) will be blocked immediately after I put this live. I cannot redirect those URL’s to the optimized URL’s as the old URL’s should still be accessible. 2.) Remove the links to the sub site (www.domain.tld/1/optimized-url) from the footer and redirect (301) all those URL’s to the newly created SEO friendly product URL’s. URL’s that cannot be matched since there is no similar catalog location in the main website will be redirected (301) to our homepage. I wonder if this is a correct approach and if it would be better to do this in a phased way rather than the currently planned big bang? Any feedback would be highly appreciated, also let me know if things are not clear. Thanks! Chris
Technical SEO | | eCommerceSEO0 -
Dynamic URLs via Refinements
What is the best way to handle large product pages with many different refinement possibilities. Ex. hard drive - 40 gigs - black case etc. All of these refinements add to the length of the url and potentially create crawling issues as the url is to dynamic. I have seen people canonical all refinements and pages to the main cat page, I have seen others no follow certain refinements. Also in the SEOmoz crawling report it tells me that over two parameters is bad. What is the best way to handle this? Thanks
Technical SEO | | Gordian0