Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Problems with WooCommerce Product Attribute Filter URL's
-
I am running a WordPress/WooCommerce site for a client, and Moz is picking up some issues with URL's generated from WooCommerce product attribute filters. For example:
..co.uk/womens-prescription-glasses/?filter_gender=mens&filter_style=full-rim&filter_shape=oval
How do I get Google to ignore these filters?
I am running Yoast Premium, but not sure if this can solve the issue? Product categories are canonicalised to the root category URL.Any suggestions very gratefully appreciated.
Thanks
Bob
-
You can tell Google what URL parameters to ignore in Google Search Console. It's under Crawl > URL Parameters > Configure URL Parameters. Google does advise to use caution when changing how Google bot handles the parameters.
-
If you got canonicals set up properly, not much else you can do. Google should recognize what you are trying to do but it will still crawl the filters.
If you don't want the subcategories indexed and crawls minimized:
- nofollow filter links
- block them via robots using wildcards
- set to noindex for all filtered urls.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What's the best way to handle product filter URLs?
I've been researching and can't find a clear cut answer. Imagine you have a product category page e.g. domain/jeans You've a lot of options as to how to filter the results domain/jeans?=ladies,skinny,pink,10 or domain/jeans/ladies-skinny-pink-10 or domain/jeans/ladies/skinny?=pink,10 And in this how do you handle titles, breadcrumbs etc. Is the a way you prefer to handle filters and why do you do it that way? I'm trying to make my mind up as some very big names handle this differently e.g. http://www.next.co.uk/shop/gender-women-category-jeans/colour-pink-fit-skinny-size-10r VS https://www.matalan.co.uk/womens/shop-by-category/jeans?utf8=✓&[facet_filter][meta.tertiary_category][Skinny]=on&[facet_filter][variants.meta.size][Size+10]=on&[facet_filter][meta.master_colour][Midwash]=on&[facet_filter][min_current_price][gte]=6.0&[facet_filter][min_current_price][lte]=18.0&per=36&sort=
Technical SEO | | RodneyRiley0 -
Assigning WooCommerce products to more than one category - Correct methodology?
I manage a store selling prescription glasses, many of which are unisex or apply to more than one category. I have already assigned the canonical URL for each category, but my question is, if a product appears in more than one category, do I need to set the canonical URL in each product to reflect the category I want it to index under? Therefore, any additional categories that product appears in simply refers the link value back to the canonical URL. I note that in Yoast, under each product, there's note in the canonical setting to leave it empty to default to permalink, so this has confused me a little. I'm just concerned that by applying a product to multiple categories, it may be causing duplicate content, as I have a lot of duplicate issues which I'll raise in another question. Thanks!
Technical SEO | | SushiUK1 -
How to handle dynamic product url that changes regularly
Hey Moz, It's actually my first post - although I look at the Q&As on a daily basis! I was hoping to get your opinions on how to handle dynamic product url that can change regularly. Before we start, our product page urls get populated by the product titles. So the situation is this. Let’s say we have a product url: /product/12345-abcde-fghj/ Then the client decides to change the title a week later, so the url changes with it to): /listing/12345-klm-qjk Another week later, the agent changes to: /listing/12345-jkhfk-jhf-kjdhfkjdhf So to note, the product ID will always remain the same. Naturally, 301 redirecting every time would cause a bit of page authority to be lost every time 301ed. Also potentially creating new a few hundreds of 301 redirect daily sounds totally mental. (I have been informed by the dev we expect a few hundreds to change url daily) Although I understand there’s no limit on how many 301s you can have on a single domain, this would look completely unnatural - really not ideal. So the potential solution we thought was: we’ll keep the original url, and make sure that is the only url that will get indexed**/product/12345-abcde-fghj/**and put canonical tag on any of the new urls, directing to the original url. The problem we will have then is that the most current url may not exactly match the description of the product -wouldn’t be ideal for ux. Has anyone had dealing with issues like this in the past? Would love to get your input! Many Thanks
Technical SEO | | MH-UK0 -
Duplicate content on Product pages for different product variations.
I have multiple colors of the same product, but as a result I'm getting duplicate content warnings. I want to keep these all different products with their own pages, so that the color can be easily identified by browsing the category page. Any suggestions?
Technical SEO | | bobjohn10 -
Why is Google's cache preview showing different version of webpage (i.e. not displaying content)
My URL is: http://www.fslocal.comRecently, we discovered Google's cached snapshots of our business listings look different from what's displayed to users. The main issue? Our content isn't displayed in cached results (although while the content isn't visible on the front-end of cached pages, the text can be found when you view the page source of that cached result).These listings are structured so everything is coded and contained within 1 page (e.g. http://www.fslocal.com/toronto/auto-vault-canada/). But even though the URL stays the same, we've created separate "pages" of content (e.g. "About," "Additional Info," "Contact," etc.) for each listing, and only 1 "page" of content will ever be displayed to the user at a time. This is controlled by JavaScript and using display:none in CSS. Why do our cached results look different? Why would our content not show up in Google's cache preview, even though the text can be found in the page source? Does it have to do with the way we're using display:none? Are there negative SEO effects with regards to how we're using it (i.e. we're employing it strictly for aesthetics, but is it possible Google thinks we're trying to hide text)? Google's Technical Guidelines recommends against using "fancy features such as JavaScript, cookies, session IDs, frames, DHTML, or Flash." If we were to separate those business listing "pages" into actual separate URLs (e.g. http://www.fslocal.com/toronto/auto-vault-canada/contact/ would be the "Contact" page), and employ static HTML code instead of complicated JavaScript, would that solve the problem? Any insight would be greatly appreciated.Thanks!
Technical SEO | | fslocal0 -
Structuring URL's for better SEO
Hello, We were rolling our fresh urls for our new service website. Currently we have our structure as www.practo.com/health/dental/clinic/bangalore We like to have it as www.practo.com/health/dental-clinic-bangalore Can someone advice us better which one of the above structure would work out better and why? Should this be a focus of attention while going ahead since this is like a search engine platform for patients looking out for actual doctors. Thanks, Aditya
Technical SEO | | shanky10 -
Should we use Google's crawl delay setting?
We’ve been noticing a huge uptick in Google’s spidering lately, and along with it a notable worsening of render times. Yesterday, for example, Google spidered our site at a rate of 30:1 (google spider vs. organic traffic.) So in other words, for every organic page request, Google hits the site 30 times. Our render times have lengthened to an avg. of 2 seconds (and up to 2.5 seconds). Before this renewed interest Google has taken in us we were seeing closer to one second average render times, and often half of that. A year ago, the ratio of Spider to Organic was between 6:1 and 10:1. Is requesting a crawl-delay from Googlebot a viable option? Our goal would be only to reduce Googlebot traffic, and hopefully improve render times and organic traffic. Thanks, Trisha
Technical SEO | | lzhao0 -
A question about RSS feeds and nofollow's
With the nofollow tag used very widely on the internet these days I was just wondering about how an RSS feed might help me find a way around it. Basically my question is this : I post a comment on a blog, it's approved and my comment together with my link(nofollow tag applied) is there. Now when the blogs RSS feed updates, does this nofollow tag get applied to the feed? As far as I can tell it does not - but I'm not too clue'd up on how the feed is generated. Anyone want to help me understand how it works and if what I'm suggesting would be 'a way around the nofollow tag' ? Thanks 🙂
Technical SEO | | DanHill0