Best way to handle page filters and sorts
-
Hello Mozzers, I have a question that has to do with the best way to handle filters and sorts with Googlebot.
I have a page that returns a list of widgets. I have a "root" page about widgets and then filter and sort functionality that shows basically the same content but adds parameters to the URL. For example, if you filter the page of 10 widgets by color, the page returns 3 red widgets on the top, and 7 non-red widgets on the bottom. If you sort by size, the page shows the same 10 widgets sorted by size. We use traditional php url parameters to pass filters and sorts, so obviously google views this as a separate URL.
Right now we really don't do anything special in Google, but I have noticed in the SERPs sometimes if I search for "Widgets" my "Widgets" and "Widgets - Blue" both rank close to each other, which tells me Google basically (rightly) thinks these are all just pages about Widgets. Ideally though I'd just want to rank for my "Widgets" root page.
What is the best way to structure this setup for googlebot? I think it's maybe one or many of the following, but I'd love any advice:
- put rel canonical tag on all of the pages with parameters and point to "root"
- use the google parameter tool and have it not crawl any urls with my parameters
- put meta no robots on the parameter pages
Thanks!
-
The only thing I might add is that, depending on the business, it might be worth building a "Red Widgets" category (as an example). However, you would treat this like a sub-category and write its own category description. You would give it its own rel canonical tag, treating it as the root of the "Red Widgets" category root.
Nine times out of ten it isn't necessary to give sorting and filtering options their own category page though, and a rel canonical tag to the canonical version of that page is the second best option. The first best option would be to not change the URL at all, only re-order the items, hiding some and featuring others. Most eCommerce platforms don't have this functionality at present, however. Rel Canonical was made to span the gap until they do.
-
I'd definitely go with option 1 - to canonicalise all the parameter variations to the root page. This is a textbook example of what the canonical meta-tag is designed for.
In addition, because you say that many of the variations are also ranking, this will pass that ranking to the root page, instead of throwing it away as would happen if you used the GWT to ignore the parameters.
Lastly, the canonical will be understood by most engines and only needs implementing once. If you go the GWT route, you'll also have to do it manually in Bing Webmaster Tools as well, and then you'll have to remember to update both each time new parameters are implemented. And this still won't work for secondary search engines, assuming they have any importance to your site.
I always think of the Webmaster Tools solution as the method of last resort if for some technical reason I am unable to implement correct canonicalisation/redirects. Consistency and lack of manual intervention are paramount for me in these situations.
Hope that helps?
Paul
-
I'd go with the parameter option:
- Go to Webmaster tools > Crawl > URL Parameters > Configure URL Parameters and enter all of the sorting/filtering parameters there.
2A) If all of your items are on one page, you can set up a canonical URL for that page (which would ignore all sorting parameters)
2B) If your categories have multiple pages, be sure to use rel=next/prev for pagination
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How long will old pages stay in Google's cache index. We have a new site that is two months old but we are seeing old pages even though we used 301 redirects.
Two months ago we launched a new website (same domain) and implemented 301 re-directs for all of the pages. Two months later we are still seeing old pages in Google's cache index. So how long should I tell the client this should take for them all to be removed in search?
Intermediate & Advanced SEO | | Liamis0 -
What is the best structure for paginating comment structures on pages to preserve the maximum SEO juice?
You have a full webpage with a great amount of content, images & media. This is a social blogging site where other members can leave their comments and reactions to the article. Over time there are say 1000 comments on this page. So we set the canonical URL, and use Rel (Prev & Next) to tell the bots that the next subsequent block of 100 comments is attributed to the primary URL. Or... We allow the newest 10 comments to exist on the primary URL, with a "see all" comments link that refers to a new URL, and that is where the rest of the comments are paginated. Which option does the community feel would be most appropriate and would adhere to the best practices for managing this type of dynamic comment growth? Thanks
Intermediate & Advanced SEO | | HoloGuy0 -
Splitting down pages
Hello everyone, I have a page on my directory for example:
Intermediate & Advanced SEO | | SamBayPublishing
https://ose.directory/topics/breathing-apparatus The title on this page is small yet a bit unspecific:
Breathing Apparatus Companies, Suppliers and Manufacturers On webmaster tools these terms hold different values for each category so "topic name companies" sometimes has a lot more searches than "topic name suppliers". I was thinking if I could split the page into the following into three separate pages would that be better: https://ose.directory/topics/breathing-apparatus (main - Title: Breathing Apparatus)
https://ose.directory/topics/breathing-apparatus/companies (Title: Breathing Apparatus Companies)
https://ose.directory/topics/breathing-apparatus/manufacturers (Title: Breathing Apparatus Manufacturers)
https://ose.directory/topics/breathing-apparatus/suppliers (Title: Breathing Apparatus Suppliers) Two Questions: Would this be more beneficial from an SEO perspective? Would google penalise me for doing this, if so is there a way to do it properly. PS. The list of companies may be the same but the page content ever so slightly different. I know this would not effect my users much because the terms I am using all mean pretty much the same thing. The companies do all three.0 -
When Mobile and Desktop sites have the same page URLs, how should I handle the 'View Desktop Site' link on a mobile site to ensure a smooth crawl?
We're about to roll out a mobile site. The mobile and desktop URLs are the same. User Agent determines whether you see the desktop or mobile version of the site. At the bottom of the page is a 'View Desktop Site' link that will present the desktop version of the site to mobile user agents when clicked. I'm concerned that when the mobile crawler crawls our site it will crawl both our entire mobile site, then click 'View Desktop Site' and crawl our entire desktop site as well. Since mobile and desktop URLs are the same, the mobile crawler will end up crawling both mobile and desktop versions of each URL. Any tips on what we can do to make sure the mobile crawler either doesn't access the desktop site, or that we can let it know what is the mobile version of the page? We could simply not show the 'View Desktop Site' to the mobile crawler, but I'm interested to hear if others have encountered this issue and have any other recommended ways for handling it. Thanks!
Intermediate & Advanced SEO | | merch_zzounds0 -
Ranking Page - Category vs. Blog Post - What is best for CTR?
Hi, I am not sure wether I shall rank with a category page, or create a new post. Let me explain... If I google for 'Basic SEO' I see an article from Rand with Authorship markup. That's cool so I can go straight to this result because I know there might be some good insight. BUT: 'Basic SEO' is also an category at MOZ an it is not ranking. On the other hand, if I google for 'advanced SEO' then the MOZ category for 'advanced SEO' is ranking. But there is no authorship image, so users are much less likely to click on that result. Now, I want to rank for a very important keyword for me (content keyword, not transactional). Therefor, I have a category called 'yoga exercises'. But shall I rather create an post about them only to increase CTR due to Google Authorship? I read in Google guidelines that Authorship on homepage an category pages are not appreciated. Hope you have some insights that can help me out.
Intermediate & Advanced SEO | | soralsokal0 -
What's the best way to check Google search results for all pages NOT linking to a domain?
I need to do a bit of link reclamation for some brand terms. From the little bit of searching I've done, there appear to be several thousand pages that meet the criteria, but I can already tell it's going to be impossible or extremely inefficient to save them all manually. Ideally, I need an exported list of all the pages mentioning brand terms not linking to my domain, and then I'll import them into BuzzStream for a link campaign. Anybody have any ideas about how to do that? Thanks! Jon
Intermediate & Advanced SEO | | JonMorrow0 -
Will pages irrelevant to a site's core content dilute SEO value of core pages?
We have a website with around 40 product pages. We also have around 300 pages with individual ingredients used for the products and on top of that we have some 400 pages of individual retailers which stock the products. Ingredient pages have same basic short info about the ingredients and the retail pages just have the retailer name, adress and content details. Question is, should I add noindex to all the ingredient and or retailer pages so that the focus is entirely on the product pages? Thanks for you help!
Intermediate & Advanced SEO | | ArchMedia0 -
What to call pages
I reckon I've bagged one of the most interesting SEO projects of the year. My new client is selling vibrators. The site is not even in development yet but they want to make it fun and friendly and take away the stigma and "seediness" of the product. Anyway, the owenr has presented a list of "places" within this site which are places where the products are going to be showcased. These are along the lines of, Royal Rabbits Palace, Clitoral Courtyard, Dungeon Dildos, Magical G-arden etc. (there is a bit shreky/fariy tale thing going on) Clearly, these places add a lot to the look and feel of the site but as URL's and Titles, they are clearly not optimal in an SEO sense. What is for the best...making sure we shift the owner back into SEO best practice or hope that having these weird and wonderful names for the pages is going to add enough to the user experience to make it worthwhile to let through. FYI, did you know you can get vibrators that you can plug an ipod into. Man, I've seen some weird things researching this client!
Intermediate & Advanced SEO | | FDC0