Best-practice URL structures with multiple filter combinations
-
Hello,
We're putting together a large piece of content that will have some interactive filtering elements. There are two types of filters, topics and object types.
The architecture under the hood constrains us so that everything needs to be in URL parameters. If someone selects a single filter, this can look pretty clean:
www.domain.com/project?topic=firstTopic
or
www.domain.com/project?object=typeOneThe problems arise when people select multiple topics, potentially across two different filter types:
www.domain.com/project?topic=firstTopic-secondTopic-thirdTopic&object=typeOne-typeTwo
I've raised concerns around the structure in general, but it seems to be too late at this point so now I'm scratching my head thinking of how best to get these indexed. I have two main concerns:
- A ton of near-duplicate content and hundreds of URLs being created and indexed with various filter combinations added
- Over-reacting to the first point above and over-canonicalizing/no-indexing combination pages to the detriment of the content as a whole
Would the best approach be to index each single topic filter individually, and canonicalize any combinations to the 'view all' page? I don't have much experience with e-commerce SEO (which this problem seems to have the most in common with) so any advice is greatly appreciated.
Thanks!
-
Thanks for the detailed answer Jonathan. What you suggested was definitely in line with my thinking - indexing just the single topics at most and trying to either noindex or canonicalize all the thousands of possible variations. I definitely agree that all those random combinations of topics/objects hold no real value and at best will eat up crawl budget unnecessarily.
I can make sure Google treats these parameters as URLs via Search Console, they're unique to this piece of content; and I think I can noindex all the random combinations of filters (hopefully).
I'm still waiting to hear more from the dev team but I have a feeling that I won't be able to change the format to subdirectories instead of differentiating everything with query parameters - not the ideal situation but I'll have to make do.
Anyways, thanks again for your thoughtful reply!
Josh
-
Google is supposed to disregard everything after the ? in the query string when indexing. However, I know at times query strings will get indexed if the content on the query stringed URL appears different enough to Google. So I would agree with your motive to try to get these dynamic URLs simplified.
From what i have read on similar scenarios, and my first thought is, do these filtered view pages benefit searchers? Typically it benefits searchers to index maybe the category level of pages. In your instance, this may be the first topic. But once URLs start referencing very specific content that one user was filtering for, I would probably suggest a noIndex meta tag. There should be a scalable solution to this so you don't have to individual go into every filtered page possibility and add noIndex to the head.
If there is a specific filtered view you believe may benefit searches, or you have already seen a demand for, I would suggest making this a page using subfolders
www.domain.com/project/firstTopic/typeOne
and noIndexing all the crazy dynamically generated query string URLs. This should allow you to seize opportunities where you see search demand and alleviate any duplicate content risks.
If you don't want to noIndex, I would gauge the quality of these nitty gritty filtered pages, and if you see value in them, I would agree canonicalizing to the preceding category page sounds like a good solution.
I think this article does a good job explaining this. It suggests that if your filters are just narrowing content on the page rather than changing it, to noIndex or canonicalize (Although, the author does remind you that canonicalization is only a suggestion to Google and is not followed 100% of time for all scenarios).
I hope this helps, and if you don't see how these solutions would be implemented on your site, this issue may require some dev help.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Whats the best practice for acquisition?
Hi, My company have just bought out a competitor. We wan't to dissolve their website and if possible steal some of their link juice. The site hasn't got any spammy links or 404's so i'm not worried in that department. What I am not sure about is which of the following is best practice? a. Redirect every single page (even pages like /?checkout) to a relevant page on our website. b. Only redirect important pages, category pages, contact pages etc and leave the other pages to 404? c. Redirect the important pages to a relevant URL and redirect the less important pages to our homepage. d. Redirect the entire domain to our home page (i assume this isn't a good idea) e. Don't redirect any of the pages just delete the site.
Intermediate & Advanced SEO | | DannyHoodless0 -
Combining multiple HTTPS sites
Hi there! I am currently combining several sites (corporate brochure site and ecommerce site) for a client into one central website. All of the content and structure on the new site is set up and relevant pages have 301 redirects ready. My main concern is that the old .co.uk website has an SSL certificate and will be pointing to the new pages on the new .com website (with new SSL in place). Will this cause connection privacy issues? And if so, what's the best way to resolve them? Many thanks!
Intermediate & Advanced SEO | | Daniel_GlueMedia0 -
Duplicate URLs ending with #!
Hi guys, Does anyone know why a site can contain duplicate URLs ending with hastag & exclamation mark e.g. https://site.com.au/#! We are finding a lot of these URLs (as duplicates) and i was wondering what they are from developer standpoint? And do you think it's worth the time and effort adding a rel canonical tag or 301 to these URLs eventhough they're not getting indexed by Google? Cheers, Chris
Intermediate & Advanced SEO | | jayoliverwright0 -
Expired urls
For a large jobs site, what would be the best way to handle job adverts that are no longer available? Ideas that I have include: Keep the url live with the original content and display current similar job vacancies below - this has the advantage of continually growing the number of indexed pages. 301 redirect old pages to parent categories - this has the advantage of concentrating any acquired link juice where it is most needed. Your thoughts much appreciated.
Intermediate & Advanced SEO | | cottamg0 -
Multiple domains?
I do own a domain for my business right now, and would have a few questions, regarding the increase or traffic for my website and getting new business 1. Is it worth to purchase multiple domains, keyword search relevant, to my business? 2. If so how is the best way to use it? : have them redirect to my own website? a specific type of redirect? do I make a separate website for each of them? 3. for ex if the keyword is " tile and grout". I figured would be best to own "tileandgrout.com". How about "tile-and-grout.com"? thank you in advance
Intermediate & Advanced SEO | | DavidIRC0 -
Best Product URL For Indexing
My proposed URL: mydomain.com/products/category/subcategory/product detail Puts my products 4 levels deep. Is this too deep to get my products indexed?
Intermediate & Advanced SEO | | waynekolenchuk0 -
Best way to find all url parameters?
In reference to http://googlewebmastercentral.blogspot.com/2011/07/improved-handling-of-urls-with.html, what is the best way to find all of the parameters that need to be addressed? Thanks!
Intermediate & Advanced SEO | | nicole.healthline0 -
Best approach to launch a new site with new urls - same domain
www.sierratradingpost.com We have a high volume e-commerce website with over 15K items, an average of 150K visits per day and 12.6 pages per visit. We are launching a new website this spring which is currently on a beta sub domain and we are looking for the best strategy that preserves our current search rankings while throttling traffic (possibly 25% per week) to measure results. The new site will be soft launched as we plan to slowly migrate traffic to it via a load balancer. This way we can monitor performance of the new site while still having the old site as a backup. Only when we are fully comfortable with the new site will we submit the 301 redirects and migrate everyone over to the new site. We will have a month or so of running both sites. Except for the homepage the URL structure for the new site is different than the old site. What is our best strategy so we don’t lose ranking on the old site and start earning ranking on the new site, while avoiding duplicate content and cloaking issues? Here is what we got back from a Google post which may highlight our concerns better: http://www.google.com/support/forum/p/Webmasters/thread?tid=62d0a16c4702a17d&hl=en&fid=62d0a16c4702a17d00049b67b51500a6 Thank You, sincerely, Stephan Woo Cude SEO Specialist scude@sierratradingpost.com
Intermediate & Advanced SEO | | STPseo0