Best-practice URL structures with multiple filter combinations
-
Hello,
We're putting together a large piece of content that will have some interactive filtering elements. There are two types of filters, topics and object types.
The architecture under the hood constrains us so that everything needs to be in URL parameters. If someone selects a single filter, this can look pretty clean:
www.domain.com/project?topic=firstTopic
or
www.domain.com/project?object=typeOneThe problems arise when people select multiple topics, potentially across two different filter types:
www.domain.com/project?topic=firstTopic-secondTopic-thirdTopic&object=typeOne-typeTwo
I've raised concerns around the structure in general, but it seems to be too late at this point so now I'm scratching my head thinking of how best to get these indexed. I have two main concerns:
- A ton of near-duplicate content and hundreds of URLs being created and indexed with various filter combinations added
- Over-reacting to the first point above and over-canonicalizing/no-indexing combination pages to the detriment of the content as a whole
Would the best approach be to index each single topic filter individually, and canonicalize any combinations to the 'view all' page? I don't have much experience with e-commerce SEO (which this problem seems to have the most in common with) so any advice is greatly appreciated.
Thanks!
-
Thanks for the detailed answer Jonathan. What you suggested was definitely in line with my thinking - indexing just the single topics at most and trying to either noindex or canonicalize all the thousands of possible variations. I definitely agree that all those random combinations of topics/objects hold no real value and at best will eat up crawl budget unnecessarily.
I can make sure Google treats these parameters as URLs via Search Console, they're unique to this piece of content; and I think I can noindex all the random combinations of filters (hopefully).
I'm still waiting to hear more from the dev team but I have a feeling that I won't be able to change the format to subdirectories instead of differentiating everything with query parameters - not the ideal situation but I'll have to make do.
Anyways, thanks again for your thoughtful reply!
Josh
-
Google is supposed to disregard everything after the ? in the query string when indexing. However, I know at times query strings will get indexed if the content on the query stringed URL appears different enough to Google. So I would agree with your motive to try to get these dynamic URLs simplified.
From what i have read on similar scenarios, and my first thought is, do these filtered view pages benefit searchers? Typically it benefits searchers to index maybe the category level of pages. In your instance, this may be the first topic. But once URLs start referencing very specific content that one user was filtering for, I would probably suggest a noIndex meta tag. There should be a scalable solution to this so you don't have to individual go into every filtered page possibility and add noIndex to the head.
If there is a specific filtered view you believe may benefit searches, or you have already seen a demand for, I would suggest making this a page using subfolders
www.domain.com/project/firstTopic/typeOne
and noIndexing all the crazy dynamically generated query string URLs. This should allow you to seize opportunities where you see search demand and alleviate any duplicate content risks.
If you don't want to noIndex, I would gauge the quality of these nitty gritty filtered pages, and if you see value in them, I would agree canonicalizing to the preceding category page sounds like a good solution.
I think this article does a good job explaining this. It suggests that if your filters are just narrowing content on the page rather than changing it, to noIndex or canonicalize (Although, the author does remind you that canonicalization is only a suggestion to Google and is not followed 100% of time for all scenarios).
I hope this helps, and if you don't see how these solutions would be implemented on your site, this issue may require some dev help.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best SEO Strategy
Hi fellow Mozers: I have a question about strategy. I have a client who is a major real estate developer in our region. They build and sell condominiums and also built and manage several major rental apartments. All rental properties have their own websites and there is also a corporate website, which has been around for many years and has decent domain authority (+/- 40). The original intent of the corporate website was to communicate central brand positioning points, attract investors and offer individual profiles of all major properties. My client is interested in developing an organic search strategy which will reach consumers looking to rent apartments. Typical search strings would include the family whose core string would be 'apartments in Baltimore.' (Currently, the client runs PPC for each one of their properties. This is expensive and highly competitive.) In doing research, we've found that there are two local competitors who are able to break on to Page 1 and appear beside the National 'apartment search guides' who dominate the Page 1 SERPS (like apartments.com). The two local competitors have websites of either the same or lower authority than our client's; one has a better link profile, the other is comparable. Here's our problem: our local competitors only build and manage apartments. So, then, the home pages and all the content of their sites ONLY talk about apartment rental related information. Our client's apartment business is actually larger in scope than either local competitor but is only one of their major real estate verticals. So my question is this: if we want to build out a bunch of content which will rank competitively with our local competition, are we better off creating a new area of the corporate site, creating targeted content and resources appropriate for apartment seekers OR would we be better off creating an entirely new site, just devoted to the same? I'm wondering if a new section will ever rank well against competitors whose root domains actually feature content which is only rental related? Likewise, I'm wondering whether we'd be giving up too much, in terms of authority, by creating an entirely new site? I've also only found examples in the industry where an entirely new site was created, so it makes me question the strategy of building out a rental-specific section of a site which also contains information about their condo business. For instance, the Related Companies are a huge builder in the East; they have a corporate site and a site called https//relatedrentals.com . Any feedback would be greatly appreciated!
Intermediate & Advanced SEO | | Daaveey0 -
Best practice for disallowing URLS with Robots.txt
Hi Everybody, We are currently trying to tidy up the crawling errors which are appearing when we crawl the site. On first viewing, we were very worried to say the least:17000+. But after looking closer at the report, we found the majority of these errors were being caused by bad URLs featuring: Currency - For example: "directory/currency/switch/currency/GBP/uenc/aHR0cDovL2NlbnR1cnlzYWZldHkuY29tL3dvcmt3ZWFyP3ByaWNlPTUwLSZzdGFuZGFyZHM9NzEx/" Color - For example: ?color=91 Price - For example: "?price=650-700" Order - For example: ?dir=desc&order=most_popular Page - For example: "?p=1&standards=704" Login - For example: "customer/account/login/referer/aHR0cDovL2NlbnR1cnlzYWZldHkuY29tL2NhdGFsb2cvcHJvZHVjdC92aWV3L2lkLzQ1ODczLyNyZXZpZXctZm9ybQ,,/" My question now is as a novice of working with Robots.txt, what would be the best practice for disallowing URLs featuring these from being crawled? Any advice would be appreciated!
Intermediate & Advanced SEO | | centurysafety0 -
Best practice to consolidating authority of several SKU pages to one destination
I am looking for input on best practices to the following solution Scenario: I have basic product A (e.g. Yamaha Keyboard Blast) There are 3 SKUs to the product A that deserve their own page content (e.g. Yamaha Keyboard Blast 350, Yamaha Keyboard Blast 450, Yamaha Keyboard Blast 550) Objective: - I want to consolidate the authority of potential links to the 3 SKUs pages into one destination/URL Possible Solutions I can think of: - Query parameters (e.g /yamaha-keyboard-blast?SKU=550) - and tell Google to ignore SKU query parameters when indexing Canonical tag (set the canonical tag of the SKU pages all to one destination URL) Hash tag (e.g. /yamaha-keyboard-blast#SKU=550); load SKU dependent content through javascript; Google only sees the URLs without hashtag Am I missing solutions? Which solutions makes the most sense and will allow me to consolidate authority? Thank you for your input.
Intermediate & Advanced SEO | | french_soc0 -
Redirecting to Modal URLs
Hi everyone! Long time no chat - hope you're all well! I have a question that for some reason is causing me some trouble. I have a client that is creating a new website, the process was a mess and I am doing a last minute redirect file for them (long story, for another time). They have different teams for different business categories, so there are multiple staff pages with a list of staffers, and a link to their individual pages. Currently they have a structure like this for their staff bios... www.example.com/category-staff/bob-johnson/ But now, to access the staffers bio, a modal pops up. For instance... www.example.com/category-staff/#bob-johnson Should I redirect current staffers URLs to the staff category, or the modal URL? Unfortunately, we are late in the game and this is the way the bio pages are set up. Would love thoughts, thanks so much guys!!
Intermediate & Advanced SEO | | PatrickDelehanty0 -
URL Structure Change - 301 Redirect - on large website
Hi Guys, I have a website which has approximately 15 million pages indexed. We are planning to change url structure of 99.99% of pages but it would remain on same domain. eg: older url: xyz.com/nike-shoes; new url: xyx.com/shopping/nike-shoes A benefit that we would get is adding a related and important keyword in url. We also achieve other technical benefits in identifying the page type before hand and can reduce time taken to serve the pages (as per our tech team). For older URLs, we are planning to do a 301 redirect. While this seems to be the correct thing to do as per Google, we do see that there is a very large number of cases where people have suffered significantly on doing something like this : Here are our questions: Will all page rank value will be passed to new url? (i.e. will there be a 100% passing of PR/link juice to the new URLs) Can it lower my rank for keywords? (currently we have pretty good rankings (1-5) on many keywords) If there is an impact on rankings - will it be only on specific keywords or will we see a sitewide impact? Assuming that we have taken a hit on traffic, How much time would it take to get the traffic back to normal? and if traffic goes down, by what percentage it may go down and for how much time. (best case, average case and worst case scenarios) Is there anything I should keep in mind while doing this? I understand that there are no clear answers that can be given to these questions but we would like to evaluate a worst case/best case situation. Just to give context : Even a 10 day downtime in terms of drops in rankings is extremely detrimental for our business.
Intermediate & Advanced SEO | | Myntra0 -
Can multiple redirects from old URLs hurt SEO?
We have a client that had an existing site with existing rankings. We rebuilt the site using DNN 7 and created/tested 301 redirects from all the Original URLs to the new DNN URLs which are nasty and have /tabid/1234 and will not allow for dashes (-)'s We have found a DNN module that will make the DNN 7 URLs search friendly. However, that will cause us to 301 the current DNN urls to the new URLs so in fact the original will redirect to the DNN and the DNN will redirect to the rewritten SEO friendly URLs. What should we know here before proceeding?
Intermediate & Advanced SEO | | tjkirgin0 -
URL - Keywords
My domain name contains my top two keywords. Am I penalized if I create another page where I add my domain key words a 2nd time after the domain name along with a subcategory and the name of a state. I don't know what white hat and black hat is so I want to make sure I stay white hat. Also I didn't know it but is it true that your title shows up in your domain name?
Intermediate & Advanced SEO | | Boodreaux0 -
New folder structure
We are in the process of relaunching one of our website's that will use a totally need folder structure. Previously we used mydomain.com/content/country/region/city/district/hotel_name/ Now we are changing to make the URL shorter, more precise - since we are using a new CMS, to be mydomain.com/gb_Hotel-Name/ My question is currently we've in the region of 10,000 pages indexed in Google. So we are going to have to create 301 permanent redirects from the old URLs to the new URLs. From your previous experience, is this the correct way of approaching the task.
Intermediate & Advanced SEO | | NeilTompkins0