How to handle lots of URL parameters
-
Howdy mozzers
I'm hoping you can lend some advice. I'm dealing with a site now with loads of URL parameters. It's a vehicle dealership group which hosts its entire inventory from multiple locations on one page, sorted by parameters.
Example inventory URL: www.dealership.com/car-inventory.asp?pa=&ns=10&so=m&sor=DESC&ma=&mod=&mt=&yr=&bs=&pr=&t=used&ln=
Where pa (page no.); ns (number of vehicles shown); so (sort by condition); sor (sort order); ma (make); mod (model); yr (year); bs (body style); pr (price range); t (type - new, used, etc.); ln (location no.).
As you can imagine this generates a gazillion URLs (or slightly less). Any thoughts on best canonicalization options?
Thanks as always
-
Thanks a lot for this. That video couldn't have been more timely!
-
There's actually a tool inside of Google Webmaster Tools that lets you specify how they should handle crawling your URL parameters. It's under 'configuration -> url parameters'. It should tell you there if they are having any trouble to crawl your site or not, based on the parameters in your URL.
Google has actually just released a video about this topic today - and it would probably be worth reviewing as well:
http://googlewebmastercentral.blogspot.com/2012/08/configuring-url-parameters-in-webmaster.html
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can you help by advising how to stop a URL from referring to another URL on my website with a 404 errorplease?
How to stop a URL from referring to another URL on my site. I'm getting a 404 error on a referred URL which is (https://webwritinglab.com/know-exactly-what-your-ideal-clients-want-in-8-easy-steps/[null id=43484])referred from URL (https://webwritinglab.com/know-exactly-what-your-ideal-clients-want-in-8-easy-steps/) The referred URL is the URL page that I want and I do not need it redirecting to the other URL as that's presenting a 404 error. I have tried saving the permalink in WordPress and recreated the .htaccess file and the problem is still there. Can you advise how to fix this please? Is it a case of removing the redirect? Is this advisable and how do I do that please? Thanks
Technical SEO | | Nichole.wynter20200 -
What is the best way to handle Product URLs which prepopulate options?
We are currently building a new site which has the ability to pre-populate product options based on parameters in the URL. We have done this so that we can send individual product URLs to google shopping. I don't want to create lots of duplicate pages so I was wondering what you thought was the best way to handle this? My current thoughts are: 1. Sessions and Parameters
Technical SEO | | moturner
On-site product page filters populate using sessions so no parameters are required on-site but options can still be pre-populated via parameters (product?colour=blue&size=100cm) if the user reaches the site via google shopping. We could also add "noindex, follow" to the pages with parameters and a canonical tag to the page without parameters. 2. Text base Parameters
Make the parameters in to text based URLs (product/blue/100cm/) and still use "noindex, follow" meta tag and add a canonical tag to the page without parameters. I believe this is possibly the best solution as it still allows users to link to and share pre-populated pages but they won't get indexed and the link juice would still pass to the main product page. 3. Standard Parmaters
After thinking more today I am considering the best way may be the simplest. Simply using standard parameters (product?colour=blue&size=100cm) so that I can then tell google what they do in webmaster tools and also add "noindex, follow" to the pages with parameters along with the canonical tag to the page without parameters. What do you think the best way to handle this would be?0 -
Migration to new URL structure
Hi guys, Just wondering what your processes are when moving a large site to a completely new URL structure on the same domain. Do you 301 everything from old page to new page, or are your more selective - i.e. only 301 pages that have a certain page authority, for example. Thanks!
Technical SEO | | A_Q0 -
Removing URL Parentheses in HTACCESS
Im reworking a website for a client, and their current URLs have parentheses. I'd like to get rid of these, but individual 301 redirects in htaccess is not practical, since the parentheses are located in many URLs. Does anyone know an HTACCESS rule that will simply remove URL parantheses as a 301 redirect?
Technical SEO | | JaredMumford0 -
Landing Page URL Structure
We are finally setting up landing pages to support our PPC campaigns. There has been some debate internally about the URL structure. Originally we were planning on URL's like: domain.com /california /florida /ny I would prefer to have the URL's for each state inside a "state" folder like: domain.com /state /california /florida /ny I like having the folders and pages for each state under a parent folder to keep the root folder as clean as possible. Having a folder or file for each state in the root will be very messy. Before you scream URL rewriting :-). Our current site is still running under Classic ASP which doesn't support URL rewriting. We have tried to use HeliconTech's ISAPI rewrite module for IIS but had to remove it because of too many configuration issues. Next year when our coding to MVC is complete we will use URL rewriting. So the question for now: Is there any advantage or disadvantage to one URL structure over the other?
Technical SEO | | briankb0 -
How to handle temporary campaign URLs
Hi, We have just run a yearly returning commercial campaign for which we have created optimized URL's. (e.g. www.domain.tld/campaign including the category and brand names after the campaign www.domain.tld./campaign/womens This has resulted in 4500+ URL's being indexed in Google including the campaign name, now the campaign is over and these URL's do not exist anymore. How should we handle those URL's? 1.) 301 them to the correct category without the campaign name 2.) Create a static page www.domain.tld/campaign to which we 301 all URL's that have the campaign name in them Do you have any other suggestions on what the best approach would be? This is a yearly commercial campaign so in a year time we will have the same URL's again. Thanks, Chris
Technical SEO | | eCommerceSEO0 -
How to handle large numbers of comments?
First the good news. One site that I've been working on has seen an increase in traffic from 2k/month to 80k! As well as lots of visitors, the site is also getting lots of comments with one page getting more than 70 comments/day and showing no sign of a slow down! Approximately 3000 comments in total and growing! What is the best approach for handling this? I'm not talking about the review/approval/response but just in the way these comments are presented on the website taking both seo and usability into account. Does anyone have any particular recommendations? Options I've considered are: Just show the most recent x comments and ignore the rest. (Nobody is going to read 3000 comments!) Paginate comments (risk of duplicate content? Using Ajax could hide long-tail phrases in comments?) Show all comments (page load speed is suffering and this is likely to be causing problems for mobile visitors) How do active comments on a page contribute to an article's freshness? Any thoughts would be greatly appreciated.
Technical SEO | | DougRoberts2 -
Which one is best? Parameters or Meta
I have issue regarding duplicate pages on website as follow. http://www.vistastores.com/review.html?pr_page_id=90344 http://www.vistastores.com/review.html?pr_page_id=90345 I checked my Google webmaster tools and found that Google have already set Parameter with pr_page_id. So, what is this? Will Google index all that pages? Can I use following Meta tag to block indexing? Which one is best?
Technical SEO | | CommercePundit0