How to handle lots of URL parameters
-
Howdy mozzers
I'm hoping you can lend some advice. I'm dealing with a site now with loads of URL parameters. It's a vehicle dealership group which hosts its entire inventory from multiple locations on one page, sorted by parameters.
Example inventory URL: www.dealership.com/car-inventory.asp?pa=&ns=10&so=m&sor=DESC&ma=&mod=&mt=&yr=&bs=&pr=&t=used&ln=
Where pa (page no.); ns (number of vehicles shown); so (sort by condition); sor (sort order); ma (make); mod (model); yr (year); bs (body style); pr (price range); t (type - new, used, etc.); ln (location no.).
As you can imagine this generates a gazillion URLs (or slightly less). Any thoughts on best canonicalization options?
Thanks as always
-
Thanks a lot for this. That video couldn't have been more timely!
-
There's actually a tool inside of Google Webmaster Tools that lets you specify how they should handle crawling your URL parameters. It's under 'configuration -> url parameters'. It should tell you there if they are having any trouble to crawl your site or not, based on the parameters in your URL.
Google has actually just released a video about this topic today - and it would probably be worth reviewing as well:
http://googlewebmastercentral.blogspot.com/2012/08/configuring-url-parameters-in-webmaster.html
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Any idea why ?ref=wookmark being appended to URL?
We have a https site and have been checking our 301 re-directs from the old http pages. All seem fine except one...and it is ONLY weird in Firefox (it works OK on Chrome and IE). The http version of that one URL is redirecting to the correct https URL, but with ?ref=wookmark being appended to the end. Why? On the Firefox browser only... http://www.easydigging.com/broadfork(dot)html 301 redirects to https://www.easydigging.com/broadfork(dot)html?ref=wookmark From the research I did Wookmark seems to be a JQuery feature, but we do not use it (as far as I know). And even if we do, it probably should not pop up when doing a 301 redirect. I did try clearing my cache a few times, with no change in the problem. Any help is appreciated 🙂
Technical SEO | | GregB1230 -
Changing URLs
As of right now we are using yahoo small business, when creating a product you have to declare an id, when we created the site we were not aware that you will not be able to change the id but also the ID is being used as the URL. we have a couple thousand products in which we will need to update the URLs. What would the best way to be to fix this without losing much juice from our current pages. Also I was thinking that if we did them all in a couple weeks it would hurt us a lot, and the best course of action would be to do a slow roll out of the URL changes. Any help is appreciated. Thank you!
Technical SEO | | TITOJAX0 -
Url rewrite subfolder
Hi, How can i rewrite example.com/example1/example2/example3 to example.com/example3 And is there tools or software that can generate url rewrite... (not a plugin) Thanks !
Technical SEO | | bigrat950 -
How to handle pagination for a large website?
I am currently doing a site audit on a large website that just went through a redesign. When looking through their webmaster tools, they have about 3,000 duplicate Title Tags. This is due to the way their pagination is set up on their site. For example. domain.com/books-in-english?page=1 // domain.com/books-in-english?page=4 What is the best way to handle these? According to Google Webmaster Tools, a viable solution is to do nothing because Google is good at distinguishing these. That said, it seems like their could be a better solution to help prevent duplicate content issues. Any advice would be much welcomed. 🙂
Technical SEO | | J-Banz0 -
Should we handle this redirect differently?
So our question is should we handle page redirection/rewriting in php or in .htaccess (with a specific problem we are running into outlined below). We have an ecommerce store in a subfolder of our site (example.com/store/). In the next folder down we have a group of widgets(www.example.com/store/widget-group1). Recently we put a .htaccess redirect in the top level folder (example.com/store/.htaccess), in order to re-write some URL’s and also 301 a page to another page. This seems to be negatively affecting our /widgets-group1/ subfolder however (organic traffic to example.com/store/widget-group1) took a nose dive 3 days after putting the .htaccess redirect in place on the /store/ folder and it has not recovered 8 days later). *Nothing appears outwardly wrong with the current setup to the eye when viewing the pages or requesting as googlebot (the only issue being the nose dive in organic traffic lol) *both subfolders are setup in apache config file to allow local overrides of .htaccess as follows: <directory store="" widget-group1="">Options -Indexes FollowSymLinks -MultiViews
Technical SEO | | altecdesign
AllowOverride All
Order allow,deny
allow from all</directory> <directory store="">Options -Indexes FollowSymLinks -MultiViews
AllowOverride All
Order allow,deny
allow from all</directory>0 -
Why is google webmaster tools ignoring my url parameter settings
I have set up several url parameters in webmaster tools that do things like select a specific products colour or size. I have set the parameter in google to "narrows" the page and selected to crawl no urls but in the duplicate content section each of these are still shown as being 2 pages with the same content. Is this just normal, i.e. showing me that they are the same anyway or is google deliberately ignoring my settings (which I assume it does when they are sure they know better or think I have made a mistake)?
Technical SEO | | mark_baird0 -
URL Structure for Deal Aggregator
I have a website that aggregates deals from various daily deals site. I originally had all the deals on one page /deals, however I thought that maybe it might be more useful to have several pages e.g. /beautydeals or /hoteldeals. However if I give every section it's own page that means I have either no current deals on the main /deals page or I will have duplicate content. I'm wondering what might be the best approach here? A few of the options that come to mind are: 1. Return to having all the deals on one page /deals and linking internally to content within that page
Technical SEO | | andywozhere
2. Have both a main /deals page with all of the deals plus other pages such as /beautydeals, but add re="canonical" to point to the main /deals page
3. Create new content for the /deals page... however I think people will probably want to see at least some deals straight away, rather than having to click through to another page.
4. Display some sub-categories on the main /deals page, but have separate URLs for other more popular sub-categories e.g. /beautydeals (this is how it works at the moment) I should probably point out that the site also has other content such as events and a directory. Any suggestions on how best to approach this much appreciated! Cheers, Andy0 -
Should I use www. or not in my main URL?
I have backlinks coming into my homepage, which has both a www. URL and one that's merely http://mysite.com. Which is the preferred URL for best optimization for search engines and how do I find this out?
Technical SEO | | NetPicks0