How can I best handle parameters?
-
Thank you for your help in advance! I've read a ton of posts on this forum on this subject and while they've been super helpful I still don't feel entirely confident in what the right approach I should take it. Forgive my very obvious noob questions - I'm still learning!
The problem: I am launching a site (coursereport.com) which will feature a directory of schools. The directory can be filtered by a handful of fields listed below. The URL for the schools directory will be coursereport.com/schools. The directory can be filtered by a number of fields listed here:
- Focus (ex: “Data Science”)
- Cost (ex: “$<5000”)
- City (ex: “Chicago”)
- State/Province (ex: “Illinois”)
- Country (ex: “Canada”)
When a filter is applied to the directories page the CMS produces a new page with URLs like these:
- coursereport.com/schools?focus=datascience&cost=$<5000&city=chicago
- coursereport.com/schools?cost=$>5000&city=buffalo&state=newyork
My questions:
1) Is the above parameter-based approach appropriate? I’ve seen other directory sites that take a different approach (below) that would transform my examples into more “normal” urls.
coursereport.com/schools?focus=datascience&cost=$<5000&city=chicago
VERSUS
coursereport.com/schools/focus/datascience/cost/$<5000/city/chicago (no params at all)
2) Assuming I use either approach above isn't it likely that I will have duplicative content issues? Each filter does change on page content but there could be instance where 2 different URLs with different filters applied could produce identical content (ex: focus=datascience&city=chicago OR focus=datascience&state=illinois). Do I need to specify a canonical URL to solve for that case? I understand at a high level how rel=canonical works, but I am having a hard time wrapping my head around what versions of the filtered results ought to be specified as the preferred versions. For example, would I just take all of the /schools?focus=X combinations and call that the canonical version within any filtered page that contained other additional parameters like cost or city?
-
Should I be changing page titles for the unique filtered URLs?
-
I read through a few google resources to try to better understand the how to best configure url params via webmaster tools. Is my best bet just to follow the advice on the article below and define the rules for each parameter there and not worry about using rel=canonical ?
https://support.google.com/webmasters/answer/1235687
An assortment of the other stuff I’ve read for reference:
http://www.wordtracker.com/academy/seo-clean-urls
http://www.practicalecommerce.com/articles/3857-SEO-When-Product-Facets-and-Filters-Fail
http://www.searchenginejournal.com/five-steps-to-seo-friendly-site-url-structure/59813/
http://googlewebmastercentral.blogspot.com/2011/07/improved-handling-of-urls-with.html
-
I think you have your answer then on how you want to focus your URLs and your site!
-
Absolutely helpful. I really appreciate it. I think one real use case that I may want to solve for is the "focus" plus "city" combo. Ie: "data science schools in chicago". Based on the research I've done thus far I think that may be the only permutation really worth worrying about. Again - thanks a lot!
-
I am not going to be very helpful here.
Looking at those parameters and all the options you would have for URLs, yes you are ripe for duplicate content issue and a whole mess of search engine problems/confusion.
I read this the other day in the QNA forum here at Moz and I wish I could remember to give them credit for the quote, they said "Don't submit search results to the search engines" - so true - so true ....
Why? You end up with an almost infinite number of thin, duplicate pages that Google then does not know which ones to rank. Even if you put all the parameters into a static URL you still have the same problem.
I think you need to step back a sec
Are people searching for "data science schools in Chicago Illinois that cost less than $5000"?
Why would you even want to attempt to setup pages that could potentially rank for those terms based on the URL?
Launch the search function on the site, but hide all the search URLs behind robots.txt
Just setup things like
/search/?focus=datascience&cost=$<5000&city=chicago
/search/focus/datascience/cost/$<5000/city/chicago
put /search/ in robots and you are set
Another option (from one of my favorite WBF http://moz.com/blog/whiteboard-friday-using-the-hash)
Hide all the parameters behind the hash and they stay hidden from the search engines
/schools#?focus=datascience&cost=$<5000&city=chicago
Then go back, do your keyword research and build helpful static URL pages around what your users are searching for and then get those pages to rank. If that ultimately is the type of page above, I would bet you $3,141 plus an apple pie that you need to setup a simpler organization of pages and urls around location say /il/chicago/school-name or type /data-science/school-name and then all the other iterations, you would hide behind a hash etc.
Maybe this did help - I hope so.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best practices for types of pages not to index
Trying to better understand best practices for when and when not use a content="noindex". Are there certain types of pages that we shouldn't want Google to index? Contact form pages, privacy policy pages, internal search pages, archive pages (using wordpress). Any thoughts would be appreciated.
Technical SEO | | RichHamilton_qcs0 -
Best way to deal with these urls?
Found overly dynamic urls in the crawl report. http://www.trespass.co.uk/camping/festivals-friendly/clothing?Product_sort=PriceDesc&utm_campaign=banner&utm_medium=blog&utm_source=Roslyn Best way to deal with these? Cheers Guys
Technical SEO | | Trespass0 -
How can I find my Webmaster Tools HTML file?
So, totally amateur hour here, but I can't for the life of me find our HTML verification file for webmaster tools. I see nowhere to look at it in Google Webmaster Tools console, I tried a site:, I googled it, all the info out there is about how to verify a site. Ours is verified, but I need the verification file code to sync up with the Google API and no one seems to have it. Any thoughts?
Technical SEO | | healthgrades0 -
Best practice for XML sitemap depth
We run an eCommerce for education products with 20 or so subject based catalogues (Maths, Literacy etc) and each catalogue having numerous ranges (Counting, Maths Games etc) then products within those. We carry approximately 15,000 products. My question is around the sitemap we submit - nightly - and it's depth. It is currently set to cover off home, catalogues and ranges plus all static content (about us etc). Should we be submitting sitemaps to include product pages as well? Does it matter or would it not make much difference in terms of search. Thanks in advance.
Technical SEO | | TTS_Group0 -
Best URL Structure for Product Pages?
I am happy with my URLs and my ecommerce site ranks well over all, but I have a question about product URL's. Specifically when the products have multiple attributes such as "color". I use a header URL in order to present the 'style' of products, www.americanmusical.com/Item--i-GIB-LPCCT-LIST and I allow each 'color' to have it's own URL so people can send or bookmark a specific item. www.americanmusical.com/Item--i-GIB-LPCCT-ANCH1 www.americanmusical.com/Item--i-GIB-LPCCT-WRCH1 I use a rel canonical to show that the header URL is the URL search engines should be indexing and to avoid duplicate content issues from having the exact same info, MP3's, PDF's, Video's accessories, etc on each specific item URL. I also have a 'noindex no follow' on the specific item URL. These header URLs rank well, but when using tools like SEOMoz, which I love, my header pages fail for using rel canonical and 'noindex no follow' I've considered only having the header URL, but I like the idea of shoppers being able to get to the specific product URL. Do I need the no index no follow? Do I even need the rel canonical? Any suggestions?
Technical SEO | | dianeb1520 -
How to handle a merge with another site
I own a gaming site at legendzelda.net Recently a site zelda-temple.net wanted to merge communities and sites. We pretty much scrapped their content (a lot was articles I already had topics on), merged the user databases, and redirected the domain to mine. What is the best seo way to redirect that domain. I tried to set up a 301 on the entire domain to gain all the backlinks that site had (A LOT) but it seems as if a lot are not being picked up by the open site explorer and other tools. Advice?
Technical SEO | | webfeatseo0 -
Dynamic Parameters in URL
I have received lots of warnings because of long urls. Most of them are because my website has many Attributes to FILTER out products. And each time the user clicks on one, its added to the URL. pls see my site here: www.theprinterdepo.com The warning is here: Although search engines can crawl dynamic URLs, search engine representatives have warned against using over 2 parameters in any given URL. The question to the community is: -What should I do? These attributes really help the user to find easier the products. I could remove some of the attributes, I am not sure if my ecommerce solution (MAGENTO), allows to change the behavior of this so that this does not use querystring parameters.
Technical SEO | | levalencia10 -
As an agency, what is the best way to handle being the webmaster and hosting provider for several sites (some of which are in the same industry and have natural links to each other)?
We are an agency that builds and hosts websites for several companies (some of which happen to be in the same industry - and therefore naturally link to each other - we do not dictate). In regards to handling their domain registrations, webmaster tools account, google analytics account, and servers, what is the best practice to avoid Google thinking that these companies are affilliated? Even though they aren't affiliated, we are afraid that us being the "webmaster" of these sites and having shared servers for them that we may be affecting them.
Technical SEO | | grayloon0