How can I best handle parameters?
-
Thank you for your help in advance! I've read a ton of posts on this forum on this subject and while they've been super helpful I still don't feel entirely confident in what the right approach I should take it. Forgive my very obvious noob questions - I'm still learning!
The problem: I am launching a site (coursereport.com) which will feature a directory of schools. The directory can be filtered by a handful of fields listed below. The URL for the schools directory will be coursereport.com/schools. The directory can be filtered by a number of fields listed here:
- Focus (ex: “Data Science”)
- Cost (ex: “$<5000”)
- City (ex: “Chicago”)
- State/Province (ex: “Illinois”)
- Country (ex: “Canada”)
When a filter is applied to the directories page the CMS produces a new page with URLs like these:
- coursereport.com/schools?focus=datascience&cost=$<5000&city=chicago
- coursereport.com/schools?cost=$>5000&city=buffalo&state=newyork
My questions:
1) Is the above parameter-based approach appropriate? I’ve seen other directory sites that take a different approach (below) that would transform my examples into more “normal” urls.
coursereport.com/schools?focus=datascience&cost=$<5000&city=chicago
VERSUS
coursereport.com/schools/focus/datascience/cost/$<5000/city/chicago (no params at all)
2) Assuming I use either approach above isn't it likely that I will have duplicative content issues? Each filter does change on page content but there could be instance where 2 different URLs with different filters applied could produce identical content (ex: focus=datascience&city=chicago OR focus=datascience&state=illinois). Do I need to specify a canonical URL to solve for that case? I understand at a high level how rel=canonical works, but I am having a hard time wrapping my head around what versions of the filtered results ought to be specified as the preferred versions. For example, would I just take all of the /schools?focus=X combinations and call that the canonical version within any filtered page that contained other additional parameters like cost or city?
-
Should I be changing page titles for the unique filtered URLs?
-
I read through a few google resources to try to better understand the how to best configure url params via webmaster tools. Is my best bet just to follow the advice on the article below and define the rules for each parameter there and not worry about using rel=canonical ?
https://support.google.com/webmasters/answer/1235687
An assortment of the other stuff I’ve read for reference:
http://www.wordtracker.com/academy/seo-clean-urls
http://www.practicalecommerce.com/articles/3857-SEO-When-Product-Facets-and-Filters-Fail
http://www.searchenginejournal.com/five-steps-to-seo-friendly-site-url-structure/59813/
http://googlewebmastercentral.blogspot.com/2011/07/improved-handling-of-urls-with.html
-
I think you have your answer then on how you want to focus your URLs and your site!
-
Absolutely helpful. I really appreciate it. I think one real use case that I may want to solve for is the "focus" plus "city" combo. Ie: "data science schools in chicago". Based on the research I've done thus far I think that may be the only permutation really worth worrying about. Again - thanks a lot!
-
I am not going to be very helpful here.
Looking at those parameters and all the options you would have for URLs, yes you are ripe for duplicate content issue and a whole mess of search engine problems/confusion.
I read this the other day in the QNA forum here at Moz and I wish I could remember to give them credit for the quote, they said "Don't submit search results to the search engines" - so true - so true ....
Why? You end up with an almost infinite number of thin, duplicate pages that Google then does not know which ones to rank. Even if you put all the parameters into a static URL you still have the same problem.
I think you need to step back a sec
Are people searching for "data science schools in Chicago Illinois that cost less than $5000"?
Why would you even want to attempt to setup pages that could potentially rank for those terms based on the URL?
Launch the search function on the site, but hide all the search URLs behind robots.txt
Just setup things like
/search/?focus=datascience&cost=$<5000&city=chicago
/search/focus/datascience/cost/$<5000/city/chicago
put /search/ in robots and you are set
Another option (from one of my favorite WBF http://moz.com/blog/whiteboard-friday-using-the-hash)
Hide all the parameters behind the hash and they stay hidden from the search engines
/schools#?focus=datascience&cost=$<5000&city=chicago
Then go back, do your keyword research and build helpful static URL pages around what your users are searching for and then get those pages to rank. If that ultimately is the type of page above, I would bet you $3,141 plus an apple pie that you need to setup a simpler organization of pages and urls around location say /il/chicago/school-name or type /data-science/school-name and then all the other iterations, you would hide behind a hash etc.
Maybe this did help - I hope so.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can you force Google to use meta description?
Is it possible to force Google to use only the Meta description put in place for a page and not gather additional text from the page?
Technical SEO | | A_Q0 -
Handling of Duplicate Content
I just recently signed and joined the moz.com system. During the initial report for our web site it shows we have lots of duplicate content. The web site is real estate based and we are loading IDX listings from other brokerages into our site. If though these listings look alike, they are not. Each has their own photos, description and addresses. So why are they appear as duplicates – I would assume that they are all too closely related. Lots for Sale primarily – and it looks like lazy agents have 4 or 5 lots and input the description the same. Unfortunately for us, part of the IDX agreement is that you cannot pick and choose which listings to load and you cannot change the content. You are either all in or you cannot use the system. How should one manage duplicate content like this? Or should we ignore it? Out of 1500+ listings on our web site it shows 40 of them are duplicates.
Technical SEO | | TIM_DOTCOM0 -
Can you use schema markup to do this?
We sell steel plates. They come in a number of different grades and then in a variety of sizes. Generally we have a page for each unique grade. What I want to do is to use schema to enhance the rich snippet in SERPS to say something like width: 2 - 3.5m length 6 - 14m thickness 6 - 300mm Phosphorous: 0.005% Sulphur 0.005% is this possible using schema? would I use thing/product or another theme? i'm trying to scope out the possibilities here to determine whether it is worth spending time on. The the other more interesting question is whether it is possible to respond differently to informational and purchasing searches. So if someone is looking for informations physical and chemical properties would be best used in the schema markup. If they are looking to purchase then product characteristics. Can these be shown differentially?
Technical SEO | | Zippy-Bungle0 -
What is the best way to deal with https?
Currently, the site I am working on is using HTTPS throughout the website. The non-HTTPS pages are redirected through a 301 redirect to the HTTPS- this happens for all pages. Is this the best strategy going forward? if not, what changes would you suggest?
Technical SEO | | adarsh880 -
Can you help me understand leveraging semantic markup
Hi, i am trying to understand about leveraging semantic markup but even though i have read the page on the link, i am still not sure of what it means and how i can use it in my site www.in2town.co.uk which has been built using joomla If anyone can help me understand then that would be amazing and help me understand how it would benefit my site.
Technical SEO | | ClaireH-1848860 -
How can i increase my website traffic
Hello, my boss has decide a build website we have more than 12500 products in ourwebsite its mtscellular.com, im new as seo but im confused and need help i want to know how somebody help me to increase my website traffic
Technical SEO | | jimmylora0 -
Double byte characters in the URL - best avoided?
We are doing some optimisation on sites in the APAC region, namely China, Hong Kong, Taiwan and Japan. We have set the url generator to automatically use the heading of the page in the URL which works fine for countries using Latin characters, but is causing problems, particularly in IE, when it comes to the double byte countries. For some reason, IE struggles with double byte and displays URLs in their rather ugly, coded form. Anybody got any suggestions on whether we should persist with the keyword URLs or revert to the non-descriptive URLs for the double byte countries? The reason I ask is it's a balance of SEO benefit vs not scaring IE users off with ugly URLs that look dreadful and spammy.
Technical SEO | | Red_Mud_Rookie0 -
Google can read japanese or only alphabet ?
Hi Actually im running a web shop in several languages: english, french, spanish, italian, russian, german, japanese, korean and japanese ! lol Im trying to optimize my web site for SEO so i changed URL rewriting rules for example French example: From: http://www.test.com/je-suis-francais.html -> http://www.test.com/Je-suis-français.html Japanese example: (i use UTF8 encoding) From: http://www.test.com/watashiwa-nihonnjin-desu.html -> http://www.test.com/私は日本人です.html So i get something like wikipedia (url with accents, ideogramms in several languages) Do you think wikipedia and me are doing wrong ?
Technical SEO | | nipponx0