How can I best handle parameters?
-
Thank you for your help in advance! I've read a ton of posts on this forum on this subject and while they've been super helpful I still don't feel entirely confident in what the right approach I should take it. Forgive my very obvious noob questions - I'm still learning!
The problem: I am launching a site (coursereport.com) which will feature a directory of schools. The directory can be filtered by a handful of fields listed below. The URL for the schools directory will be coursereport.com/schools. The directory can be filtered by a number of fields listed here:
- Focus (ex: “Data Science”)
- Cost (ex: “$<5000”)
- City (ex: “Chicago”)
- State/Province (ex: “Illinois”)
- Country (ex: “Canada”)
When a filter is applied to the directories page the CMS produces a new page with URLs like these:
- coursereport.com/schools?focus=datascience&cost=$<5000&city=chicago
- coursereport.com/schools?cost=$>5000&city=buffalo&state=newyork
My questions:
1) Is the above parameter-based approach appropriate? I’ve seen other directory sites that take a different approach (below) that would transform my examples into more “normal” urls.
coursereport.com/schools?focus=datascience&cost=$<5000&city=chicago
VERSUS
coursereport.com/schools/focus/datascience/cost/$<5000/city/chicago (no params at all)
2) Assuming I use either approach above isn't it likely that I will have duplicative content issues? Each filter does change on page content but there could be instance where 2 different URLs with different filters applied could produce identical content (ex: focus=datascience&city=chicago OR focus=datascience&state=illinois). Do I need to specify a canonical URL to solve for that case? I understand at a high level how rel=canonical works, but I am having a hard time wrapping my head around what versions of the filtered results ought to be specified as the preferred versions. For example, would I just take all of the /schools?focus=X combinations and call that the canonical version within any filtered page that contained other additional parameters like cost or city?
-
Should I be changing page titles for the unique filtered URLs?
-
I read through a few google resources to try to better understand the how to best configure url params via webmaster tools. Is my best bet just to follow the advice on the article below and define the rules for each parameter there and not worry about using rel=canonical ?
https://support.google.com/webmasters/answer/1235687
An assortment of the other stuff I’ve read for reference:
http://www.wordtracker.com/academy/seo-clean-urls
http://www.practicalecommerce.com/articles/3857-SEO-When-Product-Facets-and-Filters-Fail
http://www.searchenginejournal.com/five-steps-to-seo-friendly-site-url-structure/59813/
http://googlewebmastercentral.blogspot.com/2011/07/improved-handling-of-urls-with.html
-
I think you have your answer then on how you want to focus your URLs and your site!
-
Absolutely helpful. I really appreciate it. I think one real use case that I may want to solve for is the "focus" plus "city" combo. Ie: "data science schools in chicago". Based on the research I've done thus far I think that may be the only permutation really worth worrying about. Again - thanks a lot!
-
I am not going to be very helpful here.
Looking at those parameters and all the options you would have for URLs, yes you are ripe for duplicate content issue and a whole mess of search engine problems/confusion.
I read this the other day in the QNA forum here at Moz and I wish I could remember to give them credit for the quote, they said "Don't submit search results to the search engines" - so true - so true ....
Why? You end up with an almost infinite number of thin, duplicate pages that Google then does not know which ones to rank. Even if you put all the parameters into a static URL you still have the same problem.
I think you need to step back a sec
Are people searching for "data science schools in Chicago Illinois that cost less than $5000"?
Why would you even want to attempt to setup pages that could potentially rank for those terms based on the URL?
Launch the search function on the site, but hide all the search URLs behind robots.txt
Just setup things like
/search/?focus=datascience&cost=$<5000&city=chicago
/search/focus/datascience/cost/$<5000/city/chicago
put /search/ in robots and you are set
Another option (from one of my favorite WBF http://moz.com/blog/whiteboard-friday-using-the-hash)
Hide all the parameters behind the hash and they stay hidden from the search engines
/schools#?focus=datascience&cost=$<5000&city=chicago
Then go back, do your keyword research and build helpful static URL pages around what your users are searching for and then get those pages to rank. If that ultimately is the type of page above, I would bet you $3,141 plus an apple pie that you need to setup a simpler organization of pages and urls around location say /il/chicago/school-name or type /data-science/school-name and then all the other iterations, you would hide behind a hash etc.
Maybe this did help - I hope so.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can we re rank our Penalyzed website in Google?
Hello This is Maqbul, from India. I have a jobs portal blog [ bharatrecruit.com]. It was getting around 50K to 100K Views a Day and made me $100 a day. But after a few months, my competitor made negative SEO with 12,000 Spammy backlinks. Suddenly my site was hit by Google and now it is getting 200 to 300 Pageviews a day. So the question is I did not disavow bad links for a long time like 3 to 4 months. Now I disavow all the bad links but the website is not ranking. Can we re-rank this site or create another website. Please reply must. None of the bloggers can answer this. Thanks, Regards Maqbul
Technical SEO | | vinaso960 -
Can we validate a CDN like Max in Webmasters?
Hi, Can we validate a CDN like Max in Webmasters? We have images hosted in CDN and they dont get indexed in Google images. Its been a year now and no luck. Maxcdn says they have no issues at there end and images have ALT and they are original images with no copyright issues
Technical SEO | | ArchieChilds0 -
Dynamic Url best approach
Hi We are currently in the process of making changes to our travel site where by if someone does a search this information can be stored and also if the user needs to can take the URL and paste into their browser at find that search again. The url will be dynamic for every search, so in order to stop duplicate content I wanted ask what would be the best approach to create the URLS. ** An example of the URL is: ** package-search/holidays/hotelFilters/?depart=LGW&arrival=BJV&sdate=20150812&edate=20150819&adult=2&child=0&infant=0&fsearch=first&directf=false&nights=7&tsdate=&rooms=1&r1a=2&r1c=0&r1i=0&&dest=3&desid=1&rating=&htype=all&btype=all&filter=no&page=1 I wanted to know if people have previous experience in something like this and what would be the best option for SEO. Will we need to create the URL with a # ( As i read this stops google crawling after the #) Block the folder IN ROBOTS is there any other areas I should be aware of in order stop any duplicate content and 404 pages once the URL/HOLIDAY SEARCH is no longer valid. thanks E
Technical SEO | | Direct_Ram0 -
How to handle pages I can't delete?
Hello Mozzers, I am using wordpress and I have a small problem. I have two sites, I don't want but the dev of the theme told me I can't delete them. /portfolio-items/ /faq-items/ The dev said he can't find a way to delete it because these pages just list faqs/portfolio posts. I don't have any of these posts so basically what I have are two sites with just the title "Portfolio items" and "FAQ Items". Furthermore the dev said these sites are auto-generated so he can't find a way to remove them. I mean I don't believe that it's impossible, but if it is how should I handle them? They are indexed by search engines, should I remove them from the index and block them from robots.txt? Thanks in advance.
Technical SEO | | grobro0 -
Whats the best tool for a Sitemap creation?
Hi guys i like to know whats the best tool to create diferent types of Sitemap´s (images, videos, normals). I dont care if is paid.
Technical SEO | | faraujoj0 -
Determine the best URL structure
Hi guys, I'm working my way through a URL restructure at the moment and I've several ideas about the best way to do it. However, it would be good to get some views on this. At the moment I'm working on a property website - http://bit.ly/N7eew7 As you can quickly see, the URL structure of the site needs a lot of work. Similar websites - http://bit.ly/WXH5WG http://bit.ly/Q3UiLC One of the sites has http://www.domain.ie/property-to-let/location/ And the other has http://www.domain.ie/rentals/location/property-to-let/ I could do with some guidance about the best steps to take with this. I've a few ideas myself but this is a massive project. Cheers, Mark
Technical SEO | | MarkScully0 -
What can we do to improve our site
Hi. I am hoping that some of you can help me with the in2town site www.in2town.co.uk The site is a news/lifestyle magazine site. The site is a cross between, huffington post, digital spy, female first and the sun newspaper. Basically the site is a news site as well as covering showbiz news, travel news, health news and advice etc What i would like is for people to look at the site and let me know what they feel i should do to improve the site to make it better for our readers and to gain more readership. I would also like to hear from people on how they find moving around the site as well as the speed of the site. At the moment the site is with an american hosting company and i am in the process of talking to UK hosting companies to move the site. The site is currently on a dedicated server. It would mean a lot if people could give me their advice on how to improve the site and make it a beter experience for our readers while at the same time being able to generate income with the site. Just a quick note, all content is original and we have a number of people who write for the site. many thanks
Technical SEO | | ClaireH-1848860 -
Double byte characters in the URL - best avoided?
We are doing some optimisation on sites in the APAC region, namely China, Hong Kong, Taiwan and Japan. We have set the url generator to automatically use the heading of the page in the URL which works fine for countries using Latin characters, but is causing problems, particularly in IE, when it comes to the double byte countries. For some reason, IE struggles with double byte and displays URLs in their rather ugly, coded form. Anybody got any suggestions on whether we should persist with the keyword URLs or revert to the non-descriptive URLs for the double byte countries? The reason I ask is it's a balance of SEO benefit vs not scaring IE users off with ugly URLs that look dreadful and spammy.
Technical SEO | | Red_Mud_Rookie0