Should we use URL parameters or plain URL's=
-
Hi,
Me and the development team are having a heated discussion about one of the more important thing in life, i.e. URL structures on our site.
Let's say we are creating a AirBNB clone, and we want to be found when people search for
apartments new york.
As we have both have houses and apartments in all cities in the U.S it would make sense for our url to at least include these, so
clone.com/Appartments/New-York
but the user are also able to filter on price and size. This isn't really relevant for google, and we all agree on clone.com/Apartments/New-York should be canonical for all apartment/New York searches. But how should the url look like for people having a price for max 300$ and 100 sqft?
clone.com/Apartments/New-York?price=30&size=100
or (We are using Node.js so no problem)
clone.com/Apartments/New-York/Price/30/Size/100
The developers hate url parameters with a vengeance, and think the last version is the preferable one and most user readable, and says that as long we use canonical on everything to clone.com/Apartments/New-York it won't matter for god old google.
I think the url parameters are the way to go for two reasons. One is that google might by themselves figure out that the price parameter doesn't matter (https://support.google.com/webmasters/answer/1235687?hl=en) and also it is possible in webmaster tools to actually tell google that you shouldn't worry about a parameter.
We have agreed to disagree on this point, and let the wisdom of Moz decide what we ought to do. What do you all think?
-
Personally, I would agree with you an opt for the following option:
clone.com/Apartments/New-York?price=30&size=100I don't think it matters whether that section of the URL is readable to everyone. I would actually say that anyone who has a technical background would find the URL above easier to change than the other one, as having /'s in the URL almost symbolised different directories rather than a parameter (that's how I would generally interpret it anyway).
I think in the grand scheme of things, It's going to make little different as you don't want the additional sections to actually be indexed in the search engines. Like Gary correctly pointed out, you can setup 'URL Parameters' in GWT and I think that's your best option. There's more information about that here - http://googlewebmastercentral.blogspot.co.uk/2011/07/improved-handling-of-urls-with.html
You could also use robots.txt to block the parameters in the URL but this depends on whether the search engine crawling your website chooses to use it.
Hope this helps!
Lewis -
Good example of a site that does show up in the SERPs for all things related
-
OK, not to sit on the fence here but both are good options.
However when it comes to "URL Parameters" there is a section in Webmaster Tools that you can set to ignore certsin parameters. So that's always an option.
I like to look at sites like oodle in cases like this.
Here is an example
they spent a lot of time working out the best process and they use the node type url.
However Google has been said to prefer shorter urls recently.
Hope my sitting on the fence did not make things worse LOL
-
Personally I would just $_POST price and size - and be done with it. ( as opposed to $_GET which shows the parameter in the URL ) - No need to over think creating more URLs and complicating life.
If anything - you can define in WMT what price is and what size is but just keep it clean. Also, remember # tags in the URL doesn't get followed by google. So, clone.com/Apartments/New-York#price=30&size=100 could work too.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What's more valuable, a Blog or a Forum, and how to integrate?
We want to start a blog or forum (maybe eventually both) and are unsure what is the best way to publish it from an SEO standpoint. If the blog is published on our domain, like domain.com/blog then that obviously helps the site but if the base site is a for-profit business wouldn't it get less credibility, eyeballs, links as opposed to if you started the blog as it's own separate community on a separate domain and then just strategically linked to the for profit site (sponsorship links)? Essentially the question is, if I'm the Lucky Soday Company, do I start a Blog on the Lucky Soda website, or do I start a separate website to grow a soft drink enthusiast community blog / forum? I would guess a blog has more SEO potential than a discussion forum?
Intermediate & Advanced SEO | | MrSem0 -
URL Parameters as a single solution vs Canonical tags
Hi all, We are running a classifieds platform in Spain (mercadonline.es) that has a lot of duplicate content. The majority of our duplicate content consists of URL's that contain site parameters. In other words, they are the result of multiple pages within the same subcategory, that are sorted by different field names like price and type of ad. I believe if I assign the correct group of url's to each parameter in Google webmastertools then a lot these duplicate issues will be resolved. Still a few questions remain: Once I set f.ex. the 'page' parameter and i choose 'paginates' as a behaviour, will I let Googlebot decide whether to index these pages or do i set them to 'no'? Since I told Google Webmaster what type of URL's contain this parameter, it will know that these are relevant pages, yet not always completely different in content. Other url's that contain 'sortby' don't differ in content at all so i set these to 'sorting' as behaviour and set them to 'no' for google crawling. What parameter can I use to assign this to 'search' I.e. the parameter that causes the URL's to contain an internal search string. Since this search parameter changes all the time depending on the user input, how can I choose the best one. I think I need 'specifies'? Do I still need to assign canonical tags for all of these url's after this process or is setting parameters in my case an alternative solution to this problem? I can send examples of the duplicates. But most of them contain 'page', 'descending' 'sort by' etc values. Thank you for your help. Ivor
Intermediate & Advanced SEO | | ivordg0 -
Is it a problem that Google's index shows paginated page urls, even with canonical tags in place?
Since Google shows more pages indexed than makes sense, I used Google's API and some other means to get everything Google has in its index for a site I'm working on. The results bring up a couple of oddities. It shows a lot of urls to the same page, but with different tracking code.The url with tracking code always follows a question mark and could look like: http://www.MozExampleURL.com?tracking-example http://www.MozExampleURL.com?another-tracking-examle http://www.MozExampleURL.com?tracking-example-3 etc So, the only thing that distinguishes one url from the next is a tracking url. On these pages, canonical tags are in place as: <link rel="canonical<a class="attribute-value">l</a>" href="http://www.MozExampleURL.com" /> So, why does the index have urls that are only different in terms of tracking urls? I would think it would ignore everything, starting with the question mark. The index also shows paginated pages. I would think it should show the one canonical url and leave it at that. Is this a problem about which something should be done? Best... Darcy
Intermediate & Advanced SEO | | 945010 -
Putting "noindex" on a page that's in an iframe... what will that mean for the parent page?
If I've got a page that is being called in an iframe, on my homepage, and I don't want that called page to be indexed.... so I put a noindex tag on the called page (but not on the homepage) what might that mean for the homepage? Nothing? Will Google, Bing, Yahoo, or anyone else, potentially see that as a noindex tag on my homepage?
Intermediate & Advanced SEO | | Philip-DiPatrizio0 -
Any reasons why social media properties are ranking higher than the site's own name?
The site below has social media properties and other sites coming up before it's own listing even for the exact search of the site name. Any ideas why this is happening? Link Any input is appreciated.
Intermediate & Advanced SEO | | SEO5Team0 -
Can SEO increase a page's Authority? Or can Authority only be earned via #RCS?
Hi all. I am asking this question to purposefully provoke a discussion. The CEO of the company where I am the in-house SEO sent me a directive this morning. The directive is to take our Website from a PR3 site to a PR5....in 6 months. Now, I know Page Rank is a bit of a deprecated concept, but I'm sure you would agree that "Authority" is still crucial to ranking well. When he first sent me the directive it was worded like this "I want a plan in place with the goal being to "beat" a specific competitor in 6 months." When I prodded him to define "beat," i.e. did he mean "outrank" for every keyword, he answered that he wanted our site to have the same "Authority" that this particular competitor has. So I am left pondering this question: Is it possible for SEO to increase the authority of a page? Or does "Authority" come from #RCS? The second part of this question is what would you do if you were in my shoes? I have been devoting huge amounts of time on technical SEO because the Website is a mess. Because I've dedicated so much time to technical issues, link-earning has taken a back seat. In my mind, why would anyone want to link to a crappy site that has serious technical issues (slow load times, no persistent cart, lots of 404s, etc)? Shouldn't we make the site awesome before trying to get people to link to us? Given this directive to improve our site's "Authority" - would you scrap the technical SEO and go whole hog into a link-earning binge, or would you hunker down and pound away at the technical issues? Which one would you do first if you couldn't do both at the same time? Comments, thoughts and insights would be greatly appreciated.
Intermediate & Advanced SEO | | danatanseo1 -
Website monitoring online censorship in China - what's holding us back?
We run https://greatfire.org, a non-profit website which lets you test if a website or keyword is blocked or otherwise censored in China. There are a number of websites that nominally offer this service, and many of them rank better than us in Google. However, we believe this is unfortunate since their testing methods are inaccurate and/or not transparent. More about that further down*. We started GreatFire in February, 2011 as a reaction to ever more pervasive online censorship in China (where we are based). Due to the controversy of the project and the political situation here, we've had to remain anonymous. Still, we've been able to reach out to other websites and to users. We currently have around 3000 visits per month out of which about 1000 are from organic search. However, SEO has been a headache for us from the start. There are many challenges in running this project and our team is small (and not making any money from this). Those users that do find us on relevant keywords seem to be happy since they spend a long time on the website. Examples: websites blocked in china: 6 minutes+
Intermediate & Advanced SEO | | GreatFire.org
great firewall of china test: 8 minutes+ So, here are some SEO questions related to GreatFire.org. If you can give us advice it would be greatly appreciated and you would truly help us in our mission to bring transparency and spread awareness of online censorship in China: Each URL tested in our database has its own page. Our database contains 25000 URLs (and growing). We have previously been advised that one SEO problem is that we appear to have a lot of duplicate data, since the individual URL pages are very similar. Because of this, we've added automatic tags to most pages. We then exclude certain pages from this rule that are considered high-priority, such as domains ranked highly by Alexa and keywords that are blocked. Is this a good approach? Do you think the duplicate content factor is still holding us back? Can we improve? Some of our pages have meta descriptions, but most don't. Should we add them on URL pages? They would be set to a certain pattern which again might make them look very similar and could cause the duplicate content warning to go off. Suggestions? Many of the users that find us in Google search for keywords that aren't relevant to what we offer, such as "https.facebook.com" and lots of variations of that. Obviously, they leave the website quickly. This means that the average time that people coming from Google are spending on our website is quite low (2 minutes) and the bounce rate quite high (68%). Can we or should we do something to discourage being found on non-relevant keywords? Are there any other technical problems you can see that are holding our SEO back? Thank you very much! *Competitors ranking higher searching for "test great firewall china": 1. http://www.greatfirewallofchina.org. They are only a frontend website for this service: http://www.viewdns.info/chinesefirewall. ViewDNS only checks for DNS records which is one of three major methods to block websites. So many websites and keywords that are not DNS poisoned, but are blocked by IP or by keyword, will be specified as available, when in fact they are blocked. Our system uses actual test locations inside China to try to download the URL to be tested and checks for different types of censorship. 2. http://www.websitepulse.com/help/testtools.china-test.html. This is a better service in that they seem to do actual testing from inside China. However, they only display partial results, they do not explain test results and they do not offer historic data on whether the URL was blocked in the past. We do all of that.0 -
URL Parameter is not available in website which was monitored by Google
I was checking URL parameters section over Google webmaster tools. Google have monitored following parameters and exclude it from crawling. utm_campaign utm_medium utm_source I have built URLs with following tool to track visits from vertical search engine like Google shopping and other comparison shopping engines. http://www.google.com/support/analytics/bin/answer.py?answer=55578 So, I am quite confuse to see over my data. Will Google consider external URLs which are available with above parameters or require to consist on live website? Note: I am asking for my eCommerce website. http://www.lampslightingandmore.com/
Intermediate & Advanced SEO | | CommercePundit0