Should we use URL parameters or plain URL's=
-
Hi,
Me and the development team are having a heated discussion about one of the more important thing in life, i.e. URL structures on our site.
Let's say we are creating a AirBNB clone, and we want to be found when people search for
apartments new york.
As we have both have houses and apartments in all cities in the U.S it would make sense for our url to at least include these, so
clone.com/Appartments/New-York
but the user are also able to filter on price and size. This isn't really relevant for google, and we all agree on clone.com/Apartments/New-York should be canonical for all apartment/New York searches. But how should the url look like for people having a price for max 300$ and 100 sqft?
clone.com/Apartments/New-York?price=30&size=100
or (We are using Node.js so no problem)
clone.com/Apartments/New-York/Price/30/Size/100
The developers hate url parameters with a vengeance, and think the last version is the preferable one and most user readable, and says that as long we use canonical on everything to clone.com/Apartments/New-York it won't matter for god old google.
I think the url parameters are the way to go for two reasons. One is that google might by themselves figure out that the price parameter doesn't matter (https://support.google.com/webmasters/answer/1235687?hl=en) and also it is possible in webmaster tools to actually tell google that you shouldn't worry about a parameter.
We have agreed to disagree on this point, and let the wisdom of Moz decide what we ought to do. What do you all think?
-
Personally, I would agree with you an opt for the following option:
clone.com/Apartments/New-York?price=30&size=100I don't think it matters whether that section of the URL is readable to everyone. I would actually say that anyone who has a technical background would find the URL above easier to change than the other one, as having /'s in the URL almost symbolised different directories rather than a parameter (that's how I would generally interpret it anyway).
I think in the grand scheme of things, It's going to make little different as you don't want the additional sections to actually be indexed in the search engines. Like Gary correctly pointed out, you can setup 'URL Parameters' in GWT and I think that's your best option. There's more information about that here - http://googlewebmastercentral.blogspot.co.uk/2011/07/improved-handling-of-urls-with.html
You could also use robots.txt to block the parameters in the URL but this depends on whether the search engine crawling your website chooses to use it.
Hope this helps!
Lewis -
Good example of a site that does show up in the SERPs for all things related
-
OK, not to sit on the fence here but both are good options.
However when it comes to "URL Parameters" there is a section in Webmaster Tools that you can set to ignore certsin parameters. So that's always an option.
I like to look at sites like oodle in cases like this.
Here is an example
they spent a lot of time working out the best process and they use the node type url.
However Google has been said to prefer shorter urls recently.
Hope my sitting on the fence did not make things worse LOL
-
Personally I would just $_POST price and size - and be done with it. ( as opposed to $_GET which shows the parameter in the URL ) - No need to over think creating more URLs and complicating life.
If anything - you can define in WMT what price is and what size is but just keep it clean. Also, remember # tags in the URL doesn't get followed by google. So, clone.com/Apartments/New-York#price=30&size=100 could work too.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
New Website's Not Ranking for Branded Term
Hey Friends, I can't seem to figure out why https://feello.com/ isn't ranking on Google for it's branded term (Feello). It's ranking in 1st position on Bing and Yahoo but on page 2 (16th or so) on Google. Going through the list and can't come up with an answer. Metadata: Yes Indexed to Webmaster: Yes, Fetched pages: Yes Google cache on May 27, 2017: Check Using canonical and redirecting for non-www and HTTPS version: Yes & Yes Feello in domain name: Yes Set up social profiles and GMB: Yes Driving traffic: Yes, some email and ads Checked robots.txt: Yes, not created yet Created and Submitted Sitemap: Yes - https version Checked for blocked resources: None. The list goes on...Any ideas would be appreciated.
Intermediate & Advanced SEO | | GarrettDenham0 -
Shoemaker with ugly shoes : Agency site performing badly, what's our best bet?
Hi everyone,
Intermediate & Advanced SEO | | AxialDev
We're a web agency and our site www.axialdev.com is not performing well. We have very little traffic from relevant keywords. Local competitors with worse On-page Grader scores and very few backlinks outrank us. For example, we're 17th for the keyword "agence web sherbrooke" in Google.ca in French. Background info: In the past, we included 3 keywords-rich in the footer of every site we made (hundreds of sites by now). We're working to remove those links on poor sites and to use a single nofollow link on our best sites. Since this is on-going and we know we won't be able to remove everything, our link profile sucks (OSE). We have a lot of sites on our C-Block, some of poor quality. We've never received a manual penalty. Still, we've disavowed links as a precaution after running Link D-Tox. We receive a lot of trafic via our blog where we used to post technical articles about Drupal, Node js, plugins, etc. These visits don't drive business. Only a third of our organic visits come from Canada. What are our options? Change domain and delete the current one? Disallow the blog except for a few good articles, hoping it helps Google understand what we really do. Keep donating to Adwords? Any help greatly appreciated!
Thanks!2 -
Investigating Google's treatment of different pages on our site - canonicals, addresses, and more.
Hey all - I hesitate to ask this question, but have spent weeks trying to figure it out to no avail. We are a real estate company and many of our building pages do not show up for a given address. I first thought maybe google did not like us, but we show up well for certain keywords 3rd for Houston office space and dallas office space, etc. We have decent DA and inbound links, but for some reason we do not show up for addresses. An example, 44 Wall St or 44 Wall St office space, we are no where to be found. Our title and description should allow us to easily picked up, but after scrolling through 15 pages (with a ton of non relevant results), we do not show up. This happens quite a bit. I have checked we are being crawled by looking at 44 Wall St TheSquareFoot and checking the cause. We have individual listing pages (with the same titles and descriptions) inside the buildings, but use canonical tags to let google know that these are related and want the building pages to be dominant. I have worked though quite a few tests and can not come up with a reason. If we were just page 7 and never moved it would be one thing, but since we do not show up at all, it almost seems like google is punishing us. My hope is there is one thing that we are doing wrong that is easily fixed. I realize in an ideal world we would have shorter URLs and other nits and nats, but this feels like something that would help us go from page 3 to page 1, not prevent us from ranking at all. Any thoughts or helpful comments would be greatly appreciated. http://www.thesquarefoot.com/buildings/ny/new-york/10005/lower-manhattan/44-wall-st/44-wall-street We do show up one page 1 for this building - http://www.thesquarefoot.com/buildings/ny/new-york/10036/midtown/1501-broadway, but is the exception. I have tried investigating any differences, but am quite baffled.
Intermediate & Advanced SEO | | AtticusBerg10 -
There's NO reason these sites should be beating mine...Or is there?
Hi Over the past 10 months, my internal page rankings (previously excellent) have plummeted. I'm now trying to recover them. I haven't received an unnatural links warning in Google Webmaster Tools. Also, I used to have hundreds of internal links to each of these 21 pages using the same exact-match anchor text eg, Tuscany real estate, Umbria real estate, etc. I changed this about 6 months ago. So why am I still ranking poorly for these (only moderately competitive keywords) behind sites with poorer metrics? 1) Keyword: lake como real estate My page here – **http://tinyurl.com/d34k8m ** -- used to rank No1 or No2 neck-and-neck with this page www.immobiliarevacanzelago.com/. He's still No1 but I’m down to about No13. Yet when I look in Open Site Explorer virtually all my metrics beat his.
Intermediate & Advanced SEO | | Jeepster0 -
Can I swap a website yet keep it's high ranking for a competitive keyword?
Couldn't fit the entire question in the main bit so the explanation is here: Working on a client's website which is hosted by volusion and also been doing SEO for them for about a year. Now we've finally got them ranking at the lower end of page 1 (around 10+) for their main keyword. They now want to move from volusion over to Amazon Web Store 😢 which seems to be an SEO nightmare from even my basic understanding of SEO. From looking at the coding and the way Amazon Web store is built on top of how restricted you are from doing anything with it, I am almost certain the shop will be extremely difficult to optimise and we will have to completely change nearly all of the content. Finally! the actual question; I was thinking I could get them to delay their move to Amazon webstore until they are ranking in the top 5 for this top keyword. Once they switch over, i assume they'll keep this ranking for at least a short while? This keyword attracts a high volume of traffic and if this traffic is clicking on the result for their website, and google sees that people are finding this website valuable (not clicking back onto google results). Will they be able hold onto this high ranking? Basically what I'm asking is, this will be a terrible outdated badly SEO'd shop, but if a high volume of people are clicking on it and staying on it from their lingering ranking will Google just let it stay at the top? A massive amount of gratitude in advance for anyone who tries to help with this! 😄
Intermediate & Advanced SEO | | acecream0 -
Website monitoring online censorship in China - what's holding us back?
We run https://greatfire.org, a non-profit website which lets you test if a website or keyword is blocked or otherwise censored in China. There are a number of websites that nominally offer this service, and many of them rank better than us in Google. However, we believe this is unfortunate since their testing methods are inaccurate and/or not transparent. More about that further down*. We started GreatFire in February, 2011 as a reaction to ever more pervasive online censorship in China (where we are based). Due to the controversy of the project and the political situation here, we've had to remain anonymous. Still, we've been able to reach out to other websites and to users. We currently have around 3000 visits per month out of which about 1000 are from organic search. However, SEO has been a headache for us from the start. There are many challenges in running this project and our team is small (and not making any money from this). Those users that do find us on relevant keywords seem to be happy since they spend a long time on the website. Examples: websites blocked in china: 6 minutes+
Intermediate & Advanced SEO | | GreatFire.org
great firewall of china test: 8 minutes+ So, here are some SEO questions related to GreatFire.org. If you can give us advice it would be greatly appreciated and you would truly help us in our mission to bring transparency and spread awareness of online censorship in China: Each URL tested in our database has its own page. Our database contains 25000 URLs (and growing). We have previously been advised that one SEO problem is that we appear to have a lot of duplicate data, since the individual URL pages are very similar. Because of this, we've added automatic tags to most pages. We then exclude certain pages from this rule that are considered high-priority, such as domains ranked highly by Alexa and keywords that are blocked. Is this a good approach? Do you think the duplicate content factor is still holding us back? Can we improve? Some of our pages have meta descriptions, but most don't. Should we add them on URL pages? They would be set to a certain pattern which again might make them look very similar and could cause the duplicate content warning to go off. Suggestions? Many of the users that find us in Google search for keywords that aren't relevant to what we offer, such as "https.facebook.com" and lots of variations of that. Obviously, they leave the website quickly. This means that the average time that people coming from Google are spending on our website is quite low (2 minutes) and the bounce rate quite high (68%). Can we or should we do something to discourage being found on non-relevant keywords? Are there any other technical problems you can see that are holding our SEO back? Thank you very much! *Competitors ranking higher searching for "test great firewall china": 1. http://www.greatfirewallofchina.org. They are only a frontend website for this service: http://www.viewdns.info/chinesefirewall. ViewDNS only checks for DNS records which is one of three major methods to block websites. So many websites and keywords that are not DNS poisoned, but are blocked by IP or by keyword, will be specified as available, when in fact they are blocked. Our system uses actual test locations inside China to try to download the URL to be tested and checks for different types of censorship. 2. http://www.websitepulse.com/help/testtools.china-test.html. This is a better service in that they seem to do actual testing from inside China. However, they only display partial results, they do not explain test results and they do not offer historic data on whether the URL was blocked in the past. We do all of that.0 -
What's the best way to hold newly purchased domains over 2 years?
Hi, A friend has just bought 3 domains and is not planning to build websites with them for around 2 years. He asked me what the best thing to do with these domains was...I have 2 ways of look ing at it: a) Putting a holding page on these and submit to Google Webmaster Tools - this way they are indexed by Google and hold search engine trust when the site finally goes up - HOWEVER, if they are not updated with fresh content would that work against them in 2 years time? b) Simply redirect them to their existing site and don't do anything else. Let me know your thoughts. Adido.
Intermediate & Advanced SEO | | Adido-1053990 -
What's the best theme for seo if you are going to use yoast anyway
I am going to edit my theme myself so I don't need something like thesis for that. But people say that the thesis framework is amazing for seo, and it's hard to edit it manually. Does using the thesis theme do anything for you if you are going to use yoast anyway? Thanks William
Intermediate & Advanced SEO | | willie790