Pretty URLs... do they matter?
-
Given the following urls:
example.com/warriors/ninjas/cid=WRS-NIN01
Is there any difference from an SEO perspective? Aesthetically the 2nd bugs me but that's not a statistical difference.
Thank you
-
Tom, there is undoubtedly a difference, speaking broadly in terms of the use of query strings - at the SEO level - and at the usability / customer retention level.
URLs that are easy to read are easier to remember and easier to copy-paste too - meaning more robust - less likely to break or get corrupted when run through text parsers.
Google is explicit about their preference for clean urls, and a clean url structure for your site as a whole. I'm not sure if this is relevant to where you're at with your particular project, but I always try build a site with a url schema that exposes the information architecture and content priority as much as possible, usually with important pages close to the site root.
If you have to use query strings - and of course they are sometimes unavoidable, or actually just the best tool for the job at hand - Google Webmaster Tools allows you to provide explicit classifications for each parameter. Personally I thought this was a great addition to their suite of tools.
-
I read it, and the most relevant bit seems to be:
"Not ideally use parameters. If they need to be used the amount of parameters should be limited to two or fewer."
Is there any research that supports that 0 params is better then 1, and 1 is better than 2, etc?
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to switch from URL based navigation to Ajax, 1000's of URLs gone
Hi everyone, We have thousands of urls generated by numerous products filters on our ecommerce site, eg./category1/category11/brand/color-red/size-xl+xxl/price-cheap/in-stock/. We are thinking of moving these filters to ajax in order to offer a better user experience and get rid of these useless urls. In your opinion, what is the best way to deal with this huge move ? leave the existing URLs respond as before : as they will disappear from our sitemap (they won't be linked anymore), I imagine robots will someday consider them as obsolete ? redirect permanent (301) to the closest existing url mark them as gone (4xx) I'd vote for option 2. Bots will suddenly see thousands of 301, but this is reflecting what is really happening, right ? Do you think this could result in some penalty ? Thank you very much for your help. Jeremy
Intermediate & Advanced SEO | | JeremyICC0 -
Confused: Url Restructure
Hello, We're giving our website a bit of a spring clean in terms of SEO. The site is doing ok, but after the time invested in SEO, content and last year's migration of multiple sites into one, we're not seeing the increase in traffic we had hoped. Our current urls look something like this: /a-cake-company/cup-cakes/strawberry We have the company name as the first level as we with the migration we migrated many companies into one site. What we're considering is testing some pages with a structure like this: /cup-cakes/cup-cake-company-strawberry So we'll lose a level and we'll focus more on the category of the product rather than the brand. What's your thoughts on this? We weren't going to do a mass change yet, just a test, but is this something we should be focusing on? In terms of organisation our current url structure is perfect, but what about from an SEO point of view? In terms of keywords customers are looking for both options. Thanks!
Intermediate & Advanced SEO | | HB170 -
Cached Alternate URL appearing as base page
Hi there, I'm currently targeting Australia and the US for one of my web-pages. One of my web-pages begin with a subdomain (au.site.com) and the other one is just the root domain (site.com). After searching the website on Australian Google and checking the description and title, it keeps the US ones (i.e. root domain) and after checking the cached copy, it was cached earlier today but it is displayed exactly as the American website when it is supposed to be the Australian one? In the url for the caching it appears as au.site.com while displaying the American page's content. Any ideas why? Thanks, Oliver
Intermediate & Advanced SEO | | oliverkuchies0 -
SEO benefit of tracked URLs
I've found a lot of mixed info on this topic so I thought I'd ask the experts (Moz community). If I'm adding tracking parameters to URLs to monitor organic traffic will this affect the rank/value of the original clean URL? If so, would best practice be to 301 redirect the tracked URL to the original:
Intermediate & Advanced SEO | | IceIcebaby
i.e. redirect www.example.com/category/?DZID=Organic_G_NP/SQ&utm_source=Organic&utm_medium=Google TO www.example.com/category Thanks for your help!
-Reed0 -
Is slugs in the URL now a good thing?
Hi, Until now I've adviced a lot of web shops to avoid having long URL structures for their categories and products (aka. remove the useless slugs). Recently I discovered that Google started rolling out more and more results that looks like these screenshots: http://filer.crenia.no/McDn & http://filer.crenia.no/McYO (look at the URL in the SERP) I'm assuming the slugs are a vital part of creating these SERP results. Personally, I also think they look better and favor them compared to the old SERPs. Does anyone have any experience with these, what impact they have or any reason not to add slugs to URLs again?
Intermediate & Advanced SEO | | Inevo0 -
Best url structure
I am making a new site for a company that services many cities. I was thinking a url structure like this, website.com/keyword1-keyword2-keyword3/cityname1-cityname2-cityname3-cityname4-cityname5. Will this be the best approach to optimize the site for the keyword plus 5 different cities ? as long as I keep the total url characters under the SeoMoz reccomended 115 characters ? Or would it be better to build separate pages for each city, trying to reword the main services to try to avoid dulpicate content.
Intermediate & Advanced SEO | | jlane90 -
How to deal with old, indexed hashbang URLs?
I inherited a site that used to be in Flash and used hashbang URLs (i.e. www.example.com/#!page-name-here). We're now off of Flash and have a "normal" URL structure that looks something like this: www.example.com/page-name-here Here's the problem: Google still has thousands of the old hashbang (#!) URLs in its index. These URLs still work because the web server doesn't actually read anything that comes after the hash. So, when the web server sees this URL www.example.com/#!page-name-here, it basically renders this page www.example.com/# while keeping the full URL structure intact (www.example.com/#!page-name-here). Hopefully, that makes sense. So, in Google you'll see this URL indexed (www.example.com/#!page-name-here), but if you click it you essentially are taken to our homepage content (even though the URL isn't exactly the canonical homepage URL...which s/b www.example.com/). My big fear here is a duplicate content penalty for our homepage. Essentially, I'm afraid that Google is seeing thousands of versions of our homepage. Even though the hashbang URLs are different, the content (ie. title, meta descrip, page content) is exactly the same for all of them. Obviously, this is a typical SEO no-no. And, I've recently seen the homepage drop like a rock for a search of our brand name which has ranked #1 for months. Now, admittedly we've made a bunch of changes during this whole site migration, but this #! URL problem just bothers me. I think it could be a major cause of our homepage tanking for brand queries. So, why not just 301 redirect all of the #! URLs? Well, the server won't accept traditional 301s for the #! URLs because the # seems to screw everything up (server doesn't acknowledge what comes after the #). I "think" our only option here is to try and add some 301 redirects via Javascript. Yeah, I know that spiders have a love/hate (well, mostly hate) relationship w/ Javascript, but I think that's our only resort.....unless, someone here has a better way? If you've dealt with hashbang URLs before, I'd LOVE to hear your advice on how to deal w/ this issue. Best, -G
Intermediate & Advanced SEO | | Celts180 -
Service Keyword in URL - too much?
We're working on revamping the URL structure for a site from the ground up. This firm provides a service and has a library of case studies to back up their work. Here's some options on URL structure: 1. /cases/[industry keyword]-[service keyword] (for instance: /cases/retail-pest-control) There is some search traffic for the industry/service combination, so that would be the benefit of using both in URL. But we'd end up with about 70 pages with the same service keyword at the end. 2. /cases/[industry keyword] (/cases/retail) Shorter, less spam potential, but have to optimize for the service keyword -- the primary -- in another way. 3. /cases/clientname (/cases/wehaveants) No real keyword potential but better usability. We also want the service keyword to rank on its own on another page (so, a separate "pest control" page). So don't want to dilute that page's value even after we chase some of the long tail traffic. Any thoughts on the best course of action? Thanks!
Intermediate & Advanced SEO | | kdcomms1