Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How to Structure URL's for Multiple Locations
-
We are currently undergoing a site redesign and are trying to figure out the best way to structure the URL's and breadcrumbs for our many locations.
We currently have 60 locations nationwide and our URL structure is as follows:
www.mydomain.com/locations/{location}
Where {location} is the specific street the location is on or the neighborhood the location is in. (i.e. www.mydomain.com/locations/waterford-lakes)
The issue is, {location} is usually too specific and is not a broad enough keyword. The location "Waterford-Lakes" is in Orlando and "Orlando" is the important keyword, not " Waterford Lakes".
To address this, we want to introduce state and city pages. Each state and city page would link to each location within that state or city (i.e. an Orlando page with links to "Waterford Lakes", "Lake Nona", "South Orlando", etc.). The question is how to structure this.
Option 1
Use the our existing URL and breadcrumb structure (www.mydomain.com/locations/{location}) and add state and city pages outside the URL path:
Option 2
Build the city and state pages into the URL and breadcrumb path:
www.mydomain.com/locations/{state}/{area}/{location}
(i.e www.mydomain.com/locations/fl/orlando/waterford-lakes)
Any insight is much appreciated. Thanks!
-
Hi David,
Typically, your main landing pages are going to be those that represent the city of location, as in:
etc.
What I'm trying to understand is if you are saying you have more than one office within a single city (as in orlando office A, orlando office B, orlando office C) and are trying to hash out how to distinguish these same-city offices from one another. Is this the scenario, or am I not getting it? Please feel free to provide further details.
-
David -
It looks like there are two main options for you:
Keep the same URL structure (option 1), and create category pages that are state-based / area-based, that then have a short description of each location in that geographic area, with a link to their location page.
This is typically how it might be done with an eCommerce site, where you'd have a parent category (i.e. shoes) and then a sub-category (i.e. running shoes).
The downside to this is that you risk having duplicate content on these category pages.
Option #2 would be my recommendation, because you are including the area / state information into the URL.
One company that does not do this well is Noodles & Company. Their location URL looks like this:
http://www.noodles.com/locations/150/
... where "150" is a store ID in a database. Easy to pull out of a database table. Less helpful to the end user who doesn't know that store ID 150 = the one closest to them.
It would be much better to have it listed like:
http://www.noodles.com/locations/Colorado/Boulder/2602-Baseline/You don't want to go much beyond 4 layers, but it's a better way of indicating to Google and other search engines the location tree.
Also, I'd highly recommend using a rich-data format for displaying the location information.
For example, on the Customer Paradigm site, we use the RDFa system for tagging the location properly:
Customer Paradigm
5353 Manhattan Circle
Suite 103
Boulder CO, 80303
303.473.4400
... and then Google doesn't have to guess what the location's address and phone number actually are.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will shortening down the amount of text on my pages affect it's SEO performance?
My website has several pages with a lot of text that becomes pretty boring. I'm looking at shortening down the amount of copy on each page but then within the updated, shortened copy, integrating more target keywords naturally. Will shortening down the current copy have a negative effect on my SEO performance?
On-Page Optimization | | Liquid20150 -
Best Tool for Retrieving Multiple URL Word Counts in Bulk?
I am doing some content analysis with over 200 URLs to go through! Does anybody know of, or can recommend any bulk on-page word count checkers which would help with the heavy lifting? Any suggestions are greatly appreciated. Thanks!
On-Page Optimization | | NickG-1230 -
Duplicate URL's in Sitemap? Is that a problem?
I submitted a sitemap to on Search Console - but noticed that there are duplicate URLs, is that a problem for Google?
On-Page Optimization | | Luciana_BAH0 -
Use of '&' in meta title
Hi, I know that use of '&' would be helpful to save space and also add more keyword variation to the title tag. But just want to make sure if it matters if I use '&' in most of my title tags? And also is it common to use more than & in one title? Would the following title be different in Google's perspective regardless of the title length? I am thinking they are all targeting the keywords 'fruit cake' and 'fruit bread', but the first one is the best. buy fruit cake & bread buy fruit cake & fruit bread buy fruit cake and fruit bread Thanks in advance.
On-Page Optimization | | russellbrown0 -
Will "internal 301s" have any effect on page rank or the way in which an SE see's our site interlinking?
We've been forced (for scalability) to completely restructure our website in terms of setting out a hierarchy. For example - the old structure : country / city / city area Where we had about 3500 nicely interlinked pages for relevant things like taxis, hotels, apartments etc in that city : We needed to change the structure to be : country / region / area / city / cityarea So as patr of the change we put in place lots of 301s for the permanent movement of pages to the new structure and then we tried to actually change the physical on-page links too. Unfortunately we have left a good 600 or 700 links that point to the old pages, but are picked up by the 301 redirect on page, so we're slowly going through them to ensure the links go to the new location directly (not via the 301). So my question is (sorry for long waffle) : Whilst it must surely be "best practice" for all on-page links to go directly to the 'right' page, are we harming our own interlinking and even 'page rank' by being tardy in working through them manually? Thanks for any help anyone can give.
On-Page Optimization | | TinkyWinky0 -
Duplicate Content for Men's and Women's Version of Site
So, we're a service where you can book different hairdressing services from a number of different salons (site being worked on). We're doing both a male and female version of the site on the same domain which users are can select between on the homepage. The differences are largely cosmetic (allowing the designers to be more creative and have a bit of fun and to also have dedicated male grooming landing pages), but I was wondering about duplicate pages. While most of the pages on each version of the site will be unique (i.e. [male service] in [location] vs [female service] in [location] with the female taking precedent when there are duplicates), what should we do about the likes of the "About" page? Pages like this would both be unique in wording but essentially offer the same information and does it make sense to to index two different "About" pages, even if the titles vary? My question is whether, for these duplicate pages, you would set the more popular one as the preferred version canonically, leave them both to be indexed or noindex the lesser version entirely? Hope this makes sense, thanks!
On-Page Optimization | | LeahHutcheon0 -
Question about url structure for large real estate website
I've been running a large real estate rental website for the past few years and on May 8, 2013 my Google traffic dropped by about 50%. I'm concerned that my current url structure might be causing thin content pages for certain rental type + location searches. My current directory structure is:
On-Page Optimization | | Amped
domain.com/home-rentals/california/
domain.com/home-rentals/california/beverly-hills/
domain.com/home-rentals/california/beverly-hills/90210/
domain.com/apartment-rentals/california/
domain.com/apartment-rentals/california/beverly-hills/
domain.com/apartment-rentals/california/beverly-hills/90210/
etc.. I was thinking of changing it to the following:
domain.com/rentals/california/
domain.com/rentals/california/beverly-hills/
domain.com/rentals/california/beverly-hills/90210/ ** Note: I'd provide users the ability to filter their results by rental type - by default all types would be displayed. Another question - my listing pages are currently displayed as:
domain.com/123456 And I've been thinking of changing it to:
domain.com/123456-123-Street-City-State-Zip Should I proceed with both changes - one or the one - neither - or something else I'm not thinking of? Thank you in advance!!0 -
How do we handle sitemaps in robots.txt when multiple domains point to same physical location?
we have www.mysite.net, www.mysite.se, www.mysite.fi and so on. all of these domains point to the same physical location on our webserver, and we replace texts given back to client depending on which domain he/she requested. My problem is this: How do i configure sitemaps in robots.txt when robots.txt is used by multiple domains? If I for instance put the rows Sitemap: http://www.mysite.net/sitemapNet.xml
On-Page Optimization | | nordicnetproducts
Sitemap: http://www.mysite.net/sitemapSe.xml in robots.txt, would that result in some cross submission error?0