URL and title strategy for multiple location pages in the same city
-
Hi,
I have a customer which opens additional branches in cities where he had until now only one branch.
My question is: Once we open new store pages, what is the best strategy for the local store pages in terms of URL and title?
So far I've seen some different strategies for URL structure:
Some use [URL]/locations/cityname-1/2/3 etc.
while others use [URL]/locations/cityname-zip code/
I've even seen [URL]/locations/street address-cityname (that's what Starbucks do)There are also different strategies for the title of the branch page.
Some use [city name] [state] [zip code] | [Company name]
Other use [Full address] | [Company name]
Or [City name] [US state] [1/2/3] | [Company name]
Or [City name] [District / Neighborhood] [Zip Code] | [Company name]What is the preferred strategy for getting the best results? On the one hand, I wish differentiate the store pages from one another and gain as much local coverage as possible; on the other hand, I wish to create consistency and establish a long term strategy, taking into consideration that many more branches will be opened in the near future.
-
My pleasure, Gal
-
Thanks!
-
Hey Gal,
I'm personally a fan of title tags like this one on Phil Rozek's homepage:
http://www.localvisibilitysystem.com/
I love it when they contain a mix of sentiment and keywords, because this makes them stand out in the SERPs.
It's fine to have your brand name in most/all of your title tags, and I think it's REALLY important to have in on the homepage, the local landing page, the about page and contact page. Beyond that, with both title and H tags, I recommend being creative.
-
I'll followup this question
In the title tag, we'd like to keep the brand name. We have 1 product which is the main keyword.
What would you suggest about the title tag and the H1 in these city pages?
-
Hi Gal,
My suggestion would be to go forward like this:
URLs:
domain.com/chicago/32-center-st
Title Tags:
Brand City Street
Others in our community may have different suggestions, but the above seems simple and consistent to me.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Remove URLs from App
Hi all, our tech team inherited a bit of an SEO pickle. I manage a freemium React JS app built for 80k unique markets worldwide (and associated dedicated URL schema). Ex/ https://www.airdna.co/vacation-rental-data/app/us/california/santa-monica/overview Mistake - App, in its entirety, was indexed by Google in July 2018, which basically resulted in duplicate content penalties because the unique on-page content wasn't readable. Partial Solution - We no indexed all app pages until we were able to implement a "pre-render" / HTML readable solution with associated dynamic meta data for the Overview page in each market. We are now selectively reindexing only the free "Overview" pages that have unique data (with a nofollow on all other page links), but want to persist a noindex on all other pages because the data is not uniquely "readable" before subscribing. We have the technical server-side rules in place and working to ensure this selective indexing. Question - How can we force google to abandoned the >300k cached URLs from the summer's failed deploy? Ex/ https://screencast.com/t/xPLR78IbOEao, would lead you to a live URL such as this which has limited value to the user, https://www.airdna.co/vacation-rental-data/app/us/arizona/phoenix/revenue (Note Google's cached SERPs also have an old URL structure, which we have since 301ed, because we also updated the page structure in October). Those pages are currently and will remain noindexed for the foreseeable future. Our sitemap and robots.txt file is up-to-date, but the old search console only has a temporary removal on a one-by-one basis. Is there a way to do write a rule-based page removal? Or do we simply render these pages in HTML and remove the nofollow to those links from the Overview page so a bot can get to them, and then it would see that there's a noindex on them, and remove them from the SERPs? Thanks for your help and advice!
Local Website Optimization | | Airbnb_data_geek1 -
Difficulty Ranking Two Locations in the Same City
We are in the self-storage business and have locations through the Pacific Northwest. As we grow, there are cities where we've added multiple (2-3) locations. But we're discovering that we're having a great deal of difficulty ranking for all of these. For instance, we have two locations in Vancouver, WA. One is West Coast Self-Storage Vancouver, and the other is West Coast Self-Storage Padden Parkway. Both are in Vancouver, WA, but for the most part, only West Coast Self-Storage Vancouver is getting ranked. In fact, on those searches where Vancouver ranks, Padden Parkway doesn't show up anywhere. Not in the top 10 pages anyway. Each location has an outer landing page and an inner details page. On each page, we've placed unique, city-optimized keywords in the URL, Page Title, h1s, content. Of course each location has a separate NAP. Each location also has its own GMB page. Each location has a decent amount of reviews across multiple sites (Google, Yelp, GetFiveStars.) Both locations were previously on their own domain until a year ago when they were redirected to their current URLs. Both of those original domains were close to the same age. With the Padden Parkway location, we've tried to be even more hyper-local, by including the address in the URLs and in the h1 of the outer page. We've also created an h2 that references local neighborhoods around the business. We're also running into this situation in at least one other city, so I'm wondering if this has something to do with our url structure. Other businesses in our space use the URL structure of domain.com/state/city/location. We only go down to the state level. What are we missing?
Local Website Optimization | | misterfla0 -
What Is The Best Strategy For Writing Image Alt Text For SEO?
Curious on this topic, as websites that are image heavy, but have little written content can have depend on alt text for "readable content". I am aware the "best practice" is to write it as if you were describing the image to a blind person, but are there any SEO strategies that people have seen good results with? Some examples I've heard are: "unique keyword phrase" "unique keyword phrase + brand name" "Unique Keyword Phrase + LSI Keyword" Interested to hear feedback from the Moz Community! And thanks in advance for sharing your insight.
Local Website Optimization | | LureCreative0 -
Google plus page multiple domains
Hi I have had a .com domain for many years linked to my google plus page and local verified to my UK office address. This site sells and advertises my products, some of them are uk only like the school and computers I sell and the rest are digital and world wide. I decided to start a .co.uk domain to be more targeted to the uk and advertise only the school and computers which I sell to the uk and just link to the .com for digital products. I want the .com domain to attract world wide customers and the .co.uk for uk customers. What do I do, does it make sense to connect my google plus business page to the .co.uk site? Should I still have a google plus page for the .com site? I only have 1 office and thats in the uk. Not sure what to do here. I dont want to lose rankings or do anything negative. Thoughts? Thanks.
Local Website Optimization | | theindic0 -
What to do with localised landing pages on listings website - Canonical question
Hi Run a pet listings website and we had tonnes of duplicate content that we have resolved. But not sure what to do with the localised landing pages. We have everything pointing back back to the main listings URL http://www.dogscatsandpets.co.uk/for-sale-stud-and-adoption/ but haven't pointed the URLs that show pets for specific towns and cities eg http://www.dogscatsandpets.co.uk/for-sale/dogs-and-puppies/in-city-of-london/ back to the main url. Obviously this is giving us duplicate content issues, but these pages do rank in local search and drive traffic into the site. So my question is should we canonicalise the local pages back to the main url and if we do will this mean our local landing pages will no longer rank? Is there any alternatives?
Local Website Optimization | | dogscatsandpets0 -
Do you need exact match geographically targeted keywords for ranking within a specified city limit?
For example, I have a personal injury law firm in Sheboygan, Wisconsin. I only care about potential clients searching within the city limits of Sheboygan (and not within the state of Wisconsin or on a national level). Do the following elements need to contain an exact match geographically targeted keyword if I only care about ranking locally in Sheboygan, Wisconcsin? (The type of keyword phrase I'm referring to would be Sheboygan Personal Injury Lawyers, Sheboygan Car Accident Lawyers, etc.) Title Tag Meta Description Main Headline Body Content Should I not include an exact match geographically targeted keyword in my content and trust that Google can make the association with where I'm located by other factors on the website? Website factors: Google local business page is setup linking to my website Other local listings have been claimed and setup properly My contact page contains our full address and phone number My footer contains our full address and phone number on every page
Local Website Optimization | | peteboyd0 -
One location performing worse than the rest despite no major difference in SEO strategy
Hi all, I'm flummoxed. I'm dealing with a business that has 15 or so offices in three cities, and one city is performing horribly (this includes every office therein). The other two cities have shown consistently stellar results with massive traffic increases month over month for the past year; the city in question dropped unexpectedly in June and hasn't ever recovered. We didn't perform any major website changes during or immediately prior to that time period, and the website in general hasn't been negatively affected by Hummingbird. All locations for the business are optimized in the exact same way and according to best practices; there's no significant difference in the number of local listings, reviews, G+ fans, social signals, etc across locations. All meta data and content is optimized, NAPs are all consistent, we've built links wherever we can: the SEO for every location has been by-the-books. We've run a competitor audit in this particular city that included pulling our top competitors and exploring their domain authority, meta data, on-page keyword grade for the term we're trying to rank for, number and type of inbound links, social signals, and more; and we didn't spot any patterns or any websites that were significantly outperforming us in any area (besides actual rankings). It's frustrating because the client is expecting a fix for this city and I can't find anything that needs to be fixed! Have any multi-local SEOs out there run into a similar problem? What did you do about it?
Local Website Optimization | | ApogeeResults0 -
Single Site For Multiple Locations Or Multiple Sites?
Hi, Sorry if this rambles on. There's a few details that kind of convolute this issue so I'll try and be as clear as possible. The site in question has been online for roughly 5 years. It's established with many local citations, does well in local SERPs (working on organic results currently), and represents a business with 2 locations in the same county. The domain is structured as location1brandname.com. The site was recently upgraded from a 6-10 page static HTML site with loads of duplicate content and poor structure to a nice, clean WordPress layout. Again, Google is cool with it, everything was 301'd properly, and our rankings haven't dropped (some have improved). Here's the tricky part: To properly optimize this site for our second location, I am basically building a second website within the original, but customized for our second location. It will be location1brandname.com/secondcity and the menu will be unique to second-city service pages, unique NAP on footer, etc. I will then update our local citations with this new URL and hopefully we'll start appearing higher in local SERPs for the second-city keywords that our main URL isn't currently optimized for. The issue I have is that our root domain has our first city location in the domain and that this might have some negative effect on ranking for the second URL. Conversely, starting on a brand new domain (secondcitybrandname.com) requires building an entire new site and being brand new. My hunch is that we'll be fine making root.com/secondcity that locations homepage and starting a new domain, while cleaner and compeltely separate from our other location, is too much work for not enough benefit. It seems like if they're the same company/brand, they should be on the same sitee. and we can use the root juice to help. Thoughts?
Local Website Optimization | | kirmeliux0