URL and title strategy for multiple location pages in the same city
-
Hi,
I have a customer which opens additional branches in cities where he had until now only one branch.
My question is: Once we open new store pages, what is the best strategy for the local store pages in terms of URL and title?
So far I've seen some different strategies for URL structure:
Some use [URL]/locations/cityname-1/2/3 etc.
while others use [URL]/locations/cityname-zip code/
I've even seen [URL]/locations/street address-cityname (that's what Starbucks do)There are also different strategies for the title of the branch page.
Some use [city name] [state] [zip code] | [Company name]
Other use [Full address] | [Company name]
Or [City name] [US state] [1/2/3] | [Company name]
Or [City name] [District / Neighborhood] [Zip Code] | [Company name]What is the preferred strategy for getting the best results? On the one hand, I wish differentiate the store pages from one another and gain as much local coverage as possible; on the other hand, I wish to create consistency and establish a long term strategy, taking into consideration that many more branches will be opened in the near future.
-
My pleasure, Gal
-
Thanks!
-
Hey Gal,
I'm personally a fan of title tags like this one on Phil Rozek's homepage:
http://www.localvisibilitysystem.com/
I love it when they contain a mix of sentiment and keywords, because this makes them stand out in the SERPs.
It's fine to have your brand name in most/all of your title tags, and I think it's REALLY important to have in on the homepage, the local landing page, the about page and contact page. Beyond that, with both title and H tags, I recommend being creative.
-
I'll followup this question
In the title tag, we'd like to keep the brand name. We have 1 product which is the main keyword.
What would you suggest about the title tag and the H1 in these city pages?
-
Hi Gal,
My suggestion would be to go forward like this:
URLs:
domain.com/chicago/32-center-st
Title Tags:
Brand City Street
Others in our community may have different suggestions, but the above seems simple and consistent to me.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Remove URLs from App
Hi all, our tech team inherited a bit of an SEO pickle. I manage a freemium React JS app built for 80k unique markets worldwide (and associated dedicated URL schema). Ex/ https://www.airdna.co/vacation-rental-data/app/us/california/santa-monica/overview Mistake - App, in its entirety, was indexed by Google in July 2018, which basically resulted in duplicate content penalties because the unique on-page content wasn't readable. Partial Solution - We no indexed all app pages until we were able to implement a "pre-render" / HTML readable solution with associated dynamic meta data for the Overview page in each market. We are now selectively reindexing only the free "Overview" pages that have unique data (with a nofollow on all other page links), but want to persist a noindex on all other pages because the data is not uniquely "readable" before subscribing. We have the technical server-side rules in place and working to ensure this selective indexing. Question - How can we force google to abandoned the >300k cached URLs from the summer's failed deploy? Ex/ https://screencast.com/t/xPLR78IbOEao, would lead you to a live URL such as this which has limited value to the user, https://www.airdna.co/vacation-rental-data/app/us/arizona/phoenix/revenue (Note Google's cached SERPs also have an old URL structure, which we have since 301ed, because we also updated the page structure in October). Those pages are currently and will remain noindexed for the foreseeable future. Our sitemap and robots.txt file is up-to-date, but the old search console only has a temporary removal on a one-by-one basis. Is there a way to do write a rule-based page removal? Or do we simply render these pages in HTML and remove the nofollow to those links from the Overview page so a bot can get to them, and then it would see that there's a noindex on them, and remove them from the SERPs? Thanks for your help and advice!
Local Website Optimization | | Airbnb_data_geek1 -
I want to rank a national home page for a local keyword phrase
Hello - We are a nationally available brand based in Denver, CO. Our home page currently ranks #8 (used to be 5) for "real estate photography in Denver" -- I want to improve this ranking, but our home page is generalized and not geared toward Denver, CO but to all of our markets. I'm trying to troubleshoot this and have a few ideas.... I would love advice on the best route, or a different route altogether: Create a Denver-specific page -- _will that page compete with my home page that is already ranked in the top ten? _ Add the keyword phrase in the image alt attribute Add keyword phrase into the content - need to make sure that viewers realize we are national I already updated the meta description to say "real estate photography in Denver and beyond"
Local Website Optimization | | virtuance_photography1 -
Content Strategy – Blog Channel Questions
We are currently blogging at a high volume to hit keywords for our 1,500 locations across the country. We are trying to make sure we rank well near each location and we have been using our blog to create content for that reason. With recent changes on Google, I am seeing that it is more about content topics than hitting all variations of your keywords and including state and city specific terms. We are now asking ourselves if the blog channel portion of our content strategy is incorrect. Below are some of the main questions we have and any input that is backed by experience would be helpful. 1. Can it hurt us to blog at a high volume (4 blogs per day) in an effort to include all of our keywords and attach them to state and city specific keywords (ie. "keyword one" with "keyword one city" and "keyword one different city")? 2. Is it more valuable to blog only a couple of times per month with deeper content, or more times per month with thinner connect but more keyword involvement? 3. Our customers are forced to use our type of product by the government. We are one of the vendors that provide this service. Because of this our customers may not care at all about anything we would blog about. Do we blog for them, or do we blog for the keyword and try and reach partners and others who would read the content and hope that it also ranks us high when our potential customers search? 4. Is there an advantage/disadvantage or does it matter if we have multiple blog authors? Big questions for sure, but if you have insight on any one of them, please provide and maybe we can answer them all with a group effort. Thanks to all of you who are taking the time to read this and contribute.
Local Website Optimization | | Smart_Start0 -
Google still indexing home page even after with 301 - Ecommerce Website
Hi all,
Local Website Optimization | | David1986
We have a 301 redirect problem. Google seems to continue indexing a 301 redirect to our old home page. Even after months. We have a multiple language domain, with subfolders: www.example.com (ex page, now with a redirect to the right locale in the right country) www.example.com/it/home (canonical) www.example.com/en/home (canonical) www.example.com/es/home (canonical) www.example.com/fr/home (canonical) www.example.com/de/home (canonical) We still see the old page (www.example.com) in Google results, with old metadata in English and, just in some countries (i.e.: France), we see the correct result, the "new" homepage, www.example.com/fr/home in first position.
The real problem is that Google is still indexing and showing www.example.com as the "real" and "trusted" URL, even if we set: a 301 redirect the right language for every locale in Google Search Console a canonical tag to the locale url an hreflang tag inside the code a specific sitemap with hreflang tag specified for the new homepages Now our redirect process is the following (Italy example).
www.example.com -->301
www.example.com/en/home --> default version --->301
www.example.com/it/home --> 200 Every online tool, from Moz to Bot simulators see that there is a 301. So Correct. Google Search Console says that: on www.example.com there is a 301 (correct) in the internal link section of Google Search Console the www.example.com is still in first position with 34k links. Many of these links are cominig from property subdomains. Should we change those links inside those third level domain? From www.example.com to www.example.com/LOCALE/home? the www.example.com/LOCALE/home are the real home page, they give 200 code Do you know if there's a way to delete the old home page from Google results since this is 301? Do you think that, even after a 301 redirect, if Google sees too many internal links decides to ignore the 301? Thanks for your help! Davide0 -
Pages ranking outside of sales area
Hi there Moz Community, I work with a client (a car dealership), that mostly serves an area within 50-100 miles at most from their location. A previous SEO company had built a bunch of comparison pages on their website (i.e. 2016 Acura ILX vs. Mercedes-Benz C300). These pages perform well in their backyard in terms of engagement metrics like bounce rate, session duration, etc. However, they pull in traffic from all over the country and other countries as well. Because they really don't have much of an opportunity to sell someone a car across the country that a customer could easily buy at their local dealership, anyone from outside their primary marketing area typically bounces. So, it drags down their overall site metrics plus all of the metrics for these pages. I imagine searchers from outside their primary sales area are seeing their location and saying "whoah that's far and not what I'm looking for." I tried localizing the pages by putting their city name in the title tags, meta descriptions, and content, but that doesn't seem to really be getting rid of this traffic from areas too far away to sell a car to. My worry is that the high bounce rates, low time on site, and general irrelevancy of these pages to someone far away are going to affect them negatively. So, short of trying to localize the content on the page or just deleting these pages all together, I'm not quite sure where to go from here. Do you think that having these high bouncing pages will hurt them? Any suggestions would be welcomed. Thanks!
Local Website Optimization | | Make_Model1 -
Structuring URLs of profile pages
First of all, I want to thank everyone for the feedback that I received on the first question. My next question has to do with the URL structure of personal trainer profiles pages on www.rightfitpersonaltraining.com. Currently, the structure of each trainer profile page is "www.rightfitpersonaltraining.com/personal-trainers/trainer/" and at the end I manually add the trainer's "city-firstname-lastinitial". Would it be to my benefit to have the developers change the structure so that the trainer profile URLs are "www.rightfitpersonaltraining.com/city-personal-trainers/trainername"? That way, each trainer profile would link directly to the trainer's city page as opposed to the general "personal-trainers" page. I don't mind paying a little extra to go back into the site to make these changes, as I think they would benefit the search ranking for each city page.
Local Website Optimization | | mkornbl20 -
Duplicate content question for multiple sites under one brand
I would like to get some opinions on the best way to handle duplicate / similar content that is on our company website and local facility level sites. Our company website is our flagship website that contains all of our service offerings, and we use this site to complete nationally for our SEO efforts. We then have around 100 localized facility level sites for the different locations we operate that we use to rank for local SEO. There is enough of a difference between these locations that it was decided (long ago before me) that there would be a separate website for each. There is however, much duplicate content across all these sites due to the service offerings being roughly the same. Every website has it's own unique domain name, but I believe they are all on the same C-block. I'm thinking of going with 1 of 2 options and wanted to get some opinions on which would be best. 1 - Keep the services content identical across the company website and all facility sites, and use the rel=canonical tag on all the facility sites to reference the company website. My only concern here is if this would drastically hurt local SEO for the facility sites. 2 - Create two unique sets of services content. Use one set on the company website. And use the second set on the facility sites, and either live with the duplicate content or try and sprinkle in enough local geographic content to create some differential between the facility sites. Or if there are other suggestions on a better way to handle this, I would love to hear any other thoughts as well. Thanks!
Local Website Optimization | | KHCreative0 -
Multilingual site making new URLs, how to preserve SEO juice?
Hello! My site currently serves content in german and english, however without having separate URLs (it depends on Accept-Language and has a submitform for changing language based on set cookies). The site appears extremely well in the search engine, with many keywords ranking at #1-10. They appear on the german and english google search, with the first one bringing the best results. It's however the english site that appears in the results. I want to change to a better approach by having subdirectories for each language, as I'm extending the site, I know how to do this but I have found -nowhere- any infos on how to preserve my search engine ranks? If I keep the english version as homepage and send german visitors to /de/, might this kill my position in the german search engine which is very important, as the new frontpage under /de/ would become more relevant and the english one maybe less? Or should I keep the german version the default one and send english visitors elsewhere? What happens with my search positions, if I have no side on the / but visitors are always send to either /en/ or /de/? Every help is greatly appreciated, as I found a lot of articles everywhere on how to make a multilingual site, but nowhere anything on how it affects current search results.
Local Website Optimization | | innovacy0