Help choosing ideal URL structure
-
Hi All,
We are considering changing the link structure for the website of a large restaurant group, which represents about 100 restaurants in the USA. While I have some opinions, I'd very much welcome the opinions of some other seasoned SEO's as well.
There are two options on the table for the link structure, which you can see below. The question is for restaurants with multiple locations, and how we structure those URLs. The main difference is whether we include the "/location/" of the URL, or if that is overkill? I suppose maybe it could have some value if someone is searching a term like "Bub City Location", with "location" right in the search. But otherwise, it just adds to the length of the URL, and I'm not sure if it'll bring any extra value...
In this example, "bub-city" is the restaurant name, and "mb-financial-park" is one of the locations.
Option A
http://leye.local/restaurant/bub-city
http://leye.local/restaurant/bub-city/location/mb-financial-park/Option B
http://leye.local/restaurant/bub-city
http://leye.local/restaurant/bub-city/mb-financial-park/Thoughts?
-
Hi There,
Both Options can be useful, you have to make a choice based on following factors:
- It is good to include keywords in the URL, till the time the URL is not getting longer than 50-70 characters. If it's getting longer around 100, then get back to smaller structure.
- http://leye.local/restaurant/bub-city/location/ have a page? It's better to have it as part of the structure, if not then it makes just characters in URL and not an important part of a structure of the website. Fewer the folder the better, unless they have some importance in the structure of the website for a navigational purpose.
- Too much content in the URL is going to make it look spammy, but it needs to relevant and important to the structure of the website.
I hope this helps.
Regards,
Vijay
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do old backlinks still help with new URL with 301 redirect? Also I added the www. How does this affect it all?
I changed my URL from exampledetailing. com to exampleautodetailing. com. It is redirected with a 301. Also, it is on Squarespace AND I opted to add the www. So will the old backlinks of exampledetailing. com still help the new URL exampleautodetailing. com or do I need to try and update all the links? Also, future links, do I need to include the www. or just the root domain of exampleautodetailing. com or even the whole https://wwwexampleautodetailing. com? I believe the www is considered a sub domain and a new entity on Google, so I am not sure how that works. Thank you!
Local Website Optimization | | Rmarkjr810 -
Local SEO - Multiple stores on same URL
Hello guys, I'm working on a plan of local SEO for a client that is managing over 50 local stores. At the moment all the stores are sharing the same URL address and wanted to ask if it s better to build unique pages for each of the stores or if it's fine to go with all of them on the same URL. What do you think? What's the best way and why? Thank you in advance.
Local Website Optimization | | Noriel0 -
Structuring URLs of profile pages
First of all, I want to thank everyone for the feedback that I received on the first question. My next question has to do with the URL structure of personal trainer profiles pages on www.rightfitpersonaltraining.com. Currently, the structure of each trainer profile page is "www.rightfitpersonaltraining.com/personal-trainers/trainer/" and at the end I manually add the trainer's "city-firstname-lastinitial". Would it be to my benefit to have the developers change the structure so that the trainer profile URLs are "www.rightfitpersonaltraining.com/city-personal-trainers/trainername"? That way, each trainer profile would link directly to the trainer's city page as opposed to the general "personal-trainers" page. I don't mind paying a little extra to go back into the site to make these changes, as I think they would benefit the search ranking for each city page.
Local Website Optimization | | mkornbl20 -
Country/Language combination in subdirectory URL
Hello, We are a multi country/multi lingual (English, Arabic) website. We are following a subdirectory structure to separate and geotarget the country/language combinations. Currently our english and arabic urls are the same: For UAE: example.com/ae (English Site) For Saudi Arabic: example.com/sa (Saudi Arabia) We want to separate the English and Arabic language URLs and I wanted to know if there is any preference as to which kind of URL structure we should go with : example.com/ae-en (Country-Language) example.com/en-ae (Language-Country) example.com/ae/en (Country/Language) Is there any logic to deciding how to structure the language/country combinations or is is entirely a matter of personal preference. Thanks!
Local Website Optimization | | EcommRulz0 -
Title Tag, URL Structure & H1 for Localization
I am working with a local service company. They have one location but offer a number of different services to both residential and commercial verticals. What I have been reading seems to suggest that I put the location in URLs, Title Tags & H1s. Isn't it kind of spammy and possibly annoying user experience to see location on every page?? Portland ME Residential House Painting Portland ME Commercial Painting Portland Maine commercial sealcoating Portland Maine residential sealcoating etc, etc This strikes me as an old school approach. Isn't google more adept at recognizing location so that I don't need to paste it In H1s all over the site? Thanks in advance. PAtrick
Local Website Optimization | | hopkinspat0 -
RE: Keep Losing Keyword Ranking Position for Targeted Keyword Terms Can't Figure It Out, Please Help!!!
Hey Mozzers, I am pulling my hair out trying to figure out why one of my clients keeps losing their SERP for their targeted keyword terms. We're actively pursuing local citations, making sure their NAP is consistent across the board and refining on-page content to make sure that we're maximizing opportunities. The only thing I've found is a 4xx error that my Moz 'crawl diagnostics' keep returning back to me, however, when I check to see if there's any problems with Google Webmaster Tools, it doesn't return any errors. Is this 4xx error the culprit? Are there any suggestions any of you could give me to help me improve the SERP for my targeted keyword terms. Anyway, any and all insight can help. I'm at my wits end. Thanks for reading and for all of your help!
Local Website Optimization | | maxcarnage0 -
Local SEO HELP for Franchise SAB Business
This all began when I was asked to develop experiment parameters for our content protocol & strategy. It should be simple right? I've reviewed A/B testing tips for days now, from Moz and other sources.I'm totally amped and ready to begin testing in Google Analytics. Say we have a restoration service franchise with over 40 franchises we perform SEO for. They are all over the US. Every franchise has their own local website. Example restorationcompanylosangeles.com Every franchise purchases territories in which they want to rank in. Some service over 100 cities. Most franchises also have PPC campaigns. As a part of our strategy we incorporate the location reach data from Adwords to focus on their high reach locations first. We have 'power pages' which include 5 high reach branch preferences (areas in which the owners prefer to target) and 5 non branch preference high reach locations. We are working heavily on our National brand presence & working with PR and local news companies to build relationships for natural backlinks. We are developing a strategy for social media for national brand outlets and local outlets. We are using major aggregators to distribute our local citation for our branch offices. We make sure all NAP is consistent across all citations. We are partners with Google so we work with them on new branches that are developing to create their Google listings (MyBusiness & G+). We use local business schema markup for all pages. Our content protocol encompasses all the needed onsite optimization tactics; meta, titles, schema, placement of keywords, semantic Q&A & internal linking strategies etc. Our leads are calls and form submissions. We use several call tracking services to monitor calls, caller's location etc. We are testing Callrail to start monitoring landing pages and keywords that generating our leads. Parts that I want to change: Some of the local sites have over 100 pages targeted for 'water damage + city ' aka what Moz would call "Doorway pages. " These pages have 600-1000 words all talking about services we provide. Although our writers (4 of them) manipulate them in a way so that they aren't duplicate pages. They add about 100 words about the city location. This is the only unique variable. We pump out about 10 new local pages a month per site - so yes - over 300 local pages a month. Traffic to the local sites is very scarce. Content protocol / strategy is only tested based on ranking! We have a tool that monitors ranking on all domains. This does not count for mobile, local, nor user based preference searching like Google Now. My team is deeply attached to basing our metrics solely on ranking. The logic behind this is that if there is no local city page existing for a targeted location, there is less likelihood of ranking for that location. If you are not seen then you will not get traffic nor leads. Ranking for power locations is poor - while less competitive low reach locations rank ok. We are updating content protocol by tweaking small things (multiple variants at a time). They will check ranking everyday for about a week to determine whether that experiment was a success or not. What I need: Internal duplicate content analyzer - to prove that writing over 400 pages a month about water damage + city IS duplicate content. Unique content for 'Power pages' - I know based on dozens of chats here on the community and in MOZ blogs that we can only truly create quality content for 5-10 pages. Meaning we need to narrow down what locations are most important to us and beef them up. Creating blog content for non 'power' locations. Develop new experiment protocol based on metrics like traffic, impressions, bounce rate landing page analysis, domain authority etc. Dig deeper into call metrics and their sources. Now I am at a roadblock because I cannot develop valid content experimenting parameters based on ranking. I know that a/b testing requires testing two pages that are same except the one variable. We'd either non index these or canonicalize.. both are not in favor of testing ranking for the same term. Questions: Are all these local pages duplicate content? Is there a such thing as content experiments based solely on ranking? Any other suggestions for this scenario?
Local Website Optimization | | MilestoneSEO_LA1 -
What's the best way to add phrase keywords to the URL?
Hi, Our keywords are all our service + a list of towns (for example, "carpet cleaning St. Louis"). The issue I'm having is that one particular site could be targeting "carpet cleaning St. Louis", "carpet cleaning Manchester", "carpet cleaning Ballwin", "carpet cleaning Kirkwood", etc. etc. etc... up to maybe 15 different towns. Is there a way to effectively add these keywords into the URL without making it look spammy? I'm having the same issue with adding the exact keywords to the page title, img alt tag, etc. Thanks for any advice/input!
Local Website Optimization | | nataliefwc0