Best practice for local keyword ranking in URLs
-
Hi, I have a large artificial grass website with many franchise location landing pages. At the moment i have most of the landing page URLs like this www.domainname.com/uk/city/
My TLD does not contain the keyword "artificial grass" so should I follow the location with the keywords /city-artificial-grass/ or is Google pretty savvy these days and will it know that I am an artificial grass company?
I'm after the best recommendations for this if possible.
Thanks
-
Hey Alex,
I want to be sure I'm understanding this fully. Some questions:
-
Is artificial grass the main product your business sells?
-
If that's right, are you saying that your domain name is something like Alexs.com instead of AlexsArificialGrass.com?
-
And if that's right, are you asking if your landing pages should look like alexs.com/country/city/artificialgrass instead of just alexs.com/country/city? Or, something else?
-
And, finally, I'm curious about the use of a country name in your URLs. Do you have offices in more than one nation?
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do old backlinks still help with new URL with 301 redirect? Also I added the www. How does this affect it all?
I changed my URL from exampledetailing. com to exampleautodetailing. com. It is redirected with a 301. Also, it is on Squarespace AND I opted to add the www. So will the old backlinks of exampledetailing. com still help the new URL exampleautodetailing. com or do I need to try and update all the links? Also, future links, do I need to include the www. or just the root domain of exampleautodetailing. com or even the whole https://wwwexampleautodetailing. com? I believe the www is considered a sub domain and a new entity on Google, so I am not sure how that works. Thank you!
Local Website Optimization | | Rmarkjr810 -
Should I avoid duplicate url keywords?
I'm curious to know Can having a keyword repeat in the URL cause any penalties ? For example xyzroofing.com xyzroofing.com/commercial-roofing xyzroofing.com/roofing-repairs My competitors with the highest rankings seem to be doing it without any trouble but I'm wondering if there is a better way. Also One of the problems I've noticed is that my /commercial-roofing page outranks my homepage for both residential and commercial search inquiries. How can this be straightened out?
Local Website Optimization | | Lyontups0 -
Hreflang errors "no return tag" sitemap.xml , and local search landing page with wrong Languages
Really need help , our website when search in google(US) will provide global page (keyword:asus/asus zenfone3). and search console also return "no return tags"another wear thing is when use googlebot crawl sitemap.xml googlebot cannot finish the file less than a quarterCan you please advise on what needs to be edited or changed to make sure my implementation is correct and not returning errors?
Local Website Optimization | | June01270 -
What is the best way for a UK company to source SEO Support to boost SERPS in USA Google?
We are a niche web retailer with a world leading product and as such are probably the best option for USA customers (even though we are based in the UK) up to 18 months ago google agreed and placed us high for USA searches and we had good business as a result however since penguin (or around that time anyways) google increased our SERPS for more local markets (UK and EUROPE) and decreased our ranks for USA with a consequent reduction in our USA sales We want to improve rank again in USA (and Canada and Australia and Russia) but need specialist help What's the best way to source that? (short of someone saying they know exactly how to do that) ant recommendation most gratefully received Tom
Local Website Optimization | | tomnivore0 -
Ranking a Website that Services Multiple Cities
We have a website that offers services to various cities in a state. However, since we don't want to do keyword stuffing, how do we rank this website for all of these cities when it comes to the **title tags? **For example, how do we optimize the homepage title tag? Obviously I know we can't put all the cities into it, so how do we choose which city to use? I know we can add city/local pages and optimize them for those locations, but I'm referring specifically to the homepage and other main pages of the website. How do you determine which cities to use in those title tags?
Local Website Optimization | | SEOhughesm0 -
Local SEO HELP for Franchise SAB Business
This all began when I was asked to develop experiment parameters for our content protocol & strategy. It should be simple right? I've reviewed A/B testing tips for days now, from Moz and other sources.I'm totally amped and ready to begin testing in Google Analytics. Say we have a restoration service franchise with over 40 franchises we perform SEO for. They are all over the US. Every franchise has their own local website. Example restorationcompanylosangeles.com Every franchise purchases territories in which they want to rank in. Some service over 100 cities. Most franchises also have PPC campaigns. As a part of our strategy we incorporate the location reach data from Adwords to focus on their high reach locations first. We have 'power pages' which include 5 high reach branch preferences (areas in which the owners prefer to target) and 5 non branch preference high reach locations. We are working heavily on our National brand presence & working with PR and local news companies to build relationships for natural backlinks. We are developing a strategy for social media for national brand outlets and local outlets. We are using major aggregators to distribute our local citation for our branch offices. We make sure all NAP is consistent across all citations. We are partners with Google so we work with them on new branches that are developing to create their Google listings (MyBusiness & G+). We use local business schema markup for all pages. Our content protocol encompasses all the needed onsite optimization tactics; meta, titles, schema, placement of keywords, semantic Q&A & internal linking strategies etc. Our leads are calls and form submissions. We use several call tracking services to monitor calls, caller's location etc. We are testing Callrail to start monitoring landing pages and keywords that generating our leads. Parts that I want to change: Some of the local sites have over 100 pages targeted for 'water damage + city ' aka what Moz would call "Doorway pages. " These pages have 600-1000 words all talking about services we provide. Although our writers (4 of them) manipulate them in a way so that they aren't duplicate pages. They add about 100 words about the city location. This is the only unique variable. We pump out about 10 new local pages a month per site - so yes - over 300 local pages a month. Traffic to the local sites is very scarce. Content protocol / strategy is only tested based on ranking! We have a tool that monitors ranking on all domains. This does not count for mobile, local, nor user based preference searching like Google Now. My team is deeply attached to basing our metrics solely on ranking. The logic behind this is that if there is no local city page existing for a targeted location, there is less likelihood of ranking for that location. If you are not seen then you will not get traffic nor leads. Ranking for power locations is poor - while less competitive low reach locations rank ok. We are updating content protocol by tweaking small things (multiple variants at a time). They will check ranking everyday for about a week to determine whether that experiment was a success or not. What I need: Internal duplicate content analyzer - to prove that writing over 400 pages a month about water damage + city IS duplicate content. Unique content for 'Power pages' - I know based on dozens of chats here on the community and in MOZ blogs that we can only truly create quality content for 5-10 pages. Meaning we need to narrow down what locations are most important to us and beef them up. Creating blog content for non 'power' locations. Develop new experiment protocol based on metrics like traffic, impressions, bounce rate landing page analysis, domain authority etc. Dig deeper into call metrics and their sources. Now I am at a roadblock because I cannot develop valid content experimenting parameters based on ranking. I know that a/b testing requires testing two pages that are same except the one variable. We'd either non index these or canonicalize.. both are not in favor of testing ranking for the same term. Questions: Are all these local pages duplicate content? Is there a such thing as content experiments based solely on ranking? Any other suggestions for this scenario?
Local Website Optimization | | MilestoneSEO_LA1 -
Local Business Schema Markup on every page?
Hello, I have two questions..if someone could shed some light on the topic, I would be so very grateful! 1. I am still making my way through how schema is employed, and as I can tell, it is much more specific (and therefore relevant) in its details than using the data highlighter tool. Is this true? 2. Most of my clients' sites have a footer with the local business info included on every page of their site (address and phone). This said, I have been using the structured data markup helper to add local business schema to home page, and then including the footer markup in the footer file so that every page benefits from the local business markup. Is this incorrect to use it for every page? Also, I noticed that by just using the footer markup for the rest of the pages in the site, I am missing data that was included when I manually went through the index page (i.e. image, url, name of business). Could someone tell me if it is advisable and worth it to manually markup every page for the local business schema or if that should just be used for certain pages such as location, contact us, and/or index? Any tips or help would be greatly appreciated!!! Thanks
Local Website Optimization | | lfrazer0 -
Bing ranking a weak local branch office site of our 200-unit franchise higher than the brand page - throughout the USA!?
We have a brand with a major website at ourbrand.com. I'm using stand-ins for the actual brandname. The brand is a unique term, has 200 local offices with sites at ourbrand.com/locations/locationname, and is structured with best practices, and has a well built sitemap.xml. The link profile is diverse and solid. There are very few crawl errors and no warnings in Google Webmaster central. Each location has schema.org markup that has been checked with markup validation tools. No matter what tool you use, and how you look at it t's obvious this is the brand site. DA 51/100, PA 59/100. A rouge franchisee has broken their agreement and made their own site in a city on a different domain name, ourbrandseattle.com. The site is clearly optimized for that city, and has a weak inbound link profile. DA 18/100, PA 21/100. The link profile has low diversity and generally weak. They have no social media activity. They have not linked to ourbrand.com <- my leading theory. **The problem is that this rogue site is OUT RANKING the brand site all over the USA on Bing. **Even where it makes no sense at all. We are using whitespark.ca to check our ranking remotely in other cities and try to remove the effects of local personalization. What should we do? What have I missed?
Local Website Optimization | | scottclark0