What Are Your Thoughts On Location Targeted Pages?
-
I have a client that wants to rank for a bunch of locations around his primary location. Say 30 minutes away. So we created a bunch of pages for cities around his location. So far it seems to be working pretty well. That said, I heard from someone else that Google really doesn't like these type of pages anymore and that we are better off with just one location page and list the areas we server on it.
What are your thoughts and experiences?
-
Thanks for the responses guys. Sounds like you both think using location pages is still a good way to go as long as you are not trying to fake having a location where you do not.
-
"I would first offer a simple suggestion whenever someone says, Google doesn't or Google does... Ask them where they read it or saw it."
I agree with Robert here. Chances are the person you heard it from tried it and did not implement it correctly, and therefore because it didnt work for them, means it wont work for you. Lol.
"City" pages are one of the most effective ways to drive traffic for additional service areas. We have used these multiple times with great success. Having individual location pages with a exclusive URL, page title, content, meta description and links can help you rank in areas outside your major competitive areas. For example:
Lawn care guy serves St Louis. We set up pages for surrounding areas. Focus on St Louis for main site content, while mentioning the other areas. For the city pages, we only focus on that one area. All content is unique for each page (important) and we see how we can include certain details that are specific to that one area. Images, links, alt text are all named appropriately to that location. You can always include wiki info, maps, images or other media to help a user (and search engine) immediately understand what the page is about. Set up a variety of tracking methods as Robert stated to further help users decide that the service provider is "local". People will be more likely to call a local number for service showing their area code than a 800 or 855. Tossible digits has them for cheap, or you can use Google voice. As stated before DO NOT USE VIRTUAL ADDRESSES, or a UPS suite. You want to add to the value of your site, not hurt it.
-
Netviper
I would first offer a simple suggestion whenever someone says, Google doesn't or Google does... Ask them where they read it or saw it. (Obviously, hearing it is not very beneficial.) That said, I politely disagree with whomever said that.
We have clients that are service area businesses and they have a need to be present when someone looks for example for a plumber. So, Houston Plumber and neighborhood A, B, C, D.... What you must do, is be careful regarding local and how you set it up. Do not go out and get UPS addresses, virtual addresses, etc. for these communities. You should have one location for each real address that is a business address that your client has. If they only have one in the city, that is it. Make sure that Name, Address, and Phone is always the same. IF you are going to want to use a different phone number for a given community (typically for tracking purposes), make sure you do it as an image on the web page, no HTML and no alt text with number, etc. Forever (or until things change) you will have the same NAP for the entire city unless it is a tracking number that is not crawled.
For every page you create for a community I suggest the following. Go shoot some photos or have them shot, etc. using a camera or phone with GPS and use one or two of those on your page. Talk about that area and the business and as best possible, try not to make every page the same. Find ways to add a paragraph or two that are real differentiators for that locale.
Doing this, you should have some success with meeting the needs of your client.
Best
Robert
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google webcache of product page redirects back to product page
Hi all– I've legitimately never seen this before, in any circumstance. I just went to check the google webcache of a product page on our site (was just grabbing the last indexation date) and was immediately redirected away from google's cached version BACK to the site's standard product page. I ran a status check on the product page itself and it was 200, then ran a status check on the webcache version and sure enough, it registered as redirected. It looks like this is happening for ALL indexed product pages across the site (several thousand), and though organic traffic has not been affected it is starting to worry me a little bit. Has anyone ever encountered this situation before? Why would a google webcache possibly have any reason to redirect? Is there anything to be done on our side? Thanks as always for the help and opinions, y'all!
Intermediate & Advanced SEO | | TukTown1 -
Duplicate Page getting indexed and not the main page!
Main Page: www.domain.com/service
Intermediate & Advanced SEO | | Ishrat-Khan
Duplicate Page: www.domain.com/products-handler.php/?cat=service 1. My page was getting indexed properly in 2015 as: www.domain.com/service
2. Redesigning done in Aug 2016, a new URL pattern surfaced for my pages with parameter "products-handler"
3. One of my product landing pages had got 301-permanent redirected on the "products-handler" page
MAIN PAGE: www.domain.com/service GETTING REDIRECTED TO: www.domain.com/products-handler.php/?cat=service
4. This redirection was appearing until Nov 2016.
5. I took over the website in 2017, the main page was getting indexed and deindexed on and off.
6. This June it suddenly started showing an index of this page "domain.com/products-handler.php/?cat=service"
7. These "products-handler.php" pages were creating sitewide internal duplicacy, hence I blocked them in robots.
8. Then my page (Main Page: www.domain.com/service) got totally off the Google index Q1) What could be the possible reasons for the creation of these pages?
Q2) How can 301 get placed from main to duplicate URL?
Q3) When I have submitted my main URL multiple times in Search Console, why it doesn't get indexed?
Q4) How can I make Google understand that these URLs are not my preferred URLs?
Q5) How can I permanently remove these (products-handler.php) URLs? All the suggestions and discussions are welcome! Thanks in advance! 🙂0 -
Google is indexing wrong page for search terms not on that page
I’m having a problem … the wrong page is indexing with Google, for search phrases “not on that page”. Explained … On a website I developed, I have four products. For example sake, we’ll say these four products are: Sneakers (search phrase: sneakers) Boots (search phrase: boots) Sandals (search phrase: sandals) High heels (search phrase: high heels) Error: What is going “wrong” is … When the search phrase “high heels” is indexed by Google, my “Sneakers” page is being indexed instead (and ranking very well, like #2). The page that SHOULD be indexing, is the “High heels” page (not the sneakers page – this is the wrong search phrase, and it’s not even on that product page – not in URL, not in H1 tags, not in title, not in page text – nowhere, except for in the top navigation link). Clue #1 … this same error is ALSO happening for my other search phrases, in exactly the same manner. i.e. … the search phrase “sandals” is ALSO resulting in my “Sneakers” page being indexed, by Google. Clue #2 … this error is NOT happening with Bing (the proper pages are correctly indexing with the proper search phrases, in Bing). Note 1: MOZ has given all my product pages an “A” ranking, for optimization. Note 2: This is a WordPress website. Note 3: I had recently migrated (3 months ago) most of this new website’s page content (but not the “Sneakers” page – this page is new) from an old, existing website (not mine), which had been indexing OK for these search phrases. Note 4: 301 redirects were used, for all of the OLD website pages, to the new website. I have tried everything I can think of to fix this, over a period of more than 30 days. Nothing has worked. I think the “clues” (it indexes properly in Bing) are useful, but I need help. Thoughts?
Intermediate & Advanced SEO | | MG_Lomb_SEO0 -
If Penguin 2.0 targets specific pages and keywords, should I spend less SEO effort on them since will they be harder to optimize? Penalty repair is only starting at end of year.
I’m working with a company that got hit by Penguin 2.0. They’re going to switch to white-hat only for a few months and review analytics before considering repairing the penalty. In the meantime, would it make sense to focus less SEO effort (on-site optimization, link building, etc.) on any pages or keywords that were penalized or hit hardest? Or are those the pages we should work on the most? Thanks for reading!
Intermediate & Advanced SEO | | DA20130 -
Page Titles of Blog
Hi, Should all the page titles of our blogs include a Keyword(s) and\or our website name?
Intermediate & Advanced SEO | | Studio330 -
Page not appearing in SERPs
I have a regional site that does fairly well for most towns in the area (top 10-20). However, one place that has always done OK and has great content is not anywhere within the first 200. Everything looks OK, canonical link is correct, I can find the page if I search for exact text, there aren't any higher ranking duplicate pages. Any ideas what may have happened and how I can confirm a penalty for example. TIA,
Intermediate & Advanced SEO | | Cornwall
Chris0 -
Blocking Pages Via Robots, Can Images On Those Pages Be Included In Image Search
Hi! I have pages within my forum where visitors can upload photos. When they upload photos they provide a simple statement about the photo but no real information about the image,definitely not enough for the page to be deemed worthy of being indexed. The industry however is one that really leans on images and having the images in Google Image search is important to us. The url structure is like such: domain.com/community/photos/~username~/picture111111.aspx I wish to block the whole folder from Googlebot to prevent these low quality pages from being added to Google's main SERP results. This would be something like this: User-agent: googlebot Disallow: /community/photos/ Can I disallow Googlebot specifically rather than just using User-agent: * which would then allow googlebot-image to pick up the photos? I plan on configuring a way to add meaningful alt attributes and image names to assist in visibility, but the actual act of blocking the pages and getting the images picked up... Is this possible? Thanks! Leona
Intermediate & Advanced SEO | | HD_Leona0 -
Should I index tag pages?
Should I exclude the tag pages? Or should I go ahead and keep them indexed? Is there a general opinion on this topic?
Intermediate & Advanced SEO | | NikkiGaul0