Location Pages On Website vs Landing pages
-
We have been having a terrible time in the local search results for 20 + locations. I have Places set up and all, but we decided to create location pages on our sites for each location - brief description and content optimized for our main service. The path would be something like .com/location/example.
One option that has came up in question is to create landing pages / "mini websites" that would probably be location-example.url.com.
I believe that the latter option, mini sites for each location, would be a bad idea as those kinds of tactics were once spammy in the past.
What are are your thoughts and and resources so I can convince my team on the best practice.
-
Hi KJ,
Agree with the consensus here that building mini sites is not the right approach. Take whatever energy you would have put into developing these and channel it into making the landing pages for your locations the best in their industry/towns. I was just watching a great little video by Darren Shaw in which this is one of the things he covers. Might be worth sharing with your team:
http://www.whitespark.ca/blog/post/70-website-optimization-basics-for-local-seo
And earlier this year, Phil Rozek penned some pretty fine tips on making your pages strong:
I am curious about one element of your original post. You mention, "We have been having a terrible time in the local search results for 20 + locations." I wasn't sure whether you were saying that you've never done well in them, were doing well in them until something changed (such as the universal rollout of Local Stacks) or something else. With the latter, I would guess that a huge number of businesses are now struggling to cope with the fact that there are only 3 spots to rank for any keyword, necessitating greater focus on lower volume keywords/categories, organic and paid results. Everybody but the top 3 businesses is now in this boat. Very tough.
-
Hi KJ,
First things first, do you have a physical address for each location and are these set up in Google My Business? I doubt you have premises in each location, so ranking for all the areas is going to be an uphill task.
Google is smart and knows if you have physical premises in the targeted location, after all it's all about delivering highly relevant results to its users. Lets say for example you're an electrician and a user searches for "Electrician in Sheffield" - realistically, if you only have premises in Leeds, it's going to be difficult to rank above the company who is actually located in Sheffield.
I would firstly target 2-3 of your primary locations and focus on building 10x content, I would aim to write 1000+ words for each page (completely unique content) whilst focusing on your set keywords, but be natural and don't keyword stuff. Put reviews from customers in that specific area on the landing page and build citations from local directories.
Again, you can't build citations unless you have physical premises in the location. Trust me, I've done it for years for a Roofing company and it's taken some time to see the results. He's #1 for the city he is located in, but for other cities it's a very difficult task. Writing about the same service for each location is a daunting task too, you should consider Great Content to outsource the content if you're stuck for ideas. It's a low budget solution and will save you mountains of time.
I would also use folders and not subdomains. Build a 'service areas' page, examples of urls for the roofing company below.
-
Hello KJ,
You absolutely don't want to begin creating subdomains for different locations. That will split your link flow across multiple domains (rather than consolidating it within a single domain).
It sounds like you are attempting a silo structure for your website (multiple locations targeting the same keyword) but this can be seen as stuffing if performed incorrectly. Using multiple pages to rank for a single keyword is problematic as it hits both Panda and Penguin red flags. What you want to do is begin ranking for different keywords or at least ensuring that your content for each of these locations pages is unique and sufficiently long (500 words+) to avoid arousing suspicion.
Your site structure sounds like it is okay. For example, a silo we put in place for one of our clients followed the following pattern:
domain.com/country/region/city/service
We hit about 15 cities using this tactic, and they have been sitting 1st page for the last year or so. We also built sufficient links to the home page and relevant pages and ensured that our technical SEO was spotless, so perhaps these are the areas you might engage your team to move forward on.
If you want to know more about our process, feel free to touch base and I will provide what advice I can.
Hope this helps and best of luck moving forward!
Rob
-
Right. You will not beat the other folks with the subdomain approach. You are getting beat because your competitors are taking the time to make better content in a niche. Find a way to get better content on those pages and mark them up with schema to make the info more readable to the search engines and possibly get an enhanced listing the SERPs.
We went through a site relaunch and the review schema on locations got messed up. Did not impact our rankings, but did impact click through from the search engines. None of the stars were showing up in the SERPs due to the schema goof up. Got the schema fixed and traffic was back up.
This link will point you toward the relevant Moz resources
https://moz.com/community/q/moz-s-official-stance-on-subdomain-vs-subfolder-does-it-need-updating
If you are happy with my response, please feel free to mark as a "Good Answer" thanks!
-
I agree with you. Some marketing people believe that we cannot beat out smaller companies is that we are too diverse in services. We do great with niche keywords and markets, but are being beat by companies who only focus on one of our key services. That is why they thought sub domains would do better, but I remember Rand posting something on sub domains vs sub folders, but cannot find the original source.
Thanks for your answer...
-
This is similar to the question on if a blog should be on a subdomain (blog.website.com) vs a folder (website.com/blog).
Most people agree that the use of the folder is the better option as with every blog post that you get links to etc, you are building your domain authority and generally speaking, rising tides raise all ships.
You would run into the same issue with your option to setup subdomains for each location. You would also end up having to deal with separate webmaster accounts for each etc. I don't think the subdomain is the solution. I run a site with thousands of locations and using a folder structure the business pages rank well for a given location, if you search on the name of the location, so I know it works and I manage it at scale.
I would get back to looking at any technical issues you have and your on page options for the pages. Anything you can further do to make these pages 10x better than any other page on the net for those locations?
Good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Magento 1.9 SEO. I have product pages with identical On Page SEO score in the 90's. Some pull up Google page 1 some won't pull up at all. I am searching for the exact title on that page.
I have a website built on Magento 1.9. There are approximately 290,000 part numbers on the site. I am sampling Google SERP results. About 20% of the keywords show up on page 1 position 5 thru 10. 80% don't show up at all. When I do a MOZ page score I get high 80's to 90's. A page score of 89 on one part # may show up on page one, An identical page score on a different part # can't be found on Google. I am searching for the exact part # in the page title. Any thoughts on what may be going on? This seems to me like a Magento SEO issue.
Intermediate & Advanced SEO | | CTOPDS0 -
Landing Page Drop Out
Hi, If a product page drops out of organic ranking, but you've made no changes is there a good place to start in order to find out why? I feel like it's almost impossible? Thank you!
Intermediate & Advanced SEO | | BeckyKey1 -
How to optimize count of interlinking by increasing Interlinking count of chosen landing pages and decreasing for less important pages within the site?
We have taken out our interlinking counts (Only Internal Links and not Outbound Links) through Google WebMaster tool and discovered that the count of interlinking of our most significant pages are less as compared to of less significant pages. Our objective is to reverse the existing behavior by increasing Interlinking count of important pages and reduce the count for less important pages so that maximum link juice could be transferred to right pages thereby increasing SEO traffic.
Intermediate & Advanced SEO | | vivekrathore0 -
Help! Website Page Structure.
Hi there, I have a cupcake website; www.cupcakesdelivered.com.au To date, we have sold only regular cupcakes. Moving forward, we are about to start selling lots of different sorts of cupcakes and want to categorise them - i.e.; sport cupcakes, corporate cupcakes, movie-themed cupcakes etc. I am looking for a recommendation on how best to structure this in terms of pages / domains / subdomains etc, so as to best support SEO. Your help would be greatly appreciated!! Thank you, Laura.
Intermediate & Advanced SEO | | cupcakesdelivered0 -
Interlinking vs. 'orphaning' mobile page versions in a dynamic serving scenario
Hi there, I'd love to get the Moz community's take on this. We are working on setting up dynamic serving for mobile versions of our pages. During the process of planning the mobile version of a page, we identified a type of navigational links that, while useful enough for desktop visitors, we feel would not be as useful to mobile visitors. We would like to remove these from our mobile version of the page as part of offering a more streamlined mobile page. So we feel that we're making a fine decision with user experience in mind. On any single page, the number of links removed in the mobile version would be relatively few. The question is: is there any danger in “orphaning” the mobile versions of certain pages because links don’t exist pointing to those pages on our mobile pages? Is this a legitimate concern, or is it enough that none of the desktop versions of pages are orphaned? We were not sure whether it’s even possible, in Googlebot’s eyes, to orphan a mobile version of a page if we use dynamic serving and if there are no orphaned desktop versions of our pages. (We also plan to link to "full site" in the footer.) Thank you in advance for your help,
Intermediate & Advanced SEO | | Eric_R
Eric0 -
Avoiding Duplicate Content with Used Car Listings Database: Robots.txt vs Noindex vs Hash URLs (Help!)
Hi Guys, We have developed a plugin that allows us to display used vehicle listings from a centralized, third-party database. The functionality works similar to autotrader.com or cargurus.com, and there are two primary components: 1. Vehicle Listings Pages: this is the page where the user can use various filters to narrow the vehicle listings to find the vehicle they want.
Intermediate & Advanced SEO | | browndoginteractive
2. Vehicle Details Pages: this is the page where the user actually views the details about said vehicle. It is served up via Ajax, in a dialog box on the Vehicle Listings Pages. Example functionality: http://screencast.com/t/kArKm4tBo The Vehicle Listings pages (#1), we do want indexed and to rank. These pages have additional content besides the vehicle listings themselves, and those results are randomized or sliced/diced in different and unique ways. They're also updated twice per day. We do not want to index #2, the Vehicle Details pages, as these pages appear and disappear all of the time, based on dealer inventory, and don't have much value in the SERPs. Additionally, other sites such as autotrader.com, Yahoo Autos, and others draw from this same database, so we're worried about duplicate content. For instance, entering a snippet of dealer-provided content for one specific listing that Google indexed yielded 8,200+ results: Example Google query. We did not originally think that Google would even be able to index these pages, as they are served up via Ajax. However, it seems we were wrong, as Google has already begun indexing them. Not only is duplicate content an issue, but these pages are not meant for visitors to navigate to directly! If a user were to navigate to the url directly, from the SERPs, they would see a page that isn't styled right. Now we have to determine the right solution to keep these pages out of the index: robots.txt, noindex meta tags, or hash (#) internal links. Robots.txt Advantages: Super easy to implement Conserves crawl budget for large sites Ensures crawler doesn't get stuck. After all, if our website only has 500 pages that we really want indexed and ranked, and vehicle details pages constitute another 1,000,000,000 pages, it doesn't seem to make sense to make Googlebot crawl all of those pages. Robots.txt Disadvantages: Doesn't prevent pages from being indexed, as we've seen, probably because there are internal links to these pages. We could nofollow these internal links, thereby minimizing indexation, but this would lead to each 10-25 noindex internal links on each Vehicle Listings page (will Google think we're pagerank sculpting?) Noindex Advantages: Does prevent vehicle details pages from being indexed Allows ALL pages to be crawled (advantage?) Noindex Disadvantages: Difficult to implement (vehicle details pages are served using ajax, so they have no tag. Solution would have to involve X-Robots-Tag HTTP header and Apache, sending a noindex tag based on querystring variables, similar to this stackoverflow solution. This means the plugin functionality is no longer self-contained, and some hosts may not allow these types of Apache rewrites (as I understand it) Forces (or rather allows) Googlebot to crawl hundreds of thousands of noindex pages. I say "force" because of the crawl budget required. Crawler could get stuck/lost in so many pages, and my not like crawling a site with 1,000,000,000 pages, 99.9% of which are noindexed. Cannot be used in conjunction with robots.txt. After all, crawler never reads noindex meta tag if blocked by robots.txt Hash (#) URL Advantages: By using for links on Vehicle Listing pages to Vehicle Details pages (such as "Contact Seller" buttons), coupled with Javascript, crawler won't be able to follow/crawl these links. Best of both worlds: crawl budget isn't overtaxed by thousands of noindex pages, and internal links used to index robots.txt-disallowed pages are gone. Accomplishes same thing as "nofollowing" these links, but without looking like pagerank sculpting (?) Does not require complex Apache stuff Hash (#) URL Disdvantages: Is Google suspicious of sites with (some) internal links structured like this, since they can't crawl/follow them? Initially, we implemented robots.txt--the "sledgehammer solution." We figured that we'd have a happier crawler this way, as it wouldn't have to crawl zillions of partially duplicate vehicle details pages, and we wanted it to be like these pages didn't even exist. However, Google seems to be indexing many of these pages anyway, probably based on internal links pointing to them. We could nofollow the links pointing to these pages, but we don't want it to look like we're pagerank sculpting or something like that. If we implement noindex on these pages (and doing so is a difficult task itself), then we will be certain these pages aren't indexed. However, to do so we will have to remove the robots.txt disallowal, in order to let the crawler read the noindex tag on these pages. Intuitively, it doesn't make sense to me to make googlebot crawl zillions of vehicle details pages, all of which are noindexed, and it could easily get stuck/lost/etc. It seems like a waste of resources, and in some shadowy way bad for SEO. My developers are pushing for the third solution: using the hash URLs. This works on all hosts and keeps all functionality in the plugin self-contained (unlike noindex), and conserves crawl budget while keeping vehicle details page out of the index (unlike robots.txt). But I don't want Google to slap us 6-12 months from now because it doesn't like links like these (). Any thoughts or advice you guys have would be hugely appreciated, as I've been going in circles, circles, circles on this for a couple of days now. Also, I can provide a test site URL if you'd like to see the functionality in action.0 -
Rel canonical on every page, pointing to home page
I've just started working with a client and have been surprised to find that every page of their site (using Concrete5 CMS) has a rel=canonical pointing to their home page. I'm feeling really dumb, because this seems like a fatal flaw which would keep Google from ranking any page other than the home page... but when I look at Google Analytics, Content > Site Content > Landing Pages, using Secondary Dimension = Source, it seems that Google is delivering users to numerous pages on their site. Can anyone help me out?! Thanks very much!!
Intermediate & Advanced SEO | | measurableROI0 -
On Page question
HI folks, I have a warning that I have missing meta tag descriptions on two pages. 1) http://bluetea.com.au/wp-login.php 2) http://bluetea.com.au/wp-login.php?action=lostpassword Is this something I should just ignore? Or is there a best practice I should be implementing? Thank you for your time
Intermediate & Advanced SEO | | PHDAustralia680