Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Location Pages On Website vs Landing pages
-
We have been having a terrible time in the local search results for 20 + locations. I have Places set up and all, but we decided to create location pages on our sites for each location - brief description and content optimized for our main service. The path would be something like .com/location/example.
One option that has came up in question is to create landing pages / "mini websites" that would probably be location-example.url.com.
I believe that the latter option, mini sites for each location, would be a bad idea as those kinds of tactics were once spammy in the past.
What are are your thoughts and and resources so I can convince my team on the best practice.
-
Hi KJ,
Agree with the consensus here that building mini sites is not the right approach. Take whatever energy you would have put into developing these and channel it into making the landing pages for your locations the best in their industry/towns. I was just watching a great little video by Darren Shaw in which this is one of the things he covers. Might be worth sharing with your team:
http://www.whitespark.ca/blog/post/70-website-optimization-basics-for-local-seo
And earlier this year, Phil Rozek penned some pretty fine tips on making your pages strong:
I am curious about one element of your original post. You mention, "We have been having a terrible time in the local search results for 20 + locations." I wasn't sure whether you were saying that you've never done well in them, were doing well in them until something changed (such as the universal rollout of Local Stacks) or something else. With the latter, I would guess that a huge number of businesses are now struggling to cope with the fact that there are only 3 spots to rank for any keyword, necessitating greater focus on lower volume keywords/categories, organic and paid results. Everybody but the top 3 businesses is now in this boat. Very tough.
-
Hi KJ,
First things first, do you have a physical address for each location and are these set up in Google My Business? I doubt you have premises in each location, so ranking for all the areas is going to be an uphill task.
Google is smart and knows if you have physical premises in the targeted location, after all it's all about delivering highly relevant results to its users. Lets say for example you're an electrician and a user searches for "Electrician in Sheffield" - realistically, if you only have premises in Leeds, it's going to be difficult to rank above the company who is actually located in Sheffield.
I would firstly target 2-3 of your primary locations and focus on building 10x content, I would aim to write 1000+ words for each page (completely unique content) whilst focusing on your set keywords, but be natural and don't keyword stuff. Put reviews from customers in that specific area on the landing page and build citations from local directories.
Again, you can't build citations unless you have physical premises in the location. Trust me, I've done it for years for a Roofing company and it's taken some time to see the results. He's #1 for the city he is located in, but for other cities it's a very difficult task. Writing about the same service for each location is a daunting task too, you should consider Great Content to outsource the content if you're stuck for ideas. It's a low budget solution and will save you mountains of time.
I would also use folders and not subdomains. Build a 'service areas' page, examples of urls for the roofing company below.
-
Hello KJ,
You absolutely don't want to begin creating subdomains for different locations. That will split your link flow across multiple domains (rather than consolidating it within a single domain).
It sounds like you are attempting a silo structure for your website (multiple locations targeting the same keyword) but this can be seen as stuffing if performed incorrectly. Using multiple pages to rank for a single keyword is problematic as it hits both Panda and Penguin red flags. What you want to do is begin ranking for different keywords or at least ensuring that your content for each of these locations pages is unique and sufficiently long (500 words+) to avoid arousing suspicion.
Your site structure sounds like it is okay. For example, a silo we put in place for one of our clients followed the following pattern:
domain.com/country/region/city/service
We hit about 15 cities using this tactic, and they have been sitting 1st page for the last year or so. We also built sufficient links to the home page and relevant pages and ensured that our technical SEO was spotless, so perhaps these are the areas you might engage your team to move forward on.
If you want to know more about our process, feel free to touch base and I will provide what advice I can.
Hope this helps and best of luck moving forward!
Rob
-
Right. You will not beat the other folks with the subdomain approach. You are getting beat because your competitors are taking the time to make better content in a niche. Find a way to get better content on those pages and mark them up with schema to make the info more readable to the search engines and possibly get an enhanced listing the SERPs.
We went through a site relaunch and the review schema on locations got messed up. Did not impact our rankings, but did impact click through from the search engines. None of the stars were showing up in the SERPs due to the schema goof up. Got the schema fixed and traffic was back up.
This link will point you toward the relevant Moz resources
https://moz.com/community/q/moz-s-official-stance-on-subdomain-vs-subfolder-does-it-need-updating
If you are happy with my response, please feel free to mark as a "Good Answer" thanks!
-
I agree with you. Some marketing people believe that we cannot beat out smaller companies is that we are too diverse in services. We do great with niche keywords and markets, but are being beat by companies who only focus on one of our key services. That is why they thought sub domains would do better, but I remember Rand posting something on sub domains vs sub folders, but cannot find the original source.
Thanks for your answer...
-
This is similar to the question on if a blog should be on a subdomain (blog.website.com) vs a folder (website.com/blog).
Most people agree that the use of the folder is the better option as with every blog post that you get links to etc, you are building your domain authority and generally speaking, rising tides raise all ships.
You would run into the same issue with your option to setup subdomains for each location. You would also end up having to deal with separate webmaster accounts for each etc. I don't think the subdomain is the solution. I run a site with thousands of locations and using a folder structure the business pages rank well for a given location, if you search on the name of the location, so I know it works and I manage it at scale.
I would get back to looking at any technical issues you have and your on page options for the pages. Anything you can further do to make these pages 10x better than any other page on the net for those locations?
Good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do 403 Forbidden errors from website pages hurt rankings?
Hi All, I noticed that our website has lot of 403 errors across different pages using the tool http://www.deadlinkchecker.com/. Do these errors hurt website rankings? Thanks
Intermediate & Advanced SEO | | vtmoz0 -
Why does Google rank a product page rather than a category page?
Hi, everybody In the Moz ranking tool for one of our client's (the client sells sport equipment) account, there is a trend where more and more of their landing pages are product pages instead of category pages. The optimal landing page for the term "sleeping bag" is of course the sleeping bag category page, but Google is sending them to a product page for a specific sleeping bag.. What could be the critical factors that makes the product page more relevant than the category page as the landing page?
Intermediate & Advanced SEO | | Inevo0 -
On 1 of our sites we have our Company name in the H1 on our other site we have the page title in our H1 - does anyone have any advise about the best information to have in the H1, H2 and Page Tile
We have 2 sites that have been set up slightly differently. On 1 site we have the Company name in the H1 and the product name in the page title and H2. On the other site we have the Product name in the H1 and no H2. Does anyone have any advise about the best information to have in the H1 and H2
Intermediate & Advanced SEO | | CostumeD0 -
Is their value in linking to PPC landing pages and using rel="canonical"
I have ppc landing pages that are similar to my seo page. The pages are shorter with less text with a focus on converting visitors further along in the purchase cycle. My questions are: 1. Is there a benefit for having the orphan ppc pages indexed or should I no index them? 2. If indexing does provide benefits, should I create links from my site to the ppc pages or should I just submit them in a sitemap? 3. If indexed, should I use rel="canonical" and point the ppc versions to the appropriate organic page? Thanks,
Intermediate & Advanced SEO | | BrandExpSteve0 -
Subdomains vs directories on existing website with good search traffic
Hello everyone, I operate a website called Icy Veins (www.icy-veins.com), which gives gaming advice for World of Warcraft and Hearthstone, two titles from Blizzard Entertainment. Up until recently, we had articles for both games on the main subdomain (www.icy-veins.com), without a directory structure. The articles for World of Warcraft ended in -wow and those for Hearthstone ended in -hearthstone and that was it. We are planning to cover more games from Blizzard entertainment soon, so we hired a SEO consultant to figure out whether we should use directories (www.icy-veins.com/wow/, www.icy-veins.com/hearthstone/, etc.) or subdomains (www.icy-veins.com, wow.icy-veins.com, hearthstone.icy-veins.com). For a number of reason, the consultant was adamant that subdomains was the way to go. So, I implemented subdomains and I have 301-redirects from all the old URLs to the new ones, and after 2 weeks, the amount of search traffic we get has been slowly decreasing, as the new URLs were getting index. Now, we are getting about 20%-25% less search traffic. For example, the week before the subdomains went live we received 900,000 visits from search engines (11-17 May). This week, we only received 700,000 visits. All our new URLs are indexed, but they rank slightly lower than the old URLs used to, so I was wondering if this was something that was to be expected and that will improve in time or if I should just go for subdomains. Thank you in advance.
Intermediate & Advanced SEO | | damienthivolle0 -
Avoiding Duplicate Content with Used Car Listings Database: Robots.txt vs Noindex vs Hash URLs (Help!)
Hi Guys, We have developed a plugin that allows us to display used vehicle listings from a centralized, third-party database. The functionality works similar to autotrader.com or cargurus.com, and there are two primary components: 1. Vehicle Listings Pages: this is the page where the user can use various filters to narrow the vehicle listings to find the vehicle they want.
Intermediate & Advanced SEO | | browndoginteractive
2. Vehicle Details Pages: this is the page where the user actually views the details about said vehicle. It is served up via Ajax, in a dialog box on the Vehicle Listings Pages. Example functionality: http://screencast.com/t/kArKm4tBo The Vehicle Listings pages (#1), we do want indexed and to rank. These pages have additional content besides the vehicle listings themselves, and those results are randomized or sliced/diced in different and unique ways. They're also updated twice per day. We do not want to index #2, the Vehicle Details pages, as these pages appear and disappear all of the time, based on dealer inventory, and don't have much value in the SERPs. Additionally, other sites such as autotrader.com, Yahoo Autos, and others draw from this same database, so we're worried about duplicate content. For instance, entering a snippet of dealer-provided content for one specific listing that Google indexed yielded 8,200+ results: Example Google query. We did not originally think that Google would even be able to index these pages, as they are served up via Ajax. However, it seems we were wrong, as Google has already begun indexing them. Not only is duplicate content an issue, but these pages are not meant for visitors to navigate to directly! If a user were to navigate to the url directly, from the SERPs, they would see a page that isn't styled right. Now we have to determine the right solution to keep these pages out of the index: robots.txt, noindex meta tags, or hash (#) internal links. Robots.txt Advantages: Super easy to implement Conserves crawl budget for large sites Ensures crawler doesn't get stuck. After all, if our website only has 500 pages that we really want indexed and ranked, and vehicle details pages constitute another 1,000,000,000 pages, it doesn't seem to make sense to make Googlebot crawl all of those pages. Robots.txt Disadvantages: Doesn't prevent pages from being indexed, as we've seen, probably because there are internal links to these pages. We could nofollow these internal links, thereby minimizing indexation, but this would lead to each 10-25 noindex internal links on each Vehicle Listings page (will Google think we're pagerank sculpting?) Noindex Advantages: Does prevent vehicle details pages from being indexed Allows ALL pages to be crawled (advantage?) Noindex Disadvantages: Difficult to implement (vehicle details pages are served using ajax, so they have no tag. Solution would have to involve X-Robots-Tag HTTP header and Apache, sending a noindex tag based on querystring variables, similar to this stackoverflow solution. This means the plugin functionality is no longer self-contained, and some hosts may not allow these types of Apache rewrites (as I understand it) Forces (or rather allows) Googlebot to crawl hundreds of thousands of noindex pages. I say "force" because of the crawl budget required. Crawler could get stuck/lost in so many pages, and my not like crawling a site with 1,000,000,000 pages, 99.9% of which are noindexed. Cannot be used in conjunction with robots.txt. After all, crawler never reads noindex meta tag if blocked by robots.txt Hash (#) URL Advantages: By using for links on Vehicle Listing pages to Vehicle Details pages (such as "Contact Seller" buttons), coupled with Javascript, crawler won't be able to follow/crawl these links. Best of both worlds: crawl budget isn't overtaxed by thousands of noindex pages, and internal links used to index robots.txt-disallowed pages are gone. Accomplishes same thing as "nofollowing" these links, but without looking like pagerank sculpting (?) Does not require complex Apache stuff Hash (#) URL Disdvantages: Is Google suspicious of sites with (some) internal links structured like this, since they can't crawl/follow them? Initially, we implemented robots.txt--the "sledgehammer solution." We figured that we'd have a happier crawler this way, as it wouldn't have to crawl zillions of partially duplicate vehicle details pages, and we wanted it to be like these pages didn't even exist. However, Google seems to be indexing many of these pages anyway, probably based on internal links pointing to them. We could nofollow the links pointing to these pages, but we don't want it to look like we're pagerank sculpting or something like that. If we implement noindex on these pages (and doing so is a difficult task itself), then we will be certain these pages aren't indexed. However, to do so we will have to remove the robots.txt disallowal, in order to let the crawler read the noindex tag on these pages. Intuitively, it doesn't make sense to me to make googlebot crawl zillions of vehicle details pages, all of which are noindexed, and it could easily get stuck/lost/etc. It seems like a waste of resources, and in some shadowy way bad for SEO. My developers are pushing for the third solution: using the hash URLs. This works on all hosts and keeps all functionality in the plugin self-contained (unlike noindex), and conserves crawl budget while keeping vehicle details page out of the index (unlike robots.txt). But I don't want Google to slap us 6-12 months from now because it doesn't like links like these (). Any thoughts or advice you guys have would be hugely appreciated, as I've been going in circles, circles, circles on this for a couple of days now. Also, I can provide a test site URL if you'd like to see the functionality in action.0 -
Effect of Removing Footer Links In all Pages Except Home Page
Dear MOZ Community: In an effort to improve the user interface of our business website (a New York CIty commercial real estate agency) my designer eliminated a standardized footer containing links to about 20 pages. The new design maintains this footer on the home page, but all other pages (about 600 eliminate the footer). The new design does a very good job eliminating non essential items. Most of the changes remove or reduce the size of unnecessary design elements. The footer removal is the only change really effect the link structure. The new design is not launched yet. Hoping to receive some good advice from the MOZ community before proceeding My concern is that removing these links could have an adverse or unpredictable effect on ranking. Last Summer we launched a completely redesigned version of the site and our ranking collapsed for 3 months. However unlike the previous upgrade this modifications does not URL names, tags, text or any major element. Only major change is the footer removal. Some of the footer pages provide good (not critical) info for visitors. Note the footer will still appear on the home page but will be removed on the interior pages. Are we risking any detrimental ranking effect by removing this footer? Can we compensate by adding text links to these pages if the links from the footer are removed? Seems irregular to have a home page footer but no footer on the other pages. Are we inviting any downgrade, penalty, adverse SEO effect by implementing this? I very much like the new design but do not want to risk a fall in rank and traffic. Thanks for your input!!!
Intermediate & Advanced SEO | | Kingalan1
Alan0 -
Canonical VS Rel=Next & Rel=Prev for Paginated Pages
I run an ecommerce site that paginates product pages within Categories/Sub-Categories. Currently, products are not displayed in multiple categories but this will most likely happen as time goes on (in Clearance and Manufacturer Categories). I am unclear as to the proper implementation of Canonical tags and Rel=Next & Rel=Prev tags on paginated pages. I do not have a View All page to use as the Canonical URL so that is not an option. I want to avoid duplicate content issues down the road when products are displayed in multiple categories of the site and have Search Engines index paginated pages. My question is, should I use the Rel=Next & Rel=Prev tags on paginated pages as well as using Page One as the Canonical URL? Also, should I implement the Canonical tag on pages that are not yet paginated (only one page)?
Intermediate & Advanced SEO | | mj7750