Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Targeting local areas without creating landing pages for each town
-
I have a large ecommerce website which is structured very much for SEO as it existed a few years ago. With a landing page for every product/town nationwide (its a lot of pages).
Then along came Panda...
I began shrinking the site in Feb last year in an effort to tackle duplicate content. We had initially used a template only changing product/town name.
My first change was to reduce the amount of pages in half by merging the top two categories, as they are semantically similar enough to not need their own pages. This worked a treat, traffic didn't drop at all and the remaining pages are bringing in the desired search terms for both these products.
Next I have rewritten the content for every product to ensure they are now as individual as possible.
However with 46 products and each of those generating a product/area page we still have a heap of duplicate content. Now i want to reduce the town pages, I have already started writing content for my most important areas, again, to make these pages as individual as possible.
The problem i have is that nobody can write enough unique content to target every town in the UK via an individual page (times by 46 products), so i want to reduce these too.
QUESTION: If I have a single page for "croydon", will mentioning other local surrounding areas on this page, such as Mitcham, be enough to rank this page for both towns?
I have approx 25 Google local place/map listings and grwoing, and am working from these areas outwards. I want to bring the site right down to about 150 main area pages to tackle all the duplicate content, but obviously don't want to lose my traffic for so many areas at once.
Any examples of big sites that have reduced in size since Panda would be great.
I have a headache... Thanks community.
-
My pleasure, Silkstream. I can understand how what you are doing feels risky, but in fact, you are likely preventing fallout from worse risks in the future. SEO is a process, always evolving, and helping your client change with the times is a good thing to do! Good luck with the work.
-
Thank you Miriam. I appreciate you sharing with me the broad idea of the type of structure that you feel a site should have in this instance (if starting from scratch).
You have pretty much echoed my proposal for a new site structure, built for how Google works nowadays, rather than 2-3 years ago. We are currently reducing the size of the current site, to bring it as close to this type of model as possible. However the site would need a complete redesign to make it viably possible to have this type of structure.
I guess what I've been looking for is some kind of reassurance that we are moving in the right direction! Its a scary prospect reducing such a huge amount of pages down to a compact targeted set. With prospects of losing so much long tail traffic, it can make us a little hesitant.
However the on-site changes we have made so far, seem to be having a positive affect.And thank you for giving me some ideas about content creation for each town. I really like this as an idea to move forward after the changes are complete, which will hopefully be by the new year!
-
Hi Silkstream,
Thank you so much for clarifying this! I understand now.
If I were starting with a client like this, from scratch, this would be the approach I would take:
-
View content development as two types of pages. One set would be the landing pages for each physical location, optimized for each city, with unique content. The other set would be service pages, optimized for the services, but not for a particular city.
-
Create a Google+ Local page for each of the physical locations, linked to its respective landing page on the website. So, let's say you now have 25 city pages and 46 service pages. That's a fairly tall order, but certainly do-able.
-
Build structured citations for each location on third party local business directories. Given the number of locations, this would be an enormous jobs.
-
Build an onsite blog and designate company bloggers, ideally one in each physical office. The job of these bloggers would be something like each of them creating one blog post per month about a project that was accomplished in their city. In this way, the company could begin developing content under their own steam that would meet the need of showcasing a given service with a given city. Over time, this body of content would grow the pool of queries for which they have answers for.
-
Create a social outreach strategy, likely designating brand representatives within the company who could be active on various platforms.
-
Likely need to develop a link earning strategy tied in with steps 4 and 5.
-
Consider video marketing. A good video or two for each physical location could work wonders.
I'm painting in broad strokes here, but this is likely what the overall strategy would look like. You've come into the scenario midway and don't have the luxury of starting from scratch. You are absolutely right to be cleaning up duplicate content and taking other measures to reduce the spaminess and improve the usefulness of the site. Once you've got your cleanup complete, I think the steps I've outlined would be the direction to go in. Hope this helps.
-
-
Hi Miriam,
Thanks for jumping in.
The business model is service-based. So when i refer to "46 products" they are actually 46 different types of service available.
The customer will typically book and pay online, through the website, and they are then served at their location which is most often either their home or place of work. They actually have far more than the 25 actual locations, much closer to 120 I believe. However, I only began their SEO in February, AFTER they were hit by Panda. So building up their local listings is taking time, as the duplicate content issue seems far more urgent. Trying to strike a balance, and fix this all slowly over time to lay a solid foundation for inbound marketing, as its being diluted by the poor site structure.
Does this help? Am I doing the right things here?
-
Hi Silkstream,
I think we need to clarify what your business model is. You say you have a physical location in each of your 25 towns. So far, so good, but are you saying that your business has in-person transactions with its customers at each of the 25 locations? The confusion here is arising from the fact that e-commerce companies are typically virtual, meaning that they do not have in-person transactions with their customers. The Google Places Quality Guidelines state:
Only businesses that make in-person contact with customers qualify for a Google Places listing.
Thus, my wanting to be sure that your business model is actually eligible, given that you've described it as an e-commerce business, which would be ineligibl_e._ If you can clarify your business model, I think it will help you to receive the most helpful answers from the community.
-
You scared me then Chris!
-
Of course, if you've got the physical locations, you're in good shape there.
-
"It sounds like you're saying that your one ecommerce company has 25 Google local business listings--and growing?! It's very possible that could come back and haunt you unless you in the form of merging or penalization."
Why? The business has a physical location in every town, so why should they not have a page for every location? This is what we were advised to do?
"If there was no other competition, you would almost certainly rank for your keywords along with the town name"
I have used this tactic before, for another nationwide business, but on a smaller scale and it worked. Ie; they ranked (middle of page 1) but for non competitive keywords and the page has strong backlinks. With this site, the competition is stronger and the pages will not have a strong backlink profile at first.
My biggest worry, is to cut all the existing pages and lose the 80% long tail the site currently pulls in. But what other way is there to tackle so much duplicate content?
-
It sounds like you're saying that your one ecommerce company has 25 Google local business listings--and growing?! It's very possible that could come back and haunt you unless you in the form of merging or penalization. If not that, it's likely to stop being worth the time as a visibility tactic.
As far as whether or not mentioning local surrounding towns in your page copy will be enough to get you to rank for them, it would depend on competition. If there was no other competition, you would almost certainly rank for your keywords along with the town name but with competition, all the local ranking factors start coming into play and your ability to rank for each one will depend on a combination of all of them.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
On 1 of our sites we have our Company name in the H1 on our other site we have the page title in our H1 - does anyone have any advise about the best information to have in the H1, H2 and Page Tile
We have 2 sites that have been set up slightly differently. On 1 site we have the Company name in the H1 and the product name in the page title and H2. On the other site we have the Product name in the H1 and no H2. Does anyone have any advise about the best information to have in the H1 and H2
Intermediate & Advanced SEO | | CostumeD0 -
Google indexing only 1 page out of 2 similar pages made for different cities
We have created two category pages, in which we are showing products which could be delivered in separate cities. Both pages are related to cake delivery in that city. But out of these two category pages only 1 got indexed in google and other has not. Its been around 1 month but still only Bangalore category page got indexed. We have submitted sitemap and google is not giving any crawl error. We have also submitted for indexing from "Fetch as google" option in webmasters. www.winni.in/c/4/cakes (Indexed - Bangalore page - http://www.winni.in/sitemap/sitemap_blr_cakes.xml) 2. http://www.winni.in/hyderabad/cakes/c/4 (Not indexed - Hyderabad page - http://www.winni.in/sitemap/sitemap_hyd_cakes.xml) I tried searching for "hyderabad site:www.winni.in" in google but there also http://www.winni.in/hyderabad/cakes/c/4 this link is not coming, instead of this only www.winni.in/c/4/cakes is coming. Can anyone please let me know what could be the possible issue with this?
Intermediate & Advanced SEO | | abhihan0 -
Different Header on Home Page vs Sub pages
Hello, I am an SEO/PPC manager for a company that does a medical detox. You can see the site in question here: http://opiates.com. My question is, I've never heard of it specifically being a problem to have a different header on the home page of the site than on the subpages, but I rarely see it either. Most sites, if i'm not mistaken, use a consistent header across most of the site. However, a person i'm working for now said that she has had other SEO's look at the site (above) and they always say that it is a big SEO problem to have a different header on the homepage than on the subpages. Any thoughts on this subject? I've never heard of this before. Thanks, Jesse
Intermediate & Advanced SEO | | Waismann0 -
How long takes to a page show up in Google results after removing noindex from a page?
Hi folks, A client of mine created a new page and used meta robots noindex to not show the page while they are not ready to launch it. The problem is that somehow Google "crawled" the page and now, after removing the meta robots noindex, the page does not show up in the results. We've tried to crawl it using Fetch as Googlebot, and then submit it using the button that appears. We've included the page in sitemap.xml and also used the old Google submit new page URL https://www.google.com/webmasters/tools/submit-url Does anyone know how long will it take for Google to show the page AFTER removing meta robots noindex from the page? Any reliable references of the statement? I did not find any Google video/post about this. I know that in some days it will appear but I'd like to have a good reference for the future. Thanks.
Intermediate & Advanced SEO | | fabioricotta-840380 -
Can too many "noindex" pages compared to "index" pages be a problem?
Hello, I have a question for you: our website virtualsheetmusic.com includes thousands of product pages, and due to Panda penalties in the past, we have no-indexed most of the product pages hoping in a sort of recovery (not yet seen though!). So, currently we have about 4,000 "index" page compared to about 80,000 "noindex" pages. Now, we plan to add additional 100,000 new product pages from a new publisher to offer our customers more music choice, and these new pages will still be marked as "noindex, follow". At the end of the integration process, we will end up having something like 180,000 "noindex, follow" pages compared to about 4,000 "index, follow" pages. Here is my question: can this huge discrepancy between 180,000 "noindex" pages and 4,000 "index" pages be a problem? Can this kind of scenario have or cause any negative effect on our current natural SEs profile? or is this something that doesn't actually matter? Any thoughts on this issue are very welcome. Thank you! Fabrizio
Intermediate & Advanced SEO | | fablau0 -
How to Target Keyword Variations?
I have a list of keywords I'm trying to target and they are essentially different variations of each other: Example: blue yankees baseball hat yankees blue baseball hat yankees baseball hat in blue Should I be targeting all these on the same page, or should I be making a new page for each one? Thanks Mozzers!
Intermediate & Advanced SEO | | ATMOSMarketing560 -
301 - should I redirect entire domain or page for page?
Hi, We recently enabled a 301 on our domain from our old website to our new website. On the advice of fellow mozzer's we copied the old site exactly to the new domain, then did the 301 so that the sites are identical. Question is, should we be doing the 301 as a whole domain redirect, i.e. www.oldsite.com is now > www.newsite.com, or individually setting each page, i.e. www.oldsite.com/page1 is now www.newsite.com/page1 etc for each page in our site? Remembering that both old and new sites (for now) are identical copies. Also we set the 301 about 5 days ago and have verified its working but haven't seen a single change in rank either from the old site or new - is this because Google hasn't likely re-indexed yet? Thanks, Anthony
Intermediate & Advanced SEO | | Grenadi0 -
There's a website I'm working with that has a .php extension. All the pages do. What's the best practice to remove the .php extension across all pages?
Client wishes to drop the .php extension on all their pages (they've got around 2k pages). I assured them that wasn't necessary. However, in the event that I do end up doing this what's the best practices way (and easiest way) to do this? This is also a WordPress site. Thanks.
Intermediate & Advanced SEO | | digisavvy0