Local Listings vs. Spreading Too Thin
-
Hello SEO Community,
I'm trying to find the right balance between adapting to Googles move towards local listings and not spreading out my site too thin.
We provide our services nationally and currently have local city listings (i.e. http://www.cleanedison.com/courses/city/IL-Chicago) but these do not show up in the SERPs for individual products + city (i.e. Building Analyst Chicago)
So I could make individual pages for each product in each city, but that would exponentially increase the number of URLs on the site and probably inundate me with duplicate content.
Is there a better way I could take advantage of local listings without creating all the duplicate content and other problems that would arise with individual URLs?
Thanks
-
Here is a great article from Miriam Ellis who is one of the moderators on SEOmoz
http://www.solaswebdesign.net/wordpress/?p=1403
It will explain a lot of the strategies that Karl has mentioned here in depth.
Good response Karl.
-
I'd create a page for the top cities that you feel are important however don't just create pages just to target local search terms. If each course in each city is slightly different and you can tailor your content to match each city then it would be worth doing.
What I'd do is concentrate on building to the generics versions of the keyword and you should see the localised versions appearing in the SERPS.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Http vs Https Related Rankings Drop?
I've noticed in a number of keyword ranking tools (Moz included) that our rankings have dropped substantially for a number of our top performing keywords precisely 7 days back. When you view the attached screenshot you'll see there was a drastic drop in the overall organic impressions as well as a drop in keyword rankings. I also noticed that all the keywords which have dropped in rank now show with the https version of our home page url. I've read up on this and it believe that this should not cause a drop in rankings but we have even added https as a domain in webmaster tools with no improvement. Quite simply, has Google de-indexed our http home page url which was previously tied to our higher rankings for our core keywords? How can we get this back without "disavowing" our https version of the site. We're not doing anything to game search results so I dont think we're being penalized, simply there is some sort technical glitch taking place between recognizing HTTP vs HTTPS versions of our site. Our home page is goo.gl/qVPRwf and an example keyword is "wedding ring sets his and hers" Can anyone recommend further debugging steps or have an understanding of what can be done at this point? Also, if it helps, I have studied the Help Center, read the FAQs and searched for similar questions with no success.wedding ring sets his and hers impressions%20-%20ranking%20drop.png?dl=0
Algorithm Updates | | punitshah0 -
Local Data Aggregators For Canada
Hi Mozzers, I've seen David Mihm's list of data aggregators for local search for the US (infogroup, localeze, acxiom) but I'm in Canada. Does anyone know if someone has sourced this?
Algorithm Updates | | waynekolenchuk1 -
Getting squeezed out of SERP by local results
Hi All, I wanted to get some opinions on a phenomenon that I know others are dealing with... We have a client who is an online-only business (though they do have an office/warehouse location). The on-page is great, and the site has good domain authority for its niche. The issue is that Google is localizing most of their search terms - And our client is getting squeezed down and out in the SERPs by the local listings. How is everyone dealing with this issue? It seems like we'll never get the site to out-rank the local listings in a given geo. Thanks, Lee
Algorithm Updates | | vectormedia0 -
Does the Search Algorithm vary considerably locally?
Hey, i am from india and I just noticed that most of our searches are extremely different to those from the gooogle.com searches. Not some searches. I mean entire layouts. For instance, there were no google places in the search results in India. There was hardly any integration with the G+ for a long time after it launched, even though a large population on G+ was Indian. I got thinking on these lines. Any pointers?
Algorithm Updates | | rahul.bitmesra0 -
Choosing domain name - ccTLD vs Vanity URL
I have to choose between a country specific domain name that is long and difficult to remember, vs or a .me domain which is short and contains the exact keywords I'm optimising for. The challenge is that I'm only targeting local search traffic for the service I am advertising. Does a country specific domain name have any benefits in terms of weighting when I'm only interested in traffic from that country?
Algorithm Updates | | flashie0 -
PR Directory Vs Non PR Directory
Hello Guys, i thought to do some directory submission for my website, i am little confused. I have list of page rank directory and non pr directory. Can anyone suggest me which one shall i prefer most. **Getting approved by pr directory takes long time, so shall i prefer non pr directory to get my listing approved fast. ** please suggest me which one i have to prefer, if anyone have fast pr approval directory list please share. Thanks In Advance
Algorithm Updates | | sumit600 -
How do I separate 2 Google+ business listings?
Ever since Google Places started merging with Google+, my client's business listing is now showing up in local search results incorrectly under another business name who shares the same address as them. Has anyone else encountered this problem or a way to correct it?
Algorithm Updates | | TheeDigital0 -
Hyphens vs Underscores
I am optimizing a site which uses underscores rather than hyphens as word separators (such_as_this.php vs. such-as-this.php). Most of these pages have been around since 2007, and I am hesitant to just redirect to a new page because I am worried it will cause the rankings to slip. Would you recommend changing the file names to be in hyphenated format and place 301 redirects on the pages with underscores, or stick with the existing pages? Is there anything else that would work better? Thanks!
Algorithm Updates | | BluespaceCreative1