How local is local SEO?
-
If I manage get a client ranked for a localised organic search term on a county level. For example: "keyword - West Midlands" or "keyword - Hertfordshire"
How high will the website rank for all the cities and districts within that county?
I am going to give this a go but I was wondering if anyone else has had any experience with this?
-
Hi Adnan,
True local results revolve around cities, not counties, and inclusion in the local pack of results typically requires having a physical location in the city of search. Beyond this, you can develop content for additional cities where you're not physically located, and that could include content you develop for counties, too, if research indicates county terms are important in your niche, but the goal of this work would typically be organic visibility rather than visibility in Google's local pack of results. Hope this helps!
-
I have done this many times before and had some great success.
However.... Panda targets low quality thin content. If you are going to go down this route, DO NOT simply make pages that are just city and towns for that county with just a bit of content.
Spend time creating each area one by one for your niche and grow the site bit by bit. Low quality content in just a few areas of your site can cause the whole site to struggle.
If done wrong you can damage the site by taking the focus and theme of the main services being provided.
-
I can see what you are trying but in my experience especially in competitive markets you might rank in the results but I doubt if it will be on the 1st set of local results.
-
Hi Adnan
My experience with this is that more people tend to search with a town or city qualifier than at a county level. That is because they are really searching for a service or maybe a shop close to where they live and so with that as their intent they won't tend to use a county.
Regarding your question about how high a website will rank when optimised for a county for searches against the cities and districts, it would depend on how competitive that search was. If there are not many offering the service in the specific city then it's likely that Google will spread their net wider to bring in search results for other locations in the same area but by distance not by county.
I suggest you do some keyword research for searches with city/town qualifiers and compare them to the county results before going any further and make your decision on what to target after that.
I hope that helps,
Peter
-
You can be sneaky and use a proxy from those cities and take a look..
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO Myth-Busters -- Isn't there a "duplicate content" penalty by another name here?
Where is that guy with the mustache in the funny hat and the geek when you truly need them? So SEL (SearchEngineLand) said recently that there's no such thing as "duplicate content" penalties. http://searchengineland.com/myth-duplicate-content-penalty-259657 by the way, I'd love to get Rand or Eric or others Mozzers aka TAGFEE'ers to weigh in here on this if possible. The reason for this question is to double check a possible 'duplicate content" type penalty (possibly by another name?) that might accrue in the following situation. 1 - Assume a domain has a 30 Domain Authority (per OSE) 2 - The site on the current domain has about 100 pages - all hand coded. Things do very well in SEO because we designed it to do so.... The site is about 6 years in the current incarnation, with a very simple e-commerce cart (again basically hand coded). I will not name the site for obvious reasons. 3 - Business is good. We're upgrading to a new CMS. (hooray!) In doing so we are implementing categories and faceted search (with plans to try to keep the site to under 100 new "pages" using a combination of rel canonical and noindex. I will also not name the CMS for obvious reasons. In simple terms, as the site is built out and launched in the next 60 - 90 days, and assume we have 500 products and 100 categories, that yields at least 50,000 pages - and with other aspects of the faceted search, it could create easily 10X that many pages. 4 - in ScreamingFrog tests of the DEV site, it is quite evident that there are many tens of thousands of unique urls that are basically the textbook illustration of a duplicate content nightmare. ScreamingFrog has also been known to crash while spidering, and we've discovered thousands of URLS of live sites using the same CMS. There is no question that spiders are somehow triggering some sort of infinite page generation - and we can see that both on our DEV site as well as out in the wild (in Google's Supplemental Index). 5 - Since there is no "duplicate content penalty" and there never was - are there other risks here that are caused by infinite page generation?? Like burning up a theoretical "crawl budget" or having the bots miss pages or other negative consequences? 6 - Is it also possible that bumping a site that ranks well for 100 pages up to 10,000 pages or more might very well have a linkuice penalty as a result of all this (honest but inadvertent) duplicate content? In otherwords, is inbound linkjuice and ranking power essentially divided by the number of pages on a site? Sure, it may be some what mediated by internal page linkjuice, but what's are the actual big-dog issues here? So has SEL's "duplicate content myth" truly been myth-busted in this particular situation? ??? Thanks a million! 200.gif#12
Algorithm Updates | | seo_plus0 -
SEO Audit after Penguin 2.1 what are you guys seeing? this is my thougts
We have looked at around 2000 sites since Penguin 2.1 launched a few weeks back. These include our customers and their own competitors site. We are going through all the data which is obviously going to take some time. Hopefully we will publish a report on our findings as we are happy to share. What I currently see in my early analysis is Roughly 70% of sites tested have 0% exact match Anchor Text for their money keywords. The other 30% have less than 5% exact match Anchor Text. The quality of the links is often still poor to the sites ranking on page 1. The content surrounding the links is only about 10-15% of the time related to the money keywords. The loading time of the sites ranking seems to not matter, we encountered a lot of slow sites. Design and usability of the site was not important. We are not seeing much impact via Social media, a lot of these sites are small business Less than 10% of sites on page 1 had a Google+ account More than 40% of page 1 sites had Facebook profiles. More than 80% of the sites ranking on page 1 had less than 100 links to the landing page that ranked What are your opinions of helping to recover if hit by the above??? Q) If you have too high an anchor text percentage and have been hit or may get hit in the future would you. a) create some more high quality links with more varied anchor text, ie Click here, brand name etc b) not create any more links just remove the links you have to dilute the anchor text c) change the anchor text on links you are able to These figures are a work in progress so data will change just wanting to share our early findings and try to get a good conversation going. What are you guys seeing?
Algorithm Updates | | tempowebdesign0 -
What was the biggest challenge you faced as an SEO in 2012?
As an SEO (in-house, freelance, consultant, agency, entrepreneur) what was the biggest challenge you faced in 2012? Please be as specific as you can, and let us all know what you are doing to overcome this challenge in 2013. For me personally I would have to say the biggest challenge I had to deal with was Google+ Local. Obviously Google is putting a lot into G+L, but it has been so messy and at times I have just thrown my arms up in the air. Especially when it comes to multi-state locations and losing reviews.
Algorithm Updates | | clarktbell0 -
No longer ranking for non local local terms
Anyone seen this lately; I have a client who is in the food catering business and for the seo we target a lot of local keywords (event catering Hampshire, for example). In the past couple of weeks search engine traffic to the website seems to have dropped by about 60%. However, rankings do not seem to have dropped. What I have noticed is that up until a couple of months back, the client would be ranking first page in the Google local and also have a listing in the 'normal' serps. It appears that the non local pages have vanished. Checking a couple of their competitors and it seems the same there. This has led me to start to believe that Google are now only giving a local position or a normal position on the first page and not both, as previously. The non local pages are sitll listed but seem to have dropped way back to the 4th or 5th page when previously they would have been first page. It would of course help if the client were to give me access to the webmaster tools!!! Hate it when client's only give you half the information you need and then expect you to tell them what's up!! Anyone seem this? Thanks, Carl
Algorithm Updates | | ccgale0 -
Negative SEO?
I have a large content site that's 8-9 yrs old, a PR4, DA of 66, and has many thousands of backlinks. It has ranked well for it's primary keywords for quite some time. This morning I noticed rankings dropped significantly. My #2 keyword went from 1 to 150. I started trying to figure out what was up and when I signed into GWT I had the notice from Google on 2/25 that they noticed un-natural linking tactics. Hmm....weird...I dont use un-natural linking methods. So I pulled open a couple back link analyzing tools and when looking at Majestic SEO I noticed that about mid February I had a spike of about 2500-3000 links coming from roughly 350 unique domains. Hmm..weird..We hadn't been doing any major content marketing or link building during that time or for probably a month to month and half before that. Upon analyzing some of those links it appears that a vast majority of them are from some type of blog network. Not sure which but you know the kind I'm talking about. ALN or something similar. What appears to have happened is someone pointed a bunch of spammy links at my site and this has caused Google to penalize me. I know this isn't suppose to be possible but just recently on a forum I visit I noticed a thread where someone was able to successfully do this to his competitor who has held the number one spot for over a year. He used the same technique of a couple hundred blog network links with varied anchor text and his competitor dropped about a hundred spots. So curious if anyone else has seen this or has any advice on my next step. I have filed a re-inclusion request and outlined what I think happened. I am also attempting to figure out which blog network it is so that I can request they remove those links but if I can't I'm not sure what I should do next.
Algorithm Updates | | jmacek070 -
Ranking factors for national and local
What SEO impact factors am I missing if I am ranked 1st on Google for a keyword, but not ranked local search? The keyword being setup like {"company industry" "location" }. Its ranked 4th when searched on google and the location is specific to the location in the keyword. I've tried to varify that all of my citations are correct and identical. When I compare my sites Domain Authority and links to its competitors, I should be dominating that search. If you guys have any incite it would be greatly appreciated. MADD DOGG
Algorithm Updates | | MaddDogg0 -
Is using WPML (WordPress Multilingual Plugin) ok for On-Page SEO?
Hi Mozzers, I'm investigating multilingual site setup and translating content for a small website for 15-20 pages and came accross WPML (WordPress Multilingual Plugin) which looks like it could help, but I am curious as to whether it has any major international SEO limitations before trialing/buying. It seems to allow the option to automatically setup language folder structures as www.domain.com/it/ or www.domain.com/es/ etc which is great and seems to offer easy way of linking out to translators (for extra fee), which could be convenient. However what about the on-page optimization - url names, title tags and other onpage elements - I wonder if anyone has any experiences with using this plugin or any alternatives for it. Hoping for your valued advice!
Algorithm Updates | | emerald0 -
Ranking #1 for Local, Not for National
A client with both a web and brick and mortar store is ranking well for normal web searches locally for many terms but less so nationally. I'm aware that results change due to location and other factors. Specifically, client is wondering if his retail location and corresponding places page are hurting his web results in non-local areas.
Algorithm Updates | | AliveWired0