Localised results always been returned for a query, how do you handle this?
-
I've got an interesting issue relating to geo-location and I'm not sure how to go about solving it. The site: http://www.onlinecoal.co.uk according to the Moz rank tracker is currently ranking 12th for the term "coal merchants" but has been as high as 5th in recent weeks.
However, I've tried the search out in a number of locations (cleared caches, not logged in, different devices etc) and it always seems to returns results with a bias towards local business. The only way I see the results that Moz reports is by using this string: https://www.google.com/?q=coal&pws=0#pws=0&q=coal+merchants&safe=off&start=10
I know from the visits report by analytics that my experience is typical of that potential visitors are finding, Google always returns localised results for the term "coal merchants"
My question is two pronged:
1. What causes google to decide that a general search term is best served with localised results?
2. What is my best strategy to deal with this?
-
Thank you everyone, lot of interesting information. We'll take the approach suggested by Miriam of building up organic authority slowly and we're already on with Adwords (this site is only 5 months old so it was the obvious choice over the winter). I'm also going to follow Rob's advice and focus on a few local areas based on Analytics to try and get a bit more visibility until we build up some domain authority.
Thanks again,
Rodney
-
Sorry, had to go away a bit.
The Omskirk is being returned due to proximity. You are near there, you get to see Omskirk. If you went into Google Search settings and made the location London, you would see London showing more.So, someone in Liverpool, won't be seeing a bunch of Omskirk.
Your question was: How do you handle localised results being returned for a query? You don't. In the eyes of search engines (Google in this case), search results are impacted by the searcher who makes the queries. If you and I are in the same room and we both search on Home delivered coal, we will likely get a somewhat different result (you search on that type term more and click on specific results that modify what you see.) If we both are on non personalized search and in the same room, 99% we will get the same SERP.
So, we probably got off track a bit with the way we all answered. If not, then Miriam's direction is as solid as can be. She knows local back and front. You can either grow slowly or you advertise for a faster result.
The only way YOU will impact Local terms is to use them. Now, DO NOT GO OUT AND CREATE A BUNCH OF LOCAL LISTINGS. Not yelling, really emphasizing. You could have a page for London coal and other larger cities, etc. but take care with duplicate content.
To learn where the traffic is coming from, I would do this. Go into GA, Audience, Visitors Flow and isolate the traffic to city. In my screenshot you will see what I am speaking of. This will show you what city the traffic is coming from. (You need to first click on the UK and View only this segment). Now you see where your city traffic is coming from. That can show you where you can improve or change to meet your business desires.
Let us know if this helps,
Robert
-
Hi Rodney,
What you are experience is pretty much what all non-local businesses have been experiencing since Google began displaying local packs for queries even if they don't include geo-terms. There is nothing you can do to influence whether Google feels your search terms have a local intent or not. If Google has decided that they do, then the local pack is probably here to stay for your core search terms. As you are not a local business, your best hope lies in the following:
-
Building up enough organic authority so that you are ranking alongside the local pack - but not in it
-
Paying for visibility via Adwords
Option one will likely take a great deal of time an effort. Option two can be instantaneous, but will require an outlay of money. Hopefully, you can find a feasible strategy that combines both of these efforts and gets you as much visibility as you can achieve without being a truly local business model.
-
-
Hi Robert,
Thank you for your detailed reply, the results are focused on towns and cities rather than broader surrounding areas. I've uploaded an image of an example result, here: http://www.onlinecoal.co.uk/images/coal-merchants---Google-Search.jpg the 7 pack above and the serps highlighted in red below with lots of local elements being returned e.g. Ormskirk (which is a very small town, 3 miles away from where I currently am). This localized serps is my real problem because Google is returning them in all areas and never appears to be showing a UK serps. I know from my adwords campaigns the power of the phrase "coal merchants" as a traffic driver but if Google always returns localised serps for this term I've got an issue.
My traffic is coming from a very broad range of locations (12.5k locations are identified in Google Analytics, most with only a few visits).
Regards,
Rodney
-
Rodney,
My first question when you say local is you are speaking of a city or region around a city or a "neighborhood" as we would call it in the States, is that correct? (I am trying to differentiate between local being the UK or Great Britain and you are wanting to go beyond those borders.)
If so, I would suggest a couple of things that could assist you. First, understand that in local (with very rare exception) you will not have a page that is in the 7 pack also show in the organic for that SERP. You need to think about that if Local is important to you. So, you want the Local traffic, but you also want to broaden your market beyond "London" for example. You will need to be sure that you have a page that is "set up to rank for Local" (this is a loose phrase) and one that is more for the organic. For example on a service business you might have a contact page that is resolving in the 7 pack and a services or homepage that resolves in the organic. I have had clients with three organic pages (all different) and a listing in the Local 7 pack (also different from the other three) a couple of times - not by our design; it just happened they had strong pages.
If you look at the landing pages you are seeing traffic to, if you look at visitor flow in analytics, etc. there will be clues as to what is happening. Have you looked at where your traffic is coming from in terms of geography?
LMK if you have other questions,
Best,
Robert
-
In that case your best bet is organic rankings. To rank for local seo you need a local presence in the area, this would not work in your case. Unless you open local business locations that would represent your company.
-
Hi Vadim,
Thank you for your quick response. How would you go about "local seo" for a site which supplies nationally from one point of distribution?
Thanks,
Rodney
-
Hi Rodney,
-
This is up to Google, it is open about some categories such as web design; however most results are up to Google, and it can vary for different people and different locations viewing. This brings me to my main response:
-
Best is to rank for local results and organic results (general search as I assume you are referring to). This way you give Google two options to serve your amazing page, as Google feels best fit
Hope this helps!
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
URL Inspector, Rich Results Tool, GSC unable to detect Logo inside Embedded schema
I work on a news site and we updated our Schema set up last week. Since then, valid Logo items are dropping like flies in Search Console. Both URL inspector & Rich Results test cannot seem to be able to detect Logo on articles. Is this a bug or can Googlebot really not see schema nested within other schema?Previously, we had both Organization and Article schema, separately, on all article pages (with Organization repeated inside publisher attribute). We removed the separate Organization, and now just have Article with Organization inside the publisher attribute. Code is valid in Structured Data testing tool but URL inspection etc. cannot detect it. Example: https://bit.ly/2TY9Bct Here is this page in URL inspector:
Technical SEO | | ValnetIncBy comparison, we also have Organization schema (un-nested) on our homepage. Interestingly enough, the tools can detect that no problem. That's leading me to believe that either nested schema is unreadable by Googlebot OR that this is not an accurate representation of Googlebot and it's only unreadable by the testing tools. Here is the homepage in URL inspector:
In pseudo-code, our OLD schema looked like this: The NEW schema set up has the same Article schema set up, but the separate script for Organization has been removed. We made the change to embed our schema for a couple reasons: first, because Google's best practices say that if multiple schemas are used, Google will choose the best one so it's better to just have one script; second, Google's codelabs tutorial for schema uses a nested structure to indicate hierarchy of relevancy to the page. My question is, does nesting schemas like this make it impossible for Googlebot to detect a schema type that's 2 or more levels deep? Or is this just a bug with the testing tools?
0 -
My site Has Penalized By google Search Result Without Any Spam Score.
I Recently Make a Site Gizmocombot.com. tHE aITE has NO spam Record NO lousy BACKLINK.it has all unique article can anyone tell us how we can unpenalized our site from google webmaster and google search Result. i attcead a screenshot as well yoou need. 3nzmALp
Technical SEO | | litoginamaaba3332 -
Will using query string in the URL and swapping H1s for filtered view of the blog impact SEO negatively?
This is a blog revamp we are trying to personalize the experience for 2 separate audiences.We are revamping our blog the user starts on the blog that shows all stories (first screen) then can filter to a more specific blog (ESG or News blog). The filtered version for ESG or the News blog is done through a query string in the URL. We also swap out the page’s H1s accordingly in this process, will this impact SEO negatively?
Technical SEO | | lina_digital0 -
Query Strings causing Duplicate Content
I am working with a client that has multiple locations across the nation, and they recently merged all of the location sites into one site. To allow the lead capture forms to pre-populate the locations, they are using the query string /?location=cityname on every page. EXAMPLE - www.example.com/product www.example.com/product/?location=nashville www.example.com/product/?location=chicago There are thirty locations across the nation, so, every page x 30 is being flagged as duplicate content... at least in the crawl through MOZ. Does using that query string actually cause a duplicate content problem?
Technical SEO | | Rooted1 -
Webmaster Tools Search Queries Data Drop
Hi I'm seeing a significant drop in search queries being reported for a client in GWT starting on the 7th Feb. I have seen a few articles on SERound Table etc saying that many are reporting probs like delays etc with GWT updating its data, such as these ones: https://www.seroundtable.com/google-webmaster-tools-data-stalled-19854.html https://www.seroundtable.com/google-webmaster-tools-analytics-data-19870.html However these seem to suggest the problem is simply a delay with displayed data being updated, in the case im looking at the data is up to date but showing an increasing decline. When i look at Analytics data though the data is completely different. For exmaple GWT says on the 21st Feb there were 23 impressions with zero clicks but Analytics says there were 6 clicks/sessions from organic search. I take it this means that there is a likely problem with GWT data and I shouldn't worry ? All Best Dan
Technical SEO | | Dan-Lawrence0 -
How to Handle Subdomains with Irrelevant Content
Hi Everyone, My company is currently doing a redesign for a website and in the process of planning their 301 redirect strategy, I ran across several subdomains that aren't set up and are pointing to content on another website. The site is on a server that has a dedicated IP address that is shared with the other site. What should we do with these subdomains? Is it okay to 301 them to the homepage of the new site, even though the content is from another site? Should we try to set them up to go to the 404 page on the new site?
Technical SEO | | PapercutInteractive0 -
How to handle pagination for a large website?
I am currently doing a site audit on a large website that just went through a redesign. When looking through their webmaster tools, they have about 3,000 duplicate Title Tags. This is due to the way their pagination is set up on their site. For example. domain.com/books-in-english?page=1 // domain.com/books-in-english?page=4 What is the best way to handle these? According to Google Webmaster Tools, a viable solution is to do nothing because Google is good at distinguishing these. That said, it seems like their could be a better solution to help prevent duplicate content issues. Any advice would be much welcomed. 🙂
Technical SEO | | J-Banz0 -
Pages Linking to Sites that Return 404 Error
We have just a few 404 errors on our site. Is there any way to figure out which pages are linking to the pages that create 404 errors? I would rather fix the links than create new 301 redirects. Thanks!
Technical SEO | | jsillay0