Anyone managed to change 'At a glance:' in local search results
-
On Google's local search results, i.e when the 'Google places' data is displayed along with the map on the right hand side of the search results, there is also an element 'At a glance:'
The data that if being displayed is from some years ago and the client would if possible like it to reflect there current services, which they have been providing for some five years.According to Google support here - http://support.google.com/maps/bin/answer.py?hl=en&answer=1344353 this cannot be changed, they say
'Can I edit a listing’s descriptive terms or suggest a new one?
No; the terms are not reviewed, curated, or edited. They come from an algorithm, and we do not help that algorithm figure it out. 'My question is has anyone successfully influenced this data and if so how.
-
You are very welcome. My pleasure to help!
-
Hi Miriam
Thank you for your comprehensive answer. I will look at the resources that you have provided and perhaps conduct some tests of my own.
Once again, thank you.
-
Hi CodingStuff,
Good question, and I empathize with frustration on this. The At A Glance feature often points up the most absurd snippets. You can report the issue through this form:
http://support.google.com/places/bin/static.py?hl=en&ts=1386120&page=ts.cs
....but, Google will apparently only consider changing it if it contains totally wrong information (like jewelry appearing on a restaurant listing) and even then, it's unlikely to see action happen on their part. So, this leaves us with trying to understand the cause/source of these snippet sentiments. Mike Blumenthal wrote a great post on this in 2011 when these first appeared:
http://blumenthals.com/blog/2011/08/30/google-places-descriptor-snippets/
In that post, he points to a Bill Slawski piece on a patent that appears to relate to this sentiment display:
http://www.seobythesea.com/2011/08/google-boost-search-rankings-category/
And here's a good discussion on this topic in Google's forum:
http://productforums.google.com/forum/#!topic/business/wDRAXAl1gyA
The sources of the snippets have appeared to stem mainly from reviews, but also from links and website content. Usually, it's pretty easy to trace language in the snippets to things that have been said in reviews, at least in my experience. So, here we come to another hands-are-a-bit tied situation, because you aren't able to control what people say in their reviews. Even one bad review can end up tacking an awful sentiment to your profile in the At A Glance section.
I have never seen research done on how often Google internally refreshes At A Glance sentiments, but it stands to reason that the acquisition of a gentle but ongoing stream of positive reviews for the business would be the strongest action one can take to hope to see a change in the sentiments. Without seeing your actual profile, I have to speak broadly on this, but that would be my basic advice. You can make an effort to report the issue to Google and there is a slim chance you might get somewhere with that, but in most cases, you'll just have to work at getting reviews in hopes of seeing an eventual change to the snippets.
For good recent reading about the topic of getting Google-based reviews, here's another piece by Mike Blumenthal that you may find helpful:
http://blumenthals.com/blog/2012/09/24/asking-for-reviews-post-apocalypse/
I hope these resources help you feel up-to-speed on this sometimes frustrating topic. Good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How do internal search results get indexed by Google?
Hi all, Most of the URLs that are created by using the internal search function of a website/web shop shouldn't be indexed since they create duplicate content or waste crawl budget. The standard way to go is to 'noindex, follow' these pages or sometimes to use robots.txt to disallow crawling of these pages. The first question I have is how these pages actually would get indexed in the first place if you wouldn't use one of the options above. Crawlers follow links to index a website's pages. If a random visitor comes to your site and uses the search function, this creates a URL. There are no links leading to this URL, it is not in a sitemap, it can't be found through navigating on the website,... so how can search engines index these URLs that were generated by using an internal search function? Second question: let's say somebody embeds a link on his website pointing to a URL from your website that was created by an internal search. Now let's assume you used robots.txt to make sure these URLs weren't indexed. This means Google won't even crawl those pages. Is it possible then that the link that was used on another website will show an empty page after a while, since Google doesn't even crawl this page? Thanks for your thoughts guys.
Intermediate & Advanced SEO | | Mat_C0 -
Why do SEO agencies ask for access to our Google Search Console and Google Tag Manager?
What do they need GTM for? And what is the use case for setting up Google Search Console?
Intermediate & Advanced SEO | | NBJ_SM0 -
What URL parameter settings in GWT to choose for search results parameter?
Hello,we're about to disallow search results from crawling in robots.txt, but in GWT we have to specify URL parameters. URLs with 'search' parameter look like these: http://www.example.com/?search=keyword So in GWT we're setting the following parameter: search Question, what settings to set for it?
Intermediate & Advanced SEO | | poiseo0 -
Is Content Location Determined by Source Code or Visual Location in Search Engine's Mind?
I have a page with 2 scroll features. First 1/3 of the page (from left) has thumb pictures (not original content) and a vertical scroll next to. Remaining 2/3 of the page has a lot of unique content and a vertical scroll next to it. Question: Visually on a computer, the unique content is right next to the thumbs, but in the source code the original content shows after these thumbs. Does that mean search engines will see this content as "below the fold" and actually, placing this content below the thumbs (requiring a lot of scrolling to get to the original content) would in a search engine's mind be the exact same location of the content, as the source code shows the same location? I am trying to understand if search engines base their analysis on source code or also visual location of content? thx
Intermediate & Advanced SEO | | khi50 -
Customer Experience vs Search Result Optimisation
Yes, I know customer experience is king, however, I have a dilema, my site has been live since June 2013 & we get good feedback on site design & easy to follow navigation, however, our rankings arent as good as they could be? For example, the following 2 pages share v similar URLs, but the pages do 2 different jobs & when you get to the site that is easy to see, but my largest Keyword "Over 50 Life Insurance" becomes difficult to target as google sees both pages and splits the results, so I think i must be losing ranking positions? http://www.over50choices.co.uk/Funeral-Planning/Over-50-Life-Insurance.aspx http://www.over50choices.co.uk/Funeral-Planning/Over-50-Life-Insurance/Compare-Over-50s-Life-Insurance.aspx The first page explains the product(s) and the 2nd is the Quote & Compare page, which generates the income. I am currently playing with meta tags, but as yet havent found the right combination! Originally the 2nd page meta tags were focussing on "compare over 50s life insurance" but google still sees "over 50 life insurance" in this phrase, so the results get split. I also had internal anchor text supporting this. What do you think is the best strategy for optimising both pages? Thanks Ash
Intermediate & Advanced SEO | | AshShep10 -
Local results vs Normal results
Hi everyone, I am currently working on the website of a friend, who's owning a French spa treatment company. I have been working on it for the past 6 months, mostly on optimizing the page titles and the link building. So far the results are great in terms on normal results : if you type most of the keywords and the city name, the website would be very well positioned, if not top positioned. My only problem is that in the local results (Google Maps), nothing has improved at all. In most of the same keyword where the website is ranking 1st on normal results, the website doesn't appear at all on the same keywords in local results. This is confusing as you would think Google think the website is relevant to the subject according to the normal results but it doesn't show any good ones in a local matter. The website is clearly located in the city (thanks to the pages titles and there's a Google Map in a specific page dedicated to its location). The company has a Google Places page and it has positive customers reviews on different trusted websites for more than a year now (the website is 2 years old). I focused my work concerning the link building on the local websites (directories and specialized websites) for the past 2 months. The results kept improving on normal results but still no improvement at all in the local ones. As far as I know, there is no mistakes such as multiple addresses for the same business etc. Everything seems to be done by the rules. I am not sure at all what more I can do. The competitors do not seem to be working their SEO pretty much and in terms of linking (according to the -pretty good- Seomoz tools), they have up to 10 times less (good) links than us. Maybe you guys have some advice on how I can manage this situation ? I'm kind of lost here 😞 Thanks a lot for your help, appreciate it. Cheers,
Intermediate & Advanced SEO | | Pureshore
Raphael0 -
How to manage duplicate content?
I have a real estate site that contains a large amount of duplicate content. The site contains listings that appear both on my clients website and on my competitors websites(who have better domain authority). It is critical that the content is there because buyers need to be able to find these listings to make enquiries. The result is that I have a large number pages that contain duplicate content in some way, shape or form. My search results pages are really the most important ones because these are the ones targeting my keywords. I can differentiate these to some degree but the actual listings themselves are duplicate. What strategies exist to ensure that I'm not suffereing as a result of this content? Should I : Make the duplicate content noindex. Yes my results pages will have some degree of duplicate content but each result only displays a 200 character summary of the advert text so not sure if that counts. Would reducing the amount of visible duplicate content improve my rankings as a whole? Link back to the clients site to indicate that they are the original source Any suggestions?
Intermediate & Advanced SEO | | Mulith0 -
How 'Off Topic' can I go - site wide?
Hello, I am currently number 1 for a competitive keyword - so don't want to push the wrong button and self destruct! My site is highly focused on one relatively narrow niche with about 50-60 pages of content bang on topic. I was wondering if Google will discredit my site in any way if I start adding pages that are** 'loosely related' **to the overall theme of my niche. Some of them are what you might call sister concepts with maybe one mention of my target keyword in the body..... Does the algo value what percentage of the whole site's content is on/ off topic? If so how important is this as a factor? Thanks a lot
Intermediate & Advanced SEO | | philipjterry0