Avoiding the "sorry we have no imagery here" G-maps error
-
Hi there, we recently did a redesign on a big site and added Gmaps locations to almost every page since we are related to Real State, Listings, Details, search results all have a map embedded.
While looking at GWT I found that the top keywords on our site (which is in spanish) are the following.
- have
- here
- imagery
- sorry
After a quick search I found out this is a Gmaps bug, when Google Bot accesses the Pages it throws an error out with this text repeated several times. If you do a search for "sorry we have no imagery here" you will see lots of sites with this issue.
My question is, Is this affecting the overall SEO since Bots are actually crawling and indexing this hence its being reported by GWT, Should I cloak this to robots? Has anyone noticed this or has been able to fix it?
Thanks in advance!
-
Hmmm.. seems to be a very common issue.
How about creating script that fires the map load on a div that loads a static map image instead of the iframe by default? Then using a simple function switch that image to the iframe of the map. That should do it for the "sorry we have no imagery here" problem.
If it doesn't, you could try using some kind of internal catching to get the static image and save it in your server to serve that as the "1st" image, you can then load the iframe.
Hope that makes sense
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why are "outdated" or "frowned upon" tactics still dominating?
Hey, my first post here. I recently picked up a new client in real estate for a highly competitive market. One trend I'm noticing with all the top sites they are doing old tactics such as:
White Hat / Black Hat SEO | | Jay328
-Paid Directories
-Terrible/Spam Directories
-Overuse of exact text keywords for example: City name + real estate
-Blogroll/link exchange
-Tons of meta key words
-B.S. press releases blog commenting with kw as name Out of all the competition there is only one guy who is following the rules of today. One thing I'm noticing is that nobody is doing legit guest blogging, has great social presence, has awesome on page, etc. It's pretty frustrating as I'm trying to follow the rules and seeing these guys kill it by doing "bad seo". Anybody else find themselves in this situation? I know I'm probably beating a dead horse but I needed to vent about this 😉2 -
What are your views on recent statements regarding "advertorial" content?
Hi, Recently, there's been a lot said and written about how Google is going to come down hard on 'advertorial' content. Many B2B publishers provide exposure to their clients by creating and publishing content about them -----based on information/ content obtained from clients (for example, in the form of press releases) or compiled by the publisher. From a target audience/ user perspective, this is useful information that the publication is bringing to its audience. Also, let's say the publishers don't link directly to client websites. In such a case, how do you think Google is likely to look at publisher websites in the context of the recent statements related to 'advertorial' type content? Look forward to views of the Moz community. Thanks, Manoj
White Hat / Black Hat SEO | | ontarget-media0 -
Does posting a source to the original content avoid duplicate content risk?
A site I work with allows registered user to post blog posts (longer articles). Often, the blog posts have been published earlier on the writer's own blog. Is posting a link to the original source a sufficient preventative solution to possibly getting dinged for duplicate content? Thanks!
White Hat / Black Hat SEO | | 945010 -
Best way to handle SEO error, linking from one site to another same IP
We committed an SEO sin and created a site with links back to our primary website. Although it does not matter, the site was not created for that purpose, it is actually "directory" with categorized links to thousands of culinary sites, and ours are some of the links. This occurred back in May 2010. Starting April 2011 we started seeing a large drop in page views. It dropped again in October 2011. At this point our traffic is down over 40% Although we don't know for sure if this has anything to do with it, we know it is best to remove the links. The question is, given its a bad practice what is the best fix? Should we redirect the 2nd domain to the main or just take it down? The 2nd domain does not have much page rank and I really don't think many if any back-links to it. Will it hurt us more to lose the 1600 or so back links? I would think keeping the links is a bad idea. Thanks for your advice!
White Hat / Black Hat SEO | | foodsleuth0 -
What to do about "Penguin" penalties?
What are your suggestions about moving forward with sites hit by the "Penguin" penalties? Wait it out and see if the penalty goes away Try to remove spammy backlinks and resubmit (is this worth the time and effort) Build quality backlinks to offset (will this even work if they have thousands of spammy links) Blog more (I think this is probably a no brainer) Scrap the site and start from scratch (This is last resort and don't want to do this if at all possible) Or any other ideas are greatly appreciated
White Hat / Black Hat SEO | | RonMedlin0 -
Any hope After a G slap
I was hit very hard in the last Google update. Yes, i was building links. Before i started doing that, i was earning $1 a day despite having a very good blog. i saw my income rise to $3000 a month over the last year. Now I'm back to $1 a day. I have suspected it was coming for the last two months because of the things Matt Cutt's has been saying. Worked hard to avoid it but not hard enough. i stopped using blog networks after Build my Rank got hit. I've been using another tool known as ABC3000 or Automatic Backlink Creator. Its not automatic. 😉 That's the tool that got me the rankings and that's the tool that took them away. 🙂 I've removed all links from the tool now. Yeah, I know its a bad sign to loose 10,000 links all at once but Google already knows so the best thing to do is get ride of them. ALL of my sites were hit, not deindexed. I'm seeing minus 30 and minus 100 penalties. One of the websites has 10,000 + Pages. I've got one on page issue that revolves around my forum, vBulletin is causing a massive amount of duplicate page titles. I don't know how to fix that. It looks like each post creates a new page with the title of thread. I plan ot move it to another domain but I'm fearful of doing that. I'm not sure how. I will research it. But the Google slap is massive and I'm really wondering if I have any hope of restoring a site with good content and good social signals with good engagement. A TRUE bounce rate of less than 30% was higher before the slap at about 23%. I have an average of 4 to 5 minutes per visitors. All my stats look great except 1. That's SERP results as of April 24th, also known as my personal dooms day. haha Trying to keep my head up but it is starting to get hard now. Thanks for the help, Rusty
White Hat / Black Hat SEO | | RustyF0 -
Best Link Building Practices to Avoid Over Optimizing
With all the new over opting talk, one of the things mentioned is having the same anchored text linking to a page over and over without variation. Is there a good estimate on how many external linking in keywords should be exact versus how many should be in variation? Also, keeping value of pages links in mind. Would it be best to use [Exact] phrase for the higher PR sites or more relevant higher traffic sites? and save the long tail or keyword variation text for the lesser valued sites. When to use exact phrase and when to long tail is my question/discussion I always stay relevant in my link building, and all my links are liking within context. Because I know that relevancy has been an important factor. After watching this video from Matt Cutt's http://youtu.be/KyCYyoGusqs I assume relevancy is becoming even more of an important factor.
White Hat / Black Hat SEO | | SEODinosaur0