Avoiding the "sorry we have no imagery here" G-maps error
-
Hi there, we recently did a redesign on a big site and added Gmaps locations to almost every page since we are related to Real State, Listings, Details, search results all have a map embedded.
While looking at GWT I found that the top keywords on our site (which is in spanish) are the following.
- have
- here
- imagery
- sorry
After a quick search I found out this is a Gmaps bug, when Google Bot accesses the Pages it throws an error out with this text repeated several times. If you do a search for "sorry we have no imagery here" you will see lots of sites with this issue.
My question is, Is this affecting the overall SEO since Bots are actually crawling and indexing this hence its being reported by GWT, Should I cloak this to robots? Has anyone noticed this or has been able to fix it?
Thanks in advance!
-
Hmmm.. seems to be a very common issue.
How about creating script that fires the map load on a div that loads a static map image instead of the iframe by default? Then using a simple function switch that image to the iframe of the map. That should do it for the "sorry we have no imagery here" problem.
If it doesn't, you could try using some kind of internal catching to get the static image and save it in your server to serve that as the "1st" image, you can then load the iframe.
Hope that makes sense
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Content optimized for old keywords and G Updates
Hi, We've got some old content, about 50 pages worth in an Ecommerce site, that is optimized for keywords that aren't the subject of the page - these keywords occur about 8 times (2 keywords per page) in the old content. We are going through these 50 pages and changing the title, H1, and meta description tag to match the exact subject of the page - so that we will increase in rankings again - the updates have been lowering our rankings. Do we need to completely rewrite the content for these 50 pages, or can we just sprinkle it with any needed additions of the one keyword that is the subject of the page? The reason I'm asking is that our rankings keep dropping and these 50 pages seem to be part of the problem. We're in the process of updating these 50 pages Thanks.
White Hat / Black Hat SEO | | BobGW0 -
Can I Point Multiple Exact Match Domains to a Primary Domain? (Avoiding Duplicate Content)
For example, lets say I have these 3 domains: product1.com product2.com product.com The first 2 domains will have very similar text content, with different products. The product.com domain will be similar content, with all of the products in one place. Transactions would be handled through the Primary domain (product.com) The purpose of this would be to capitalize on the Exact match domain opportunities. I found this seemingly old article: http://www.thesitewizard.com/domain/point-multiple-domains-one-website.shtml The article states that you can avoid duplicate content issues, and have all links attributed to the Primary domain. What do you guys think about this? Is it possible? Is there a better way of approaching this while still taking advantage of the EMD?
White Hat / Black Hat SEO | | ClearVisionDesign0 -
Is there any value in "starting from scratch" on a new domain?
Hi, Our ecommerce store - we have had some duplicate content issues and they have been corrected, but of course, Google takes time to pick up on these. Our link profile is very poor, so we wont lose a lot by going to a new domain in that sense. My question is, in what instances is it worthwhile starting under a new domain? And in which not? Presumably you can also 301 the whole site - when is it worth doing this or not? Thanks, Ben
White Hat / Black Hat SEO | | bjs20100 -
Competitor using "unatural inbound links" not penalized??!
Since Google's latest updates, I think it would be safe to say that building links is harder. But i also read that Google applies their latest guidelines retro-actively. In other words, if you have built your ilnking profile on a lot of unnatural links, with spammy anchor text, you will get noticed and penalized. In the past, I used to use SEO friendly directories and "suggest URL's" to build back links, with keyword/phrase anchor text. But I thought that this technique was frowned upon by Google these days. So, what is safe to do? Why is Google not penalizing the competitor? And bottom line what is considered to be "unnatural link building" ?
White Hat / Black Hat SEO | | bjs20101 -
Best way to handle SEO error, linking from one site to another same IP
We committed an SEO sin and created a site with links back to our primary website. Although it does not matter, the site was not created for that purpose, it is actually "directory" with categorized links to thousands of culinary sites, and ours are some of the links. This occurred back in May 2010. Starting April 2011 we started seeing a large drop in page views. It dropped again in October 2011. At this point our traffic is down over 40% Although we don't know for sure if this has anything to do with it, we know it is best to remove the links. The question is, given its a bad practice what is the best fix? Should we redirect the 2nd domain to the main or just take it down? The 2nd domain does not have much page rank and I really don't think many if any back-links to it. Will it hurt us more to lose the 1600 or so back links? I would think keeping the links is a bad idea. Thanks for your advice!
White Hat / Black Hat SEO | | foodsleuth0 -
Any hope After a G slap
I was hit very hard in the last Google update. Yes, i was building links. Before i started doing that, i was earning $1 a day despite having a very good blog. i saw my income rise to $3000 a month over the last year. Now I'm back to $1 a day. I have suspected it was coming for the last two months because of the things Matt Cutt's has been saying. Worked hard to avoid it but not hard enough. i stopped using blog networks after Build my Rank got hit. I've been using another tool known as ABC3000 or Automatic Backlink Creator. Its not automatic. 😉 That's the tool that got me the rankings and that's the tool that took them away. 🙂 I've removed all links from the tool now. Yeah, I know its a bad sign to loose 10,000 links all at once but Google already knows so the best thing to do is get ride of them. ALL of my sites were hit, not deindexed. I'm seeing minus 30 and minus 100 penalties. One of the websites has 10,000 + Pages. I've got one on page issue that revolves around my forum, vBulletin is causing a massive amount of duplicate page titles. I don't know how to fix that. It looks like each post creates a new page with the title of thread. I plan ot move it to another domain but I'm fearful of doing that. I'm not sure how. I will research it. But the Google slap is massive and I'm really wondering if I have any hope of restoring a site with good content and good social signals with good engagement. A TRUE bounce rate of less than 30% was higher before the slap at about 23%. I have an average of 4 to 5 minutes per visitors. All my stats look great except 1. That's SERP results as of April 24th, also known as my personal dooms day. haha Trying to keep my head up but it is starting to get hard now. Thanks for the help, Rusty
White Hat / Black Hat SEO | | RustyF0 -
Any recent discoveries or observations on the "Official Line" of incoming link penalization?
I know this is always a contentious issue and that the official, or shall we say semi-official line is that you can't be penalized for incoming links, as you can't control who links to you (aside of course from link buying, and other stuff that Google feels it can work out). I was wondering if anyone had any recent discoveries or observations on this? Obviously there's the problem that is usually brought up where you could damage a competitor buy link building to them with spammy links, etc... hence the half denial of it being an issue... but has anyone seen or hear anything on it recently, or experienced something relevant?
White Hat / Black Hat SEO | | SteveOllington1