Does having an embedded Google Map still count as a positive SEO signal?
-
I know this was true a few years ago, however is there still an advantage to having an embedded map vs. a pop up map in 2017?
-
Hey BigChad,
Embedding a G Map wasn't considered a Top 50 local or organic ranking factor in this year's Local Search Ranking Factors survey (https://moz.com/local-search-ranking-factors). That being said, one thing that could potentially help you is to drive users to Google Maps to get driving directions to your business, as this behavior could potentially influence rankings.
-
Hi there,
I've heard it used to help back in 2014, but I haven't heard about it for ages. I doubt a little thing like this could help - everyone would do it then! I agree with Don, it can be a valuable thing to have, especially when you have a site or you need clients to visit you. It is certainly a better experience than having to copy the postcode and paste in Google for the map results.
So if I were you, I would add the map so you have it, but I wouldn't be expecting too much when it comes to SEO.
I hope this helps.
Let us know & Good luck!
Katarina
-
BigChad2,
In my experience, I have not seen embedding a google map make much of a difference in rankings. What it does do however, is increase the user experience on your site. Especially if you have a location people travel to.
Thanks,
Don
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What's your proudest accomplishment in regards to SEO?
After many years in the industry, you come to realize a few things. One of of the biggest pain points for us at web daytona was being able to give clients a quick keyword ranking cost estimation. After multiple trial and error and relying on API data from one of the most reliable SEO softwares in our industry, we were able to develop an SEO tool that allows us to quickly and accurately get the estimated cost for a given keyword (s) using multiple variables. Most agencies can relate to that story. It’s something my colleagues and I at Web Daytona have been through before. Finding the cost and amount of time needed to rank for a keyword is a time consuming process. That’s why it’s a common practice to sell SEO packages of 5-10 keywords for about $1000-2000 / month. The problem is not all keywords are equally valuable, and most clients know this. We constantly get questions from clients asking: “how much to rank for this specific keyword?” It’s difficult to answer that question with a pricing model that treats the cost of ranking every keyword equally. So is the answer to spend a lot more time doing tedious in-depth keyword research? If we did we could give our clients more precise estimates. But being that a decent proposal can take as long as 2-5 hours to make, and agency life isn’t exactly full of free time, that wouldn’t be ideal. That’s when we asked a question. What if we could automate the research needed to find the cost of ranking keywords? We looked around for a tool that did, but we couldn’t find it. Then we decided to make it ourselves. It wasn’t going to be easy. But after running an SEO agency for over a decade, we knew we had the expertise to create a tool that wouldn’t just be fast and reliable, it would also be precise. Fast forward to today and we’re proud to announce that The Keyword Cost Estimator is finally done. Now we’re releasing it to the public so other agencies and businesses can use it too. You can see it for yourself here. Keyword-Rank-Cost-Ectimator-Tool-by-Web-Daytona-Agency.png
Local Website Optimization | | WebDaytona0 -
International SEO - How do I show correct SERP results in the UK and US?
Hi, Moz community. I hope you’re all OK and keeping busy during this difficult period. I have a few questions about international SEO, specifically when it comes to ranking pages in the UK and the US simultaneously. We currently have 2 websites set-up which are aimed towards their respective countries. We have a ‘.com’ and a ‘.com/us’. If anybody could help with the issues below, I would be very grateful. Thank you all. Issues When looking in US Google search with a VPN, the title tag for our UK page appears in the SERP e.g. I will see: UK [Product Name] | [Brand] When checking the Google cache, the UK page version also appears This can cause a problem especially when I am creating title tags and meta descriptions that are unique from the UK versions However, when clicking through from the SERP link to the actual page, the US page appears as it should do. I find this very bizarre that it seems to show you the US page when you click through, but you see the UK version in the SERP when looking in the overall search results. Current Set-Up Our UK and US page content is often very similar across our “.com” and “.com/us” websites and our US pages are canonicalised to their UK page versions to remove potential penalisation We have also added herflang to our UK and US pages Query How do I show our US SERP as opposed to the UK version in US Google search? My Theories/ Answers US page versions have to be completely unique with content related to US search intent and be indexed separately - therefore no longer canonicalised to UK version Ensure hreflang is enabled to point Google to correct local page versions Ensure local backlinks point to localised pages If anyone can help, it will be much appreciated. Many thanks all.
Local Website Optimization | | Katarina-Borovska0 -
I've submitted my site to google search console, and only 6 images of 89 images have been indexed in 2 weeks. Should I be worried?
I've submitted my site to google search console, and only 6 images of 89 images have been indexed in 2 weeks. Should I be worried? My site is http://bayareahomebirth.org Images are a pretty big part of this site's content and SEO value. Thanks for your help!
Local Website Optimization | | mattchew0 -
Local SEO: thoughts on driving users to a homepage or to a local landing page?
I work with a client who is about to launch a local landing page for one of their locations. They're worried that the new local landing page will cannibalize some of the keyword rankings for the homepage. Any advice on how to have a local presence but still drive people to the more valuable homepage?
Local Website Optimization | | jrridley0 -
Is there an SEO benefit to using tags in WordPress for my blog posts?
We have locations across the US and are trying to develop content so that we rank well for specific keywords on a local level. For instance, "long tail keyword search in state" or "long tail keyword search near 76244", etc. The goal is to develop those content pages via blogs to rank for those keywords. We are using Yoast and will be optimizing each post using that tool. My questions are: 1. Are there any benefits to adding a long list of tags to each post?
Local Website Optimization | | Smart_Start
2. If yes, do I need to limit the number of tags?
3. Do we need to block the indexing of yoast to those tags and categories for duplicate content issues? Any insight on the best way to optimize these blog posts with the use of tags or other avenues would be greatly appreciated.0 -
Killing it in Yahoo/Bing...Sucking it in Google. What gives?
Our website http://www.survive-a-storm.com has historically performed well in Google for the search terms "storm shelters" and "tornado shelters." Our geographic focus is nationwide, but we are particularly interested in ranking up for Oklahoma. Right now we are hovering at about the third position in Yahoo/Bing, and in some geographic areas (i.e., as selected in Google's search settings) we are doing reasonably to quite well for these terms in Google (i.e., first page). In Oklahoma, though, we are holding steady around positions 20-25. We have just changed the title tag on our home page, cleaned up a bit of on-page optimization, and are going to work on getting some more optimized content on the page. We are outperforming the competition on Domain Authority (38) and Page Authority (46), and as far as I can tell, other key metrics are respectable. Our social isn't bad, but could always use improvement--which we are working on. Any idea why we might be lagging so badly in Google? Any help would be appreciated!
Local Website Optimization | | Survive-a-Storm0 -
Does Google play fair? Is 'relevant content' and 'usability' enough?
It seems there are 2 opposing views, and as a newbie this is very confusing. One view is that as long as your site pages have relevant content and are easy for the user, Google will rank you fairly. The other view is that Google has 'rules' you must follow and even if the site is relevant and user-friendly if you don't play by the rules your site may never rank well. Which is closer to the truth? No one wants to have a great website that won't rank because Google wasn't sophisticated enough to see that they weren't being unfair. Here's an example to illustrate one related concern I have: I've read that Google doesn't like duplicated content. But, here are 2 cases in which is it more 'relevant' and 'usable' to the user to have duplicate content: Say a website helps you find restaurants in a city. Restaurants may be listed by city region, and by type of restaurant. The home page may have links to 30 city regions. It may also have links for 20 types of restaurants. The user has a choice. Say the user chooses a region. The resulting new page may still be relevant and usable by listing ALL 30 regions because the user may want to choose a different region. Altenatively say the user chooses a restaurant type for the whole city. The resulting page may still be relevant and usable by giving the user the ability to choose another type OR another city region. IOW there may be a 'mega-menu' at the top of the page which duplicates on every page in the site, but is very helpful. Instead of requiring the user to go back to the home page to click a new region or a new type the user can do it on any page. That's duplicate content in the form of a mega menu, but is very relevant and usable. YET, my sense is that Google MAY penalize the site even though arguably it is the most relevant and usable approach for someone that may or may not have a specific region or restaurant type in mind.. Thoughts?
Local Website Optimization | | couponguy0 -
How slow can a website be, but still be ok for visitors and seo?
Hello to all, my site http://www.allspecialtybuildings.com is a barn construction site. Our visitors are usually local. I am worried about page speed. I have been using Google Page Insight, and Gtmetrix. Although I cannot figure out browser leveraging, I have a 79 / 93 google score and for gtmetrix 98/87 score. Load times vary between 2.13 secs to 2.54 secs What is acceptable? I want to make sure I get Google love for a decent page speed, but for me these times are great. Bad times are like 7 seconds and higher. I have thought about a CDN, yet I have read horror stories too. I have ZERO idea of how to use a CDN, or if I need it. I just want a fast site that is both user and Google speed friendly. So my question is, what is a slow speed for a website? Is under 3 seconds considered ok? or bad for seo? But any advice is greatly appreciated.
Local Website Optimization | | asbchris0