Google my business - Image sizes
-
I have scoured the web in order to find a guide that would give me the ideal dimensions for images to populate google my business page... in vain.
Google itself is very vague about it as indicated below
- Format: JPG, PNG, TIFF, BMP
- Size: Between 10 KB and 5 MB
- Minimum resolution: 250px tall, 250px wide
Does anyone know of a guide with optimum recommendation for each photo (profile, Cover photo, business specific photos...) or alternatively can recommend the exact size needed.
Thanks
-
thanks dimitri
-
Hey Neil,
The problem here is that Google has rolled out both a new local and a new maps interface in the past couple of months, and I'm not sure what this has done to photo requirements. Here are the most recent things I could find for you, but I am not positive the numbers are still accurate. You might need to experiment a bit:
http://blumenthals.com/blog/2015/02/24/google-my-business-upgrades-business-photos/
http://localu.org/blog/your-google-my-business-profile-image-your-most-important-image/
Hope this helps a bit!
-
Hi there.
I think people are forgetting that GMB and G+ is pretty much the same. As far as I understand, Google is actually moving from personal G+ to Business G+ and changing the name of the network. So, assuming that my thought is correct, I look at this: https://support.google.com/plus/answer/1057172?hl=en
And it says:
Tip: We recommend that you choose a photo that's 1080 x 608 pixels. The smallest photo you can choose is 480 x 270 pixels, and the largest photo you can choose is 2120 x 1192 pixels.
Hope this works.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I have followed all the steps in google speed ranking on how to increase my website http://briefwatch.com/ speed but no good result
My website http://briefwatch.com/ has a very low-speed score on google page speed and I followed all the steps given to me still my website speed doesn't increase
Local Website Optimization | | Briefwatch0 -
Are local business directories worth the effort? Eg. White pages, Yell.com, Local.com?
Hi Guys, Im new to Moz and very keen to do SEO right without upsetting Mr. Google too much. Are local business directories worth the effort? Its a laborious job, but happy to do it, if its effective and won't be considered spammy by Google? Thanks
Local Website Optimization | | Fetseun0 -
Google still indexing home page even after with 301 - Ecommerce Website
Hi all,
Local Website Optimization | | David1986
We have a 301 redirect problem. Google seems to continue indexing a 301 redirect to our old home page. Even after months. We have a multiple language domain, with subfolders: www.example.com (ex page, now with a redirect to the right locale in the right country) www.example.com/it/home (canonical) www.example.com/en/home (canonical) www.example.com/es/home (canonical) www.example.com/fr/home (canonical) www.example.com/de/home (canonical) We still see the old page (www.example.com) in Google results, with old metadata in English and, just in some countries (i.e.: France), we see the correct result, the "new" homepage, www.example.com/fr/home in first position.
The real problem is that Google is still indexing and showing www.example.com as the "real" and "trusted" URL, even if we set: a 301 redirect the right language for every locale in Google Search Console a canonical tag to the locale url an hreflang tag inside the code a specific sitemap with hreflang tag specified for the new homepages Now our redirect process is the following (Italy example).
www.example.com -->301
www.example.com/en/home --> default version --->301
www.example.com/it/home --> 200 Every online tool, from Moz to Bot simulators see that there is a 301. So Correct. Google Search Console says that: on www.example.com there is a 301 (correct) in the internal link section of Google Search Console the www.example.com is still in first position with 34k links. Many of these links are cominig from property subdomains. Should we change those links inside those third level domain? From www.example.com to www.example.com/LOCALE/home? the www.example.com/LOCALE/home are the real home page, they give 200 code Do you know if there's a way to delete the old home page from Google results since this is 301? Do you think that, even after a 301 redirect, if Google sees too many internal links decides to ignore the 301? Thanks for your help! Davide0 -
Our Website is showing on the 11th place on Google Map
Hello, We are a photo studio in New York City, our website is
Local Website Optimization | | YourHollywoodPortrait
and our Google Plus page is http://yourhollywoodportrait.com/ https://plus.google.com/+YourHollywoodPortraitStudioNewYork When doing a search in maps for Boudoir Photography New York City we don't appear in the first 10 results, there is even a studio from New Jersey appearing before us. We have only 5* reviews, we did a bunch of local citations and still we are not in the first page of maps. Would you have any suggestions as to what we are doing wrong or should be doing? Thanks a lot for your help! Michael0 -
What's with Google? All metrics in my favor, yet local competitors win.
In regards to local search with the most relevant keyword, I can't seem to get ahead of the competition. I've been going through a number of analytics reports, and in analyzing our trophy keyword (which is also the most relevant, to our service and site) our domain has consistently been better with a number of factors. There is not a moz report that I can find that doesn't present us as the winner. Of course I know MOZ analytics and google analytics are different, but I'm certain that we have them beat with both. When all metrics seem to be in our favor, why might other competitors continue to have better success? We should be dominating this niche industry. Instead, I see a company using blackhat seo, another with just a facebook page only, and several others that just don't manage their site or ever add unique, helpful content. What does it take to get ahead? I'm pretty certain I've been doing everything right, and doing everything better than our local competitors. I think google just has a very imperfect algorythm, and the answer is "a tremendous amount of patience" until they manage to get things right.
Local Website Optimization | | osaka730 -
Local SEO HELP for Franchise SAB Business
This all began when I was asked to develop experiment parameters for our content protocol & strategy. It should be simple right? I've reviewed A/B testing tips for days now, from Moz and other sources.I'm totally amped and ready to begin testing in Google Analytics. Say we have a restoration service franchise with over 40 franchises we perform SEO for. They are all over the US. Every franchise has their own local website. Example restorationcompanylosangeles.com Every franchise purchases territories in which they want to rank in. Some service over 100 cities. Most franchises also have PPC campaigns. As a part of our strategy we incorporate the location reach data from Adwords to focus on their high reach locations first. We have 'power pages' which include 5 high reach branch preferences (areas in which the owners prefer to target) and 5 non branch preference high reach locations. We are working heavily on our National brand presence & working with PR and local news companies to build relationships for natural backlinks. We are developing a strategy for social media for national brand outlets and local outlets. We are using major aggregators to distribute our local citation for our branch offices. We make sure all NAP is consistent across all citations. We are partners with Google so we work with them on new branches that are developing to create their Google listings (MyBusiness & G+). We use local business schema markup for all pages. Our content protocol encompasses all the needed onsite optimization tactics; meta, titles, schema, placement of keywords, semantic Q&A & internal linking strategies etc. Our leads are calls and form submissions. We use several call tracking services to monitor calls, caller's location etc. We are testing Callrail to start monitoring landing pages and keywords that generating our leads. Parts that I want to change: Some of the local sites have over 100 pages targeted for 'water damage + city ' aka what Moz would call "Doorway pages. " These pages have 600-1000 words all talking about services we provide. Although our writers (4 of them) manipulate them in a way so that they aren't duplicate pages. They add about 100 words about the city location. This is the only unique variable. We pump out about 10 new local pages a month per site - so yes - over 300 local pages a month. Traffic to the local sites is very scarce. Content protocol / strategy is only tested based on ranking! We have a tool that monitors ranking on all domains. This does not count for mobile, local, nor user based preference searching like Google Now. My team is deeply attached to basing our metrics solely on ranking. The logic behind this is that if there is no local city page existing for a targeted location, there is less likelihood of ranking for that location. If you are not seen then you will not get traffic nor leads. Ranking for power locations is poor - while less competitive low reach locations rank ok. We are updating content protocol by tweaking small things (multiple variants at a time). They will check ranking everyday for about a week to determine whether that experiment was a success or not. What I need: Internal duplicate content analyzer - to prove that writing over 400 pages a month about water damage + city IS duplicate content. Unique content for 'Power pages' - I know based on dozens of chats here on the community and in MOZ blogs that we can only truly create quality content for 5-10 pages. Meaning we need to narrow down what locations are most important to us and beef them up. Creating blog content for non 'power' locations. Develop new experiment protocol based on metrics like traffic, impressions, bounce rate landing page analysis, domain authority etc. Dig deeper into call metrics and their sources. Now I am at a roadblock because I cannot develop valid content experimenting parameters based on ranking. I know that a/b testing requires testing two pages that are same except the one variable. We'd either non index these or canonicalize.. both are not in favor of testing ranking for the same term. Questions: Are all these local pages duplicate content? Is there a such thing as content experiments based solely on ranking? Any other suggestions for this scenario?
Local Website Optimization | | MilestoneSEO_LA1 -
2 clients. 2 websites. Same City. Both bankruptcy attorneys. How to make sure Google doesn't penalize...
Hi Moz'ers! I am creating 2 new websites for 2 different bankruptcy attorneys in the same city. I plan to use different templates BUT from the same template provider. I plan to host with the same hosting company (unless someone here advises me not to). The content will be custom, but similar, as they both practice bankruptcy law. They have different addresses, as they are different law firms. My concern is that Google will penalize for duplicate content because they both practice the same area of law, in the same city, hosting the same, template maker the same, and both won't rank. What should I do to make sure that doesn't happen? Will it be enough that they have different business names, address, and phone numbers? Thanks for any help!!
Local Website Optimization | | BBuck0 -
Do more page links work against a Google SEO ranking when there is only 1 url that other sites will link to?
Say I have a coupon site in a major city and assume there are 20 main locations regions (suburb cities) in that city. Assume that all external links to my site will be to only the home page. www.site.com Assume also that my website business has no physical location. Which scenario is better? 1. One home page that serves up dynamic results based on the user cookie location, but mentions all 20 locations in the content. Google indexes 1 page only, and all external links are to it. 2. One home page that redirects to the user region (one of 20 pages), and therefore will have 20 pages--one for each region that is optimized for that region. Google indexes 20 pages and there will be internal links to the other 19 pages, BUT all external links are still only to the main home page. Thanks.
Local Website Optimization | | couponguy0