KML File vs. KMZ File
-
When should you use a KMZ file? What are the benefits to using a KMZ file as opposed to just a standalone KML file?
-
Sorry, I meant to respond to this days ago. Started, but didn't finish and then it left my mind.
It sounds to me like you want to place a Google map on your site to show the location of the business? If this is correct, I wouldn't use KML nor KMZ. I would just use a javascript point market that represents the one location using the Google Maps API. You can see what I mean as I have several points on one site I'm finishing up at http://www.snoshield.com/contest/christmas2012
You can also show service area like this: http://www.kcdirt.com/contact
Both of those maps use strictly JavaScript, there is no KML or KMZ on them.
-
Thanks for the feedback! Are you saying that the KMZ file is for when you have more than one location for a particular business/domain? If you only have one business location, would a KML be ideal as it relates to local SEO exposure? We work with independent hotels. Therefore, locally speaking, what do you feel is the best strategy regarding site maps?
-
I'm not sure if you are referencing SEO in this post, but here's the breakdown from my experience (Several years in the GIS realm)
KMZ when you have a lot of markers in the map making the file size is large and you know the end user will use a Google product to read it.
KML if you are needing to read the file with a program outside of Google due to incompatibility issues with the KMZ format in other programs.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Log File Analyzer Only Showing Spoofed Bots and No Verified Bots
Question for you guys: After analyzing some crawl data in Search Console in the sitemap section, I noticed that Google consistently isn't indexing about 3/4 of the client sites I work on that all use the same content management system. I began to wonder if maybe Google (and others) have a hard time crawling certain parts of the sites consistently, as finding a pattern here could lead me to investigate whether there's a CMS problem. To research this, I started using a log file analyzer (Screaming Frog's version) for some of those clients. After loading the files, I noticed that none of the crawl activity logged by the servers is considered verified. I input one month's worth of log files, but when I switch the program to show only verified bots, all data disappears. Is it possible for a site not to have any search engines crawling it for a whole month? Given my experience, that seems unlikely, particularly since we've been submitting crawl requests. I know that doesn't guarantee a crawl, but it seems odd that it's never happening for any search engines across the board. Context that might be helpful: I did check technical settings, and the sites are crawlable. The sites do appear in search but seem to be losing organic search traffic. Thanks for any help you can provide!
Algorithm Updates | | geodigitalmarketing0 -
Adding non-important folders to disallow in robots.txt file
Hi all, If we have many non-important folders like /category/ in blog.....these will multiply the links. These are strictly for users who access very rarely but not for bots. Can we add such to disallow list in robots to stop link juice passing from them, so internal linking will me minimised to an extent. Can we add any such paths or pages in disallow list? Is this going to work pure technical or any penalty? Thanks, Satish
Algorithm Updates | | vtmoz0 -
Physical locationof the server vs customer base vs SEO penality?
HI All, We are an Australian business with our hosting currently based in Australia. We have recently been considering moving hosts for a few reasons. In particular when we have done analysis of hosting in the US and also with Rackspace say in Hong Kong we have found that the prices can be significantly cheaper or with more bells in whistles provided in the hosting of a dedicated server offshore vs Australia for the same price. Therefore from this point of view we would be much better off moving our hosting to the US or HK with Rackspace. There are the issues such as latency to take on board but lets put that to the side for the moment as we are mostly interested in understanding if offshore hosting will impact us from an SEO perspective and if so how and can these impacts be mitigated. So our first question is a) if we move our hosting offshore, will this impact our SEO? b) if it does impact our seo, how will it impact (ie lose rankings for organic pages due to IP address being offshore)? c) is A is also an impact are there ways of eliminating these impacts outlined in B? d) net - if the impacts on seo can be mitigated will the net result still be negative or could we still be seen on the same footing as a domain hosted in Australia? Thanks Sean
Algorithm Updates | | sbcinv0 -
Google.co.uk vs pages from the UK - anyone noticed any changes?
We've started to notice some changes in the rankings of Google UK and Google pages from the UK. Pages from the UK have always typically ranked higher, however it seems like these are slipping, and Google UK pages (pages from the web) are climbing. We've noticed a similar thing happening in the Bing/Yahoo algorithm as well. Just wondered if anyone else has anyone else noticed this? Thanks
Algorithm Updates | | Digirank0 -
Plural vs non-plural domain name
I'm sure this question has been answered and asked a 1,000 different ways but what would be the best domain name to use in the long term (2 years +)? The plural versions (examples.com) which has a decent domain authority and is ranking 1st in Google search results yet has less search volume or the singular version (example.com) that has no current SEO value for the search term that we'd like to target however the singular version of the keyword has a much higher search volume? so basically will it be better to have the exact match that has more volume or the plural form that has better rankings after 2 years of doing SEO for each domain? My guess is that using (examples.com) with the better domain authority and tightening the grip on its dominance in Google will still be more effective than having the exact match domain with more search volume for that keyword while performing the same amount of SEO even after two years. Any suggestions?
Algorithm Updates | | ydop0 -
FLASH vs HTML links in SEO
In terms of a small flash slideshow and having text and links on various slides within, is such text and links as easily index-able (or even at all) compared to static html text on a webpage?
Algorithm Updates | | heritageseo0 -
Data on Google Vs Bing, et al and changes to sites.
I am curious to know if anyone has any data that correlates site/page changes like content or Title Tag, H1, etc. and subsequent movement in rankings on Google and Bing and Yahoo? The equation is for example: ABCSite.com/home-page/ makes a change to the H1 and H2 and one paragraph of content is changed. Over next 6 to 12 weeks changes in page rank for the 3 engines is tracked to see where it started and where it "stopped." Obviously, there are more factors than individual algorithms in play here. An example of that would be that a significant number of sites will be indexed in Google by a dev and not in the others. We see this regularly. So, at least from a timing standpoint, different sites are entering/leaving the fray at different rates. We are going to begin to track this but I would love to see any data already around or speak with anyone involved in such a study about what they found. Thanks
Algorithm Updates | | RobertFisher0 -
Is a slash just as good as buying a country specific domain? .com/de vs .de
I guess this question comes in a few parts: 1. Would Google read a 2-letter country code that is after the domain name (after the slash) and recognize it as a location (targeting that country)? Or does is just read it as it would a word. eg. www.marketing.com/de for a microsite for the Germans www.marketing.com/fr for a microsite for the French Or would it read the de and fr as words (not locations) in the url. In which case, would it have worse SEO (as people would tend to search "marketing france" not "marketing fr")? 2. Which is better for SEO and rankings? Separate country specific domains: www.marketing.de and www.marketing.fr OR the use of subfolders in the url: www.marketing.com/de and www.marketing.com/fr
Algorithm Updates | | richardstrange0