Localization without proper address?
-
Hi Mozzers,
recently I received a project to promote a hotel website in a third world country. They have no street names, no landline phone, no zip-code.
So far I tried to give a good address description in all social networks and on the homepage (footer) and signed into hotel directories.
Suddently a new website of another hotel came up on google and made it up to number 1. They put a fake telefon number (landline) on the website. Is that a good idea of localizing a business? Do you have recommendations for me how to enhance.
Thanks
-
Hi Miriam,
to get a grey pin you only need to mark a place somewhere in the world on google maps. Example: I have a hotel website for a hotel on a river in Nicaragua. The next village is 20 miles away. But still google localized the place and I can put a mobile phone number with it. Or another hotel is located at a deserted beach. I put the marker on the position and describe the position in the address fields as good as possible (e.g. Street name: Playa del Sol, City: Islandname, Zipcode: a random figure).
In my specific case we talk about the listing for "Hotel Little Corn Island" and "hotel bellavista corn island". Thanks for help...
-
Hi Falk, I'm stumped. I don't understand how you are seeing your hotel in the main results with a true grey/pinned local result if you have no address or phone number. Unfortunately, without being able to actually look at the listing, I can't get any further with this. What you are describing is not something I've ever seen before, and I have to wonder if Google is handling things very differently than one would expect, given the remoteness/other factors about your location. If you can share the listings, I'm happy to look at it. If not, I can't really provide any further insight.
-
First of all, thanks for the quick answers of you.
The localization works. There is a grey pin next to the hotel, thanks to google maps and/or Panoramio. Most of the people, when it comes to look for a hotel, they search for "cityname/area + hotel". That's how it works on the island here too. None of our island hotels had a telephone number so far until this one now which reaches number 1 on google within days. Can a telephone number make such a difference (even when the number is not valid on the island)?
The page of my competitor has nearly no content! Mine is full of content about the hotel and the destination. He hardly has backlinks...me too so far because it is a new website.
Thanks for further advice.
-
Hi Falk,
I agree with much of the advice offered by EEE3. Unfortunately, your client is not eligible for inclusion in Google's local products if they lack a physical street address and local phone number. The competitor's usage of a fake phone number is not advisable...he is misdirecting his own potential guests and, there is a good chance Google will see through this.
So, local inclusion just isn't appropriate for your client, meaning you will have to rely on Organic SEO rather than Local SEO to gain visibility for the hotel. I am presuming that if no one in this region of the world has a street address, Google isn't showing any truly local results for the area (no results with the grey, lettered pins on them). So, make the website as strongly optimized as you can for the town and region where the hotel exists and rely on traditional SEO techniques for gaining high organic visibility for the client. That would be my best advice.
-
Okay, so this may not be the answer you're looking for, but maybe another tactic would better serve this hotel?
What about a marketing campaign something like "So off the grid even Google can't find it"? There are lots of adventurous people on this planet--and though the true cause behind the lack of street name, landline phone or zip code may be due to poor infrastructure and not because it's in the middle of a jungle reached only by canoe--you have an audience there.
As far as tackling the local issue, Mike Blumenthal and David Mihm might have some resources for you on their websites and blogs.
http://www.davidmihm.com/blog/
Best of luck to you.
P.S. A fake phone number is not a great idea. If you do go that route, please make sure someone familiar with the hotel is able to answer it. I heard at Local U stories of Google calling phone numbers to check on the location and make sure they were accurate.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Search Console 'Change of Address' Just 301s on source domain?
Hi all. New here, so please be gentle. 🙂 I've developed a new site, where my client also wanted to rebrand from .co.nz to .nz On the source (co.nz) domain, I've setup a load of 301 redirects to the relevant new page on the new domain (the URL structure is changing as well).
Technical SEO | | WebGuyNZ
E.G. On the old domain: https://www.mysite.co.nz/myonlinestore/t-shirt.html
In the HTACCESS on the old/source domain, I've setup 301's (using RewriteRule).
So that when **https://www.mysite.co.nz/**myonlinestore/t-shirt.html is accessed, it does a 301 to;
https://mysite.nz/shop/clothes/t-shirt All these 301's are working fine. I've checked in dev tools and a 301 is being returned. My question is, is having the 301's just on the source domain only enough, in regards to starting a 'Change of Address' in Google's Search Console? Their wording indicates it's enough but I'm concerned, maybe I also need redirects on the target domain as well? I.E. Does the Search Console Change of Address process work this way?
It looks at the source domain URL (that's already in Google's index), sees the 301 then updates the index (and hopefully pass the link juice) to the new URL. Also, I've setup both source and target Search Console properties as Domain Properties. Does that mean I no longer need to specify that the source and target properties are HTTP or HTTPS? I couldn't see that option when I created the properties. Thanks!0 -
Utilising Wordpress Attachment Pages Without Getting Duplicate Content Warnings.
I have a wordpres site that relies heavily on images and their usefulness. Each post links to larger sizes of the images with links back to the post and the "gallery" all images uploaded to the post. Unfortunately this goes against the "rules" and our attachment page show as duplicate content in Google (even though the image titles are different). There must be a way to utlise and make the most of attachment pages without getting duplicate content warnings?
Technical SEO | | DotP0 -
How to use rel="alternate" properly for mobile directory.
Hey everyone, For the URL - http://www.absoluteautomation.ca/dakota-alert-dcpa-p/dkdcpa2500.htm - I have the following tags in the header: rel="canonical" href="http://www.absoluteautomation.ca/dakota-alert-dcpa-p/dkdcpa2500.htm" /> rel="alternate" media="only screen and (max-width: 640px)" href="http://www.absoluteautomation.ca/mobile/Product.aspx?id=37564" /> Yes Google WMT is reading these as duplicate pages with duplicate titles, meta descriptions etc. How can I fix this? Thanks!
Technical SEO | | absoauto0 -
Product Descriptions for Localized eCommerce Store
Hi, We currently have an eCommerce store that only sells in one country. We are going to open separate stores that will target different countries. As far as product descriptions go my initial plan was to use the same descriptions but then block product pages from being indexed. However this has also ended up blocking us from being able to use Google's product listing ads (it rejects the data feed due to not being able to index the products). Is there a way to copy the descriptions and avoid a delicate content issues without blocking our product pages from being indexed? Thanks for your help!
Technical SEO | | pgicom0 -
Are robots.txt wildcards still valid? If so, what is the proper syntax for setting this up?
I've got several URL's that I need to disallow in my robots.txt file. For example, I've got several documents that I don't want indexed and filters that are getting flagged as duplicate content. Rather than typing in thousands of URL's I was hoping that wildcards were still valid.
Technical SEO | | mkhGT0 -
What is the best strategy for franchise companies when building local sites?
Hi, if you represent a national franchise, I have noticed that Dominos and others do NOT use local websites for LOCAL SEO, rather they use their own MAMMOTH sites with a store locator for the local stores with a few NOT keyword rich pages with very basic information. However, for LOCAL SEO, I have been thinking that using e.g. Hyperfranchise.com for the main domain and then e.g. buckhead.hyperfranchise.com or buckheadhyperfrnachise.com would be better for LOCAL SEO including Yelp, FourSquare and more.It will take time to rank for all local sites, but is that not better in the end than having e.g. 6 pages of content that are "local" on the main site? However, I have not seen any of the big ones do that, but that might be because they are so entrenched in their own OLD system that might be ranking well anyway for their local franchisees? Any comments, ideas, suggestions?
Technical SEO | | yvonneq0 -
Best local listings submitting service
I'm building several local websites and looking for the fastest, cost effective service out there to list them in all the major local directories like yelp etc.. I've been using UBL but 3 months later i don't see my listing in many of the directories they claim to list at... Thanks!
Technical SEO | | atohad0 -
How do I use the Robots.txt "disallow" command properly for folders I don't want indexed?
Today's sitemap webinar made me think about the disallow feature, seems opposite of sitemaps, but it also seems both are kind of ignored in varying ways by the engines. I don't need help semantically, I got that part. I just can't seem to find a contemporary answer about what should be blocked using the robots.txt file. For example, I have folders containing site comps for clients that I really don't want showing up in the SERPS. Is it better to not have these folders on the domain at all? There are also security issues I've heard of that make sense, simply look at a site's robots file to see what they are hiding. It makes it easier to hunt for files when they know the directory the files are contained in. Do I concern myself with this? Another example is a folder I have for my xml sitemap generator. I imagine google isn't going to try to index this or count it as content, so do I need to add folders like this to the disallow list?
Technical SEO | | SpringMountain0