Google places: 7 weeks and still not verified. Reasons?
-
I've created Google Places entries for the business' 25 locations in 8 countries. It's been 7 weeks and Google hasn't validated my bulk upload yet.
Which of the following would you say might do the trick?
-
URL: I have pointed to http://site.com/country/city for every location. Should I point to http://site?
-
I've created categories in English, but being this about local I guess I should do it in the local language of each country...
-
Should I cancel the bulk upload and do one country per country? My list of countries include Latin America, Europe, Africa and Asia so I don't know if manual verification is done by google centrally and that might be a problem.
Otherwise the entries are (IMHO) well written, detailed and all. But I'm kind of desperate now because it's 7 weeks already...
thanks for you help
-
-
In the past, I too have had this issue.
I've found repeatedly that the quickest way to get verified is to get the Google Places/Local dept. to actually call you and speak to a real live person. This way, you can verify instantly over the phone and not worry about postcard PINs.
Furthermore, I expect you'd be able to actually verify all your business locations during the same call, though it might take a while.
Directions as follow to reach Google:
What you want to do is to verify ownership of your business over the phone. In order to do this, you actually need to call Google. Visit this address to get the process started: http://support.google.com/places/bin/static.py?hl=en&ts=1399021&page=ts.cs
Under "Which verification method did you try" select " I tried PIN verification for a single listing" . At the next prompt, check "The status is NOT needs action". At the verification method prompt, check "Postcard" and then last check "Yes" when asked "Have you waited 15 days".
You will be given a "call us" link and a contact form. Never mind the form and click the call link. You'll enter your name and number and a real live Google rep will call you within a minute. Simply explain to them that your are trying to verify your Local+ or Places page, and ask if you can do it by phone.
You should be all set within two weeks, tops.
-
Hey Javier
It sounds like a standard verification to ensure you are not spamming the index. Probably one you have to wait out unfortunately.
Historically, when I have solved issues with google places uploads the pending message has given me a clue with regards to the problem but in this instance, 'needs verification' seems to mean that it just needs verification from google places staff.
Probably your only option is to upload them individually from different accounts local to the countries in question or wait it out. Sorry buddy, it's just a patience game.
Cheers
Marcus -
Thanks Marcus. All 26 listings are classified as "Needs Action" and all are flagged as "Needs Action
Unverified bulk upload. May not appear on Google.
".
Since all of them had the same basic structure in their details, I've tried different approaches: with/without categories, with data in Spanish (for some Spain listings) or English (for all countries), with the link pointing to mysite.com or mysite.com/country/city, etc.
But no luck so far.
-
Hey
If you look at the listings, what does it state about each one? Sometimes the listings themselves get flagged due to the content or some other reason and I have previously got pending listings instantly live by removing certain keywords from the services section.
Let me know the pending message as it may give us some clues & this has certainly been the case previously.
Of course, it could just be that it's a bulk upload and they want to verify it is not a spam attempt!
Cheers
Marcus -
Thanks PSV. That's what I've rad elsewhere too, that's it might take much more than the official 4 weeks.
Nevertheless, any idea about how should I proceed with the three items mentioned above?
Thanks
J
-
Hi,
Google can take a very long time some times to varify the bulk upload stream I have had to wait 6 weeks for my feed to be accepted in the past.
they must have a lot of them on back order because it seems to be taking more and more time.
Also make sure when you have the titles and content for the fields you do not add keywords which are not in your business name it will slow it down even further and make you have to do the process again.
Regards.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google indexing wrong pages
We have a variety of issues at the moment, and need some advice. First off, we have a HUGE indexing issue across our entire website. Website in question: http://www.localsearch.com.au/ Firstly
Intermediate & Advanced SEO | | localdirectories
In Google.com.au, if you search for 'plumbers gosford' (https://www.google.com.au/#q=plumbers+gosford), the wrong page appears - in this instance, the page ranking should be http://www.localsearch.com.au/Gosford,NSW/Plumbers I can see this across the board, across multiple locations. Secondly
Recently I've seen Google reporting in 'Crawl Errors' in webmaster tools URLs such as:
http://www.localsearch.com.au/Saunders-Beach,QLD/Electronic-Equipment-Sales-Repairs&Sa=U&Ei=xs-XVJzAA9T_YQSMgIHQCw&Ved=0CIMBEBYwEg&Usg=AFQjCNHXPrZZg0JU3O4yTGjWbijon1Q8OA This is an invalid URL, and more specifically, those query strings seem to be referrer queries from Google themselves: &Sa=U&Ei=xs-XVJzAA9T_YQSMgIHQCw&Ved=0CIMBEBYwEg&Usg=AFQjCNHXPrZZg0JU3O4yTGjWbijon1Q8OA Here's the above example indexed in Google: https://www.google.com.au/#q="AFQjCNHXPrZZg0JU3O4yTGjWbijon1Q8OA" Does anyone have any advice on those 2 errors?0 -
Client has moved to secured https webpages but non secured http pages are still being indexed in Google. Is this an issue
We are currently working with a client that relaunched their website two months ago to have hypertext transfer protocol secure pages (https) across their entire site architecture. The problem is that their non secure (http) pages are still accessible and being indexed in Google. Here are our concerns: 1. Are co-existing non secure and secure webpages (http and https) considered duplicate content?
Intermediate & Advanced SEO | | VanguardCommunications
2. If these pages are duplicate content should we use 301 redirects or rel canonicals?
3. If we go with rel canonicals, is it okay for a non secure page to have rel canonical to the secure version? Thanks for the advice.0 -
Google Places Landing Page: Homepage or City-Specific?
What should you put in the “Website” field of your Google Places page: the URL of your homepage, or of one of your location pages?
Intermediate & Advanced SEO | | AlexanderWhite0 -
Does putting a Google custom search box on make Google think my users are bouncing?
I added a Google custom search box to my pages, that's doing an advanced Google search. A lot of people are using it. So users are coming to my site from a Google search, and then often performing another Google search on my site. Should I be worried that Google may interpret the resultant user behavior as a bounce or pogo-stick? Or will the fact that the second search occurred on my site, using custom search, and with advanced parameters signal to Google that this is not a dissatisfied user returning to Google? Thanks
Intermediate & Advanced SEO | | GilReich0 -
I have removed over 2000+ pages but Google still says i have 3000+ pages indexed
Good Afternoon, I run a office equipment website called top4office.co.uk. My predecessor decided that he would make an exact copy of the content on our existing site top4office.com and place it on the top4office.co.uk domain which included over 2k of thin pages. Since coming in i have hired a copywriter who has rewritten all the important content and I have removed over 2k pages of thin pages. I have set up 301's and blocked the thin pages using robots.txt and then used Google's removal tool to remove the pages from the index which was successfully done. But, although they were removed and can now longer be found in Google, when i use site:top4office.co.uk i still have over 3k of indexed pages (Originally i had 3700). Does anyone have any ideas why this is happening and more importantly how i can fix it? Our ranking on this site is woeful in comparison to what it was in 2011. I have a deadline and was wondering how quickly, in your opinion, do you think all these changes will impact my SERPs rankings? Look forward to your responses!
Intermediate & Advanced SEO | | apogeecorp0 -
Are Silos Still Important for SEO?
I am in the process of migrating www.nyc-officespace-leader.com from Drupal to Wordpress and my developer is of the opinion that it is not necessary to implement silos to achieve favorable ranking for competitive keywords. I know a lot has changed in the last two years with Panda and Penguin. Is it SEO best practices to implement silos in the course of the redesign? Will this make a significant difference for SEO? Thanks, Alan Rosinsky
Intermediate & Advanced SEO | | Kingalan10 -
Site: on Google
Hello, people. I have a quick question regarding search in Google. I use search operator [site:url] to see indexing stauts of my site. Today, I was checking indexing status and I found that Google shows different numbers of indexed pages depends on search setting. 1. At default setting (set as 10 search result shows) > I get about 150 pages indexed by Google. 2. I set 100 results shows per page and tried again. > I get about 52 pages indexed by Google. Of course I used same page URL. I really want to know which data is accurate. Please help people!!
Intermediate & Advanced SEO | | Artience0 -
Google, Links and Javascript
So today I was taking a look at http://www.seomoz.org/top500 page and saw that the AddThis page is currently at the position 19. I think the main reason for that is because their plugin create, through javascript, linkbacks to their page where their share buttons reside. So any page with AddThis installed would easily have 4/5 linbacks to their site, creating that huge amount of linkbacks they have. Ok, that pretty much shows that Google doesn´t care if the link is created in the HTML (on the backend) or through Javascript (frontend). But heres the catch. If someones create a free plugin for wordpress/drupal or any other huge cms platform out there with a feature that linkbacks to the page of the creator of the plugin (thats pretty common, I know) but instead of inserting the link in the plugin source code they put it somewhere else, wich then is loaded with a javascript code (exactly how AddThis works). This would allow the owner of the plugin to change the link showed at anytime he wants. The main reason for that would be, dont know, an URL address update for his blog or businness or something. However that could easily be used to link to whatever tha hell the owner of the plugin wants to. What your thoughts about this, I think this could be easily classified as White or Black hat depending on what the owners do. However, would google think the same way about it?
Intermediate & Advanced SEO | | bemcapaz0