Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How do I get rid of rogue listings?
-
Unfortunately, Google has taken bits and pieces of my business and combined it with non-existent businesses and other rogue information. So now my business has 3 locations.
One proper listing that I created and manage.
One that uses my website address but nothing else is correct in the listing.
One that contains my name(incorrectly), but the address and everything else about it is incorrect.
I have reported these places many times but they continue to hang around and I am lost/confused on what to do next.
Please advise.
-
Hi Dignan,
The appropriate thing to do in cases like these it to go through the REPORT A PROBLEM link at the bottom of each problematic Place Page. It's a good idea to be signed into your account while doing this. Describe the situation and link to the correct listing for your business in the wizard. State that you have only one address - the one on your authoritative listing. Ask that these 2 other listings be removed.
Wait 4-6 weeks to hear back from Google (could take considerably less time these days, actually). If you do not see resolution, then take the matter to the new Google Places Help Forum. The new forum will be here:
http://groups.google.com/a/googleproductforums.com/forum/#!forum/maps
Explain the steps you have taken and ask if a Top Contributor can please help you obtain resolution.
*In your shoes, I would also do some sleuthing to try to figure out where the other data is coming from...it's coming from somewhere and discovering the origin may help you to surmise what is going on.
Hope this helps and good luck!
Miriam
-
As you have your listing I would suggest that you continue to try and get those listings removed by reporting it to Google.
One way that you can demonstrate that your verified listing is the "credible" and only one is through local citation linkbuilding and getting people to place reviews on your listing.
The more activity will show hopefully push the other two down in SERP's and eventually be removed from Google.
Good luck,
Vahe
-
HI Brent,
Thanks for the reply. As of last week the phone numbers were different on the rogue listings, but I just checked...and guess what...both now have MY phone number.
So it looks like I could claim them. It was advised that the best way to deal with these listings would be to just report them to Google, opposed to claiming them.
They mentioned it was against the rules for me to have more than one listing(even if my intention was to nix the two rogue listings.)
Care to share your input?
-
This may be obvious but you haven't mentioned it. Did you try to claim the listings? Or are they already verified and claimed by somebody else?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
If I get spammy backlinks removed is it still necessary to disavow?
Now there is some conflicting beliefs here and I want to know what you think. If I got a high spam website to remove my backlink, is a disavow through search console still necessary ? Keep in mind if it helps even in the slightest to improve rankings im for it!
Technical SEO | | Colemckeon1 -
What are best options for website built with navigation drop-down menus in JavaScript, to get those menus indexed by Google?
This concerns f5.com, a large website with navigation menus that drop down when hovered over. The sub nav items (example: “DDoS Protection”) are not cached by Google and therefore do not distribute internal links properly to help those sub-pages rank well. Best option naturally is to change the nav menus from JS to CSS but barring that, is there another option? Will Schema SiteNavigationElement work as an alternate?
Technical SEO | | CarlLarson0 -
Schema Markup for property listings (estate agent)
Hello, I've been looking online for some help with this. An estate agent has a page of properties for sale. Is it possible to mark these individual properties up and if so would they appear as rich snippets in the SERPS - never seen anything like this for properties for sale so just wondered,
Technical SEO | | AL123al1 -
Site indexed by Google, but (almost) never gets impressions
Hi there, I have a question that I wasn't able to give it a reasonable answer yet, so I'm going to trust on all of you. Basically a site has all its pages indexed by Google (I verified with site:sitename.com) and it also has great and unique content. All on-page grades are A with absolutely no negative factors at all. However its pages do not get impressions almost at all. Of course I didn't expect it to be on page 1 since it has been launched on Dec, 1st, but it looks like Google is ignoring (or giving it bad scores) for some reason. Only things that can contribute to that could be: domain privacy on the domain, redirect from the www to the subdomain we use (we did this because it will be a multi-language site, so we'll assign to each country a subdomain), recency (it has been put online on Dec 1st and the domain is just a couple of months old). Or maybe because we blocked crawlers for a few days before the launch? Exactly a few days before Dec 1st. What do you think? What could be the reason for that? Thanks guys!
Technical SEO | | ruggero0 -
Unnecessary pages getting indexed in Google for my blog
I have a blog dapazze.com and I am suffering from a problem for a long time. I found out that Google have indexed hundreds of replytocom links and images attachment pages for my blog. I had to remove these pages manually using the URL removal tool. I had used "Disallow: ?replytocom" in my robots.txt, but Google disobeyed it. After that, I removed the parameter from my blog completely using the SEO by Yoast plugin. But now I see that Google has again started indexing these links even after they are not present in my blog (I use #comment). Google have also indexed many of my admin and plugin pages, whereas they are disallowed in my robots.txt file. Have a look at my robots.txt file here: http://dapazze.com/robots.txt Please help me out to solve this problem permanently?
Technical SEO | | rahulchowdhury0 -
How do I get out of google bomb?
Hi all, I have a website named bijouxroom.com; and I was in the 7th page for the search term takı in google; and 2nd page for online takı. Now, I see that in one day my results seem to be on the 13th and 10th page in google respectively. I made too much anchor text for takı and online takı. What shall I do to gain my positions back? Thanks in advance. Regards,
Technical SEO | | ozererim0 -
What tools produce a complete list of all URLs for 301 redirects?
I am project managing the rebuild of a major corporate website and need to set up 301 redirects from the old pages to the new ones. The problem is that the old site sits on multiple CMS platforms so there is no way I can get a list of pages from the old CMS. Is there a good tool out there that will crawl through all the sites and produce a nice spreadsheet with all the URLs on it? Somebody mentioned Xenu but I have never used it. Any recommendations? Thanks -Adrian
Technical SEO | | Adrian_Kingwell0 -
What is the best method to block a sub-domain, e.g. staging.domain.com/ from getting indexed?
Now that Google considers subdomains as part of the TLD I'm a little leery of testing robots.txt with something like: staging.domain.com
Technical SEO | | fthead9
User-agent: *
Disallow: / in fear it might get the www.domain.com blocked as well. Has anyone had any success using robots.txt to block sub-domains? I know I could add a meta robots tag to the staging.domain.com pages but that would require a lot more work.0