Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How do I get rid of rogue listings?
-
Unfortunately, Google has taken bits and pieces of my business and combined it with non-existent businesses and other rogue information. So now my business has 3 locations.
One proper listing that I created and manage.
One that uses my website address but nothing else is correct in the listing.
One that contains my name(incorrectly), but the address and everything else about it is incorrect.
I have reported these places many times but they continue to hang around and I am lost/confused on what to do next.
Please advise.
-
Hi Dignan,
The appropriate thing to do in cases like these it to go through the REPORT A PROBLEM link at the bottom of each problematic Place Page. It's a good idea to be signed into your account while doing this. Describe the situation and link to the correct listing for your business in the wizard. State that you have only one address - the one on your authoritative listing. Ask that these 2 other listings be removed.
Wait 4-6 weeks to hear back from Google (could take considerably less time these days, actually). If you do not see resolution, then take the matter to the new Google Places Help Forum. The new forum will be here:
http://groups.google.com/a/googleproductforums.com/forum/#!forum/maps
Explain the steps you have taken and ask if a Top Contributor can please help you obtain resolution.
*In your shoes, I would also do some sleuthing to try to figure out where the other data is coming from...it's coming from somewhere and discovering the origin may help you to surmise what is going on.
Hope this helps and good luck!
Miriam
-
As you have your listing I would suggest that you continue to try and get those listings removed by reporting it to Google.
One way that you can demonstrate that your verified listing is the "credible" and only one is through local citation linkbuilding and getting people to place reviews on your listing.
The more activity will show hopefully push the other two down in SERP's and eventually be removed from Google.
Good luck,
Vahe
-
HI Brent,
Thanks for the reply. As of last week the phone numbers were different on the rogue listings, but I just checked...and guess what...both now have MY phone number.
So it looks like I could claim them. It was advised that the best way to deal with these listings would be to just report them to Google, opposed to claiming them.
They mentioned it was against the rules for me to have more than one listing(even if my intention was to nix the two rogue listings.)
Care to share your input?
-
This may be obvious but you haven't mentioned it. Did you try to claim the listings? Or are they already verified and claimed by somebody else?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
If I get spammy backlinks removed is it still necessary to disavow?
Now there is some conflicting beliefs here and I want to know what you think. If I got a high spam website to remove my backlink, is a disavow through search console still necessary ? Keep in mind if it helps even in the slightest to improve rankings im for it!
Technical SEO | | Colemckeon1 -
If I'm using a compressed sitemap (sitemap.xml.gz) that's the URL that gets submitted to webmaster tools, correct?
I just want to verify that if a compressed sitemap file is being used, then the URL that gets submitted to Google, Bing, etc and the URL that's used in the robots.txt indicates that it's a compressed file. For example, "sitemap.xml.gz" -- thanks!
Technical SEO | | jgresalfi0 -
Nofollow/Noindex Category Listing Pages with Filters
Our e-commerce site currently has thousands of duplicate pages indexed because category listing pages with all the different filters selected are indexed. So, for example, you would see indexed: example.com/boots example.com/boots/black example.com/boots/black-size-small etc. There is a logic in place that when more than one filter is selected all the links on the page are nofollowed, but Googlebot is still getting to them, and the variations are being indexed. At this point I'd like to add 'noindex' or canonical tags to the filtered versions of the category pages, but many of these filtered pages are driving traffic. Any suggestions? Thanks!
Technical SEO | | fayfr0 -
How preproduction website is getting indexed in Google.
Hi team, Can anybody please help me to find how my preproduction website and urls are getting indexed in Google.
Technical SEO | | nlogix0 -
What tools produce a complete list of all URLs for 301 redirects?
I am project managing the rebuild of a major corporate website and need to set up 301 redirects from the old pages to the new ones. The problem is that the old site sits on multiple CMS platforms so there is no way I can get a list of pages from the old CMS. Is there a good tool out there that will crawl through all the sites and produce a nice spreadsheet with all the URLs on it? Somebody mentioned Xenu but I have never used it. Any recommendations? Thanks -Adrian
Technical SEO | | Adrian_Kingwell0 -
Universal Business Listing?
Can anyone recommend the best or a better alternative to submitting a clients site to multiple directories than Universal Business Listing? Is UBL the best or is there something better and/or less expensive out there? https://ubl.org Thanks
Technical SEO | | fun52dig0 -
What is the best method to block a sub-domain, e.g. staging.domain.com/ from getting indexed?
Now that Google considers subdomains as part of the TLD I'm a little leery of testing robots.txt with something like: staging.domain.com
Technical SEO | | fthead9
User-agent: *
Disallow: / in fear it might get the www.domain.com blocked as well. Has anyone had any success using robots.txt to block sub-domains? I know I could add a meta robots tag to the staging.domain.com pages but that would require a lot more work.0 -
How do I get Google to display categories instead of the URL in results?
I've seen that for some domains Google will show a nice clickable site heirarchy in place of the actual URL of a search result. See attached for an example. How do I go about achieving this type of results? categorized.png
Technical SEO | | Carlito-2569610