Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How do I get rid of rogue listings?
-
Unfortunately, Google has taken bits and pieces of my business and combined it with non-existent businesses and other rogue information. So now my business has 3 locations.
One proper listing that I created and manage.
One that uses my website address but nothing else is correct in the listing.
One that contains my name(incorrectly), but the address and everything else about it is incorrect.
I have reported these places many times but they continue to hang around and I am lost/confused on what to do next.
Please advise.
-
Hi Dignan,
The appropriate thing to do in cases like these it to go through the REPORT A PROBLEM link at the bottom of each problematic Place Page. It's a good idea to be signed into your account while doing this. Describe the situation and link to the correct listing for your business in the wizard. State that you have only one address - the one on your authoritative listing. Ask that these 2 other listings be removed.
Wait 4-6 weeks to hear back from Google (could take considerably less time these days, actually). If you do not see resolution, then take the matter to the new Google Places Help Forum. The new forum will be here:
http://groups.google.com/a/googleproductforums.com/forum/#!forum/maps
Explain the steps you have taken and ask if a Top Contributor can please help you obtain resolution.
*In your shoes, I would also do some sleuthing to try to figure out where the other data is coming from...it's coming from somewhere and discovering the origin may help you to surmise what is going on.
Hope this helps and good luck!
Miriam
-
As you have your listing I would suggest that you continue to try and get those listings removed by reporting it to Google.
One way that you can demonstrate that your verified listing is the "credible" and only one is through local citation linkbuilding and getting people to place reviews on your listing.
The more activity will show hopefully push the other two down in SERP's and eventually be removed from Google.
Good luck,
Vahe
-
HI Brent,
Thanks for the reply. As of last week the phone numbers were different on the rogue listings, but I just checked...and guess what...both now have MY phone number.
So it looks like I could claim them. It was advised that the best way to deal with these listings would be to just report them to Google, opposed to claiming them.
They mentioned it was against the rules for me to have more than one listing(even if my intention was to nix the two rogue listings.)
Care to share your input?
-
This may be obvious but you haven't mentioned it. Did you try to claim the listings? Or are they already verified and claimed by somebody else?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How do I get coupon information like retailmenot has on the SERPs?
Hello can anyone tell me how I can implement the same tactic that RetailMeNot is using to populate coupon information in the search results? They have below there meta description 4 fields labeled:Coupon Codes: 38, Free Shipping Deals: 6, Best Discount: 20% off, & Total Offers: 49 Is there some schema markup here? Or is this only allowed for RMN I have not seen it elsewhere but want my website coupons page to compete with them in the SERPs. Appreciate your help! dQNkHrb
Technical SEO | | Serenawong1 -
Why images are not getting indexed and showing in Google webmaster
Hi, I would like to ask why our website images not indexing in Google. I have shared the following screenshot of the search console. https://www.screencast.com/t/yKoCBT6Q8Upw Last week (Friday 14 Sept 2018) it was showing 23.5K out 31K were submitted and indexed by Google. But now, it is showing only 1K 😞 Can you please let me know why might this happen, why images are not getting indexed and showing in Google webmaster.
Technical SEO | | 21centuryweb0 -
If I'm using a compressed sitemap (sitemap.xml.gz) that's the URL that gets submitted to webmaster tools, correct?
I just want to verify that if a compressed sitemap file is being used, then the URL that gets submitted to Google, Bing, etc and the URL that's used in the robots.txt indicates that it's a compressed file. For example, "sitemap.xml.gz" -- thanks!
Technical SEO | | jgresalfi0 -
Do URLs with canonical tags get indexed by Google?
Hi, we re-branded and launched a new website in February 2016. In June we saw a steep drop in the number of URLs indexed, and there have continued to be smaller dips since. We started an account with Moz and found several thousand high priority crawl errors for duplicate pages and have since fixed those with canonical tags. However, we are still seeing the number of URLs indexed drop. Do URLs with canonical tags get indexed by Google? I can't seem to find a definitive answer on this. A good portion of our URLs have canonical tags because they are just events with different dates, but otherwise the content of the page is the same.
Technical SEO | | zasite0 -
Unnecessary pages getting indexed in Google for my blog
I have a blog dapazze.com and I am suffering from a problem for a long time. I found out that Google have indexed hundreds of replytocom links and images attachment pages for my blog. I had to remove these pages manually using the URL removal tool. I had used "Disallow: ?replytocom" in my robots.txt, but Google disobeyed it. After that, I removed the parameter from my blog completely using the SEO by Yoast plugin. But now I see that Google has again started indexing these links even after they are not present in my blog (I use #comment). Google have also indexed many of my admin and plugin pages, whereas they are disallowed in my robots.txt file. Have a look at my robots.txt file here: http://dapazze.com/robots.txt Please help me out to solve this problem permanently?
Technical SEO | | rahulchowdhury0 -
Google Showing Multiple Listings For Same Site?
I've been optimizing a small static HTML site and have been working to increase the keyword rankings, yet have always ranked #1 for the company name. But, I've now noticed the company name is taking more than just the first position - the site is now appearing in 1st, 2nd, and 3rd position (each position referencing a different page of the site). Great.. who doesn't want to dominate a page of Google! ..But it looks kind of untidy and not usually how links from the same site are displayed. Is this normal? I'm used to seeing results from the same site grouped under the primary result, but not like this. any info appreciated 🙂
Technical SEO | | GregDixson0 -
What tools produce a complete list of all URLs for 301 redirects?
I am project managing the rebuild of a major corporate website and need to set up 301 redirects from the old pages to the new ones. The problem is that the old site sits on multiple CMS platforms so there is no way I can get a list of pages from the old CMS. Is there a good tool out there that will crawl through all the sites and produce a nice spreadsheet with all the URLs on it? Somebody mentioned Xenu but I have never used it. Any recommendations? Thanks -Adrian
Technical SEO | | Adrian_Kingwell0 -
What is the best method to block a sub-domain, e.g. staging.domain.com/ from getting indexed?
Now that Google considers subdomains as part of the TLD I'm a little leery of testing robots.txt with something like: staging.domain.com
Technical SEO | | fthead9
User-agent: *
Disallow: / in fear it might get the www.domain.com blocked as well. Has anyone had any success using robots.txt to block sub-domains? I know I could add a meta robots tag to the staging.domain.com pages but that would require a lot more work.0