Google Places Duplicate Listings
-
Hey Mozzers-
I know the basic process for handling duplicate listings, but I just want to make sure and ask because this one is a little sensitive.
I have a client with a claimed and verified listings page, which is here:
http://maps.google.com/maps/place?q=chambers+and+associates&hl=en&cid=9065936543314453461
There is also another listing (which I have not claimed yet) here:
http://maps.google.com/maps/place?q=dr.+george+chambers&hl=en&cid=14758636806656154330
The first listing has 0 reviews, where the 2nd unverified listing has 12 fantastic 5 star reviews. We can all agree that if I can get these two listings to merge, his general listing will perform much better than it already is (the first listing has about 200 actions per months).
So, what is the best way to merge these two without losing any reviews and without suspending my places account? Thanks in advance!
Ian
-
I'm going to head this route....thanks a lot for your help!
-
True it is techinically against google's TOS. but i think that relates more to people who are doing it in a malicious way. I think everything we have explained above is legit, even if he temporarily holds two of the same location in his places account.
If they actually got ahold of him to complain about it, it would probably be useful, because he could state his story and they would resolve the issue quicker.
But it's not like that would ever happen. Google staff are way to preoccupied to approach businesses on a case by case basis like that.
-
It is against Google's TOS to have more than 1 Places listing. If (as is the case in some countries) there is no 'Report A Problem' link on the Place page, you may (as suggested by a Google Places rep (I have the link somewhere) claim the duplicate and then delete the undesirable one: follow the steps as Storwell Self Storage has laid out.
If the Places page/s is in the USA, you'll want to follow Jacob's suggestions, as well as #5 from Storwell Self Storage
- wait a couple days, if that listing is still showing up in the search results, go to its places page, and click the Report A Problem link, and select “Place has another listing”. If you can, include a link to your claimed place page in the comments section.
Here is the link to the Google Places help page for duplicate issues:
http://www.google.com/support/places/bin/answer.py?hl=en&answer=183009
-
-
claim the second listing, the good one.
-
fill out the profile to 100% complete (5 videos, 10 pictures, categories, all the works)
-
make sure that this new listing is now set to active in your places acount (you should now have 2 listings in your places account)
-
find the listing with no reviews inside your places account and click **"**Remove this listing from my Google Places account".
-
wait a couple days, if that listing is still showing up in the search results, go to its places page, and click the Report A Problem link, and select “Place has another listing”. If you can, include a link to your claimed place page in the comments section.
-
-
I actually just got done with a similar issue.
The best way to take care of this, in my humble opinion, is to delete the claimed (and inferior) listing. Do this first.
http://www.google.com/support/places/bin/answer.py?hl=en&answer=154102
Then, claim the better listing like normal.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate pages and Canonicals
Hi all, Our website has more than 30 pages which are duplicates. So canonicals have been deployed to show up only 10 of these pages. Do more of these pages impact rankings? Thanks
Intermediate & Advanced SEO | | vtmoz0 -
Hide Aggregation from Google?
Google isn't a fan of aggregation, but sometimes it is a good way to fill out content when you cannot cover every news story there is. What I'm wondering is if anyone has continued to do any form of aggregation based on a category and hide that url from Google. Example: example.com/industry-news/ -- is where you'd post aggregation stories but you block robots from crawling that. We wouldn't be doing this for search value just value to our readers. Thoughts?
Intermediate & Advanced SEO | | meistermedia0 -
Google displaying a content box above the listing link for top ranking listing in SERPs
Hi, In the attached Google SERP example the first listing below the paid search ads has a large box with a snippet of content from the relevant page then followed by the standard link. Does anyone know how you get Google to display a box like this in their SERPs? I checked the code on the page and there doesn't appear to be anything special about it such as any schema markup. It uses standard list code. Does this only appear for particular types of content or sites, such as medical content in this case? Is the content more likely to appear for lists? Does it only appear for high authority sites that Google has selected? We have a similar medical information based site and it would be great to try to get Google to display a similar box of content for some of our pages. Thanks. Damien ZmPJVSl.png
Intermediate & Advanced SEO | | james.harris0 -
Duplicate content on subdomains
Hi All, The structure of the main website goes by http://abc.com/state/city/publication - We have a partnership with public libraries to give local users access to the publication content for free. We have over 100 subdomains (each for an specific library) that have duplicate content issues with the root domain, Most subdomains have very high page authority (the main public library and other local .gov websites have links to this subdomains).Currently this subdomains are not index due to the robots text file excluding bots from crawling. I am in the process of setting canonical tags on each subdomain and open the robots text file. Should I set the canonical tag on each subdomain (homepage) to the root domain version or to the specific city within the root domain? Example 1:
Intermediate & Advanced SEO | | NewspaperArchive
Option 1: http://covina.abc.com/ = Canonical Tag = http://abc.com/us/california/covina/
Option 2: http://covina.abc.com/ = Canonical Tag = http://abc.com/ Example 2:
Option 1: http://galveston.abc.com/ = Canonical Tag = http://abc.com/us/texas/galveston/
Option 2: http://galveston.abc.com = Canonical Tag = http://abc.com/ Example 3:
Option 1: http://hutchnews.abc.com/ = Canonical Tag = http://abc.com/us/kansas/hutchinson/
Option 2: http://hutchnews.abc.com/ = Canonical Tag = http://abc.com/ I believe it makes more sense to set the canonical tag to the corresponding city (option 1), but wondering if setting the canonical tag to the root domain will pass "some link juice" to the root domain and it will be more beneficial. Thanks!0 -
Is it possible to get a list of pages indexed in Google?
Is there a tool that will give me a list of pages on my site that are indexed in Google?
Intermediate & Advanced SEO | | rise10 -
Google Experiment
Hi there, are there any implications for organic seo for implementing google experiment. Where you send users to a new page?
Intermediate & Advanced SEO | | pauledwards0 -
How to Block Google Preview?
Hi, Our site is very good for Javascript-On users, however many pages are loaded via AJAX and are inaccessible with JS-off. I'm looking to make this content available with JS-off so Search Engines can access them, however we don't have the Dev time to make them 'pretty' for JS-off users. The idea is to make them accessible with JS-off, but when requested by a user with JS-on the user is forwarded to the 'pretty' AJAX version. The content (text, images, links, videos etc) is exactly the same but it's an enormous amount of effort to make the JS-off version 'pretty' and I can't justify the development time to do this. The problem is that Googlebot will index this page and show a preview of the ugly JS-off page in the preview on their results - which isn't good for the brand. Is there a way or meta code that can be used to stop the preview but still have it cached? My current options are to use the meta noarchive or "Cache-Control" content="no-cache" to ask Google to stop caching the page completely, but wanted to know if there was a better way of doing this? Any ideas guys and girls? Thanks FashionLux
Intermediate & Advanced SEO | | FashionLux0 -
HTTPS Duplicate Content?
I just recieved a error notification because our website is both http and https. http://www.quicklearn.com & https://www.quicklearn.com. My tech tells me that this isn't actually a problem? Is that true? If not, how can I address the duplicate content issue?
Intermediate & Advanced SEO | | QuickLearnTraining0