Would duplicate listings effect a client's ranking if they used same address?
-
Lots of duplication on directory listings using similar or same address, just different company names... like so-and-so carpet cleaning; and another listing with so-and-so janitorial services.
Now my client went from a rank around 3 - 4 to not even in the top 50 within a week.
-- -- -- Would duplication cause this sudden drop?
Not a lot of competition for a client using keyword (janitorial services nh);
-- -- -- would a competitor that recently optimized a site cause this sudden drop?
Client does need to optimize for this keyword, and they do need to clean up this duplication. (Unfortunately this drop happened first of March -- I provided the audit, recommendations/implementation and still awaiting the thumbs up to continue with implementation).
--- --- --- Did Google make a change and possibly find these discrepancies within listings and suddenly drop this client's ranking?
And they there's Google Places:
Client usually ranks #1 for Google Places with up to 12 excellent reviews, so they are still getting a good spot on the first page.The very odd thing though is that Google is still saying that need to re-verify their Google places.
I really would like to know for my how this knowledge how a Google Places account could still need verification and yet still rank so well within Google places on page results? because of great reviews? --- Any ideas here, too?
_Cindy
-
Glad to be of help, Cindy. Good luck!
Miriam
-
Miriam,
Thank you very much. I believe I am starting to see the whole picture of possibilities for this drop in rank. And excellent info about Google Places.
I do need to visit the Places help forum and add it to the list of resources!
Much appreciated.
_Cindy
-
Hi Cindy,
I would definitely suspect either a penalty or a bug of some sort in regards to the drop from #1 to not in top 50.
What happens when you do a direct business name or phone number search within maps.google.com for the business? Are you able to find their Place Page, or, possibly, getting a 'Do Not Support' message.
Check out this thread at the Google Places Help Forum to see if you recognize your problem in it:
https://productforums.google.com/forum/#!category-topic/business/technical-issue/9JszqkewMVU
Regardless of this, you have clearly identified problems with the business that need to be fixed ASAP. There is no question in my mind that the data confusion could be precisely what has caused the ranking drop. Data consistency is the number one requirement of a good Places record, but, a business can get by for months or years with high rankings and bad data. Then, one day, it all goes away. This is something I see being reported constantly in the Google Places Help Forum, and the major lesson is that if you find problems, make every effort to clean them up as swiftly as possible. Then, once you've cleaned up the record to the best of your ability, you need to wait for the effects of what you've done to settle in. So, it becomes about patience.
Hopefully, the client will give you the go-ahead to get cracking on this. Good luck!
-
Perfect, thanks Cody, helps a lot.
And thank you for advice on matching contact page and Google Places!
_Cindy
-
In the local search realm, you want to have every directory listing exactly the same as it appears on your contact page. This allows search engines to associate those listings with your business without a doubt. As a good practice, I'd go ahead and make sure they all match, especially Google Places vs. your client's Contact Us page on their website.
Also, Penguin hit directories pretty hard. There is no penalization going on from what I have read, but there has been some major devaluation in links. It's likely the case that your client is not getting nearly as much weight from the directory links as they once were. Check out the link profile and see how many of these there are. I would imagine more reputable links now need to be acquired.
About the places listing, I'm not sure why that would occur, but it really doesn't matter for the end result here. If the Big G says you need to re-verify I would do it. It's a quick thing to do and that can't be helping their rankings.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sub domain? Micro site? What's the best solution?
My client currently has two websites to promote their art galleries in different parts of the country. They have bought a new domain (let's call it buyart.com) which they would eventually like to use as an e-commerce platform. They are wondering whether they keep their existing two gallery websites (non e-commerce) separate as they always have been, or somehow combine these into the new domain and have one overarching brand (buyart.com). I've read a bit on subdomains and microsites but am unsure at this stage what the best option would be, and what the pros and cons are. My feeling is to bring it all together under buyart.com so everything is in one place and creates a better user journey for anyone who would like to visit. Thoughts?
Technical SEO | | WhitewallGlasgow0 -
'duplicate content' on several different pages
Hi, I've a website with 6 pages identified as 'duplicate content' because they are very similar. This pages looks similar because are the same but it show some pictures, a few, about the product category that's why every page look alike each to each other but they are not 'exactly' the same. So, it's any way to indicate to Google that the content is not duplicated? I guess it's been marked as duplicate because the code is 90% or more the same on 6 pages. I've been reviewing the 'canonical' method but I think is not appropriated here as the content is not the same. Any advice (that is not add more content)?
Technical SEO | | jcobo0 -
Shortening URL's
Hello again Mozzers, I am debating what could be a fairly drastic change to the company website and I would appreciate your thoughts. The URL structure is currently as follows Product Pages
Technical SEO | | ATP
www.url.co.uk/product.html Category Pages
www.url.co.uk/products/category/subcategory.html I am debating removing the /products/ section as i feel it doesn't really add much and lengthens the url with a pointless word. This does mean however redirecting about 50-60 pages on the website, is this worth it? Would it do more damage than good? Am i just being a bit OCD and it wont really have an impact? As always, thanks for the input0 -
What's Moz's Strategy behind their blog main categories?
I've only just noticed that the Moz' blog categories have been moved within a pull down menu. See it underneath : 'Explore Posts by Category' on any blog page. This means that the whole list of categories under that pull-down is not crawlable by bots, and therefore no link-juice flows down onto those category pages. I imagine that the main drive behind that move is to sculpt page rank so that the business/money pages or areas of the website get greater link equity as opposed to just wasting it all throwing it down to the many categories ? it'd be good to hear about more from Rand or anyone in his team as to how they came onto engineering this and why. One of the things I wonder is: with the sheer amount of content that Moz produces, is it possible to contemplate an effective technical architecture such as that? I know they do a great job at interlinking content from one post onto another, so effectively one can argue that that kind of supersedes the need for hierarchical page rank distribution via categories... but I wonder : "is it working better this way vs having crawlable blog category links on the blog section? have they performed tests" some insights or further info on this from Moz would be very welcome. thanks in advance
Technical SEO | | carralon
David0 -
We're no longer turning up in Google SERP for our brand search when we used to be #1 after our site update. Any ideas why?
We recently updated our website and during the push, someone mistakenly 301 redirected "www.brandx.com" to "brandx.com" instead of the otherway. Since then, our website no longer turns up for the search "brandx" on Google. We have reversed the mistake a few days ago, but we're still not turning up, and we used to rank #1 in Google SERP. Could it just be due to timing between the crawls and that our www. site didn't make it in Google's index due to this mistake? We have submitted our new sitemap to google a couple of days ago as well, as a side we're still showing up #1 in Bing's results however. And it should still show up based on SEOMoz's SERP report. Any help would help as I'm growing increasingly concerned.
Technical SEO | | JoeLin0 -
What's the best way to switch over to a new site with a different CMS?
Is it better to 301 or to closely duplicate each page URL when switching over to a new website from an established site with good ranking and a different CMS ( Drupal switching to Wordpress)?
Technical SEO | | OhYeahSteve0 -
What's the best free tool for checking for broken links?
I'm trying to find the best tool to check for broken links on our site. We have over 11k pages and I'm looking for something fast and thorough! I've tried Xenu and LinkChecker. Any other ideas?
Technical SEO | | CIEEwebTeam0 -
What's the best way to deal with an entire existing site moving from http to https?
I have a client that just switched their entire site from the standard unsecure (http) to secure (https) because of over-zealous compliance issues for protecting personal information in the health care realm. They currently have the server setup to 302 redirect from the http version of a URL to the https version. My first inclination was to have them simply update that to a 301 and be done with it, but I'd prefer not to have to 301 every URL on the site. I know that putting a rel="canonical" tag on every page that refers to the http version of the URL is a best practice (http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=139394), but should I leave the 302 redirects or update them to 301's. Something seems off to me about the search engines visiting an http page, getting 301 redirected to an https page and then being told by the canonical tag that it's actually the URL they were just 301 redirected from.
Technical SEO | | JasonCooper0