Client's site dropped completely for all keywords, but not brand name - not manual penalty... help!
-
We just picked up a new search client a few weeks ago. They've been a customer (we're an automotive dealer website provider) since October of 2011. Their content was very generic (came from the previous provider), so we did a quick once-over as soon as he signed up. Beefed up his page content, made it more unique and relevant... tweaked title tags... wrote meta descriptions (he had none).
In just over a week, he went from ranking on page 4 or 5 for his terms to ranking on page 2 or 3. My team was working on getting his social media set up, set up his blog, started competitor research...
And then this last weekend, something happened and he dropped completely from the rankings... He still shows up if you do a site: search, or if you search his exact business name, but for everything else, he's nowhere to be found.
His URL is www.ohioautowarehouse.com, business name is "Ohio Auto Warehouse"
We filed a reconsideration request on Monday, and just got a reply today that there was no manual penalty. They suggested we check our content, but we know we didn't do anything spammy or blackhat.
We hadn't even fully optimized his site yet - we were just finishing up his competitor research and were planning on a full site optimization next week... so we're at a complete loss as to what happened.
Also, he's not ranking for any of the vehicles in his inventory. Our vehicle pages always rank on page 1 or 2, depending on how big the city is... you can always search "year make model city" and see our customers' sites (whether they're doing SEO or not). This guy's cars aren't showing up... so we know something is going on...
Any help would be a lifesaver. We've been doing this for quite some time now, and we've never had a site get penalized. Since the reconsideration request didn't help, we're not sure what to do...
-
(rolling eyes)
I need a big "I heart google" shirt...
so we filed a reconsideration request the week that this happened, then got a reply that it wasn't a manual penalty and that we should look at the content.
so we filed another one, with a very in-depth explanation that we hadn't changed much content, we weren't violating any Google guidelines, and we hadn't done anything to trigger an algorithmic penalty...
since this started, we didn't do anything on his site - zero changes to content (wanted to wait until we heard back from the 2nd request) - we added a single link from a car dealer directory site (that you have to call to be listed in) the day we realized he had dropped from the SERPs, but nothing since then...
the ONLY thing we did was finally sort out his + Local listing - the address was displaying as 916 but is really 915 - but since he didn't know who registered his Places page, we had to re-verify... so there were 2 verified google accounts with access to the Places dashboard, which apparently causes issues and problems... we got that sorted out so that we're the only account with access... and we updated the address...
then we got a reply Monday from our 2nd reconsideration request that was word-for-word the same as the first one - no manual penalty, check your content, blah blah blah...
and as of Tuesday of this week, BAM - he's back in the SERPs... he's around page 3 or 4 for most of his targeted terms, so the little bit of initial optimization we did has him ranking higher than he was before he signed up with us...
so - don't know if the Places/+ Local address being off by a few numbers was the problem (probably not, cause that problem was there before we started) or if the second reconsideration request got someone to look and fix something... or if it was just a really random algorithmic hiccup...
-
Hey Greg, any resolution on this?
-
I doubt it buddy! No one said this was fair and I would certainly try to make as much of it as unique as possible to get out of this hole!
-
Thanks Marcus -
as far as the Craigslist stuff - it's duplicated because he lists his inventory on Craigslist. We see this with all of our dealers who list inventory on CL, but it's never penalized anyone.
Also, inventory is fed out to hundreds of classified sites, so you'll see supe content there as well - but it's never hurt any of our dealers (and it's extremely common in the auto industry)
As far as the other websites that come up, they're other customers - 99% of our websites have the same menu buttons, as they're pretty standardized in the auto industry... and for those customers who don't do SEO, there's really generic basic text which leads to similar/duplicate content issues. Â But again, it's never caused a penalty before...
can't do anything about dupe content on the vehicle side of things, but we'll definitely update the content on his custom pages... and obviously we'll start linkbuilding...
hopefully that will be enough to get him back in...
-
yeah - the system shows the home page whenever a sold vehicle page is requested (i know, not the right way to do it at all - fighting that battle with our dev team already)
but the same thing happens with all of our customers' sites (over 1000), and no one else is penalized for it...
-
Just one more, you have some weird internal duplication going on too. The following URLs are indexed but render the homepage:
I bet there are more of these too. Maybe these pages are supposed to 404/redirect and instead are mirroring the homepage? That could be causing the issue.
-
Hey Greg, this sounds like an algorithmic penalty. It could be to do with the site itself or more likely it could be to do with external aspects that are / were beyond your control.
Google has replied and told you to check your content so this certainly needs some investigation for starters.
1. External Duplication
I did a quick copyscape search for the homepage and it came back with 26 other sites with the same or similar content (mostly craigslist sub domains).
I did another copyscape search for one product page and it came back with 80 results of sites with the same or similar content.
You have external duplication issues.
2. Link Profile
There is no link profile to speak of so that is not the problem - 16 links from 6 sites. Nothing of any quality.
So, without getting all CSI on this, you have a site with no links and a whole bucket load of content duplication on other sites. Whether they are cutting and pasting this content and dropping it on other sites or they are taking it from other places, you need to A) sort this out & B) educate them going forward.
Get the duplication sorted, get a few decent links, add a bit of unique content (try moving the blog onto the main domain instead of on the blog. sub domain) and you will bounce back for these long tail searches at least.
Seriously though, I have seen this a bunch of times. I have even seen older static sites have their content so pillaged and scraped that they got penalised but it was easy fixed by just making the content unique and leave the old duplicated stuff to the haters and scrapers. You probably can't control all these external sites as easy as you can edit the site itself so get onto that quick smart and tell your client to stop dropping this content on every single website imaginable.
As google hinted - look at your content.
Hope this helps! Shout if I can help!
P.S. Oh, and the URLs are horrible, something a little shorter and less obviously dynamic would be better for users.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Recent redirect leading to keyword ranking drop
Hey all! Someone I work with recently redirected one of their site pages via Squarespace. They used Squarespace-provided code to make a 301 redirect. Following this, the primary keywords for the page that was redirect to have dropped pretty significantly. Any Squarespace pros out there who can help me figure out what's going on?
Technical SEO | | kelseyworsham0 -
If I'm using a compressed sitemap (sitemap.xml.gz) that's the URL that gets submitted to webmaster tools, correct?
I just want to verify that if a compressed sitemap file is being used, then the URL that gets submitted to Google, Bing, etc and the URL that's used in the robots.txt indicates that it's a compressed file. For example, "sitemap.xml.gz" -- thanks!
Technical SEO | | jgresalfi0 -
Google's ability to crawl AJAX rendered content
I would like to make a change to the way our main navigation is currently rendered on our e-commerce site. Currently, all of the content that appears when you click a navigation category is rendering on page load. This is currently a large portion of every page visit’s bandwidth and even the images are downloaded even if a user doesn’t choose to use the navigation. I’d like to change it so the content appears and is downloaded only IF the user clicks on it, I'm planning on using AJAX. As that is the case it wouldn’t not be automatically on the site(which may or may not mean Google would crawl it). As we already provide a sitemap.xml for Google I want to make sure this change would not adversely affect our SEO. As of October this year the Webmaster AJAX crawling doc. suggestions has been depreciated. While the new version does say that its crawlers are smart enough to render AJAX content, something I've tested, I'm not sure if that only applies to content injected on page load as opposed to in click like I'm planning to do.
Technical SEO | | znotes0 -
Why are only PDFs on my client's site being indexed, and not actual pages?
My client has recently built a new site (we did not build this), which is a subdomain of their main site. The new site is: https://addstore.itelligencegroup.com/uk/en/. (Their main domain is: http://itelligencegroup.com/uk/) This new Addstore site has recently gone live (in the past week or so) and so far, Google appears to have indexed 56 pdf files that are on the site, but it hasn't indexed any of the actual web pages yet. I can't figure out why though. I've checked the robots.txt file for the site which appears to be fine: https://addstore.itelligencegroup.com/robots.txt. Does anyone have any ideas about this?
Technical SEO | | mfrgolfgti0 -
Windows Acces used for e-commerce site - help needed
Hello everybody, I am working on this e-commerce website built on windows access and it's a nightmare to change the html content on it.has anyone used it before? It doesn't allow me to change the content for the html tags even though it should  and i don't have a clue about what to do. Thanks oscar
Technical SEO | | PremioOscar0 -
What's the best way to solve this sites duplicate content issues?
Hi, The site is www.expressgolf.co.uk and is an e-commerce website with lots of categories and brands. I'm trying to achieve one single unique URL for each category / brand page to avoid duplicate content and to get the correct URL's indexed. Currently it looks like this... Main URL http://www.expressgolf.co.uk/shop/clothing/galvin-green Different Versions http://www.expressgolf.co.uk/shop/clothing/galvin-green/ http://www.expressgolf.co.uk/shop/clothing/galvin-green/1 http://www.expressgolf.co.uk/shop/clothing/galvin-green/2 http://www.expressgolf.co.uk/shop/clothing/galvin-green/3 http://www.expressgolf.co.uk/shop/clothing/galvin-green/4 http://www.expressgolf.co.uk/shop/clothing/galvin-green/all http://www.expressgolf.co.uk/shop/clothing/galvin-green/1/ http://www.expressgolf.co.uk/shop/clothing/galvin-green/2/ http://www.expressgolf.co.uk/shop/clothing/galvin-green/3/ http://www.expressgolf.co.uk/shop/clothing/galvin-green/4/ http://www.expressgolf.co.uk/shop/clothing/galvin-green/all/ Firstly, what is the best course of action to make all versions point to the main URL and keep them from being indexed - Canonical Tag, NOINDEX  or block them in robots? Secondly, do I just need to 301 the (/) from all URL's to the non (/) URL's ? I'm sure this question has been answered but I was having trouble coming to a solution for this one site. Cheers, Paul
Technical SEO | | paulmalin0 -
Several Keywords Dropped Completely
On December 14th one of my sites dropped completely for multiple keywords. I cannot figure out why this happened. I have been adding content more in the past few months than ever before... www.howtomac.com is the website I am attaching a few webmasters screenshots so you can see the drop. Any help is much appreciated. htm-webmasters-0.jpg
Technical SEO | | 38wesley1 -
What's the best way to deal with an entire existing site moving from http to https?
I have a client that just switched their entire site from the standard unsecure (http) to secure (https) because of over-zealous compliance issues for protecting personal information in the health care realm. They currently have the server setup to 302 redirect from the http version of a URL to the https version.  My first inclination was to have them simply update that to a 301 and be done with it, but I'd prefer not to have to 301 every URL on the site. I know that putting a rel="canonical" tag on every page that refers to the http version of the URL is a best practice (http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=139394), but should I leave the 302 redirects or update them to 301's.  Something seems off to me about the search engines visiting an http page, getting 301 redirected to an https page and then being told by the canonical tag that it's actually the URL they were just 301 redirected from.
Technical SEO | | JasonCooper0