Is this a possible Google penalty scenario?
-
In January we were banned from Google due to duplicate websites because of a server configuration error by our previous webmaster. Around 100 of our previously inactive domain names were defaulted to the directory of our company website during a server migration, thus showing the exact same site 100 times... obviously Google was not game and banned us.
At the end of February we were allowed back into the SERPS after fixing the issue and have since steadily regained long-tail keyword phrase rankings, but in Google are still missing our main keyword phrase. This keyword phrase brings in the bulk of our best traffic, so obviously it's an issue.
We've been unable to get above position 21 for this keyword, but in Yahoo, Bing, and Yandex (Russian SE) we're positions 3, 3, and 7 respectively. It seems to me there has to be a penalty in effect, as this keyword gets between 10 and 100 times as much traffic in Google than any of the ones we're ranked for, what do you think?
EDIT: I should mention in the 4-5 years prior to the banning we had been ranked between 15 and 4th in Google, 80% of the time on the first page.
-
Yeah! That was actually the first time I had even heard of Yandex. Of course the issue is that I haven't searched in the Russian version of our keyword using the .ru SE as I'm not sure about any linguistic differences that Google Translate wouldn't account for (it's a financial industry term).
I'm sure we're not ranked highly in the Russian version of the SE though, as we don't have a Russian version of the site! haha
-
Did you see Rand in the latest whiteboard Friday. http://www.seomoz.org/blog/smx-advanced-andy-atkins-kruger-talks-international-seo-and-yandex-whiteboard-friday
Interesting stuff about Yandex and their Algo.
-
You're definitely right about the discrepancies, I pay so much attention to Google that I had forgotten the other SE's show large discrepancies for many of our key phrases/terms.
My thought on the penalty for a certain phrase was that it had to do with the search volume of that phrase, but that's just speculation. I suppose only time and further SEO work will tell.
-
When did Google state they didn't penalize your site in rankings?
My reply was that a penalty is not for a specific term. If your site received a penalty, your entire site is affected (i.e. all terms). If I was to stretch for a corner case, it would be where your site might have a page using a copyrighted term such as "herbal viagra" where Google drops "viagra" from the keyword phrase due to pfizer's copyright.
You can have large discrepancies between Google, Yahoo and Bing. Each company is independent and using their own systems. This is even true between Bing and Yahoo who share a lot of information.
From my own keyword reports I have a site with the following ranks for the same term:
Google - 9, Bing 14, Yahoo 3
Google - 20, Bing - not in top 50, Yahoo - 49
There can definitely be major discrepancies between SERPs.
-
When did Google state they didn't penalize your site in rankings? I thought that's been a trend for years now when you broke the webmaster guidelines in a way that wasn't strong enough to be banned but strong enough to warrant penalty.
Sharing the URL of websites I work on seems like a bad idea to me, so I'll keep it to myself for now. The keyword is "highly competitive" 53%, as determined by SEOMoz's tool and our page has been optimized for this keyword for 5+ years now - hence why we were on page 1 from before 2005 - 2011.
Only once we were banned from Google were we unable to move above the first spot on Page 3, despite regaining #3 rankings in Yahoo and Bing. I don't usually see that large of discrepancies between the SE's, am I wrong?
My one thought is that this also coincides with the release of Panda, though our site has no duplicate content issues.
-
Google does not penalize a site by devaluing your ranking on a given term. They either remove you from their index completely, or they may devalue links to your site which would affect your domain as a whole.
I would recommend taking a very close look at how competitive this keyword is, how well your target page is optimized for it, and make comparisons with the competing sites who outrank you.
For any further details you would need to share the URL and keyword.
-
We are facing similar problem. Although our pages are regularly crawled by google but google doesn't show them on first page. Some are way behind after 100 pages.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What happens when we canonical and point to a page which has been redirected to another page? Google response!
Hi all, I would like to know the different scenarios Google going to respond when we use canonical and redirect for duplicate pages. Let's say A to B are duplicate pages with 95% same content and C Doesn't have same content but context wise similar and priority page we expect to rank for. What happens if we canonical from A to B and set redirect from B to C? What if both A and B are pointed to C with canonical? What if A or B deleted and other one is canonical to C? Note: We can noindex or 301 redirect as they have their own visitors. This is more about showing most relevant content to the audience and avoid duplicate content in search results. Thanks
Algorithm Updates | | vtmoz0 -
Https & Google Updated Guidelines
Hi We have https on aspects of the site which users directly interact with, such as login, basket page. But we don't have https across the whole site. In light of Google adding it to their guidelines - is this something we need to put into action? Also same question on the Accessibility point Ensure that your pages are useful for readers with visual impairments, for example, by testing usability with a screen-reader. Are we going to be penalised if these are not added to our site? Thank you
Algorithm Updates | | BeckyKey0 -
Google is forcing a 301 by truncating our URLs
Just recently we noticed that google has indexed truncated urls for many of our pages that get 301'd to the correct page. For example, we have:
Algorithm Updates | | mmac
http://www.eventective.com/USA/Massachusetts/Bedford/107/Doubletree-Hotel-Boston-Bedford-Glen.html as the url linked everywhere and that's the only version of that page that we use. Google somehow figured out that it would still go to the right place via 301 if they removed the html filename from the end, so they indexed just: http://www.eventective.com/USA/Massachusetts/Bedford/107/ The 301 is not new. It used to 404, but (probably 5 years ago) we saw a few links come in with the html file missing on similar urls so we decided to 301 them instead thinking it would be helpful. We've preferred the longer version because it has the name in it and users that pay attention to the url can feel more confident they are going to the right place. We've always used the full (longer) url and google used to index them all that way, but just recently we noticed about 1/2 of our urls have been converted to the shorter version in the SERPs. These shortened urls take the user to the right page via 301, so it isn't a case of the user landing in the wrong place, but over 100,000 301s may not be so good. You can look at: site:www.eventective.com/usa/massachusetts/bedford/ and you'll noticed all of the urls to businesses at the top of the listings go to the truncated version, but toward the bottom they have the full url. Can you explain to me why google would index a page that is 301'd to the right page and has been for years? I have a lot of thoughts on why they would do this and even more ideas on how we could build our urls better, but I'd really like to hear from some people that aren't quite as close to it as I am. One small detail that shouldn't affect this, but I'll mention it anyway, is that we have a mobile site with the same url pattern. http://m.eventective.com/USA/Massachusetts/Bedford/107/Doubletree-Hotel-Boston-Bedford-Glen.html We did not have the proper 301 in place on the m. site until the end of last week. I'm pretty sure it will be asked, so I'll also mention we have the rel=alternate/canonical set up between the www and m sites. I'm also interested in any thoughts on how this may affect rankings since we seem to have been hit by something toward the end of last week. Don't hesitate to mention anything else you see that may have triggered whatever may have hit us. Thank you,
Michael0 -
Did Google just give away how Penguin works?
At SMX during the You&A with Matt Cutts, Danny asked why the algo update was called Penguin. Matt said: "We thought the codename actually might give too much info about how it works so the lead engineer got to choose." Last night Google released their 39 updates for the month of May. Among them was this: "Improvements to Penguin. [launch codename "twref2", project codename "Page Quality"] This month we rolled out a couple minor tweaks to improve signals and refresh the data used by the penguin algorithm." Whoa, codename twref2 for Penguin improvement? Is this giving us an insight about how it works? I would guess the ref2 means second refresh perhaps. But tw I am not sure about. What do you think? Is there a hidden insight here?
Algorithm Updates | | DanDeceuster1 -
Stop google indexing CDN pages
Just when I thought I'd seen it all, google hits me with another nasty surprise! I have a CDN to deliver images, js and css to visitors around the world. I have no links to static HTML pages on the site, as far as I can tell, but someone else may have - perhaps a scraper site? Google has decided the static pages they were able to access through the CDN have more value than my real pages, and they seem to be slowly replacing my pages in the index with the static pages. Anyone got an idea on how to stop that? Obviously, I have no access to the static area, because it is in the CDN, so there is no way I know of that I can have a robots file there. It could be that I have to trash the CDN and change it to only allow the image directory, and maybe set up a separate CDN subdomain for content that only contains the JS and CSS? Have you seen this problem and beat it? (Of course the next thing is Roger might look at google results and start crawling them too, LOL) P.S. The reason I am not asking this question in the google forums is that others have asked this question many times and nobody at google has bothered to answer, over the past 5 months, and nobody who did try, gave an answer that was remotely useful. So I'm not really hopeful of anyone here having a solution either, but I expect this is my best bet because you guys are always willing to try.
Algorithm Updates | | loopyal0 -
Google Update on the 6th July
Hi Mozzers, Has anyone noticed a Google update on the 6th July? A price comparison site I optimise has fallen off the SERPs for most generic terms, however still getting traffic for longer tail phrases. Cheers Aran
Algorithm Updates | | Entrusteddev0 -
Google removing pages from Index for Panda effected sites?
We have several clients that we took over from other SEO firms in the last 6 months. We are seeing an odd trend. Links are disappearing from the reports. Not just the SEOmoz reports, but all the back link reports we use. Also... sites that pre Panda would show up as a citation or link, have not been showing up. Many are these are not Indexed, and are on large common Y.P or other type sites. Any one think Google is removing pages from the Index on sites based on Panda. Yours in all curiosity. PS ( we are not large enough to produce quantity data on this.)
Algorithm Updates | | MBayes0