How Do You Know or Find Out if You've been hit by a Google Penalty?
-
Hi Moz Community,
How do you find out if you have been hit with a Google Penalty?
Thanks,
Gary
-
Hi there,
For manual penalty, please check your search console under manual actions tab.
For algorithm penalty, you may notice huge drop in traffic and rankings, and may not rank for your brand name
-
Hi Kirsten,
Thanks for sharing. I'll have a look. Thanks for sharing the three points.
Have a great day.
G
-
Hi Deacyde,
Thanks for dropping in. I'll take a look at both sites and see what I can come up with.
It's funny the more you work at digital marketing you realize the less you know...
Fun game, just want to stay above board.
Gary
-
Not to barge in but I recently used these two sites coupled with analytics data to see if a drop correlated with a algo update.
I used http://feinternational.com/website-penalty-indicator/
Which will use estimated search traffic as far back as 2012 to present and overlays google algo updates ( color coded by type of algo ) so you can see what drop relates to what algo update.
I also used http://barracuda.digital/panguin-tool/
Which will ask to link to your analytics account ( read only ), you'll select the account, and the view and it will do the same as above, overlays the google algo updates to help you figure out what algo you were possibly hit by. ( also give further info about what each algo was and where to read more about it at )
If you don't want to use those kinda sites, your analytics data will be your best resource, looking for big drops after a steady flatline or increase could be an indication of a penalty, but really don't just assume, find out as much info about the algo update you think relates with your drops to see if your site really lacks in the area the algo was about.
Hope this helps!
-
Thanks Kristen. appreciate the feedback. What is the top three steps I would take to check the site for an algorithm penalty?
Thanks,
G
-
It will be listed under Manual Penalties in your Webmaster Console.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Trying to escape from Google algorithm ranking drop
in 2010 our website was ranking number 1 for many keywords. we suddenly saw a crash in this a few years ago. we have since identified we have been hit by many shades of Panda and penguin updates. Mainly due to low quality back-links and poor content (some duplicates). since then we have done a major overhaul of our backlink profile. We have saved rankings that went from number 1 for many keywords to number 60 -70. We are now placed at around 11 to 18 rankings. We have also looked at our duplicate content issues, and removed all duplicate content, introduced a blog for fresh bi daily updates in an attempt to gain traffic. We also amalgamated many small low quality pages to larger higher quality content pages. we are now mobile friendly with a dynamic site, and our site speed is good (around 80). we have switched to https, and also upgraded our website for better conversions. we have looked at the technical issues of the site and don't have many major issues, although we do have 404's coming up in the google webmaster tools for old pages we removed due to duplicate content. we are link building at a pace of around 40 mentions a month. some are no follow, some do follow and some no links. We are diversifying links to include branding in addition to target keywords. We have pretty much exhausted every avenue we can think of now, but we cannot jump over to page 1 for any significant keywords we are targeting. Our competitor websites are not that powerful, and metrics are similar to ours if not lower. 1. please can you advise anything else you can think of that we should look at. 2. we are even considering going to a new domain and 301'ing all pages to this domain in an attempt to shake off the algorithm filter (penalties). has anyone done this? how long can we expect to get at least the same ranking for the new domain if 301 all urls to it? do you think its worth it? we know the risk of doing this, and so wanted to seek some advice. 3. we have on the other hand considered the fact that we have disavowed so many links (70%) that this could be a cause of the page two problem, however we are link building according to moz metric standards and majestic standards with no benefit.. do you think we should increase link building? Advice is appreciated!
White Hat / Black Hat SEO | | Direct_Ram0 -
Controlling crawl speed/delay through dynamic server-code and 503's
Lately i'm experiencing performance trouble caused by bot traffic. Although Googlebot is not the worst (it's mainly bingbot and ahrefsbot), they cause heavy server load from time to time. We run a lot of sites on one server, so heavy traffic on one site impacts other site's performance. Problem is that 1) I want a centrally managed solution for all sites (per site administration takes too much time), which 2) takes into account total server-load in stead of only 1 site's traffic and 3) controls overall bot-traffic in stead of controlling traffic for one bot. IMO user-traffic should always be prioritized higher than bot-traffic. I tried "Crawl-delay:" in robots.txt, but Googlebot doesn't support that. Although my custom CMS system has a solution to centrally manage Robots.txt for all sites at once, it is read by bots per site and per bot, so it doesn't solve 2) and 3). I also tried controlling crawl-speed through Google Webmaster Tools, which works, but again it only controls Googlebot (and not other bots) and is administered per site. No solution to all three of my problems. Now i came up with a custom-coded solution to dynamically serve 503 http status codes to a certain portion of the bot traffic. What traffic-portion for which bots can be dynamically (runtime) calculated from total server load at that certain moment. So if a bot makes too much requests within a certain period (or whatever other coded rule i'll invent), some requests will be answered with a 503 while others will get content and a 200. Remaining question is: Will dynamically serving 503's have a negative impact on SEO? OK, it will delay indexing speed/latency, but slow server-response-times do in fact have a negative impact on the ranking, which is even worse than indexing-latency. I'm curious about your expert's opinions...
White Hat / Black Hat SEO | | internetwerkNU1 -
OSE report doesn't quite reflect the fact for me?
Hope someone could get me some insight if possible. We operate SEO purely on whitehat and for a popular keyword that we have worked hard for years now we ranks 10th. I have compared us with a few competitors who rank better (ranked 1st and 3rd) on OSE and found things confusing. In the following matrix we are way ahead of them in: Domain Authority
White Hat / Black Hat SEO | | LauraHT
Page Authority
Just-Discovered
root domain
total links
Social like/Social shares All score of above of our site are substantially higher than the competitors. one of the competitors has only one thing better than us:
Internal Equity-Passing Links plus It shows that both competitors have lots of low quality links as follow -forum signature anchor text links where the account no contribution to the forum
-low authority directories links where many of them are overseas and not industry specific
-links from article sites
-link from sites that are in totally different industries where we only have very a few or no from above I am thinking if the matrix figures from OSE dont count then what else I should be looking at. Any advice? please forgive me if I chose the wrong support question type.0 -
How will Google deal with the crosslinks for my multiple domain site
Hi, I can't find any good answer to this question so I thought, why not ask Moz.com ;-)! I have a site, let's call it webshop.xx For a few languages/markets, Deutsch, Dutch & Belgian, English, French. I use a different TLD with a different IP for each of these languages, so I'll end up with: webshop.de, webshop.nl, webshop.be, webshop.co.uk, webshop.com & webshop.fr They all link to eachother and every subpage that is translated from the other site gets a link as well from the other languages, so: webshop.com/stuff links to webshop.de/stuff My main website, webshop.com gets links from every other of these domain which Open Site Explorer as well as Majestic SEO sees as an external link. (this is happening) My question. How will Google deal in the long-run with the crosslinks coming from these domains? some guesses I made: I get full external links juice (content is translated so unique?) I get a bit of the juice of an external link They are actually seen as internal links I'll get a penalty Thanks in advance guys!!!
White Hat / Black Hat SEO | | pimarketing0 -
Who's still being outranked by spam?
Over the past few months, through Google Alerts, I've been watching one of our competitors kick out crap press releases, and links to their site have been popping up all over blog networks with exact match anchor text. They now outrank us for that anchor text. Why is this still happening? Three Penguin updates later and this still happens. I'm trying so hard to do #RCS and acquire links that will ensure our site's long-term health in the SERPs. Is anyone else still struggling with this crap?
White Hat / Black Hat SEO | | UnderRugSwept2 -
Will my association's network of sites get penalized for link farming?
Before beginning I found these similar topics here: http://www.seomoz.org/q/multiple-domains-on-same-ip-address-same-niche-but-different-locations http://www.seomoz.org/q/multiple-domains-on-1-ip-address We manage over two dozen dental sites that are individually owned through out the US. All these dentists are in a dental association which we also run and are featured on (http://www.acedentalresource.com/). Part of the dental associations core is sharing information to make them better dentists and to help their patients which in addition to their education, is why they are considered to be some of the best dentists in the world. As such, we build links from what we consider to be valuable content between the sites. Some sites are on different IPs and C-Blocks, some are not. Given the fact that each site is only promoting the dentist at that brick and mortar location but also has "follow" links to other dentists' content in the network we fear that we are in the grey area of link building practices. Questions are: Is there an effective way to utilize the power of the network if quality content is being shared? What risks are we facing given our network? Should each site be on a different IP? Would having some of our sites on different servers make our backlinks more valuable than having all of our sites under the same server? If it is decided that having unique IPs is best practice, would it be obvious that we made the switch? Keep in mind that ALL sites are involved in the association, so naturally they would be linking to each other, and the main resource website mentioned above. Thanks for your input!
White Hat / Black Hat SEO | | DigitalElevator0 -
Is this Penguin or Manual Penalty?
I have a client that's traffic dropped off on April 10th. They did get a message in GWT on March 21st. The April 10th date leads me to believe that it is a manual penalty and couldn't be penguin since penguin was released on April 24th. I guess either way backlinks need to be cleaned up though.
White Hat / Black Hat SEO | | RonMedlin0 -
Why is Google not punishing paid links as it says it will?
I've recently started working with a travel company - and finding the general link building side of the business quite difficult. I had a call from an SEO firm the other day offering their services, and stating that they had worked with a competitor of ours and delivered some very good results. I checked the competitors rankings, PR, link profile, and indeed, the results were quite impressive. However, the link profile pointed to one thing, that was incredibly obvious. They had purchased a large amount of sidebar text links from powerful blogs in the travel sector. Its painfully obvious what has happened, yet they still rank very highly for a lot of key terms. Why don't Google do something about this? They aren't the only company in this sector doing this, but it just seems pointless for white hats trying to do things properly, then those with the dollar in their pockets just buy success in the SERPS. Thanks
White Hat / Black Hat SEO | | neilpage1230