How Do You Know or Find Out if You've been hit by a Google Penalty?
-
Hi Moz Community,
How do you find out if you have been hit with a Google Penalty?
Thanks,
Gary
-
Hi there,
For manual penalty, please check your search console under manual actions tab.
For algorithm penalty, you may notice huge drop in traffic and rankings, and may not rank for your brand name
-
Hi Kirsten,
Thanks for sharing. I'll have a look. Thanks for sharing the three points.
Have a great day.
G
-
Hi Deacyde,
Thanks for dropping in. I'll take a look at both sites and see what I can come up with.
It's funny the more you work at digital marketing you realize the less you know...
Fun game, just want to stay above board.
Gary
-
Not to barge in but I recently used these two sites coupled with analytics data to see if a drop correlated with a algo update.
I used http://feinternational.com/website-penalty-indicator/
Which will use estimated search traffic as far back as 2012 to present and overlays google algo updates ( color coded by type of algo ) so you can see what drop relates to what algo update.
I also used http://barracuda.digital/panguin-tool/
Which will ask to link to your analytics account ( read only ), you'll select the account, and the view and it will do the same as above, overlays the google algo updates to help you figure out what algo you were possibly hit by. ( also give further info about what each algo was and where to read more about it at )
If you don't want to use those kinda sites, your analytics data will be your best resource, looking for big drops after a steady flatline or increase could be an indication of a penalty, but really don't just assume, find out as much info about the algo update you think relates with your drops to see if your site really lacks in the area the algo was about.
Hope this helps!
-
Thanks Kristen. appreciate the feedback. What is the top three steps I would take to check the site for an algorithm penalty?
Thanks,
G
-
It will be listed under Manual Penalties in your Webmaster Console.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Ranking Fluctuation on "Canvas Prints" keyword in google.co.uk
Hello Moz We are struggling for "canvas prints" ranking in google.co.uk since last 2 years. every time in SERP my webpage has been changed. i want to rank this URL on this particular keyword - "canvas prints" Can you tell me why my page has been fluctuate every time in SERP's. mtwpvf
White Hat / Black Hat SEO | | CommercePundit1 -
I have plenty of backlinks but the site does not seem to come up on Google`s first page.
My site has been jumping up and down for many months now. but it never stays on Google first page. I have plenty of back-links, shared content on social media. But what could i be doing wrong? any help will be appreciated. Content is legit. I have recently added some internal links is this might be the cause? Please help .
White Hat / Black Hat SEO | | samafaq0 -
Does Google crawl and index dynamic pages?
I've linked a category page(static) to my homepage and linked a product page (dynamic page) to the category page. I tried to crawl my website using my homepage URL with the help of Screamingfrog while using Google Bot 2.1 as the user agent. Based on the results, it can crawl the product page which is a dynamic. Here's a sample product page which is a dynamic page(we're using product IDs instead of keyword-rich URLs for consistency):http://domain.com/AB1234567 Here's a sample category page: http://domain.com/city/area Here's my full question, does the spider result (from Screamingfrog) means Google will properly crawl and index the property pages though they are dynamic?
White Hat / Black Hat SEO | | esiow20130 -
Partial match penalty & Penguin 2.1 smack
Our site is large and allows business owners to post their inventory for sale. We also make websites for those businesses that post their inventory. We link back to the home page of our site from each of those business websites using our domain name as the anchor text. Last summer we got a partial match penalty from Google "Unnatural links to your site—impacts links Google has detected a pattern of unnatural artificial, deceptive, or manipulative links pointing to pages on this site. Some links may be outside of the webmaster’s control, so for this incident we are taking targeted action on the unnatural links instead of on the site’s ranking as a whole. " We investigated and noticed a large amount of links from spammy sites, forum signatures, blog comments, etc. We think we were hit by a negative SEO campaign. We started cleaning up the backlinks and disavowing them. Every reconsideration request since has been denied with more examples of these horrid links. The final reconsideration request gave as examples of how we're violating Google link quality guidelines, our own sites we make for businesses. "_Google has received a reconsideration request from a site owner for domainname.com. We've reviewed the links to your site and we still believe that some of them are outside our quality guidelines." _ So here's the issue I need your advice on. We have tens of thousands of business websites linking back to our main site using our domain name. We're assuming this is the reason Google gave them as examples for violating link quality guidelines. **How can we fix this without losing traffic from removing all those backlinks or make our traffic tank worse than it has? ** Can we replace the domain name with our logo image and still link? Can we nofollow all those links? Can we link not to the home page but to internal pages or sections with no more than 10% of the links, linking to each section? Should we just remove the links and cry?
White Hat / Black Hat SEO | | CFSSEO0 -
Looking for a Way to Standardize Content for Thousands of Pages w/o Getting Duplicate Content Penalties
Hi All, I'll premise this by saying that we like to engage in as much white hat SEO as possible. I'm certainly not asking for any shady advice, but we have a lot of local pages to optimize :). So, we are an IT and management training course provider. We have 34 locations across the US and each of our 34 locations offers the same courses. Each of our locations has its own page on our website. However, in order to really hone the local SEO game by course topic area and city, we are creating dynamic custom pages that list our course offerings/dates for each individual topic and city. Right now, our pages are dynamic and being crawled and ranking well within Google. We conducted a very small scale test on this in our Washington Dc and New York areas with our SharePoint course offerings and it was a great success. We are ranking well on "sharepoint training in new york/dc" etc for two custom pages. So, with 34 locations across the states and 21 course topic areas, that's well over 700 pages of content to maintain - A LOT more than just the two we tested. Our engineers have offered to create a standard title tag, meta description, h1, h2, etc, but with some varying components. This is from our engineer specifically: "Regarding pages with the specific topic areas, do you have a specific format for the Meta Description and the Custom Paragraph? Since these are dynamic pages, it would work better and be a lot easier to maintain if we could standardize a format that all the pages would use for the Meta and Paragraph. For example, if we made the Paragraph: “Our [Topic Area] training is easy to find in the [City, State] area.” As a note, other content such as directions and course dates will always vary from city to city so content won't be the same everywhere, just slightly the same. It works better this way because HTFU is actually a single page, and we are just passing the venue code to the page to dynamically build the page based on that venue code. So they aren’t technically individual pages, although they seem like that on the web. If we don’t standardize the text, then someone will have to maintain custom text for all active venue codes for all cities for all topics. So you could be talking about over a thousand records to maintain depending on what you want customized. Another option is to have several standardized paragraphs, such as: “Our [Topic Area] training is easy to find in the [City, State] area. Followed by other content specific to the location
White Hat / Black Hat SEO | | CSawatzky
“Find your [Topic Area] training course in [City, State] with ease.” Followed by other content specific to the location Then we could randomize what is displayed. The key is to have a standardized format so additional work doesn’t have to be done to maintain custom formats/text for individual pages. So, mozzers, my question to you all is, can we standardize with slight variations specific to that location and topic area w/o getting getting dinged for spam or duplicate content. Often times I ask myself "if Matt Cutts was standing here, would he approve?" For this, I am leaning towards "yes," but I always need a gut check. Sorry for the long message. Hopefully someone can help. Thank you! Pedram1 -
How does Google decide what content is "similar" or "duplicate"?
Hello all, I have a massive duplicate content issue at the moment with a load of old employer detail pages on my site. We have 18,000 pages that look like this: http://www.eteach.com/Employer.aspx?EmpNo=26626 http://www.eteach.com/Employer.aspx?EmpNo=36986 and Google is classing all of these pages as similar content which may result in a bunch of these pages being de-indexed. Now although they all look rubbish, some of them are ranking on search engines, and looking at the traffic on a couple of these, it's clear that people who find these pages are wanting to find out more information on the school (because everyone seems to click on the local information tab on the page). So I don't want to just get rid of all these pages, I want to add content to them. But my question is... If I were to make up say 5 templates of generic content with different fields being replaced with the schools name, location, headteachers name so that they vary with other pages, will this be enough for Google to realise that they are not similar pages and will no longer class them as duplicate pages? e.g. [School name] is a busy and dynamic school led by [headteachers name] who achieve excellence every year from ofsted. Located in [location], [school name] offers a wide range of experiences both in the classroom and through extra-curricular activities, we encourage all of our pupils to “Aim Higher". We value all our teachers and support staff and work hard to keep [school name]'s reputation to the highest standards. Something like that... Anyone know if Google would slap me if I did that across 18,000 pages (with 4 other templates to choose from)?
White Hat / Black Hat SEO | | Eteach_Marketing0 -
How to recover my site from -50 penalty
One of my sites was hit after Google confirmed its panda 3.2 update. The site ranked very well for many heavy traffic keywords in my niche. But all of a sudden, 80% of the keywords which ranked high in the previous dropped 50 in SERP. I know it is a -50 penalty , but i do not know how to recover from it. The link building campaign is almost the same as before and all of the articles are unique. BTW, i have two image ads on the sidebar and 7 affiliate links on the bottom of the page. Any input will be great appreciated !
White Hat / Black Hat SEO | | aoneshosesun0 -
Is it possible that since the Google Farmer's Update, that people practicing Google Bowling can negatively affect your site?
We have hundreds of random bad links that have been added to our sites across the board that nobody in our company paid for. Two of our domains have been penalized and three of our sites have pages that have been penalized. Our sites are established with quality content. One was built in 2007, the other in 2008. We pay writers to contribute quality and unique content. We just can't figure out a) Why the sites were pulled out of Google indexing suddenly after operating well for years b) Where the spike in links came from. Thanks
White Hat / Black Hat SEO | | dahnyogaworks0