Determining the Cause of a Penalty
-
I received a link removal request from a site who said that they were penalized. I confirmed that they were #1 for the competitive keyword phrase that is also their domain name and now they are #10.
Here are some things I noticed about the site:
- Over 2,500 linking domains.
- Dozens of high quality linking domains like Huffington Post and Mashable.
- Some off topic guest post links, e.g. on a SEO site.
- Guest post anchor text was usually their site name which is an exact match domain.
- Lots of top 100 resource pages that received good organic links.
- Infographics with links using their domain name as the anchor text.
- Relatively few spammy links according to Open Site Explorer.
Overall their site's links were engineered but using tactics that most would consider "white hat." I don't think they violated any Google Webmaster Guidelines. Why were they penalized?
What do you think?
-
Hi Project Labs,
In order to better understand the scenario, we would want to know if they received a link warning within Google Webmaster Tool, or just "assessed" themselves as having penalty due to steep ranking drops.
The easiest way to answer this is to comment on your points:
-
Over 2,500 linking domains.
It is very difficult for even medium size businesses to achieve 2500 unique links, so this number might have raised a flag from Google in their niche - especially if it deviated outside of the aggregate "norm" for this niche / keyword set.
-
Dozens of high quality linking domains like Huffington Post and Mashable.
This helps authority, but quickly can be trumped by negative factors....
-
Some off topic guest post links, e.g. on a SEO site.
I would recommend they contact these webmasters and have these ones removed.
-
Guest post anchor text was usually their site name which is an exact match domain.
EMD is a more difficult issue now, since EMDs by default do not rank as well.
Because people like to link to the site by name, sometimes the exact match can be a serious problem. Is this website owned by a business that has a unique name?
Example - Cheesepizza.com - owned by Cheesy Pizza Dynasty.
If this is the case, the owner should switch to using the unique brand name. In some cases there is value in creating a unique name, and building business brand signals if they do not currently have this.
-
Lots of top 100 resource pages that received good organic links.
Excellent, they would want to check for relevancy, co-citation (shared links to other websites and their main topics)
-
Infographics with links using their domain name as the anchor text.
I would remove those EMD anchor texts and move to url only, or Alloneword.com OR the second brand / business option.
-
Relatively few spammy links according to Open Site Explorer.
Overall their site's links were engineered but using tactics that most would consider "white hat." I don't think they violated any Google Webmaster Guidelines. Why were they penalized?
Anything engineered will be, or has been penalized, or demoted, and I wouldn't expect this to change.
Internal links and other areas of SEO / trust signals aside, I would recommend studying the top 20 in this niche and thoroughly analyze their backlinks:
1. Rate of build
2. Exact match vs Commercialized anchors vs brand vs branded vs other?
3. Percentage of 'engineered' links vs natural, editorial.
4. Union of links competitors share in top 5 positions in Google.Hope this helps!
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
X-robots tag causing no index issues
I have an interesting problem with a site which has an x-robot tag blocking the site from being indexed, the site is in Wordpress, there are no issues with the robots.txt or at the page level, I cant find the noindex anywhere. I removed the SEO plug-in which was there and installed Yoast but it made no difference. this is the url: https://www.cotswoldflatroofing.com/ Its coming up with a HTTP error: x-robots tag noindex, nofollow, noarchive
Technical SEO | | Donsimong0 -
I have duplicate content but // are causing them
I have 3 pages duplicated just by a / Example: https://intercallsystems.com/intercall-nurse-call-systems**//**
Technical SEO | | Renalynd
https://intercallsystems.com/intercall-nurse-call-systems**/** What would cause this?? And how would I fix it? Thanks! Rena0 -
IP Redirect causing Indexing Issue
Hi, I am trying to redirect any IP from outside India that comes to Store site (https://store.nirogam.com/) to Global Store site (https://global.nirogam.com/) using this methodThis is causing various indexing issues for Store site as Googlebot from US also gets redirected!- Very few pages for "store.nirogam.com/products/" are being indexed. Even after submission of sitemap it indexed ~50 pages and then went back to 1 page etc. Only ~20 pages indexed for now.- After this I tried manually indexing via "Crawl -> Fetch as Google" - but then it showed me a redirect to global.nirogam.com. All have their "status -> Redirected" - This is why bots are not able to index the site.What are possible solutions for this? How can we tell bots to index these pages and not get redirected?Will a popup method where we ask user if they are outside India help in solving this issue?All approaches/suggestions will be highly appreciated.
Technical SEO | | pks3330 -
Can increase in crawl errors in GWT) be caused by input fields and jquery?
Dear Mozzerz We took over www.urgiganten.dk not long ago and last week we opened up for indexation, after having taken the old website down for a couple of months. One week after opening for indexation we saw a huge increase in crawl errors.Google is discovering some weird links to e.g http://www.urgiganten.dk/30-garmin-urremme/ which returns a 404. In GWT we are told that we are linking to this url from http://www.urgiganten.dk/garmin-urremme. But nowhere on http://www.urgiganten.dk/garmin-urremme will you find this link. However you will find the following script in the source code, which is the only code part that contains "/30-garmin-urremme/":Can it be true that google take the id and adds it to our tld to form a url? We have seen quite a lot of these errors not only on Urgiganten.dk but also some of our other websites!
Technical SEO | | urgiganten0 -
Disavowing links cause over 2M pages deindexing
can disavowing links cause deindexing from google ? we had about 2.5M pages indexed until dec 30th , since then it s dropped to about 600K We received a unnatural link warning in july, got hit in september, since then we are suffering substantially in ranks and also in business revenue. We've used the disavow tool, and also got tons of links removed within the past 3 , 4 months. Since october we havent got any response from google about what is going on despite sending another recon in Nov 2012. Now the site is getting deindexed. what should we do at this point? Any help is greatly appreciated. here is the url http://goo.gl/Ai17f Thank you nick
Technical SEO | | orion680 -
What could be the cause of this duplicate content error?
I only have one index.htm and I'm seeing a duplicate content error. What could be causing this? IUJvfZE.png
Technical SEO | | ScottMcPherson1 -
How does Google determine freshness of content?
With the changes in the Google algorithm emphasizing freshness of content, I was wondering how they determine freshness and what constitutes new content. For instance, if I write a major update to a story I published last July, is the amended story fresh? Is there anything I can do in addition to publishing brand new content to make Google sure they see all my new content?
Technical SEO | | KnutDSvendsen0 -
Dealing with hundreds of spam pages caused by a hacker
A couple of my sites have recently been hacked with the hacker managing to overwrite lots of my pages with their own spam products and also adding in lots of (hundreds) pages that they have created themselves. I have rectified this in so far as removing folders that the hacker used to over write my pages so my original pages are now back showing the correct content and also removed all the hundres of new pages that they had managed to instantly add. I appreciate that google will find and re-crawl all my genuine pages so the correct content is being displayed and indexed for them but what is the best method for dealing with the hundreds of extra spam ages that google had managed to crawl but have now been deleted so there are loads of 404 page not founds in google?
Technical SEO | | Wardy0