Interesting case of IP-wide Google Penalty, what is the most likely cause?
-
Dear SEOMOZ Community,
Our portfolio of around 15 internationalized web pages has received a significant, as it seems IP-wide, Google penalty starting November 2010 and have yet to recover from it. We have undergone many measure to lift the penalty including reconsideration requests wo/ luck and am now hoping the SEOMoz community can give us some further tips.
We are very interested in the community's help and judgement what else we can try to uplift the penalty.
As quick background information,
-
The sites in question offers sports results data and is translated for several languages.
-
Each market, equals language, has its own tld domain using the central keyword, e.g. <keyword_spanish>.es <keyword_german>.de <keyword_us>.com</keyword_us></keyword_german></keyword_spanish>
-
The content is highly targeted around the market, which means there are no duplicate content pages across the domains, all copy is translated, content reprioritized etc. however the core results content in the body of the pages obviously needs to stay to 80% the same
-
A SEO agency of ours has been using semi-automated LinkBuilding tools in mid of 2010 to acquire link partnerships
-
There are some promotional one-way links to sports-betting and casino positioned on the page
-
The external linking structure of the pages is very keyword and main-page focused, i.e. 90% of the external links link to the front page with one particular keyword
-
All sites have a strong domain authority and have been running under the same owner for over 5 years
As mentioned, we have experienced dramatic ranking losses across all our properties starting in November 2010. The applied penalties are indisputable given that rankings dropped for the main keywords in local Google search engines from position 3 to position 350 after the sites have been ranked in the top 10 for over 5 years. A screenshot of the ranking history for one particular domain is attached. The same behavior can be observed across domains.
Our questions are:
-
Is there something like an IP specific Google penalty that can apply to web properties across an IP or can we assume Google just picked all pages registered at Google Webmaster?
-
What is the most likely cause for our penalty given the background information? Given the drops started already in November 2010 we doubt that the Panda updates had any correlation t this issue?
-
What are the best ways to resolve our issues at this point? We have significant history data available such as tracking records etc. Our actions so far were reducing external links, on page links, and C-class internal links
-
Are there any other factors/metrics we should look at to help troubleshooting the penalties?
-
After all this time wo/ resolution, should we be moving on two new domains and forwarding all content as 301s to the new pages? Are the things we need to try first?
Any help is greatly appreciated.
SEOMoz rocks. /T
-
-
Thanks tomypro.
-
Generally speaking, you will achieve the best results by consolidating your sites under one domain with a dedicated folder for each country as you described. I would recommend delaying the move until you are sure your sites are not under any penalty.
The advantage you will receive with a single root domain is the consolidation of your Domain Authority. It sounds like your sites were doing well before the penalty. The higher DA can help even further.
Thanks again for your thoughts. This is actually a topic I am very involved with. I work as a Technical Director in a large digital agency and our SEO team just recommended a large Fortune 100 customer to break their web property into market ttlds from .com/de, .com/es etc into .com .es .de using the same top love root domain. According to our SEO team DA is sort of shared if the same root domain is used. However, local ttlds will obviously give you better rankings in local Google engines.
My thoughts are the right approach probably depends on the size of your brand. If its easy for you to build up quickly DA for local ttlds are preferred. If you are a smaller player you might run better consolidating everything under one umbrella to share DA.
I am actually running an experiment for one of my projects where I am doing the ttld breakout for one domain to compare organic search traffic. the benefit with local ttlds is that eventually you can tie those to market-local servers which boosts again SEO in local markets. This isn't possible for directories.
Do you share my thoughts Ryan? As said, this is a very hot topic for me at this moment.
P.S. I will definitely reach out for recommendations - thank you.
-
Atul,
What I mean by it is that all domains hosted under the same server IP (=dedicated root server) have experienced significant ranking drops that seem tied to a global penalty.
However it is questionable if Google would be considering this a valid approach given the probability that other domains could be hosted under the same IP that are not associated with the to-be-penalized URL.
-
Atul,
What I mean by it is that all domains hosted under the same server IP (=dedicated root server) have experienced significant ranking drops that seem tied to a global penalty.
However it is questionable if Google would be considering this a valid approach given the probability that other domains could be hosted under the same IP that are not associated with the to-be-penalized URL.
-
Ryan, i would like to know what is meant by IP specific penalty ?
-
Due to the penalties we have been considering moving everything under one umbrella and manage local sites in directories e.g. .com/es/keyword1 .com/de/keyword2 - however until the penalties hit the url approach has worked very well for us. Any thoughts?
Generally speaking, you will achieve the best results by consolidating your sites under one domain with a dedicated folder for each country as you described. I would recommend delaying the move until you are sure your sites are not under any penalty.
The advantage you will receive with a single root domain is the consolidation of your Domain Authority. It sounds like your sites were doing well before the penalty. The higher DA can help even further.
The internationalized sites are each hosted with a different root domain keyword1.es keyword2.de - are you still confirming that this should not be causing duplicate content penalties?
Correct, as long as the sites are properly set up to target their target countries. Sites which are dedicated to a specific locale and language would not normally compete in SERPs with other sites that offer similar content in another country and language.
Does your company have that experience and do you provide such services?
While I appreciate the inquiry, my resources have been already dedicated for the remainder of this month. You could take a look at the SEOmoz directory. Please note that anyone can list their company in the directory. A listing is not an endorsement.
If you desire a further recommendation you can send me a message on SEOmoz and I will respond. I can share a few names of SEOs whom I have confidence in based on their Q&A responses, blogs and reputation if that would be helpful.
-
Ryan,
Thank you for your thoughtful answers. Couple of clarifications:
The internationalized sites are each hosted with a different root domain keyword1.es keyword2.de - are you still confirming that this should not be causing duplicate content penalties? Due to the penalties we have been considering moving everything under one umbrella and manage local sites in directories e.g. .com/es/keyword1 .com/de/keyword2 - however until the penalties hit the url approach has worked very well for us. Any thoughts?
I should clarify the comment on auto-linkbuilding. The company used LinkAssistant to research potential partners, i.e. a lot of link solicitation emails were sent but the actual link building was still performed manually only with legitimate and contetn relevant partners.
We are not working with our old SEO agency any longer and have been reaching out to a couple of external SEO resources/experts but have not been presented with a conclusive, convincing concept to resolve the issues. I guess it takes a resource with experience in handling Google penalties to do the job. Does your company have that experience and do you provide such services?
-
Is there something like an IP specific Google penalty that can apply to web properties across an IP or can we assume Google just picked all pages registered at Google Webmaster?
Think of Google as an intelligent business. They have processes which algorithmically penalize websites. They also have systems which flag sites for manual review. When a penalty is deemed appropriate it is possible for it to be applied on any number of factors such as an IP address, a Google account, a domain, etc. It depends on how widespread of a violation has occurred.
What is the most likely cause for our penalty given the background information? Given the drops started already in November 2010 we doubt that the Panda updates had any correlation t this issue?
You mentioned a few points which can potentially lead to a penalty. I am not clear from your post, but is sounds like you may be linking to casino and gambling sites. While those sites may be legitimate, many have a reputation for using black hat SEO techniques.
If you want to remove a penalty, be certain that you do not provide a followed link to any questionable site. When you provide a followed link to a site, you are basically saying "I trust this site. It is a good site and I endorse it". If you are found to offer a link to a "bad" site, your site can be penalized.
What are the best ways to resolve our issues at this point? We have significant history data available such as tracking records etc. Our actions so far were reducing external links, on page links, and C-class internal links
Hire a professional SEO to review your site. You want to review every page to ensure your site is within Google's guidelines. I am highly concerned about your site's links to external sites. I am also concerned about the automated link building that your current SEO has been doing. A professional SEO company should not lead your site to incur a penalty. I am having difficulty understanding how this happened in the first place, how it has not been fixed in almost a year, and how this SEO company is building links for you. Frankly, it's time to consider a new SEO company.
Translating content to other languages is fine. You can take the exact same article and offer a translated version for each language, and even country. For example you can offer a Spanish version for your Spain site, and a different Spanish version for your Mexico site. As long as these sites are targeting specific countries then there is no duplicate content issues.
After all this time wo/ resolution, should we be moving on two new domains and forwarding all content as 301s to the new pages? Are the things we need to try first?
The penalty would follow to your new domain.
The external linking structure of the pages is very keyword and main-page focused, i.e. 90% of the external links link to the front page with one particular keyword
Not good at all.
Summary: your site needs careful, professional review by a SEO professional who adheres to white hat techniques. Every day your site is penalized you are losing traffic and money. The cost you pay to fix this issue may be extremely small in comparison to the amount of revenue you have lost.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Black hat : raising CTR to have better rank in Google
We all know that Google uses click-through-rate (CTR) as one of it is ranking factor. I came up with an idea in my mind. I would like to see if someone saw this idea before or tried it. If you search in Google for the term "SEO" for example. You will see the moz.com website in rank 3. And if you checked the source code you will see that result 3 is linking to this url: https://www.google.com.sa/url?sa=t&rct=j&q=&esrc=s&source=web&cd=3&cad=rja&uact=8&ved=0CDMQFjAC&url=https%3A%2F%2Fmoz.com%2Fbeginners-guide-to-seo&ei=F-pPVaDZBoSp7Abo_IDYAg&usg=AFQjCNEwiTCgNNNWInUJNibqiJCnlqcYtw That url will redirect you to seomoz.com Ok, what if we use linkbucks.com or any other cheap targeted traffic network and have a campaign that sends traffic to the url that I show you. Will that count as traffic from Google so it will increase the CTR from Google?
White Hat / Black Hat SEO | | Mohtaref11 -
Google says 404s don't cause ranking drops, but what about a lot of them
Hello, According to Google here, 404s don't cause rankings to go down. Our rankings are going down and we have about 50 or so 404s (though some may have been deindexed by now). We have about 300 main products and 9000 pages in general on this Ecommerce site. There's no link equity gained by 301 redirecting the 404s. A custom 404 page has been made linking to the home page. There's nothing linking to the pages that are 404s Provided that no more 404s are created, can I just ignore them and find the real reason our rankings are going down?
White Hat / Black Hat SEO | | BobGW0 -
Negative SEO attack working amazingly on Google.ca
We have a client www.atvandtrailersales.com who recently (March) fell out of the rankings. We checked their backlink file and found over 100 spam links pointing at their website with terms like "uggboots" and "headwear" etc. etc. I submitted a disavow link file, as this was obviously an attack on the website. Since the recent Panda update, the client is back out of the rankings for a majority of keyword phrases. The disavow link file that was submitted back in march has 90% of the same links that are still spamming the website now. I've sent a spam report to Google and nothing has happened. I could submit a new disavow link file, but I'm not sure if this is worth the time. '.'< --Thanks!
White Hat / Black Hat SEO | | SmartWebPros1 -
What has been updated on part of Google Penguin 2.0?
I am looking for more details of Google Penguin 2.0 update. Is any information from SEO experts?
White Hat / Black Hat SEO | | gbavadiya1 -
Massive drop in Google traffic after upping pagecount 8-fold.
I run a book recommendation site -- Flashlight Worthy. It's a collection of original, topical book lists: "The Best Books for Healthy (Vegetarian) Babies" or "Keystone Mysteries: The Best Mystery Books Set in Pennsylvania" or "5 Books That Helped Me Discover and Love My Italian Heritage". It's been online for 4+ years. Historically, it's been made up of: a single home page ~50 "category" pages, and ~425 "book list" pages. (That 50 number and 425 number both started out much smaller and grew over time but has been around 425 for the last year or so as I've focused my time elsewhere.) On Friday, June 15 we made a pretty big change to the site -- we added a page for every Author who has a book that appears on a list. This took the number of pages in our sitemap from ~500 to 4,149 overnight. If an Author has more than one book on the site, the page shows every book they have on the site, such as this page: http://www.flashlightworthybooks.com/books-by/Roald-Dahl/2805 ..but the vast majority of these author pages have just one book listed, such as this page: http://www.flashlightworthybooks.com/books-by/Barbara-Kilarski/2116 Obviously we did this as an SEO play -- we figured that our content was getting ~1,000 search entries a day for such a wide variety of queries that we may as well create pages that would make natural landing pages for a broader array of queries. And it was working... 5 days after we launched the pages, they had ~100 new searches coming in from Google. (Ok, it peaked at 100 and dropped down to a steady 60 or so day within a few days, but still. And then it trailed off for the last week, dropping lower and lower every day as if they realized it was repurposed content from elsewhere on our site...) Here's the problem: For the last several years the site received ~30,000 search entries a month... a little more than 1,000 a day on weekdays, a little lighter on weekends. This ebbed and flowed a bit as Google made tweaked things (Panda for example), as we garnered fresh inbound links, as the GoodReads behemoth stole some traffic... but by and large, traffic was VERY stable. And then, on Saturday, exactly 3 weeks after we added all these pages, the bottom fell out of our search traffic. Instead of ~1,000 entries a day, we've had ~300 on Saturday and Sunday and it looks like we'll have a similar amount today. And I know this isn't just some Analytics reporting problem as Chartbeat is showing the same drop. As search is ~80% of my traffic I'm VERY eager to solve this problem... So: 1. Do you think the drop is related to my upping my pagecount 8-fold overnight? 2. Do you think I'd climb right back into Google's good graces if I removed all the pages at once? Or just all the pages that only list one author (which would be the vasy majority). 3. Have you ever heard of a situation like this? Where Google "punishes" a site for creating new pages out of existing content? Really, it's useful content -- and these pages are better "answers" for a lot of queries. When someone searches for "Norah Ephron books" it's better they land on a page of ours that pulls together the 4 books we have than taking them to a page that happens to have just one book on it among 5 or 6 others by other authors. What else? Thanks so much, help is very appreciated. Peter
White Hat / Black Hat SEO | | petestein1
Flashlight Worthy Book Recommendations
Recommending books so good, they'll keep you up past your bedtime. 😉0 -
Keyword Rich Domains on Same IP
In addition to my main website, I want to create two new sites for the upcoming football and basketball seasons. By starting now, I'm thinking I have enough time to get them ranked decently. I have purchased www.collegefootballpredictions.net for the upcoming football seasons. The intent here is two fold. First, I'd like to rank in the top 3 for "College Football Predictions." Second, and this is why I'm thinking that Google won't hate me for the approach, is that someone looking for that search term is much more likely to convert on a landing page geared for them then on my main website. If the goal of a separate website is truly to compliment the main website, then is it considered white hat? I'm thinking that, as long as my intentions are pure, they should go on the same IP. Placing them on separate IPs could be a good way of letting the big G know that I'm trying to cheat the system and get away with it.
White Hat / Black Hat SEO | | PatrickGriffith0 -
I've done some link building on my website... why is google showing this?
Hi guys, it seems Google is going crazy as always, basically sometimes i'm ranked first page sometimes i'm not there, not sure if it's because of my link building and Google is indexing the links. At the moment in IE i'm top 3-4 for this keyword however the Title tag is not what I set it to be it's basically taking the product name then adding something after it. (I know google sometimes changes to what they want if they feel its more relevant but it isn't in this case) Not sure if this is normal for my keyword to keep appearing then dissapearing in Google. I noticed in FF my keyword isn't there but in IE it is. I've logged out of my Google account deleted all history/cookies etc. Even checked on my friends computer. Hope this makes sense and i'm not going crazy!
White Hat / Black Hat SEO | | InkCartridgesFast0 -
Are there any "legitimate" paid links in Google's eyes?
The news about paid link campaigns is so frequent, that I have to ask the question....does Google allow any paid links? Aside from SEO, paid links can have visibility value. Much like an exit sign on the highway, the paid link says "Get off here"
White Hat / Black Hat SEO | | bcmull0