Interesting case of IP-wide Google Penalty, what is the most likely cause?
-
Dear SEOMOZ Community,
Our portfolio of around 15 internationalized web pages has received a significant, as it seems IP-wide, Google penalty starting November 2010 and have yet to recover from it. We have undergone many measure to lift the penalty including reconsideration requests wo/ luck and am now hoping the SEOMoz community can give us some further tips.
We are very interested in the community's help and judgement what else we can try to uplift the penalty.
As quick background information,
-
The sites in question offers sports results data and is translated for several languages.
-
Each market, equals language, has its own tld domain using the central keyword, e.g. <keyword_spanish>.es <keyword_german>.de <keyword_us>.com</keyword_us></keyword_german></keyword_spanish>
-
The content is highly targeted around the market, which means there are no duplicate content pages across the domains, all copy is translated, content reprioritized etc. however the core results content in the body of the pages obviously needs to stay to 80% the same
-
A SEO agency of ours has been using semi-automated LinkBuilding tools in mid of 2010 to acquire link partnerships
-
There are some promotional one-way links to sports-betting and casino positioned on the page
-
The external linking structure of the pages is very keyword and main-page focused, i.e. 90% of the external links link to the front page with one particular keyword
-
All sites have a strong domain authority and have been running under the same owner for over 5 years
As mentioned, we have experienced dramatic ranking losses across all our properties starting in November 2010. The applied penalties are indisputable given that rankings dropped for the main keywords in local Google search engines from position 3 to position 350 after the sites have been ranked in the top 10 for over 5 years. A screenshot of the ranking history for one particular domain is attached. The same behavior can be observed across domains.
Our questions are:
-
Is there something like an IP specific Google penalty that can apply to web properties across an IP or can we assume Google just picked all pages registered at Google Webmaster?
-
What is the most likely cause for our penalty given the background information? Given the drops started already in November 2010 we doubt that the Panda updates had any correlation t this issue?
-
What are the best ways to resolve our issues at this point? We have significant history data available such as tracking records etc. Our actions so far were reducing external links, on page links, and C-class internal links
-
Are there any other factors/metrics we should look at to help troubleshooting the penalties?
-
After all this time wo/ resolution, should we be moving on two new domains and forwarding all content as 301s to the new pages? Are the things we need to try first?
Any help is greatly appreciated.
SEOMoz rocks. /T
-
-
Thanks tomypro.
-
Generally speaking, you will achieve the best results by consolidating your sites under one domain with a dedicated folder for each country as you described. I would recommend delaying the move until you are sure your sites are not under any penalty.
The advantage you will receive with a single root domain is the consolidation of your Domain Authority. It sounds like your sites were doing well before the penalty. The higher DA can help even further.
Thanks again for your thoughts. This is actually a topic I am very involved with. I work as a Technical Director in a large digital agency and our SEO team just recommended a large Fortune 100 customer to break their web property into market ttlds from .com/de, .com/es etc into .com .es .de using the same top love root domain. According to our SEO team DA is sort of shared if the same root domain is used. However, local ttlds will obviously give you better rankings in local Google engines.
My thoughts are the right approach probably depends on the size of your brand. If its easy for you to build up quickly DA for local ttlds are preferred. If you are a smaller player you might run better consolidating everything under one umbrella to share DA.
I am actually running an experiment for one of my projects where I am doing the ttld breakout for one domain to compare organic search traffic. the benefit with local ttlds is that eventually you can tie those to market-local servers which boosts again SEO in local markets. This isn't possible for directories.
Do you share my thoughts Ryan? As said, this is a very hot topic for me at this moment.
P.S. I will definitely reach out for recommendations - thank you.
-
Atul,
What I mean by it is that all domains hosted under the same server IP (=dedicated root server) have experienced significant ranking drops that seem tied to a global penalty.
However it is questionable if Google would be considering this a valid approach given the probability that other domains could be hosted under the same IP that are not associated with the to-be-penalized URL.
-
Atul,
What I mean by it is that all domains hosted under the same server IP (=dedicated root server) have experienced significant ranking drops that seem tied to a global penalty.
However it is questionable if Google would be considering this a valid approach given the probability that other domains could be hosted under the same IP that are not associated with the to-be-penalized URL.
-
Ryan, i would like to know what is meant by IP specific penalty ?
-
Due to the penalties we have been considering moving everything under one umbrella and manage local sites in directories e.g. .com/es/keyword1 .com/de/keyword2 - however until the penalties hit the url approach has worked very well for us. Any thoughts?
Generally speaking, you will achieve the best results by consolidating your sites under one domain with a dedicated folder for each country as you described. I would recommend delaying the move until you are sure your sites are not under any penalty.
The advantage you will receive with a single root domain is the consolidation of your Domain Authority. It sounds like your sites were doing well before the penalty. The higher DA can help even further.
The internationalized sites are each hosted with a different root domain keyword1.es keyword2.de - are you still confirming that this should not be causing duplicate content penalties?
Correct, as long as the sites are properly set up to target their target countries. Sites which are dedicated to a specific locale and language would not normally compete in SERPs with other sites that offer similar content in another country and language.
Does your company have that experience and do you provide such services?
While I appreciate the inquiry, my resources have been already dedicated for the remainder of this month. You could take a look at the SEOmoz directory. Please note that anyone can list their company in the directory. A listing is not an endorsement.
If you desire a further recommendation you can send me a message on SEOmoz and I will respond. I can share a few names of SEOs whom I have confidence in based on their Q&A responses, blogs and reputation if that would be helpful.
-
Ryan,
Thank you for your thoughtful answers. Couple of clarifications:
The internationalized sites are each hosted with a different root domain keyword1.es keyword2.de - are you still confirming that this should not be causing duplicate content penalties? Due to the penalties we have been considering moving everything under one umbrella and manage local sites in directories e.g. .com/es/keyword1 .com/de/keyword2 - however until the penalties hit the url approach has worked very well for us. Any thoughts?
I should clarify the comment on auto-linkbuilding. The company used LinkAssistant to research potential partners, i.e. a lot of link solicitation emails were sent but the actual link building was still performed manually only with legitimate and contetn relevant partners.
We are not working with our old SEO agency any longer and have been reaching out to a couple of external SEO resources/experts but have not been presented with a conclusive, convincing concept to resolve the issues. I guess it takes a resource with experience in handling Google penalties to do the job. Does your company have that experience and do you provide such services?
-
Is there something like an IP specific Google penalty that can apply to web properties across an IP or can we assume Google just picked all pages registered at Google Webmaster?
Think of Google as an intelligent business. They have processes which algorithmically penalize websites. They also have systems which flag sites for manual review. When a penalty is deemed appropriate it is possible for it to be applied on any number of factors such as an IP address, a Google account, a domain, etc. It depends on how widespread of a violation has occurred.
What is the most likely cause for our penalty given the background information? Given the drops started already in November 2010 we doubt that the Panda updates had any correlation t this issue?
You mentioned a few points which can potentially lead to a penalty. I am not clear from your post, but is sounds like you may be linking to casino and gambling sites. While those sites may be legitimate, many have a reputation for using black hat SEO techniques.
If you want to remove a penalty, be certain that you do not provide a followed link to any questionable site. When you provide a followed link to a site, you are basically saying "I trust this site. It is a good site and I endorse it". If you are found to offer a link to a "bad" site, your site can be penalized.
What are the best ways to resolve our issues at this point? We have significant history data available such as tracking records etc. Our actions so far were reducing external links, on page links, and C-class internal links
Hire a professional SEO to review your site. You want to review every page to ensure your site is within Google's guidelines. I am highly concerned about your site's links to external sites. I am also concerned about the automated link building that your current SEO has been doing. A professional SEO company should not lead your site to incur a penalty. I am having difficulty understanding how this happened in the first place, how it has not been fixed in almost a year, and how this SEO company is building links for you. Frankly, it's time to consider a new SEO company.
Translating content to other languages is fine. You can take the exact same article and offer a translated version for each language, and even country. For example you can offer a Spanish version for your Spain site, and a different Spanish version for your Mexico site. As long as these sites are targeting specific countries then there is no duplicate content issues.
After all this time wo/ resolution, should we be moving on two new domains and forwarding all content as 301s to the new pages? Are the things we need to try first?
The penalty would follow to your new domain.
The external linking structure of the pages is very keyword and main-page focused, i.e. 90% of the external links link to the front page with one particular keyword
Not good at all.
Summary: your site needs careful, professional review by a SEO professional who adheres to white hat techniques. Every day your site is penalized you are losing traffic and money. The cost you pay to fix this issue may be extremely small in comparison to the amount of revenue you have lost.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why homepage is not getting cached by Google ?
It has been more than 2-3 months that I didn't notice that our website homepage is not getting cached by Google ?? i don't know why?? help me please, thanks in advance. Regards,
White Hat / Black Hat SEO | | spellblaster
Spel why.PNG0 -
Should you include keywords in your domain name to rank well on Google Places?
Is it okay to include keywords in your domain name (as well as business name) to rank well on Google Places? In my opinion, this is very spammy and the sites using this technique will be slapped by Google sooner or later.
White Hat / Black Hat SEO | | thegoatman1 -
Looking for a Way to Standardize Content for Thousands of Pages w/o Getting Duplicate Content Penalties
Hi All, I'll premise this by saying that we like to engage in as much white hat SEO as possible. I'm certainly not asking for any shady advice, but we have a lot of local pages to optimize :). So, we are an IT and management training course provider. We have 34 locations across the US and each of our 34 locations offers the same courses. Each of our locations has its own page on our website. However, in order to really hone the local SEO game by course topic area and city, we are creating dynamic custom pages that list our course offerings/dates for each individual topic and city. Right now, our pages are dynamic and being crawled and ranking well within Google. We conducted a very small scale test on this in our Washington Dc and New York areas with our SharePoint course offerings and it was a great success. We are ranking well on "sharepoint training in new york/dc" etc for two custom pages. So, with 34 locations across the states and 21 course topic areas, that's well over 700 pages of content to maintain - A LOT more than just the two we tested. Our engineers have offered to create a standard title tag, meta description, h1, h2, etc, but with some varying components. This is from our engineer specifically: "Regarding pages with the specific topic areas, do you have a specific format for the Meta Description and the Custom Paragraph? Since these are dynamic pages, it would work better and be a lot easier to maintain if we could standardize a format that all the pages would use for the Meta and Paragraph. For example, if we made the Paragraph: “Our [Topic Area] training is easy to find in the [City, State] area.” As a note, other content such as directions and course dates will always vary from city to city so content won't be the same everywhere, just slightly the same. It works better this way because HTFU is actually a single page, and we are just passing the venue code to the page to dynamically build the page based on that venue code. So they aren’t technically individual pages, although they seem like that on the web. If we don’t standardize the text, then someone will have to maintain custom text for all active venue codes for all cities for all topics. So you could be talking about over a thousand records to maintain depending on what you want customized. Another option is to have several standardized paragraphs, such as: “Our [Topic Area] training is easy to find in the [City, State] area. Followed by other content specific to the location
White Hat / Black Hat SEO | | CSawatzky
“Find your [Topic Area] training course in [City, State] with ease.” Followed by other content specific to the location Then we could randomize what is displayed. The key is to have a standardized format so additional work doesn’t have to be done to maintain custom formats/text for individual pages. So, mozzers, my question to you all is, can we standardize with slight variations specific to that location and topic area w/o getting getting dinged for spam or duplicate content. Often times I ask myself "if Matt Cutts was standing here, would he approve?" For this, I am leaning towards "yes," but I always need a gut check. Sorry for the long message. Hopefully someone can help. Thank you! Pedram1 -
How do I write tags on a youtube video for a local Google search?
I've been reading into tags, and I would like to know what the best ways to do them for a local search are. Right now I have a title that reads similar to, "Keyword1 and Keyword2 in City X" Would I make a corresponding tag that reads "Keyword 1 and Keyword 2 in City X,"? Or would I do "Keyword 1," "Keyword 2," and, "City X," as separate tags? Thanks!
White Hat / Black Hat SEO | | OOMDODigital0 -
Google Reconsideration Requests no problem... So what do I do next?
Hi all, So I recently filed a Google reconsideration request - but it came back saying "No manual spam actions found" - ok, so that's that. But from what I've read, if we have been hit by Panda for duplicate or thin content, we wouldn't know - in other words, Google does not report it as it is an algorhythm penalty as opposed to a manual one. So what are my options - do I wait until the next Panda update? when can that be? Or do I start over on a fresh domain? Input and views appreciated. thanks,
White Hat / Black Hat SEO | | bjs20100 -
Oh sh@t Wetherby Racecourse has been de indexed by Google :-(
Dio mio! Wetherby racecourse <cite>www.wetherbyracing.co.uk/</cite> has been de indexed by Google, re indexing request has been made via webmaster tools and the offending 3rd party banner ad has been stripped out. So my question is please. How long will it take approximately to re -index?
White Hat / Black Hat SEO | | Nightwing
And is it true re submitting an updated xml site & firing tweets at the ailing site may spark it back into life? Grazie tanto,David0 -
Abandon Ship! Or do I stay aboard like a good captain?
Hey SEOmoz people. I need your advice. I have a site that was entrusted to an SEO company last year to help me rank better. I didn't put any marketing effort into it myself (dumb), though I did spend lots of time on the web design & courseware preparation. It was hammed during Panda and the first Penguin changes. I decided to fight for the site and spent lots of time learning about internet marketing and trying to get the dodgy links removed and disavowed. Plus I've started to create lots of quality content. But after last weeks algorithm changes it's plummeted again. Dropping below rank 50 is a few cases. My Question: How do you know if it's time to bail water or abandon ship and start up a fresh site? Check out my Moz Ranking attached. Dan - bringyourownlaptop.com.au 0BE1cmB.png
White Hat / Black Hat SEO | | danlovesadobe1 -
Is it possible that since the Google Farmer's Update, that people practicing Google Bowling can negatively affect your site?
We have hundreds of random bad links that have been added to our sites across the board that nobody in our company paid for. Two of our domains have been penalized and three of our sites have pages that have been penalized. Our sites are established with quality content. One was built in 2007, the other in 2008. We pay writers to contribute quality and unique content. We just can't figure out a) Why the sites were pulled out of Google indexing suddenly after operating well for years b) Where the spike in links came from. Thanks
White Hat / Black Hat SEO | | dahnyogaworks0