Interesting case of IP-wide Google Penalty, what is the most likely cause?
-
Dear SEOMOZ Community,
Our portfolio of around 15 internationalized web pages has received a significant, as it seems IP-wide, Google penalty starting November 2010 and have yet to recover from it. We have undergone many measure to lift the penalty including reconsideration requests wo/ luck and am now hoping the SEOMoz community can give us some further tips.
We are very interested in the community's help and judgement what else we can try to uplift the penalty.
As quick background information,
-
The sites in question offers sports results data and is translated for several languages.
-
Each market, equals language, has its own tld domain using the central keyword, e.g. <keyword_spanish>.es <keyword_german>.de <keyword_us>.com</keyword_us></keyword_german></keyword_spanish>
-
The content is highly targeted around the market, which means there are no duplicate content pages across the domains, all copy is translated, content reprioritized etc. however the core results content in the body of the pages obviously needs to stay to 80% the same
-
A SEO agency of ours has been using semi-automated LinkBuilding tools in mid of 2010 to acquire link partnerships
-
There are some promotional one-way links to sports-betting and casino positioned on the page
-
The external linking structure of the pages is very keyword and main-page focused, i.e. 90% of the external links link to the front page with one particular keyword
-
All sites have a strong domain authority and have been running under the same owner for over 5 years
As mentioned, we have experienced dramatic ranking losses across all our properties starting in November 2010. The applied penalties are indisputable given that rankings dropped for the main keywords in local Google search engines from position 3 to position 350 after the sites have been ranked in the top 10 for over 5 years. A screenshot of the ranking history for one particular domain is attached. The same behavior can be observed across domains.
Our questions are:
-
Is there something like an IP specific Google penalty that can apply to web properties across an IP or can we assume Google just picked all pages registered at Google Webmaster?
-
What is the most likely cause for our penalty given the background information? Given the drops started already in November 2010 we doubt that the Panda updates had any correlation t this issue?
-
What are the best ways to resolve our issues at this point? We have significant history data available such as tracking records etc. Our actions so far were reducing external links, on page links, and C-class internal links
-
Are there any other factors/metrics we should look at to help troubleshooting the penalties?
-
After all this time wo/ resolution, should we be moving on two new domains and forwarding all content as 301s to the new pages? Are the things we need to try first?
Any help is greatly appreciated.
SEOMoz rocks. /T
-
-
Thanks tomypro.
-
Generally speaking, you will achieve the best results by consolidating your sites under one domain with a dedicated folder for each country as you described. I would recommend delaying the move until you are sure your sites are not under any penalty.
The advantage you will receive with a single root domain is the consolidation of your Domain Authority. It sounds like your sites were doing well before the penalty. The higher DA can help even further.
Thanks again for your thoughts. This is actually a topic I am very involved with. I work as a Technical Director in a large digital agency and our SEO team just recommended a large Fortune 100 customer to break their web property into market ttlds from .com/de, .com/es etc into .com .es .de using the same top love root domain. According to our SEO team DA is sort of shared if the same root domain is used. However, local ttlds will obviously give you better rankings in local Google engines.
My thoughts are the right approach probably depends on the size of your brand. If its easy for you to build up quickly DA for local ttlds are preferred. If you are a smaller player you might run better consolidating everything under one umbrella to share DA.
I am actually running an experiment for one of my projects where I am doing the ttld breakout for one domain to compare organic search traffic. the benefit with local ttlds is that eventually you can tie those to market-local servers which boosts again SEO in local markets. This isn't possible for directories.
Do you share my thoughts Ryan? As said, this is a very hot topic for me at this moment.
P.S. I will definitely reach out for recommendations - thank you.
-
Atul,
What I mean by it is that all domains hosted under the same server IP (=dedicated root server) have experienced significant ranking drops that seem tied to a global penalty.
However it is questionable if Google would be considering this a valid approach given the probability that other domains could be hosted under the same IP that are not associated with the to-be-penalized URL.
-
Atul,
What I mean by it is that all domains hosted under the same server IP (=dedicated root server) have experienced significant ranking drops that seem tied to a global penalty.
However it is questionable if Google would be considering this a valid approach given the probability that other domains could be hosted under the same IP that are not associated with the to-be-penalized URL.
-
Ryan, i would like to know what is meant by IP specific penalty ?
-
Due to the penalties we have been considering moving everything under one umbrella and manage local sites in directories e.g. .com/es/keyword1 .com/de/keyword2 - however until the penalties hit the url approach has worked very well for us. Any thoughts?
Generally speaking, you will achieve the best results by consolidating your sites under one domain with a dedicated folder for each country as you described. I would recommend delaying the move until you are sure your sites are not under any penalty.
The advantage you will receive with a single root domain is the consolidation of your Domain Authority. It sounds like your sites were doing well before the penalty. The higher DA can help even further.
The internationalized sites are each hosted with a different root domain keyword1.es keyword2.de - are you still confirming that this should not be causing duplicate content penalties?
Correct, as long as the sites are properly set up to target their target countries. Sites which are dedicated to a specific locale and language would not normally compete in SERPs with other sites that offer similar content in another country and language.
Does your company have that experience and do you provide such services?
While I appreciate the inquiry, my resources have been already dedicated for the remainder of this month. You could take a look at the SEOmoz directory. Please note that anyone can list their company in the directory. A listing is not an endorsement.
If you desire a further recommendation you can send me a message on SEOmoz and I will respond. I can share a few names of SEOs whom I have confidence in based on their Q&A responses, blogs and reputation if that would be helpful.
-
Ryan,
Thank you for your thoughtful answers. Couple of clarifications:
The internationalized sites are each hosted with a different root domain keyword1.es keyword2.de - are you still confirming that this should not be causing duplicate content penalties? Due to the penalties we have been considering moving everything under one umbrella and manage local sites in directories e.g. .com/es/keyword1 .com/de/keyword2 - however until the penalties hit the url approach has worked very well for us. Any thoughts?
I should clarify the comment on auto-linkbuilding. The company used LinkAssistant to research potential partners, i.e. a lot of link solicitation emails were sent but the actual link building was still performed manually only with legitimate and contetn relevant partners.
We are not working with our old SEO agency any longer and have been reaching out to a couple of external SEO resources/experts but have not been presented with a conclusive, convincing concept to resolve the issues. I guess it takes a resource with experience in handling Google penalties to do the job. Does your company have that experience and do you provide such services?
-
Is there something like an IP specific Google penalty that can apply to web properties across an IP or can we assume Google just picked all pages registered at Google Webmaster?
Think of Google as an intelligent business. They have processes which algorithmically penalize websites. They also have systems which flag sites for manual review. When a penalty is deemed appropriate it is possible for it to be applied on any number of factors such as an IP address, a Google account, a domain, etc. It depends on how widespread of a violation has occurred.
What is the most likely cause for our penalty given the background information? Given the drops started already in November 2010 we doubt that the Panda updates had any correlation t this issue?
You mentioned a few points which can potentially lead to a penalty. I am not clear from your post, but is sounds like you may be linking to casino and gambling sites. While those sites may be legitimate, many have a reputation for using black hat SEO techniques.
If you want to remove a penalty, be certain that you do not provide a followed link to any questionable site. When you provide a followed link to a site, you are basically saying "I trust this site. It is a good site and I endorse it". If you are found to offer a link to a "bad" site, your site can be penalized.
What are the best ways to resolve our issues at this point? We have significant history data available such as tracking records etc. Our actions so far were reducing external links, on page links, and C-class internal links
Hire a professional SEO to review your site. You want to review every page to ensure your site is within Google's guidelines. I am highly concerned about your site's links to external sites. I am also concerned about the automated link building that your current SEO has been doing. A professional SEO company should not lead your site to incur a penalty. I am having difficulty understanding how this happened in the first place, how it has not been fixed in almost a year, and how this SEO company is building links for you. Frankly, it's time to consider a new SEO company.
Translating content to other languages is fine. You can take the exact same article and offer a translated version for each language, and even country. For example you can offer a Spanish version for your Spain site, and a different Spanish version for your Mexico site. As long as these sites are targeting specific countries then there is no duplicate content issues.
After all this time wo/ resolution, should we be moving on two new domains and forwarding all content as 301s to the new pages? Are the things we need to try first?
The penalty would follow to your new domain.
The external linking structure of the pages is very keyword and main-page focused, i.e. 90% of the external links link to the front page with one particular keyword
Not good at all.
Summary: your site needs careful, professional review by a SEO professional who adheres to white hat techniques. Every day your site is penalized you are losing traffic and money. The cost you pay to fix this issue may be extremely small in comparison to the amount of revenue you have lost.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I Report A SEO Agency to Google
Our competitor has employed the services of a spammy SEO agency that sends spammy links to our site. Though our rankings were affected we have taken the necessary steps. It is possible to send evidence to Google so that they can take down the site. I want to take this action so that other sites will not be affected by them again.
White Hat / Black Hat SEO | | Halmblogmusic0 -
How long Google will take to Disavow a link?
Just want to know how long will Google take to Disavow a link? I uploaded my file on 18 Dec 2020 and today is 5th January 2021 and still, that link is appearing in my Search Console in Top linking domains. Anyone who recently done this practice and how long it took? I mentioned the domain name below and hopefully, it will disavow all the links [subdomain+www+without www] coming from that domain. domain:abcd.com Help me out, please...
White Hat / Black Hat SEO | | seotoolsland.com0 -
Why google is catching my website late
Hello, I hope you all guys are doing great. Recently, I published my over my website and within almost 10 mins, it was indexed completely and I also personally checked it in google search console. The URL was indexed but the problem is, it does not appear in Google Search. Sometimes in search result I notice Google shows a result who is published 10-30 mins ago but this is not the case with my website. All articles just show in Google SERP after 1-2 days. What can be the reason behind this, although DA, PA is good (28-31).
White Hat / Black Hat SEO | | HansiAliya0 -
"Google chose different canonical than user" Issue Can Anyone help?
Our site https://www.travelyaari.com/ , some page are showing this error ("Google chose different canonical than user") on google webmasters. status message "Excluded from search results". Affected on our route page urls mainly. https://www.travelyaari.com/popular-routes-listing Our canonical tags are fine, rel alternate tags are fine. Can anyone help us regarding why it is happening?
White Hat / Black Hat SEO | | RobinJA0 -
How the heck is this guy ranking on top of Google for everything?
Hey everyone, how is the website below ranking so high for everything with his website? His link profile is spam junk, he uses forums and hides backlinks in smiles and quotes. Plus the guy even seems to be hitting all the competition websites with bad backlinks etc. It seems he is jus using automated tools to build tons of backlinks. Why isn't Google picking this site up and doing something about it? Search google for "advanced warfare hacks" he shows up on top. Same for "titanfall hacks" Same for "ghosts hacks" Check his link profile and sneaky ways, his main site is hackerbot [dot] net
White Hat / Black Hat SEO | | Draden670 -
Website not listing in google - screaming frog shows 500 error? What could the issue be?
Hey, http://www.interconnect.org.uk/ - the site seems to load fine, but for some reason the site is not getting indexed. I tried running the site on screaming frog, and it gives a 500 error code, which suggests it can't access the site? I'm guessing this is the same problem google is having, do you have any ideas as to why this may be and how I can rectify this? Thanks, Andrew
White Hat / Black Hat SEO | | Heehaw0 -
Massive site-wide internal footer links to doorway pages: how bad is this?
My company has stuffed several hundred links into the footer of every page. Well, technically not the footer, as they're right at the end of the body tag, but basically the same thing. They are formatted as follows: [" href="http://example.com/springfield_oh_real_estate.htm">" target="_blank">http://example.com/springfield_pa_real_estate.htm">](</span><a class= "http://example.com/springfield_oh_real_estate.htm")springfield, pa real estate These direct to individual pages that contain the same few images and variations the following text that just replace the town and state: _Springfield, PA Real Estate - Springfield County [images] This page features links to help you Find Listings and Homes for sale in the Springfield area MLS, Springfield Real Estate Agents, and Springfield home values. Our free real estate services feature all Springfield and Springfield suburban areas. We also have information on Springfield home selling, Springfield home buying, financing and mortgages, insurance and other realty services for anyone looking to sell a home or buy a home in Springfield. And if you are relocating to Springfield or want Springfield relocation information we can help with our Relocation Network._ The bolded text links to our internal site pages for buying, selling, relocation, etc. Like I said, this is repeated several hundred times, on every single page on our site. In our XML sitemap file, there are links to: http://www.example.com/Real_Estate/City/Springfield/
White Hat / Black Hat SEO | | BD69
http://www.example.com/Real_Estate/City/Springfield/Homes/
http://www.example.com/Real_Estate/City/Springfield/Townhomes/ That direct to separate pages with a Google map result for properties for sale in Springfield. It's accompanied by the a boilerplate version of this: _Find Springfield Pennsylvania Real Estate for sale on www.example.com - your complete source for all Springfield Pennsylvania real estate. Using www.example.com, you can search the entire local Multiple Listing Service (MLS) for up to date Springfield Pennsylvania real estate for sale that may not be available elsewhere. This includes every Springfield Pennsylvania property that's currently for sale and listed on our local MLS. Example Company is a fully licensed Springfield Pennsylvania real estate provider._ Google Webmaster Tools is reporting that some of these pages have over 30,000 internal links on our site. However, GWT isn't reporting any manual actions that need to be addressed. How blatantly abusive and spammy is this? At best, Google doesn't care a spit about it , but worst case is this is actively harming our SERP rankings. What's the best way to go about dealing with this? The site did have Analytics running, but the company lost the account information years ago, otherwise I'd check the numbers to see if we were ever hit by Panda/Penguin. I just got a new Analytics account implemented 2 weeks ago. Of course it's still using deprecated object values so I don't even know how accurate it is. Thanks everyone! qrPftlf.png0 -
Site-wide links: Nofollow or eliminate altogether?
As a web developer, it's not uncommon for me to place a link in the footer of a website to give myself credit for the web design/development. I recently decided to go back and nofollow all these site-wide footer links, to avoid potentially looking spammy. I wanted to know if I should remove these links altogether, and just give myself text credit without a link at all? I would like for a potential client who is interested in my work to still be able to get to my site if they like my work - but I want to keep my link profile squeaky clean. Thoughts?
White Hat / Black Hat SEO | | brad.s.knutson0