Does IP Blacklist cause SEO issues?
-
Hi,
Our IP was recently blacklisted - we had a malicious script sending out bulk mail in a Joomla installation.
Does it hurt our SEO if we have a domain hosted on that IP?
Any solid evidence?
Thanks.
-
It's not related to Gmail. The server itself was sending out email spam (Joomla is a CMS program used to manage websites). I bet he means he got listed on Spamhaus.
First off, web spam and email spam are two entirely separate things. So you can be blacklisted with anyone in the email realm and not have it affect your SEO.
Second, I've heard the "blacklisted IP" theory regurgitated for nearly 10 years now and nobody has ever proven that a specific IP was the reason for a site losing ranking. So you could, in theory, share an IP with an entire link farm and not lose any ranking (consider how many blogs share an IP under Wordpress.com or Blogspot). Google surfs the web just like everyone else (using DNS lookups) and they rank domains, not IPs (which are subject to change). The only way I could see an IP getting you in trouble is if your server got hacked and the hacker was using it to proxy attacks against Google (as in DDoS attacks, not spam). Then you might have some issues with SEO but your server being hacked would be a far more serious problem at that point.
-
Who has Blacklisted the IP Address. Is it Joomla forums or Gmail Account
However, as referred in this article :- http://moz.com/ugc/the-penguin-update-how-google-identifies-spam about spam emails
"So what about websites? Wouldn't that knowledge of identifying and classifying spam be shared with the search and webspam teams? Don't you think Matt Cutts has access to Gmail's spam detection data? I bet he does, and I bet some of it is being seen in this Penguin update.
Gmail is a pretty well documented product. You can read up quite a bit on spam filters and how they work. I would recommend this to everyone as we can then get a better idea of what Google sees as spam content. For me the biggest takeaway is that Gmail openly admits to using user data and feedback in classifying and identifying spam. This should be a huge indicator to us all that user data is playing a role in how Google classifies and identifies spam on the web. The trick now is to figure out exactly what user data/feedback is being used."
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
"Google chose different canonical than user" Issue Can Anyone help?
Our site https://www.travelyaari.com/ , some page are showing this error ("Google chose different canonical than user") on google webmasters. status message "Excluded from search results". Affected on our route page urls mainly. https://www.travelyaari.com/popular-routes-listing Our canonical tags are fine, rel alternate tags are fine. Can anyone help us regarding why it is happening?
White Hat / Black Hat SEO | | RobinJA0 -
New Software Requires us to redirect a sub domain to another IP Address.
I operate a local print and direct mail company located in Houston called Catdi Printing (www.catdi.com)We do very well with our local rankings and rank 1 or 2 in our main keywords ( direct mail Houston & eddm Houston ) We are looking to upgrade our online quoting and ordering system. The software is very expensive and the only way we can incorporate this new system is create on our end a new subdomain (printing.catdi.com) and redirect it to an ip thats with their server. Their server is located in Californiaa and might even be hosted by Google but im not certain on this point. Our current host provider is Hostgator and they are based in Houston so im not this provides any benefit. I guess my main question is will Google look at this negatively? Would this change our SERPS organically and what about how Google indexes pages on the subdomain? Im also concerned that the load times will be off and make the user experience awkward. Any feedback is greatly appreciated!
White Hat / Black Hat SEO | | ChopperCharlie0 -
Help identifying cause for total rank loss
Hello, Last week I noticed one of my pages decreased in rank for a particular query from #8 to #13. Although I had recently made a few minor edits to the page (added an introductory paragraph and left-column promo to increase word count), I thought the reason for the decrease was due to a few newly ranked pages that I hadn't seen before. In an attempt to regain my original position, I tried to optimize the meta title for the singular form of the word. After making this change, I fetched and rendered the page as Google (status = partial) and submitted the page for indexing (URL only, not including on-page links). Almost immediately after submitting, the page dropped from #13 out of the top 50. I've since changed the meta title back to what it was originally and let Google crawl and index the page on its own, but the page is still not in the top 50. Could the addition of the page description and left column promos tipped the scales of keyword stuffing? If I change everything back to the way it was originally, is it reasonable to think I should regain my original position below the new pages? Any insights would be greatly appreciated!
White Hat / Black Hat SEO | | jmorehouse0 -
Bay Area SEO Agency
Hi, Can anyone help with recommendations on good SEO agencies based in the Bay Area who have some history of working with gaming or adult brands which have been badly hit by rankings falls in the past 12 months, we suspect due to Penguin. Thanks
White Hat / Black Hat SEO | | BetAmerica0 -
Subdomain and root domain effects on SEO
I have a domain lets say it's mydomain.com, which has my web app already hosted on this domain. I wanted to create a sub-product from my company, the concept is a bit different than my original web app that is on mydomain.com and I am planning to host this on mynewapp.mydomain.com. I am having doubts that using a sub-domain will have an impact on my existing or new web app. Can anyone give me any pointers on this? As much as I wanted to use a directory mydomain.com/mynewapp, this is not possible because it will just confuse existing users of the new product/web app. I've heard that subdomains are essentially treated as a new site, is this true? If it is then I am fine with this, but is it also true that subdomains are harder to reach the top rank rather than a root domain?
White Hat / Black Hat SEO | | herlamba0 -
Which SEO companies offer Penalty analysis?
I'm having a hard time finding a (good) SEO company which specializes itself in Penalty analysis? Any recommendations? I only found Bruce Clay, but they charge 8,000$ :)...
White Hat / Black Hat SEO | | wellnesswooz0 -
Vendor Descriptions for SEO... Troublesome?
Howdy! I have been tossing this idea around in my head over the weekend and I cannot decide which answer is correct, so here I am! We a retailer of products and is currently in the midst of redesigning our site-- not only design but also content. The issue that we are facing is with product descriptions from our vendors. We are able to access the product descriptions/specs from their websites and use them on ours, but my worry is that we will get tagged for duplicate content. Other retailers (as well as the vendors) are using this content as well, so I don't want this to have an adverse effect on our ranking. There are so many products that it would be a large feat to re-write unique content-- not to mention that the majority of the rhetoric would be extremely similar. What have you seen in your experiences in similar situations? Is it bad to use the descriptions? Or do we need to bite the bullet and do our best to re-write hundreds of product descriptions? Or is there a way to use the descriptions and tag it in a way that won't have Google penalize us? I originally thought that if we have enough other unique content on our site, that it shouldn't be as big of a deal, but then I realized how much of our site's structure is our actual products. Thanks in advance!
White Hat / Black Hat SEO | | jpretz0 -
Interesting case of IP-wide Google Penalty, what is the most likely cause?
Dear SEOMOZ Community, Our portfolio of around 15 internationalized web pages has received a significant, as it seems IP-wide, Google penalty starting November 2010 and have yet to recover from it. We have undergone many measure to lift the penalty including reconsideration requests wo/ luck and am now hoping the SEOMoz community can give us some further tips. We are very interested in the community's help and judgement what else we can try to uplift the penalty. As quick background information, The sites in question offers sports results data and is translated for several languages. Each market, equals language, has its own tld domain using the central keyword, e.g. <keyword_spanish>.es <keyword_german>.de <keyword_us>.com</keyword_us></keyword_german></keyword_spanish> The content is highly targeted around the market, which means there are no duplicate content pages across the domains, all copy is translated, content reprioritized etc. however the core results content in the body of the pages obviously needs to stay to 80% the same A SEO agency of ours has been using semi-automated LinkBuilding tools in mid of 2010 to acquire link partnerships There are some promotional one-way links to sports-betting and casino positioned on the page The external linking structure of the pages is very keyword and main-page focused, i.e. 90% of the external links link to the front page with one particular keyword All sites have a strong domain authority and have been running under the same owner for over 5 years As mentioned, we have experienced dramatic ranking losses across all our properties starting in November 2010. The applied penalties are indisputable given that rankings dropped for the main keywords in local Google search engines from position 3 to position 350 after the sites have been ranked in the top 10 for over 5 years. A screenshot of the ranking history for one particular domain is attached. The same behavior can be observed across domains. Our questions are: Is there something like an IP specific Google penalty that can apply to web properties across an IP or can we assume Google just picked all pages registered at Google Webmaster? What is the most likely cause for our penalty given the background information? Given the drops started already in November 2010 we doubt that the Panda updates had any correlation t this issue? What are the best ways to resolve our issues at this point? We have significant history data available such as tracking records etc. Our actions so far were reducing external links, on page links, and C-class internal links Are there any other factors/metrics we should look at to help troubleshooting the penalties? After all this time wo/ resolution, should we be moving on two new domains and forwarding all content as 301s to the new pages? Are the things we need to try first? Any help is greatly appreciated. SEOMoz rocks. /T cxK29.png
White Hat / Black Hat SEO | | tomypro0