Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Does IP Blacklist cause SEO issues?
-
Hi,
Our IP was recently blacklisted - we had a malicious script sending out bulk mail in a Joomla installation.
Does it hurt our SEO if we have a domain hosted on that IP?
Any solid evidence?
Thanks.
-
It's not related to Gmail. The server itself was sending out email spam (Joomla is a CMS program used to manage websites). I bet he means he got listed on Spamhaus.
First off, web spam and email spam are two entirely separate things. So you can be blacklisted with anyone in the email realm and not have it affect your SEO.
Second, I've heard the "blacklisted IP" theory regurgitated for nearly 10 years now and nobody has ever proven that a specific IP was the reason for a site losing ranking. So you could, in theory, share an IP with an entire link farm and not lose any ranking (consider how many blogs share an IP under Wordpress.com or Blogspot). Google surfs the web just like everyone else (using DNS lookups) and they rank domains, not IPs (which are subject to change). The only way I could see an IP getting you in trouble is if your server got hacked and the hacker was using it to proxy attacks against Google (as in DDoS attacks, not spam). Then you might have some issues with SEO but your server being hacked would be a far more serious problem at that point.
-
Who has Blacklisted the IP Address. Is it Joomla forums or Gmail Account
However, as referred in this article :- http://moz.com/ugc/the-penguin-update-how-google-identifies-spam about spam emails
"So what about websites? Wouldn't that knowledge of identifying and classifying spam be shared with the search and webspam teams? Don't you think Matt Cutts has access to Gmail's spam detection data? I bet he does, and I bet some of it is being seen in this Penguin update.
Gmail is a pretty well documented product. You can read up quite a bit on spam filters and how they work. I would recommend this to everyone as we can then get a better idea of what Google sees as spam content. For me the biggest takeaway is that Gmail openly admits to using user data and feedback in classifying and identifying spam. This should be a huge indicator to us all that user data is playing a role in how Google classifies and identifies spam on the web. The trick now is to figure out exactly what user data/feedback is being used."
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What is the proper URL length? in seo
i learned that having 50 to 60 words in a url is ok and having less words is preferable by google. but i would like to know that as i am gonna include keywords in the urls and i am afraid it will increase the length. is it gonna slighlty gonna hurt me? my competitors have 8 characters domain url and keywords length of 13 and my site has 15 character domain url and keywords length of 13 which one will be prefered by google.
White Hat / Black Hat SEO | | calvinkj0 -
Robots.txt file in Shopify - Collection and Product Page Crawling Issue
Hi, I am working on one big eCommerce store which have more then 1000 Product. we just moved platform WP to Shopify getting noindex issue. when i check robots.txt i found below code which is very confusing for me. **I am not getting meaning of below tags.** Disallow: /collections/+ Disallow: /collections/%2B Disallow: /collections/%2b Disallow: /blogs/+ Disallow: /blogs/%2B Disallow: /blogs/%2b I can understand that my robots.txt disallows SEs to crawling and indexing my all product pages. ( collection/*+* ) Is this the query which is affecting the indexing product pages? Please explain me how this robots.txt work in shopify and once my page crawl and index by google.com then what is use of Disallow: Thanks.
White Hat / Black Hat SEO | | HuptechWebseo0 -
Dfferent domains on same ip address ranking for the same keywords, is it possible?
Hello, I want to ask if two domains which r hosted on the same server and have the same ip ( usually happens with shared hosts ) tries to rank for the same keywords in google, does the same ip affects them or not.
White Hat / Black Hat SEO | | RizwanAkbar0 -
How to make second site in same niche and do white hat SEO
Hello, As much as we would like, there's a possibility that our site will never recover from it's Google penalties. Our team has decided to launch a new site in the same niche. What do we need to do so that Google will not mind us having 2 sites in the same niche? (Menu differences, coding differences, content differences, etc.) We won't have duplicate content, but it's hard to make the sites not similar. Thanks
White Hat / Black Hat SEO | | BobGW0 -
The use of a ghost site for SEO purposes
Hi Guys, Have just taken on a new client (.co.uk domain) and during our research have identified they also have a .com domain which is a replica of the existing site but all links lead to the .co.uk domain. As a result of this, the .com replica is pushing 5,000,000+ links to the .co.uk site. After speaking to the client, it appears they were approached by a company who said that they could get the .com site ranking for local search queries and then push all that traffic to .co.uk. From analytics we can see that very little referrer traffic is coming from the .com. It sounds remarkably dodgy to us - surely the duplicate site is an issue anyway for obvious reasons, these links could also be deemed as being created for SEO gain? Does anyone have any experience of this as a tactic? Thanks, Dan
White Hat / Black Hat SEO | | SEOBirmingham810 -
Sitelinks Search Box impact for SEO
I am wondering how the relatively new sitelinks search box impacts the SEO rankings for a specific site or keyword combination - do you guys have any experience or bechmarks on this? Obviously it should help on getting more real estate on the SERP page (due to adding the search box), but do you also get extra goodwill and improved SERP position from adding it? Also, is the impact different on different type of terms, let's say single brand or category term such as "Bestbuy" (or "coupon") or a combination term "Bestbuy Apple" (or "Dixons coupon")? Thanks in advance!
White Hat / Black Hat SEO | | tjr0 -
Why does expired domains still work for SEO?
Hi everyone I’ve been doing an experiment during more than 1 year to try to see if its possible to buy expired domains. I know its considered black hat, but like I said, I wanted to experiment, that is what SEO is about. What I did was to buy domains that just expired, immediately added content on a WP setup, filled it with relevant content to the expired domain and then started building links to other relevant sites from these domains.( Here is a pretty good post on how to do, and I did it in a similar way. http://searchenginewatch.com/article/2297718/How-to-Build-Links-Using-Expired-Domains ) This is nothing new and SEO:s has been doing it for along time. There is a lot of rumors around the SEO world that the domains becomes worthless after they expire. But after trying it out during more than 1 year and with about 50 different expired domains I can conclude that it DOES work, 100% of the time. Some of the domains are of course better than others, but I cannot see any signs of the expired domains or the sites i link to has been punished by Google. The sites im liking to ranks great ONLY with those links 🙂 So to the question: WHY does Google allow this? They should be able to see that a domain has been expired right? And if its expired, why dont they just “delete” all the links to that domain after the expiry date? Google is well aware of this problem so what is stopping them? Is there any one here that know how this works technically?
White Hat / Black Hat SEO | | Sir0 -
Black Hat SEO Case Study - Private Link Network - How is this still working?
I have been studying my competitor's link building strategies and one guy (affiliate) in particular really caught my attention. He has been using a strategy that has been working really well for the past six months or so. How well? He owns about 80% of search results for highly competitive keywords, in multiple industries, that add up to about 200,000 searches per month in total. As far as I can tell it's a private link network. Using Ahref and Open Site Explorer, I found out that he owns 1000s of bought domains, all linking to his sites. Recently, all he's been doing is essentially buying high pr domains, redesigning the site and adding new content to rank for his keywords. I reported his link-wheel scheme to Google and posted a message on the webmaster forum - no luck there. So I'm wondering how is he getting away with this? Isn't Google's algorithm sophisticated enough to catch something as obvious as this? Everyone preaches about White Hat SEO, but how can honest marketers/SEOs compete with guys like him? Any thoughts would be very helpful. I can include some of the reports I've gathered if anyone is interested to study this further. thanks!
White Hat / Black Hat SEO | | howardd0