Can a Self-Hosted Ping Tool Hurt Your IP?
-
Confusing title I know, but let me explain.
We are in the middle of programming a lot of SEO "action" tools for our site. These will be available for users to help better optimize their sites in SERPs. We were thinking about adding a "Ping" tool based in PHP so users can ping their domain and hopefully get some extra attention/speed up indexing of updates.
This would be hosted on a subdomain of our site. My question is: If we get enough users using the product, could that potentially get us blacklisted with Google, Bing etc? Technically it needs to send out the Ping request, and that would be coming from the same IP address that our main site is hosted on. If we end up getting over a 1000 users all trying to send ping requests I don't want to potentially jeopardize our IP.
Thoughts?
-
We are not using Wordpress for any of the tools, and all will be handled using PHP. This is a separate directory from our main site used strictly for the tools. Our main site does not use Wordpress either.
The server resources are not an issue at this time, as we have a very powerful setup.
I am not worried about how many times a subscribed user wants to ping their site. I am more concerned about where the ping is being sent out from and how many times.
-
First of all, there is no certainty that pinging a domain helps get it indexed, submitting a site map in the search console seems like the appropriate way to get that done.
I understand that if you ping your site when you update content it can let many sites, RSS feeds, and search engines know about it, but if you ping too much you risk getting blacklisted.
Second, it seems that using your server to send out many pings may slow down response time and therefor slow page load speed for your site which definitely has a negative effect on SEO.
Thirdly, if you can host the service on a separate IP, that would seem like the best course of action because if it gets blacklisted you can just start using a different one, don't risk your domains IP to get blacklisted.
Maybe, I'm missing something here but if you are using WordPress, doesn't that automatically create an auto-updating /feed/ URL for your site?
The following is from - https://en.support.wordpress.com/comments/pingbacks/
Granted I am using WordPress so that is mostly what I focus on. Are you using a different CMS?
How do I send out update pings?
Many services like Technorati, Feedster, Icerocket, Google Blog Search, and others want a “ping” from you to know you’ve updated so they can index your content. WordPress.com handles it all for you. When you post, we send a ping using Ping-o-Matic!, is a service that pings several different search providers all at once including Technorati, My Yahoo!, and Google Blog Search.
Pings are automatically sent if you have a public blog. If your blog is private or if you block search engines, pings will not be sent.
-
Yup!
Use "javascript" on client site to do pinging. Or Java app running from web as applet. Or Flash.
There are two major problems - javascript doesn't support cross-platform post without hacks. And not all computers comes with Java. Same is with Flash.
-
Thank you for your response. As to the IP getting blacklisted, since we have full server control we could always assign another dedicated IP address to the site. The issue is that we would not know if and when it happened to take such action. Obviously, we don't want to have to do this, and it could create headaches if the main site IP is blacklisted for our search position until we get it resolved.
We are also planning on adding in website submission limits. For example, you could only submit mysitehere.com up to 3 times per month per subscriber account. The only way they could spam the system is to create another account and sign up all over again. I doubt anyone would go through that much effort, but I could be wrong.
Thoughts?
-
TL;DR - YES
Long story - i'm author of similar desktop tool called SEOPingler:
http://www.mobiliodevelopment.com/seopingler/
so anyone you can use to ping anything. And bots are coming within second or two. This works perfect.The problem is when you use this to ping many URLs (like 10k-20k). At some time this stop working and ping API endpoint receive your request but i can't see that bots are coming. This mean that there is some threshold that if you pass it for IP and you're temporary blacklisted. I also heard (but i can't confirm this) that this temporary may vary due previous usage. For me this isn't problem because users can blacklist their own IPs. And they can use hotspot wifi internet or VPN for continuing pinging.
But on server this will be HUGE problem because you can't switch IPs on-fly. And no one can guarantee how long your IP will be blacklisted.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Dfferent domains on same ip address ranking for the same keywords, is it possible?
Hello, I want to ask if two domains which r hosted on the same server and have the same ip ( usually happens with shared hosts ) tries to rank for the same keywords in google, does the same ip affects them or not.
White Hat / Black Hat SEO | | RizwanAkbar0 -
Does showing the date published for an article in the SERPS help or hurt click-through rate?
Does showing the date published for an article in the SERPS help or hurt click-through rate?
White Hat / Black Hat SEO | | WebServiceConsulting.com1 -
What is parasite hosting?
Anybody define about parasite hosting and how can affected ranking in Google.
White Hat / Black Hat SEO | | dotlineseo0 -
Can you use the image description for IMG ALT?
ello ello! We're running an ecommerce site with thousands of products. None of the product pages have an IMG ALT. We're been thinking about an IMG ALT rule to apply to all product page images. Every image currently has a detailed caption so the thought was, why don't we use the description as the IMG ALT? It's perfect as it explains the image. Now the thing is, the length of the description, some of them come to 150 - 200 characters with spaces. Do you think this is too much? Also, would having a caption and the IMG ALT be the same cause issues? Have you guys employed any rules for IMG ALT in a bulk way?
White Hat / Black Hat SEO | | Bio-RadAbs0 -
Potential Implications of using the Disavow tool to remove thousands of links
So here's the situation. My companies site has over 30 thousand backlinks from Rippling.info These links all point to 3 product pages, some of which are no longer in production. Apparently a former employee was experimenting with some link farm ideas. My questions are; 1. does anyone here have experience with rippling.info? Is it legit? It seems like a link farm but Google allows adsense ads??? I thought Google was against link farms... 2. if I use the Disavow tool in Webmaster Tools to tell Google these 30k+ incoming links are to be ignored, will there be any consequences? -Google Analytics shows zero referral traffic since jan 1st 2012.
White Hat / Black Hat SEO | | mjmorse0 -
Webmaster Tools Showing Bad Links Removed Over 60 Days Ago
Hello, One of my clients received the notorious message from Google about unnatural links late last March. We've removed several hundred (if not thousands) of links, and resubmitted several times for reconsideration, only to continue with responses that state that we still have unnatural links. Looking through the "links to your site" in google webmaster tools, there are several hundred sites / pages listed, from which we removed our link over 60 days ago. If you click each link to view the site / page, they contain nothing, viewable or hidden, regarding our website / address. I was wondering if this (outdated / inaccurate) list is the same as the one their employees use to analyze the current status of bad links, and if so how long it will take to reflect up-to-date information. In other words, even though we've removed the bad links, how long do we need to wait until we can expect a clean resubmission for reconsideration. Any help / advice would be greatly appreciated -
White Hat / Black Hat SEO | | Bromtec0 -
Is my SEO strategy solid moving forward (post panda update) or am I doing risky things that might hurt my sites down the road?
Hey all, WIhen I first started doing SEO, I was encouraged by several supposed experts that it was a good idea to buy links from "respectable" sources and as well make use of SEO experimentation offered on Fiverr. I did that a lot for the clients I represented not knowing if this was going to hurt. But now after the latest Google shift, I am realizing that this was stupid and thus deserving of the ranking drops I have received. In the aftermath, I want to list out here what I am doing now to try to build better and stronger rankings for my sites using white hat techniques only... Below is a list of what I'm doing. Please let me know if any of these are bad choices and I will immediately dump them. Also, If i am not including some good options, please let me know that too. I am really embarrassed and humbled by this and could use whatever help you can offer. Thanks in advance for your help... What am I doing now? *Writing quality articles for external blogs with keyword links back to sites *Taking the above articles and spinning them at SEOLINKVINE to create several articles *Writing quality articles for every site's internal blog and using keywords to link out to other sites that are on different servers - All articles are original, varied and not duplicate content. *Writing quality, relevant articles and submitting them to places like Ezine *Signing clients up for Facebook, Yelp, Twitter, etc so they have a social presence *Working to fix mistakes with onsite issues (mirror sites, duplicate page titles, etc.) *Writing quality keyword-rich unique content on each page of each site *Submitting URL listings and descriptions to directories like JoeAnt, REALS and business.com (Any other good ones that people can recommend that give good link juice?) *Doing competitive research and going after highly authoritative links that our competitors have That is about it... HELP!!! Thanks again
White Hat / Black Hat SEO | | creativeguy0 -
How do I find out if a competitor is using black hat methods and what can I do about it?
A competitor of mine has appeared out of nowhere with various different websites targetting slightly different keywords but all are in the same industry. They don't have as many links as me, the site structure and code is truly awful (multiple H1's on same page, tables for non-tabular data etc...) yet they outperform mine and many of my other competitors. It's a long story but I know someone who knows the people who run these sites and from what I can gather they are using black hat techniques. But that is all I know and I would like to find out more so I can report them.
White Hat / Black Hat SEO | | kevin11