Can a Self-Hosted Ping Tool Hurt Your IP?
-
Confusing title I know, but let me explain.
We are in the middle of programming a lot of SEO "action" tools for our site. These will be available for users to help better optimize their sites in SERPs. We were thinking about adding a "Ping" tool based in PHP so users can ping their domain and hopefully get some extra attention/speed up indexing of updates.
This would be hosted on a subdomain of our site. My question is: If we get enough users using the product, could that potentially get us blacklisted with Google, Bing etc? Technically it needs to send out the Ping request, and that would be coming from the same IP address that our main site is hosted on. If we end up getting over a 1000 users all trying to send ping requests I don't want to potentially jeopardize our IP.
Thoughts?
-
We are not using Wordpress for any of the tools, and all will be handled using PHP. This is a separate directory from our main site used strictly for the tools. Our main site does not use Wordpress either.
The server resources are not an issue at this time, as we have a very powerful setup.
I am not worried about how many times a subscribed user wants to ping their site. I am more concerned about where the ping is being sent out from and how many times.
-
First of all, there is no certainty that pinging a domain helps get it indexed, submitting a site map in the search console seems like the appropriate way to get that done.
I understand that if you ping your site when you update content it can let many sites, RSS feeds, and search engines know about it, but if you ping too much you risk getting blacklisted.
Second, it seems that using your server to send out many pings may slow down response time and therefor slow page load speed for your site which definitely has a negative effect on SEO.
Thirdly, if you can host the service on a separate IP, that would seem like the best course of action because if it gets blacklisted you can just start using a different one, don't risk your domains IP to get blacklisted.
Maybe, I'm missing something here but if you are using WordPress, doesn't that automatically create an auto-updating /feed/ URL for your site?
The following is from - https://en.support.wordpress.com/comments/pingbacks/
Granted I am using WordPress so that is mostly what I focus on. Are you using a different CMS?
How do I send out update pings?
Many services like Technorati, Feedster, Icerocket, Google Blog Search, and others want a “ping” from you to know you’ve updated so they can index your content. WordPress.com handles it all for you. When you post, we send a ping using Ping-o-Matic!, is a service that pings several different search providers all at once including Technorati, My Yahoo!, and Google Blog Search.
Pings are automatically sent if you have a public blog. If your blog is private or if you block search engines, pings will not be sent.
-
Yup!
Use "javascript" on client site to do pinging. Or Java app running from web as applet. Or Flash.
There are two major problems - javascript doesn't support cross-platform post without hacks. And not all computers comes with Java. Same is with Flash.
-
Thank you for your response. As to the IP getting blacklisted, since we have full server control we could always assign another dedicated IP address to the site. The issue is that we would not know if and when it happened to take such action. Obviously, we don't want to have to do this, and it could create headaches if the main site IP is blacklisted for our search position until we get it resolved.
We are also planning on adding in website submission limits. For example, you could only submit mysitehere.com up to 3 times per month per subscriber account. The only way they could spam the system is to create another account and sign up all over again. I doubt anyone would go through that much effort, but I could be wrong.
Thoughts?
-
TL;DR - YES
Long story - i'm author of similar desktop tool called SEOPingler:
http://www.mobiliodevelopment.com/seopingler/
so anyone you can use to ping anything. And bots are coming within second or two. This works perfect.The problem is when you use this to ping many URLs (like 10k-20k). At some time this stop working and ping API endpoint receive your request but i can't see that bots are coming. This mean that there is some threshold that if you pass it for IP and you're temporary blacklisted. I also heard (but i can't confirm this) that this temporary may vary due previous usage. For me this isn't problem because users can blacklist their own IPs. And they can use hotspot wifi internet or VPN for continuing pinging.
But on server this will be HUGE problem because you can't switch IPs on-fly. And no one can guarantee how long your IP will be blacklisted.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does google give any advantage to Webmaster tools verified sites?
Hello friends, I am seeing a strange pattern. i register 2 new domain and make sites on them and add no backlinks nothing only put content and did on page seo right. After 1month of google indexing. both sites are not showing in search for the targeted keywords, but as soon as i add them to Google Webmaster tools they both automatically comes to the 16th and 24th number for their specific keywords. So my question is does Google give any advantage to sites which are verified and added into its webmaster tools in terms of seo or authority?
White Hat / Black Hat SEO | | RizwanAkbar0 -
Can anyone suggest good keywords for this
hello everyone, can you please suggest Good Keywords for my client domain www.amojobs.com. Any one can help please ?? my client Need it urgent.. Thanx in advance
White Hat / Black Hat SEO | | poojathakar0 -
How Can I Safely Establish Homepage Relevancy With Internal Keyword Links?
My website has roughly 1000-2000 pages. However, our homepage is lacking relevancy as to what it is about. One way that I'd like to tackle this problem, is by updating many of our pages with internal linking. I often hear, use exact keyword links with caution, but have assumed this mainly referred to external backlinks. Would it be a disaster to set up our single most relevant keyword on about 300 pages and point it to our homepage? There are breadcrumbs on our site, but the home link uses an image (It's a picture of a house, if you're curious.) Am I better off just to change that to our most relevant keyword? I could use any advice on internal links for establishing better homepage relevancy. Thank you!
White Hat / Black Hat SEO | | osaka730 -
Has anyone had experience with the Disavow Links tool? If so did you notice positive results from it?
I recently noticed a large number of backlinks from low authority directories coming in for one of my clients. These links were either purchased from a competitor or from a directory service site that knows we might be willing to pay to have bad links removed. I've contacted the website admin and they require a payment of $.30 per link to have them removed from their directory. Has anyone had a similar experience? I'm also considering using the disavow tool but I've heard the outcome of using this tool is usually bad. I'd appreciate any feedback, thanks!
White Hat / Black Hat SEO | | Leadhub1 -
Can i 301 redirect a website that does not have manual penalty - but definetly affected by google
ok, i have a website (website A) which has been running since 2008, done very nicely in search results, until january of this year... it dropped siginificantly, losing about two thirds of visitors etc... then in may basically lost the rest... i was pulling my hair out for months trying to figure out why, i "think" it was something to do with links and anchor text, i got rid of old SEO company, got a new SEO company, they have done link analysis, trying to remove lots of links, have dissavowed about 500 domains... put in a reconsideration request... got a reply saying there is no manual penalty... so new seo company says all they can do is carry on removing links, and wait for penguin to update and hopefully that will fix it... this will take as along as it takes penguin to update again... obviously i can not wait indefinetely, so they have advised i start a new website (website B)... which is a complete duplicate of website A. Now as we do not know whats wrong with website A - (we think its links - and will get them removed) my seo company said we cant do a 301 redirect, as we will just cause what ever is wrong to pass over to website B... so we need to create a blank page for every single page at website A, saying we have moved and put a NO FOLLOW link to the new page on website B.... Personally i think the above will look terrible, and not be a very user friendly experience - but my seo company says it is the only way to do it... before i do it, i just wanted to check with some experts here, if this is right? please advise if 301 redirects are NOT correct way to do this. thanks
White Hat / Black Hat SEO | | isntworkdull
James0 -
Can you have too many NOINDEX meta tags?
Hi, Our magento store has a lot of duplicate content issues - after trying various configurations with canonicals, robots, we decided it best and easier to manage to implement Meta NOINDEX tags to the pages that we wish the search engines to ignore. There are about 10000 URL's in our site that can be crawled - 6000 are Meta No Index - and 3000 odd are index follow. There is a high proportion of Meta No Index tags - can that harm our SEO efforts? thanks, Ben
White Hat / Black Hat SEO | | bjs20100 -
How can do I report a multiple set of duplicated websites design to manipulate SERPs?
Ok, so within one of my client's sectors it has become clear that someone is trying to manipulate the SERPs by registering tons of domains that are all keyword targeted. All of the websites are simply duplications of one another and are merely setup to dominate the SERP listings - which, at the moment, it is beginning to do. None of the sites have any real authority (in some cases 1 PA and DA) and yet they're ranking above much more established websites. The only back links they have are from dodgy-looking forum ones. It's all a bit crazy and it shouldn't be happening. Anyway, all of the domains have been registered by the same person and within a two-month time period of each other. What do you guys think is the best step to take to report these particular websites to Google?
White Hat / Black Hat SEO | | Webrevolve0 -
What can i do with it? Black hat in my competitors.
Hi, Here we go, i have a site that is is in first page but in last positon, and i got a competitor that is in first place but his is just duplicate content for every page. He just chage the keyword but still the same content. Really, what can i do, do the same thing, i dont want black hat my site. Do i have to keepping doing my on-page and link building and do not care about him?
White Hat / Black Hat SEO | | Ex20