Can a Self-Hosted Ping Tool Hurt Your IP?
-
Confusing title I know, but let me explain.
We are in the middle of programming a lot of SEO "action" tools for our site. These will be available for users to help better optimize their sites in SERPs. We were thinking about adding a "Ping" tool based in PHP so users can ping their domain and hopefully get some extra attention/speed up indexing of updates.
This would be hosted on a subdomain of our site. My question is: If we get enough users using the product, could that potentially get us blacklisted with Google, Bing etc? Technically it needs to send out the Ping request, and that would be coming from the same IP address that our main site is hosted on. If we end up getting over a 1000 users all trying to send ping requests I don't want to potentially jeopardize our IP.
Thoughts?
-
We are not using Wordpress for any of the tools, and all will be handled using PHP. This is a separate directory from our main site used strictly for the tools. Our main site does not use Wordpress either.
The server resources are not an issue at this time, as we have a very powerful setup.
I am not worried about how many times a subscribed user wants to ping their site. I am more concerned about where the ping is being sent out from and how many times.
-
First of all, there is no certainty that pinging a domain helps get it indexed, submitting a site map in the search console seems like the appropriate way to get that done.
I understand that if you ping your site when you update content it can let many sites, RSS feeds, and search engines know about it, but if you ping too much you risk getting blacklisted.
Second, it seems that using your server to send out many pings may slow down response time and therefor slow page load speed for your site which definitely has a negative effect on SEO.
Thirdly, if you can host the service on a separate IP, that would seem like the best course of action because if it gets blacklisted you can just start using a different one, don't risk your domains IP to get blacklisted.
Maybe, I'm missing something here but if you are using WordPress, doesn't that automatically create an auto-updating /feed/ URL for your site?
The following is from - https://en.support.wordpress.com/comments/pingbacks/
Granted I am using WordPress so that is mostly what I focus on. Are you using a different CMS?
How do I send out update pings?
Many services like Technorati, Feedster, Icerocket, Google Blog Search, and others want a “ping” from you to know you’ve updated so they can index your content. WordPress.com handles it all for you. When you post, we send a ping using Ping-o-Matic!, is a service that pings several different search providers all at once including Technorati, My Yahoo!, and Google Blog Search.
Pings are automatically sent if you have a public blog. If your blog is private or if you block search engines, pings will not be sent.
-
Yup!
Use "javascript" on client site to do pinging. Or Java app running from web as applet. Or Flash.
There are two major problems - javascript doesn't support cross-platform post without hacks. And not all computers comes with Java. Same is with Flash.
-
Thank you for your response. As to the IP getting blacklisted, since we have full server control we could always assign another dedicated IP address to the site. The issue is that we would not know if and when it happened to take such action. Obviously, we don't want to have to do this, and it could create headaches if the main site IP is blacklisted for our search position until we get it resolved.
We are also planning on adding in website submission limits. For example, you could only submit mysitehere.com up to 3 times per month per subscriber account. The only way they could spam the system is to create another account and sign up all over again. I doubt anyone would go through that much effort, but I could be wrong.
Thoughts?
-
TL;DR - YES
Long story - i'm author of similar desktop tool called SEOPingler:
http://www.mobiliodevelopment.com/seopingler/
so anyone you can use to ping anything. And bots are coming within second or two. This works perfect.The problem is when you use this to ping many URLs (like 10k-20k). At some time this stop working and ping API endpoint receive your request but i can't see that bots are coming. This mean that there is some threshold that if you pass it for IP and you're temporary blacklisted. I also heard (but i can't confirm this) that this temporary may vary due previous usage. For me this isn't problem because users can blacklist their own IPs. And they can use hotspot wifi internet or VPN for continuing pinging.
But on server this will be HUGE problem because you can't switch IPs on-fly. And no one can guarantee how long your IP will be blacklisted.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Too many links pointing to our privacy policy page: Hurting our ranking efforts of main pages?
Hi community, As per the "Links" report from GSC, there are millions of pages pointing to our privacy policy page. We can expect high number of links to this page being ours an open source product. But these links are overtaking the count of links pointing to our homepage which are very artificial from few spammy or low quality sites. "Privacy policy" anchor text is also been the top anchor text. Our homepage ranking dropped and I suspect this is the culprit. Google might be considering this is the important page being linked on top with anchor text. Shall I Disavow these sites and will this makes Google stop counting links, and the anchor text coming from these sites as well? Suggestions please. Thanks
White Hat / Black Hat SEO | | vtmoz0 -
New Software Requires us to redirect a sub domain to another IP Address.
I operate a local print and direct mail company located in Houston called Catdi Printing (www.catdi.com)We do very well with our local rankings and rank 1 or 2 in our main keywords ( direct mail Houston & eddm Houston ) We are looking to upgrade our online quoting and ordering system. The software is very expensive and the only way we can incorporate this new system is create on our end a new subdomain (printing.catdi.com) and redirect it to an ip thats with their server. Their server is located in Californiaa and might even be hosted by Google but im not certain on this point. Our current host provider is Hostgator and they are based in Houston so im not this provides any benefit. I guess my main question is will Google look at this negatively? Would this change our SERPS organically and what about how Google indexes pages on the subdomain? Im also concerned that the load times will be off and make the user experience awkward. Any feedback is greatly appreciated!
White Hat / Black Hat SEO | | ChopperCharlie0 -
Effect of same country server hosting on SEO
Hello all, my question is if my website targets a country abc and I have server in the same country abc compared to suppose I shift my server to country xyz can it effect SEO and ranking of my website ?
White Hat / Black Hat SEO | | adnan11010 -
Google admits it can take up to a year to refresh/recover your site after it is revoked from Penguin!
I found myself in an impossible situation where I was getting information from various people that seem to be "know it all's" but everything in my heart was telling me they were wrong when it came to the issues my site was having. I have been on a few Google Webmaster Hangouts and found many answers to questions I thought had caused my Penguin Penalty. After taking much of the advice, I submitted my Reconsideration Request for the 9th time (might have been more) and finally got the "revoke" I was waiting for on the 28th of MAY. What was frustrating was on May 22nd there was a Penguin refresh. This as far as I knew was what was needed to get your site back up in the organic SERPS. My Disavow had been submitted in February and only had a handful of links missing between this time and the time we received the revoke. We patiently waited for the next penguin refresh with the surety that we were heading in the right direction by John Mueller from Google (btw.. John is a great guy and really tries to help where he can). The next update came on October 4th and our rankings actually got worse! I spoke with John and he was a little surprised but did not go into any detail. At this point you have to start to wonder WHAT exactly is wrong with the website. Is this where I should rank? Is there a much deeper Panda issue. We were on the verge of removing almost all content from the site or even changing domains despite the fact that it was our brand name. I then created a tool that checked the dates of every last cached date of each link we had in our disavow file. The thought process was that Google had not re-crawled all the links and so they were not factored into the last refresh. This proved to be incorrect,all the links had been re-cached August and September. Nothing earlier than that,which would indicate a problem that they had not been cached in time. i spoke to many so called experts who all said the issue was that we had very few good links left,content issues etc.. Blah Blah Blah, heard it all before and been in this game since the late 90's, the site could not rank this badly unless there was an actual penalty as spam site ranked above us for most of our keywords. So just as we were about to demolish the site I asked John Mueller one more time if he could take a look at the site, this time he actually took the time to investigate,which was very kind of him. he came back to me in a Google Hangout in late December, what he said to me was both disturbing and a relief at the same time. the site STILL had a penguin penalty despite the disavow file being submitted in February over 10 months ago! And the revoke in May. I wrote this to give everyone here that has an authoritative site or just an old one, hope that not all is lots just yet if you are still waiting to recover in Google. My site is 10 years old and is one of the leaders in its industry. Sites that are only a few years old and have had unnatural link building penalties have recovered much faster in this industry which I find ridiculous as most of the time the older authoritative sites are the big trustworthy brands. This explains why Google SERPS have been so poor for the last year. The big sites take much longer to recover from penalties letting the smaller lest trustworthy sites prevail. I hope to see my site recover in the next Penguin refresh with the comfort of knowing that my site currently is still being held back by the Google Penguin Penalty refresh situation. Please feel free to comment below on anything you think is relevant.
White Hat / Black Hat SEO | | gazzerman10 -
Do inbound links from forums hurt our traffic?
We have a manual action against us on Google webmaster tools for unnatural links. While evaluating our back links, I noticed that forums with low page rank/domain authority are linking to us. Is this hurting us?
White Hat / Black Hat SEO | | imlovinseo0 -
Negative SEO and when to use to Dissavow tool?
Hi guys I was hoping someone could help me on a problem that has arisen on the site I look after. This is my first SEO job and I’ve had it about 6 months now. I think I’ve been doing the right things so far building quality links from reputable sites with good DA and working with bloggers to push our products as well as only signing up to directories in our niche. So our backlink profile is very specific with few spammy links. Over the last week however we have received a huge increase in backlinks which has almost doubled our linking domains total. I’ve checked the links out from webmaster tools and they are mainly directories or webstat websites like the ones below | siteinfo.org.uk deperu.com alestat.com domaintools.com detroitwebdirectory.com ukdata.com stuffgate.com | We’ve also just launched a new initiative where we will be producing totally new and good quality content 4-5 times a week and many of these new links are pointing to that page which looks very suspicious to me. Does this look like negative Seo to anyone? I’ve read a lot about the disavow tool and it seems people’s opinions are split on when to use it so I was wondering if anyone had any advice on whether to use it or not? It’s easy for me to identify what these new links are, yet some of them have decent DA so will they do any harm anyway? I’ve also checked the referring anchors on Ahrefs and now over 50% of my anchor term cloud are totally unrelated terms to my site and this has happened over the last week which also worries me. I haven’t seen any negative impact on rankings yet but if this carries on it will destroy my link profile. So would it be wise to disavow all these links as they come through or wait to see if they actually have an impact? It should be obvious to Google that there has been a huge spike in links so then the question is would they be ignored or will I be penalised. Any ideas? Thanks in advance Richard
White Hat / Black Hat SEO | | Rich_9950 -
When to NOT USE the disavow link tool
Im not here to say this is concrete and should never do this, and please if you disagree with me then lets discuss. One of the biggest things out there today especially after the second wave of Penguin (2.0) is the fear striken web masters who run straight to the disavow tool after they have been hit with Penguin or noticed a drop shortly after. I had a friend who's site who never felt the effects of Penguin 1.0 and thought everything was peachy. Then P2.0 hit and his rankings dropped of the map. I got a call from him that night and he was desperately asking me for help to review his site and guess what might have happened. He then tells me the first thing he did was compile a list of websites back linking to him that might be the issue and create his disavow list and submitted it. I asked him "How long did you research these sites before you came the conclusion they were the problem?" He Said "About an hour" Then I asked him "Did you receive a message in your Google Webmaster Tools about unnatural linking?" He Said "No" I said "Then why are you disavowing anything?" He Said "Um.......I don't understand what you are saying?" In reading articles, forums and even here in the Moz Q/A I tend to think there is some misconceptions about the disavow tool from Google that do not seem to be clearly explained. Some of my findings with the tool and when to use it is purely based on logic IMO. Let me explain When to NOT use the tool If you spent an hour reviewing your back link profile and you are to eager to wait any longer to upload your list. Unless you have less than 20 root domains linking to you, you should spend a lot more than an hour reviewing your back link profile You DID NOT receive a message from GWT informing you that you had some "unnatural" links Ill explain later If you spend a very short amount of time reviewing your back link profile. Did not look at each individual site linking to you and every link that exists, then you might be using it WAY TO SOON. The last thing you want to do is disavow a link that actually might be helping you. Take the time to really look at each link and ask your self this question (Straight from the Google Guidelines) "A good rule of thumb is whether you'd feel comfortable explaining what you've done to a website that competes with you, or to a Google employee" Studying your back link profile We all know when we have cheated. Im sure 99.9% of all of us can admit to it at one point. Most of the time I can find back links from sites and look right at the owner and ask him or her "You placed this back link didn't you?" I can see the guilt immediately in their eyes 🙂 Remember not ALL back links you generate are bad or wrong because you own the site. You need to ask yourself "Was this link necessary and does it apply to the topic at hand?", "Was it relevant?" and most important "Is this going to help other users?". These are some questions you can ask yourself before each link you place. You DID NOT receive a message about unnatural linking This is were I think the most confusing takes place (and please explain to me if I am wrong on this). If you did not receive a message in GWT about unnatural linking, then we can safely say that Google does not think you contain any "fishy" spammy links in which they have determined to be of a spammy nature. So if you did not receive any message yet your rankings dropped, then what could it be? Well it's still your back links that most likely did it, but its more likely the "value" of previous links that hold less or no value at all anymore. So obviously when this value drops, so does your rank. So what do I do? Build more quality links....and watch you rankings come back 🙂
White Hat / Black Hat SEO | | cbielich1