Do i need to use proxy when i ping my backlinks?
-
I just create 50+ backlinks i would like to know when i ping those like do i need to use proxy?
Thank you so much
-
This is not my strong suit however. I do know Google is coming down hard on people pinging their links all at once. I would use a proxy if I were doing it. If I were you I would release the links over a course of the next 2 to 3 weeks. If Google sees a spike think that's not natural. Here's some info below to I hope I was of some help to you.
Sincerely,
Thomas Zickell"I think about 6 months to 1 year+ ago, mass pinging tons of backlinks doesn't work anymore... anyone notice if you bulk ping thousands of backlinks all point to your website to mass ping servers doesn't work any more?
It seems like Google has implement a filter to strip away ping spam, and if you find that pinging with a lot of backlinks in the shortest time doesn't work, how about spreading them 100 sites per ping to fewer ping servers, would that work?
Yes, it does, and my recommendation is - ping to less than 10 ping servers at a time, and ping only 100 backlinks or less (that link to the same money site) at a time work much better! Anyone had similar experience?"
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
The use of Markup language
Hi, We were thinking of adding markup language to our site. We have been reading about it to understand the actual benefits of doing so (we have seen many brands are not using it, including moz.com). So I have two questions: 1- Would you recommend using it for our site? www.memoq.com 2- If yes, would it be better to create a snippet of code for our home page as an "organization" and then different snippets for our product pages as "products". Looking forward to your comments,
Technical SEO | | Kilgray0 -
Using rel=canonical
I have a set of static pages which were created with the purpose of targeting long tail keywords. That has resulted in Domain Authority dilution to some extent. I am now in the process of creating one page which will serve the same results but only after user selects the fields in the drop-down. I am planning to use rel=cannonical on the multiple pages pointing back to the new page. Will it serve the purpose?
Technical SEO | | glitterbug0 -
Anyone have experience with GrowTeam or the platform they use?
We are being pitched by GrowTeam to improve our keyword rankings. They tell us they work with an SEO Technology company that does A/B testing of title tags on an engine that mimics Google's algorithms. Is this possible? I am not familiar with any platform where someone could do A/B testing on meta titles.
Technical SEO | | MikeAA0 -
Does using cufon for H-tags etc hurt SEO?
Does the use of cufon for H-tags et al affect SEO/how Google views your website?
Technical SEO | | Alligator0 -
RegEx help needed for robots.txt potential conflict
I've created a robots.txt file for a new Magento install and used an existing site-map that was on the Magento help forums but the trouble is I can't decipher something. It seems that I am allowing and disallowing access to the same expression for pagination. My robots.txt file (and a lot of other Magento site-maps it seems) includes both: Allow: /*?p= and Disallow: /?p=& I've searched for help on RegEx and I can't see what "&" does but it seems to me that I'm allowing crawler access to all pagination URLs, but then possibly disallowing access to all pagination URLs that include anything other than just the page number? I've looked at several resources and there is practically no reference to what "&" does... Can anyone shed any light on this, to ensure I am allowing suitable access to a shop? Thanks in advance for any assistance
Technical SEO | | MSTJames0 -
My backlinks do not register on your software but are registered on my google webmaster tools
Why are my back links not being recognized by any software other than google webmaster?
Technical SEO | | SteveK640 -
Need help with Joomla duplicate content issues
One of my campaigns is for a Joomla site (http://genesisstudios.com) and when my full crawl was done and I review the report, I have significant duplicate content issues. They seem to come from the automatic creation of /rss pages. For example: http://www.genesisstudios.com/loose is the page but the duplicate content shows up as http://www.genesisstudios.com/loose/rss It appears that Joomla creates feeds for every page automatically and I'm not sure how to address the problem they create. I have been chasing down duplicate content issues for some time and thought they were gone, but now I have about 40 more instances of this type. It also appears that even though there is a canonicalization plugin present and enabled, the crawl report shows 'false' for and rel= canonicalization tags Anyone got any ideas? Thanks so much... Scott | |
Technical SEO | | sdennison0 -
Should we use Google's crawl delay setting?
We’ve been noticing a huge uptick in Google’s spidering lately, and along with it a notable worsening of render times. Yesterday, for example, Google spidered our site at a rate of 30:1 (google spider vs. organic traffic.) So in other words, for every organic page request, Google hits the site 30 times. Our render times have lengthened to an avg. of 2 seconds (and up to 2.5 seconds). Before this renewed interest Google has taken in us we were seeing closer to one second average render times, and often half of that. A year ago, the ratio of Spider to Organic was between 6:1 and 10:1. Is requesting a crawl-delay from Googlebot a viable option? Our goal would be only to reduce Googlebot traffic, and hopefully improve render times and organic traffic. Thanks, Trisha
Technical SEO | | lzhao0