IP block in Google
-
Our office has a number of people performing analysis and research on keyword positions, volume, competition etc.
We have 1 external static IP address. We installed the static IP so we can filter out our visits in Google Analytics. However by 10 AM we get impssible CAPTCHA's or even get blocked in Google.
Do you have any experience with such an issue? Any solutions you can recommend?
Any help would be appreciated!
-
yup, i know that problem. too many requests to Google from the same IP. The IPs from which you query Google should frequently change, otherwise you'll get blocked.
-
Speed (even when set at slow) is not always the problem but a regular timing of queries which is unlike humans who operate randomly.
-
Thanks for the reply. Yes, we do use a rank tracker (seo powersuite). We've put it at its lowest crawl speed. We're 3 people using the tool, not even simultaneously.
Great visual, btw. Guess I'll join you in that category (proudly though).
-
I start at 8 AM and 2 hours later we see the captcha's getting to a ridiculous level. The IP doesn't seem to be on a black list. No, the IP never change.
I get the captcha's as a result of using Rank Tracker by Seo Powersuite. However, it is at it's lowest crawl level. But you can imagine a team of several people tracking ranks that is raises flags at Google. What would you suggest we do as a team to avoid this?
-
You are likely using tools like SEO Elite etc which send fast queries. This type of software can be set to slower hit rate. I doubt you guys search that fast to be booted without use of software. Check your plugins and software - that would be my first recommendation.
I do get this from time to time when I try to hack URLs to death... but it only happens occasionally
-
Is this happening every day at 10AM? What happens between midnight and 10AM? Before midnight? Is this static IP on a blacklist? Does this static IP change over night? Where are you getting these 'impossible CAPTCHAs'? Where is 'Google blocking you'?
-
Thank you. Very helpful
-
i have never seem this before
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Search Console Block
Am new to SEO. My clients site was completed using Yoast premium and then used Google search console to initiate the crawl. Initially setup an http:// property and all seemed good. Then i removed that under search console an created an https:// did the render and it appears google has put a block and placed their own robots.txt file which basically has rendered the site useless. Feedback most appreciated.
Web Design | | BoostMyBiz0 -
Block parent folder in robot.txt, but not children
Example: I want to block this URL (which shows up in Webmaster Tools as an error): http://www.siteurl.com/news/events-calendar/usa But not this: http://www.siteurl.com/news/events-calendar/usa/event-name
Web Design | | Zuken0 -
How does Google look at strings added to a URL
For example: http://localhost:3000/en-US/app/a-knsmtrhqrqs/personal where knsmtrhqrqs is a string Can Google tell this is a string and what's their policy? Will it hurt rankings? Thank you.
Web Design | | RoxBrock0 -
No cache meta tags - does it help Google get back and reindex faster?
I saw these meta tags on a site and am trying to figure out their benefit. These meta tags are on the home page, product pages, every page of the site. Will it cause search engine bots to come back and index pages faster? Will it cause slower page loading in browsers if nothing is cached? http-equiv="pragma" content="no-cache"/> http-equiv="cache-control" content="no-cache,no-store,must-revalidate"/> http-equiv="expires" content="0"/>
Web Design | | CFSSEO0 -
Is it necessary to Remove 301 redirects from Wordpress after removing the 404 url from Google Webmaster?
There were many 404 urls in my site found by Google Webmaster. I've redirected these urls to the relevant urls with 301 redirect in wordpress. After that I removed these 404 urls from Google Index through Webmaster. "Should I cleanup these 301 redirects from Wordpress or not? ". Help Needed.
Web Design | | SangeetaC0 -
Google Tag Manager
I recently discovered the Google Tag Manager and I am in the process of updating many of my websites with this feature. I am using Tag Manager to mange Google Analytics, Google Remarketing, Alive Chat, Woopra, etc. I have one question about how Tag Manager actually works. As best I can tell, the Tag Manager code snippet that I insert into my web pages is the same for all my websites and does not include a unique ID. If that is the case, then Tag Manager must search all the URLs in the TM database to find a match. What is to stop someone else from adding some rules for my URLs to their containers? I expect Google has a method to ensure proper matching, but I'm not clear on how that is enforced. Best,
Web Design | | ChristopherGlaeser
Christopher0 -
Do I need to redirect soft 404s that I got from Google Webmaster Tools?
Hi guys, I got almost 1000+ soft 404s from GWT. All of the soft 404s produce 200 HTTP status code but the URLs are something like the following: http://www.example.com/search/house-for-rent (query used: house for rent) http://www.example.com/search/-----------rent (query used:-------rent) There are no listings that match these queries and there is an advanced search that is visible in these pages. Here are my questions: 1. Do I need to redirect each page to its appropriate landing page? 2. Do I need to add user sitemap or a list of URLs where they can search for other properties? Any suggestions would help. 🙂
Web Design | | esiow20130 -
Do Pages That Rearrange Set Off Any Red Flags for Google?
We have a broad content site that includes crowdsourced lists of items. A lot of the pages allow voting, which causes the content on the pages (sometimes the content is up to 10 pages deep) to completely rearrange, and therefore spread out and switch pages often among the (up to 10) pages of content. Now, could this be causing any kind of duplicate content or any other kind of red flags for Google? I know that the more the page changes the better, but if it's all the same content that is being moved up and down constantly, could Google think we're pulling some kind of "making it look like we have new content" scheme and ding us for these pages? If so, what would anyone recommend we do? Let's take an example of a list of companies with bad customer service. We let the internet vote them up and down all the time, the order changes depending on the votes in real time. Is that page doomed, or does Google see it and love it?
Web Design | | BG19850