How to avoid automated queries to Google
-
We have a search engine marketing team working on different projects and we share same IP. We check our rankings manually on Google, is it sending automated queries to Google? Can it affect our sites? What solution do you suggest to check rankings without sending multiple queries to Google?
-
That assumes they have a fixed IP though doesn't it?
-
You should have your IP address's omitted via google analytics, there is an option to do that
-
Your post says you check rankings 'manually'. Do you do this by putting the query into Google through a browser? If so, the only time I've seen this have any effect is when you get a CAPTCHA to check you're still human.
For us we have noticed that this happens less if you're logged into Google. But it happens so rarely because you'd have to be able to post the query and record the ranking in such a fast time for Google to think you're automating queries, so I'm not sure you need to worry.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Are there ways to avoid false positive "soft 404s" by Google
Sometimes I get alerts from Google Search Console that it has detected soft 404s on different websites, and since I take great care to never have true soft 404s, they are always false positives. Today I got one on a website that has pages promoting some events. The language on the page for one event that has sold out says that "tickets are no longer available" which seems to have tripped up Google into thinking the page is a soft 404. It's kind of incredible to me that in the current era we're in, with things like chatGPT that Google doesn't seem to understand natural language. But that has me thinking, are there some strategies or best practices we can use in how we write copy on the page so Google doesn't flag it as soft 404? It seems like anything that could tell a user that an item isn't available could trip it up into thinking it is a 404. In the case of my page, it's actually important information we need to tell the public that an event has sold out, but to use their interest in that event to promote other events. so I don't want the page deindexed or not to rank well!
Technical SEO | | IrvCo_Interactive0 -
My Website stopped being in the Google Index
Hi there, So My website is two weeks old, and I published it and it was ranking at about page 10 or 11 for a week maybe a bit longer. The last few days it dropped off the rankings, which I assumed was the google algorithm doing its thing but when I checked Google Search Console it says my domain is not in the index. 'This page is not in the index, but not because of an error. See the details below to learn why it wasn't indexed.' I click request indexing, then after a bit, it goes green saying it was successfully indexed. Then when I refresh the website it gives me the same message 'This page is not in the index, but not because of an error. See the details below to learn why it wasn't indexed.' Not sure why it says this, any ideas or help is appreciated cheers.
Technical SEO | | sydneygardening0 -
Fetch as Google temporarily lifting a penalty?
Hi, I was wondering if anyone has seen this behaviour before? I haven't! We have around 20 sites and each one has lost all of its rankings (not in index at all) since the medic update apart from specifying a location on the end of a keyword. I set to work trying to identify a common issue on each site, and began by improving speed issues in insights. On one site I realised that after I had improved the speed score and then clicked "fetch as google" the rankings for that site all returned within seconds. I did the same for a different site and exactly the same result. Cue me jumping around the office in delight! The pressure is off, people's jobs are safe, have a cup of tea and relax. Unfortunately this relief only lasted between 6-12 hours and then the rankings go again. To me it seems like what is happening is that the sites are all suffering from some kind of on page penalty which is lifted until the page can be assessed again and when it is the penalty is reapplied. Not one to give up I set about methodically making changes until I found the issue. So far I have completely rewritten a site, reduced over use of keywords, added over 2000 words to homepage. Clicked fetch as google and the site came back - for 6 hours..... So then I gave the site a completely fresh redesign and again clicked fetch as google, and same result. Since doing all that, I have swapped over to https, 301 redirected etc and now the site is completely gone and won't come back after fetching as google. Uh! So before I dig myself even deeper, has anyone any ideas? Thanks.
Technical SEO | | semcheck11 -
Using Google Adwords is good?
I heard about that if you using adwords, google drops your ranking a little bit. Because of you already pay money for results. I think that is reasonable.
Technical SEO | | umutege0 -
Google authorship syntax, plus no follow
I have seen two forms of rel=author syntax. Are they both valid? (1) (2) Second, does 'no follow' take away authorship? Is there any point in doing rel=author for a link that is rel=no follow? Like this:
Technical SEO | | scanlin0 -
Do Google Always Tell You About Penalties?
Hi One of my clients has recently suffered a big decline in rankings on some of their keywords, going from positions in the top 10 to sub 50 for some of their keywords, but not all. Their website is still indexed and there is no link warning from Google. Some of their links from a previous link builder are dodgy and we have already tried to contact the other company for removal but to no avail. Any ideas where I start? So far I planning to increase the branded anchor link and add some content as the site is fairly light on content. Is it worth doing a reconsideration request and mention the bad links incase this is the problem. Has anyone seen any particular good resources on what to do when it appears you have a penalty. None of my other clients have been effected by Panda or Penguin and although I have done some general reading I need to know what action to take and get up to speed FAST! Thanks in advance. Kind regards Karen
Technical SEO | | Karen_Dauncey0 -
Should I delete Google Places and start over?
My Google Places page is stagnant. It never changes its rank, doesn't find corresponding reviews on directory sites, disappears altogether after a few days for certain keywords and ranks poorly overall. We have had a horrible time getting our web presence to be uniform...name, address, phone, etc. Our business was previously a different name so all those listings were still active, we have had different doctors over the years...those were associated with the business name in different listings, we used a referral service that sponsored listings using a diffferent phone number, change web url in the past year so some sites indicated the wrong address. HUGE HEADACHE Are there any positives/negatives to deleting the Places page and starting over? Here is the page: http://maps.google.com/maps/place?hl=en&georestrict=input_srcid%3Ab1706925095a8afa Thank you for your help!
Technical SEO | | PMC-3120871 -
Google support eTag?
Hello~ People! I have a questions regarding eTag. I know Google support If-Modified-HTTP-Header aka last modified header. I used eTag instead of last modified header. It seems like Google does support, yet here is my questions. code.google suggest as following. GData-Version: 2.0
Technical SEO | | Artience
ETag: "C0QBRXcycSp7ImA9WxRVFUk." but I used etag as following . ETag: "10cd712-eaae-b279a480" I didnt include "GData-Version: 2.0". is this mean Google may not support my etag?0