I need to whitelist the Google Analytics servers via IP Address. Any one know their server IPs?
-
Due to a walled garden situation, I need to input all the servers we should have access to via IP Address. We need to track the way our users are browsing within the garden, so I require all the IP addresses that Google Analytics would use. I tried googling it, but was not able to find any definitive answers.
Thank you in advance,
Heather
-
Indeed, if you want to whitelist a list of Google IPs, we'd be talking about a TON of IPs. To my knowledge, there doesn't seem to be a public list anywhere.
I'm afraid there is no good answer for your question.
Your best bet might be experimenting with an self-hosted Analytics package like Urchin.
-
Hi,
The reason we require the IP addresses is that we have all internet access completely shut down, so there is no way for our site to send out a call to Google Analytics each time a user loads the page unless the IP address of the google servers have been added to our whitelist, therefore allowing data out from our network to theirs.
I did a bunch of research after initially asking the question back in November and it apparently there are no specific IP ranges for just Google Analytics related servers (they are just part of the overall server farm). We would need to whitelist well over a 100,000 ip addresses based on the ranges I found.
Thank you,
Heather
-
Yes, I think we need this clarified as there are two things to be considered with Analytics - depending which method you want to use to verify your site you may need Googlebot to have access. There is another option in this case though - you could simply turn off the firewall, verify the site and then turn the firewall back on.
Of course Keri is quite right in that there is no need for crawlers to access the site for Analytics to work once verification has been completed as it uses javascript to send information from the browser.
The other question to be addressed though is "do you want any of the pages behind the firewall to be indexed?". If the answer is no, then the question will have been answered by the previous comments.
If the answer is yes, then you would need to whitelist the entire range to give googlebot(s) ongoing access.
Sha
-
Can you clarify if you're talking about Google Analytics, or about the google bot crawling the site itself?
-
I don't believe this is generally a problem, because traditionally googlebot doesn't use javascript that would trigger GA in the first place.
-
Hi Heather,
I think the closest you would get would be to run a Whois lookup for google.com, which will show you a series of IP address ranges that are associated with the domain.
While it will not give you the specific addresses for Analytics, it will also ensure that you are not blocking any of the various googlebots responsible for crawling, whatever job they do
Hope that helps,
Sha
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved Google Analytics (GA4) recommendations for SEO analysis?
Guides on Moz and elsewhere mostly refer to Google Analytics' Universal Analytics (UA). However, UA is being replaced with GA4, and the interface, options, and reporting are very different. Can you recommend a clear, thorough, and effective walkthrough of how to set up useful SEO reports in GA4? Is there a simple tool you recommend that will help connect historical data from UA to GA4 when GA4 is the only option available? If there's no simple tool, what values do you recommend retaining from UA for effective historical reporting? How would you use them? At minimum for reporting, I'd want to show month-to-month changes and year-to-year changes (in percentages and in real numbers) for the following: all site visits all organic visits organic visits as a percentage of all site visits organic visits that led to a specific goal completion organic visits that led to any goal completion Thanks in advance for your help!
Reporting & Analytics | | Kevin_P1 -
Why google stubbornly keeps indexing my http urls instead of the https ones?
I moved everything to https in November, but there are plenty of pages which are still indexed by google as http instead of https, and I am wondering why. Example: http://www.gomme-auto.it/pneumatici/barum correctly redirect permanently to https://www.gomme-auto.it/pneumatici/barum Nevertheless if you search for pneumatici barum: https://www.google.it/search?q=pneumatici+barum&oq=pneumatici+barum The third organic result listed is still http. Since we moved to https google crawler visited that page tens of time, last one two days ago. But doesn't seems to care to update the protocol in google index. Anyone knows why? My concern is when I use API like semrush and ahrefs I have to do it twice to try both http and https, for a total of around 65k urls I waste a lot of my quota.
Reporting & Analytics | | max.favilli0 -
Identifying Bots in Google Analytics
Hi there, While you can now filter out bots and spiders in Google Analytics, I'm interested in how you identify a bots and spiders in the first place. For example, it used to be thought that Googlebot wouldn't appear in GA as it 'couldn't process Javascript' but now Google has announced new developments for its crawler with regards to interpreting javascript and CSS, this argument isn't as cut and dry. I'm not suggesting Googlebot appears in Google Analytics, but I am saying that you can't make the case that it won't appear only because it can't interpret JavaScript. So, I'm interested to see what metrics you use to identify a bot? For me, the mix of Users > Browser, Users > Operating System Version is still quite handy, but is it possible to identify individual bots and spiders within Google Analytics? And would Googlebot appear?
Reporting & Analytics | | ecommercebc0 -
Google Analytics: Trackbacks & Network Referrals?
Buon Pormeriggio from 15 degrees C mostly cloudy Wetherby UK 🙂
Reporting & Analytics | | Nightwing
Whats the difference between Trackbacks & Network referrals within Google analytics social media reporting? I'd like to specifically understand why a link to site i'm working on withing this post:
http://huddled.co.uk/huddled-interviews-nicola-schaefer-from-liverpool-fc-e-l-i-t-e-s-8335/ is classed as Trackback & not a Network refferral 😞 Illustrated here is the link thats being recorded as a track back:
http://i216.photobucket.com/albums/cc53/zymurgy_bucket/track-back-query_zpsbab2679b.jpg And here is the data:
http://i216.photobucket.com/albums/cc53/zymurgy_bucket/google-trackbacks-2_zps2861fa17.jpg But why is not showing up in Network referrals? Thanks in advance,
David0 -
Google Analytics Showing Inflated Product Revenue
Hi- For the month of Feb on two of our sites we are seeing inflated product revenues. I have not seen this before and I am not having any luck searching for answers. Here is the issue: Product B sells for $159.95 For the month of Feb we sold 3 thus revenue should be ~$479.85 GA is showing Product B's revenue at $3,360.00 I read online that sometimes folks will bookmark the receipt page and that can cause this and we would need to put a catch in place for this but I am guessing this is not the case as it is happening on two sites. Please let me know if you have any questions.
Reporting & Analytics | | K2_Sports0 -
Google Analytics Tracking Code Queries
Hello, I have taken on a new client who has Google Analytics installed. The tracking code is set to 'single domain'. Recently they added a mobile site using a sub-domain (m.website.com) which means that Google Analytics is not picking up this traffic. I want to revise the account so that I have a master account (raw data) and then profiles for the mobile site, main domain (www.website.com) and one other for a sub-domain that they are using. I am aware that there is mobile specific tracking code however I thought it would be easier (re conversions/goals/eCommerce tracking) to not use this and by changing the account to 'multiple domains' we could also get data for another sub-domain that they are using . My questions are: Am I right to want to use individual profiles over web properties. If not please explain why. When installing the tracking code (where the profile number is changing) I believe that I need to add that code with the changing profile number to the sub-domain sections. So my question is a) is that correct, and b) if I use a profile number on a sub-domain section will the master account still gather the data for the main URL as well as all sub-domains. If I change the master account from using 'single domain' tracking code to 'multiple domain' tracking code will this affect historical data? Will I lose the data? When changing from 'single domain' tracking to 'multiple domain' tracking does this affect eCommerce tracking? Or do we only need to be adding the additional lines of tracking code that allow sub-domains to be tracked? The web developers are using asynchronous code however half is in the and the other half is at the bottom of the source code. Given that traffic is being reported in the Google Analytics account should I have any concerns that the code is split? I have done a lot of reading but seem to be going around in circles, so your help is much appreciated! Thanks,
Reporting & Analytics | | Unity
Dinny0 -
Google API Tools - Next Analytics
Have you used any API tools like Next Analytics and/or can you recommend another tool that is particularly useful for a SEO?
Reporting & Analytics | | KnutDSvendsen0