Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
What IP Address does Googlebot use to read your site when coming from an external backlink?
-
Hi All,
I'm trying to find more information on what IP address Googlebot would use when arriving to crawl your site from an external backlink.
I'm under the impression Googlebot uses international signals to determine the best IP address to use when crawling (US / non-US) and then carries on with that IP when it arrives to your website?
E.g. - Googlebot finds www.example.co.uk. Due to the ccTLD, it decides to crawl the site with a UK IP address rather than a US one. As it crawls this UK site, it finds a subdirectory backlink to your website and continues to crawl your website with the aforementioned UK IP address.
Is this a correct assumption, or does Googlebot look at altering the IP address as it enters a backlink / new domain?
Also, are ccTLDs the main signals to determine the possibility of Google switching to an international IP address to crawl, rather than the standard US one? Am I right in saying that hreflang tags don't apply here at all, as their purpose is to be used in SERPS and helping Google to determine which page to serve to users based on their IP etc.
If anyone has any insight this would be great.
-
There's a few things you need to marry up if you want to do this. You need the referring page or domain / hostname (to validate that the session came from a backlink you know about). Once you filter the data down like that, you just need to filter by user-agent ("googlebot" - or any user-agent string which contains "googlebot"). Then you just want to look at the IP address field in the tabular data and you have your answers!
Here's the problem, most IP-level data is contained within basic server-side analysis packages (like AWStats which is installed on most sites, within the cPanel) or alternatively you can go to the log files for much of the same data. Most referrer-level data (stuff that deals with attribution) is contained within Analytics suites like Adobe Omniture or Google Analytics.
In GA, you can't usually get to 'individual' IP-level data. There used to be a URL hack to force it to render, but it was killed off (and many people who used it were banned by Google). The reason for that is, Google don't want too much PID (Personally Identifiable Data) harvested by their tool. It creates too many legal issues for Google (and also, whomever is leveraging that data for potentially nefarious marketing purposes)
Since you won't get enough IP-level data from GA, you're going to have to go to log files and log analysis tools instead. Hopefully they will contain at least some referral level data... The issue is, getting all the pieces you want to align in a legally compliant way
Obviously you have your reasons for looking. I'd check if you can find anything on your CPanel in AWStats (if that's installed) or get the log files and analyse them with something like Screaming Frog Log File Analyser
I can't promise this will return the data you want, but it's probably your only hope
-
Hi,
First of all "Google crawls from many IPs and they have confirmed that they do periodically add new ones. And there are also various Googlebot useragents, not just the regular one. This is why Google doesn't publish a list of all the IPs, because there are so many of them and they can change" .
You can see full conversation here @ https://productforums.google.com/forum/#!msg/webmasters/4fKthSy7oFQ/GgslLXJnDQAJ
Second Today Google says "IP Addresses Don't Matter For Backlinks & Search Rankings"
https://www.seroundtable.com/google-ip-addresses-backlinks-rankings-26561.html
Hope this helps
Thanks
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Bit.ly backlinks
Hi all, what experience do you have with Bit.ly links? Can I use it for backlinking management?
Intermediate & Advanced SEO | | Tormar3 -
My site shows 503 error to Google bot, but can see the site fine. Not indexing in Google. Help
Hi, This site is not indexed on Google at all. http://www.thethreehorseshoespub.co.uk Looking into it, it seems to be giving a 503 error to the google bot. I can see the site I have checked source code Checked robots Did have a sitemap param. but removed it for testing GWMT is showing 'unreachable' if I submit a site map or fetch Any ideas on how to remove this error? Many thanks in advance
Intermediate & Advanced SEO | | SolveWebMedia0 -
Using a US CDN (Cloudflare) for a UK Site. Should I use a UK Based CDN as it says my server is based in USA
Hi All, We are a UK Company with Uk customers only and use CloudFlare CND. Our Site is hosted by a UK company with servers here but from looking online and checking where my site is hosted etc etc , some sites are telling me the name of our UK Hosted company and other sites are telling me my site is hosted in San Fran (USA) , where I presume the Cloudflare is based. I know Cloudflare has a couple of servers in the UK it uses but given all my customers are UK based ,I don't want this is affect rankings etc , as I thought it was a ranking benefit to be hosted in the country you are based. Is there any issue with this and should I change or is google clever enough to know so i shouldn't worry. thanks Pet
Intermediate & Advanced SEO | | PeteC120 -
Moving to a new site while keeping old site live
For reasons I won't get into here, I need to move most of my site to a new domain (DOMAIN B) while keeping every single current detail on the old domain (DOMAIN A) as it is. Meaning, there will be 2 live websites that have mostly the same content, but I want the content to appear to search engines as though it now belongs to DOMAIN B. Weird situation. I know. I've run around in circles trying to figure out the best course of action. What do you think is the best way of going about this? Do I simply point DOMAIN A's canonical tags to the copied content on DOMAIN B and call it good? Should I ask sites that link to DOMAIN A to change their links to DOMAIN B, or start fresh and cut my losses? Should I still file a change of address with GWT, even though I'm not going to 301 redirect anything?
Intermediate & Advanced SEO | | kdaniels0 -
The benefits from having a dedicated IP
Is the true? Claim by SiteGround Having a dedicated IP for each website is considered by some experts as an advantage for search engine optimization. There is a common believe that sites with dedicated IP addresses do better in the search engine results than those on shared IPs. Such sites do not share the risk of being banned for sharing the same IP in case another website hosted on the same server gets banned by a search engine.
Intermediate & Advanced SEO | | JordanBrown0 -
Using Canonical URL to poin to an external page
I was wondering if I can use a canonical URL that points to a page residing on external site? So a page like:
Intermediate & Advanced SEO | | llamb
www.site1.com/whatever.html will have a canonical link in its header to www.site2.com/whatever.html. Thanks.0 -
Multiple sites in the same niche
Hi All A question regarding multiple sites in the same niche... If I have say 10 sites all targetting the same niche yet all on different C-class IPs with different hosts, registrars, whois data and ages can I use the same template, or will Google discern a pattern? Basically I have developed a WordPress template which I want to use on the sites albeit with different logos / brand colours. NB/ All of the 10 sites will have unique, original content and they will NOT be interlinked
Intermediate & Advanced SEO | | danielparry1 -
How do you implement dynamic SEO-friendly URLs using Ajax without using hashbangs?
We're building a new website platform and are using Ajax as the method for allowing users to select from filters. We want to dynamically insert elements into the URL as the filters are selected so that search engines will index multiple combinations of filters. We're struggling to see how this is possible using symfony framework. We've used www.gizmodo.com as an example of how to achieve SEO and user-friendly URLs but this is only an example of achieving this for static content. We would prefer to go down a route that didn't involve hashbangs if possible. Does anyone have any experience using hashbangs and how it affected their site? Any advice on the above would be gratefully received.
Intermediate & Advanced SEO | | Sayers1