How do I complete a reverse DNS check when completing log file analysis?
-
I'm doing some log file analysis and need to run a reverse DNS check to ensure that I'm analysing logs from Google and not any imposters. Is there a command I can use in terminal to do this?
If not, whats the best way to verify Googlebot?
Thanks
-
That's awesome! Glad to know there's a bulk tool out there!
-
Hi Tyler,
Thanks for your reply. I managed to get down to 98 unique IPs and ran a bulk reverse DNS/IP Look-up using this tool:
https://www.infobyip.com/ipbulklookup.php
Thanks for your help though!
-
Hey Daniel,
If you want to verify that a user-agent is actually Googlebot, you'll want to use a log file analysis tool to aggregate all of the IP addresses associated with Googlebot. Once you have a list of IP addresses, you can perform a reverse DNS lookup to verify whether the IP addresses are actually associated with Googlebot or not.
If you're on windows/pc these steps should work:
https://www.serverintellect.com/support/dns/reverse-dns/If you're on mac try these steps:
1. open Terminal
2. type "host" + ip address
for example: "host 66.249.66.1"
3. hit enter
4. view results. For example: "1.66.249.66.in-addr.arpa domain name pointer crawl-66-249-66-1.googlebot.com"If the results are from Google.com or Googlebot.com, you can be sure it's actually Google crawling your site. Unfortunately, I don't know of any faster ways to achieve these results. I'm sure there's a tool out there, I just haven't found it yet.
This might also be a good resource for you: https://support.google.com/webmasters/answer/80553?hl=en
Good luck!
-Tyler
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
CSS and Javascipt files - website redesign project
UPDATED: We ran a crawl of the old website and have a list of css and javascript links as are part of the old website content. As the website is redesigned from scratch, I don't think these old css and javascipt files are being used for anything on the new site. I've read elsewhere online that you redirect "all" content files if launching/migrating to a new site. We are debating if this is needed for css and javascript files. Examples (A) http://website.com/wp-content/themes/style.css (B) http://website.com/wp-includes/js/wp-embed.min.js?ver=4.8.1
Technical SEO | | CREW-MARKETING0 -
Does uploading a new disavow file wipe out the original?
Hi guys, Just struggling to get a definitive answer on this one. If say I disavow 55 domains, then upload a brand new disavow file with on 35 domains in it, does this mean the original disavow file will be overwritten and those original domains will be forgotten about? Kind regards!
Technical SEO | | WCR0 -
301 Redirect keep html files on server?
Hello just one quick question which came up in the discussion here: http://moz.com/community/q/take-a-good-amount-of-existing-landing-pages-offline-because-of-low-traffic-cannibalism-and-thin-content When I do 301 redirects where I put together content from 2 pages, should I keep the page/html which redirects on the server? Or should I delete? Or does it make no difference at all?
Technical SEO | | _Heiko_0 -
Forward DNS Please HEEEELLLLPPPPPPP!!!
Sorry if I'm asking a rookie question here, I'm a hobbyist not a pro. I purchased a domain from Dodaddy and it's hosted on Network Solutions. We are now adding a shopping cart and need to create the subdomain shop.mydomain.com and forward the DNS to Volusion. I assumed this is done at Godaddy, where I purchased the domain. I called and they told me that Network Solutions needs to do it. I called NS and they told me GD had to do it. I called GD again hoping to get a different answer, but NOPE... guess what??? "everyone knows that it's done by whoever is hosting the site." UUUUGGGG So PLEASE let me know WHO has this responsibility???? Hopefully everyone has a common answer. 🙂 I still believe it's done at GD. Thanks Everyone
Technical SEO | | dmac0 -
A week ago I asked how to remove duplicate files and duplicate titles
Three weeks ago we had a very large number of site errors revealed by crawl diagostics. These errors related purely to the presence of both http://domain name and http://www.domain name. We used the rel canonical tag in the head of our index page to direct all to the www. preference, and we have no improvement. Matters got worse two weeks ago and I checked with Google Webmaster and found that Google had somehow lost our preference choice. A week ago I asked how to overcome this problem and received good advice about how to re-enter our preference for the www.tag with Google. This we did and it was accepted. We aso submitted a new sitemap.xml which was also acceptable to Google. Today, a week later we find that we have even more duplicate content (over 10,000 duplicate errors) showing up in the latest diagnostic crawl. Does anyone have any ideas? (Getting a bit desperate.)
Technical SEO | | FFTCOUK0 -
How to rewrite WordPress permalinks for reverse proxy?
Our main site, www.domain.com, is on an IIS 6 server. When we started our blog, we wanted to put it in a subdirectory (domain.com/blog), but we couldn't because our IT people refused to support it. Instead, we built it on a third-party Apache server and configured it to open under blog.domain.com. However, I came across this SEOmoz post about the glories of reverse proxies, so I've persuaded our IT people to take a swing at it. We got it to work on a staging server, but the permalinks won't change (still appear as blog.domain.com/slug). The IT guys say it's due to a configuration problem with WordPress. Can somebody out there point me in the right direction as far as working out the URL issues with this?
Technical SEO | | ufmedia0 -
Robots.txt file getting a 500 error - is this a problem?
Hello all! While doing some routine health checks on a few of our client sites, I spotted that a new client of ours - who's website was not designed built by us - is returning a 500 internal server error when I try to look at the robots.txt file. As we don't host / maintain their site, I would have to go through their head office to get this changed, which isn't a problem but I just wanted to check whether this error will actually be having a negative effect on their site / whether there's a benefit to getting this changed? Thanks in advance!
Technical SEO | | themegroup0 -
Does Google index XML files?
Does Google or other search engines include XML files in their index? More specifically, I am wondering how Google knows the difference between an xml filetype and an RSS feed.
Technical SEO | | nicole.healthline0