Fetch as Googlebot
-
"With Fetch as Googlebot you can see exactly how a page appears to Google"
I have verified the site and clicked on Fetch button. But how can i "see exactly how a page appears to Google"
Thanks
-
Hi Atul,
Here's Google's comprehensive explanation about Fetch as Googlebot, and what it can do for you.
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=158587
-
" The title description of the site you mentioned are getting displayed properly in Google Search."
Will it be displayed in Google webmaster tools ? This is what i would like to know ?
-
What is the exact issue you are encountering? The title description of the site you mentioned are getting displayed properly in Google Search.
"Fetch as Googlebot" will give you the source code of the page and whatever is displayed gets crawled by Google though you need to ensure that the syntax are correct.
-
Instant Previews are page snapshots that are displayed in search results.
It's not displaying title or description. Only the site image is being displayed.
The site in question is http://bit.ly/xu2mGi
-
"Fetch as GooleBot" gives you the details that Google will fetch from your source code (no ajax code will appear for instance) along with server response code.
If you want to see how your page will show in SERP's preview you need to use the tool "Instant Previews" under Labs section of the Google webmaster tool.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
If Fetch As Google can render website, it should be appear on SERP ?
Hello everyone and thank you in advance for helping me. I have a Reactjs application which has been made by Create-React-App that is zero configuration. Also I connect it using Axios to the API using Codeigniter(PHP). Before using Reactjs, this website was at the top Google's SERPs for specific keywords. After Using Reactjs and some changes in URLs with no redirection in htaccess or something else, I lost my search engine visibility! I guess it should be caused by Google penalties! I tried using "react-snap", "react-snapshot" and so forth for prerendering but there are so many problem with them. Also I tried using Prerender.io and unfortunately my host provider didn't help me to config the shared host! Finally I found a great article that my website eventually display in Rendering box of Fetch As Google. But still in Fetching box, the dynamic content didn't display. But I can see my entire website in both "This is how Googlebot saw the page" and "This is how a visitor to your website would have seen the page" for all pages without any problem. If Fetch As Google can render the entire of the website, is it possible to index my pages after a while and it would be appear on Google's SERP? mokaab_serp.png
Intermediate & Advanced SEO | | hamoz10 -
Googlebot on steroids... Why?
We launched a new website (www.gelderlandgroep.com). The site contains 500 pages, but some pages (like https://www.gelderlandgroep.com/collectie/) contains filters (so there are a lot possible url parameters). Last week we mentioned a tremendous amount of traffic (25 GB!!) and CPU usage on the server. 2017-12-04 16:11:57 W3SVC66 IIS14 83.219.93.171 GET /collectie model=6511,6901,7780,7830,2105-illusion&ontwerper=henk-vos,foklab 443 - 66.249.76.153 HTTP/1.1 Mozilla/5.0+(Linux;+Android+6.0.1;+Nexus+5X+Build/MMB29P)+AppleWebKit/537.36+(KHTML,+like+Gecko)+Chrome/41.0.2272.96+Mobile+Safari/537.36+(compatible;+Googlebot/2.1;++http://www.google.com/bot.html) - - www.gelderlandgroep.com 200 0 0 9445 501 312 We find out that "Googlebot" was firing many, many requests. At first we did a nslookup for the IPadres where it actually seems to be googlebot. Second we visited Google Searchconsole and I was really surprised... Googlebot on steroids? Googlebot requested 922.565 different url's and made combinations for every filter/ parameter combination on the site. Why? The sitemap.xml contains 500 url's... The authority of the site isn't very high, no other signal that this is a special website... Why so much "Google resources"? Of course we will exclude the parameters in SearchConsole, but I never saw a Googlebot activity for a small website like this before! Does anybody have any clue? Regards Olaf searchconsole.png nslookup.png
Intermediate & Advanced SEO | | Olaf0 -
When to Fetch?
If I'm about to submit a new sitemap for Google to crawl, is there any need to use the Fetch tool?
Intermediate & Advanced SEO | | muzzmoz0 -
Ajax Module Crawability vs. WMT Fetch & Render
Recently a module was built into the homepage to pull in content from an outside source via Ajax and I'm curious about the overall crawability of the content. In WMT, if I fetch & render the content it displays correctly, but if I view source all I am seeing is the empty container. Should I take additional steps so that the actual AJAX content displays in my source code, or am I "good" since the content does display correctly when I fetch & render?
Intermediate & Advanced SEO | | RosemarieReed0 -
Use of ajax to fetch data of a section
Hi, Is it ok to fetch a section on a page using ajax. Will it be crawlable by Google. I have already seen google's directions to get a complete ajax fetched page crawled by Google. Is there a way to get a particular section on a page fetched through ajax & indexed by Google. Regards
Intermediate & Advanced SEO | | vivekrathore0 -
Best way to view Global Navigation bar from GoogleBot's perspective
Hi, Links in the global navigation bar of our website do not show up when we look at Google cache --> text only version of the page. These links use "style="<a class="attribute-value">display:none;</a>" when we looked at HTML source. But if I use "user agent switcher" add-on in Firefox and set it to Googlebot, the links in global nav are displayed. I am wondering what is the best way to find out if Google can/can not see the links. Thanks for the help! Supriya.
Intermediate & Advanced SEO | | SShiyekar0 -
Manipulate Googlebot
**Problem: I have found something wierd on the server log as below. the googlebot visit the folders and files which do not exist at all. there is no photo folder on the server, but googlebot visit the files inside the photo folder and return 404 error. ** I wonder if it is SEO hacking attempts, and how can someone manage to Manipulate Googlebot. ================================================== **66.249.71.200 - - [22/Aug/2012:02:31:53 -0400] "GET /robots.txt HTTP/1.0" 200 2255 "-" "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)" ** **66.249.71.25 - - [22/Aug/2012:02:36:55 -0400] "GET /photo/pic24.html HTTP/1.1" 404 - "-" "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)" 66.249.71.26 - - [22/Aug/2012:02:37:03 -0400] "GET /photo/pic20.html HTTP/1.1" 404 - "-" "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)" 66.249.71.200 - - [22/Aug/2012:02:37:11 -0400] "GET /photo/pic22.html HTTP/1.1" 404 - "-" "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)" 66.249.71.200 - - [22/Aug/2012:02:37:28 -0400] "GET /photo/pic19.html HTTP/1.1" 404 - "-" "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)" 66.249.71.26 - - [22/Aug/2012:02:37:36 -0400] "GET /photo/pic17.html HTTP/1.1" 404 - "-" "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)" 66.249.71.200 - - [22/Aug/2012:02:37:44 -0400] "GET /photo/pic21.html HTTP/1.1" 404 - "-" "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)" **
Intermediate & Advanced SEO | | semer1 -
How to find what Googlebot actually sees on a page?
1. When I disable java-script in Firefox and load our home page, it is missing entire middle section. 2. Also, the global nav dropdown menu does not display at all. (with java-script disabled) I believe this is not good. 3. But when type in <website name="">in Google search and click on the cached version of home page > and then click on text only version, It displays the Global nav links fine.</website> 4. When I switch the user agent to Googlebot(using Firefox plugin "User Agent Swticher)), the home page and global nav displays fine. Should I be worried about#1 and #2 then? How to find what Googlebot actually sees on a page? (I have tried "Fetch as Googlebot" from GWT. It displays source code.) Thanks for the help! Supriya.
Intermediate & Advanced SEO | | Amjath0