When do you use 'Fetch as a Google'' on Google Webmaster?
-
Hi,
I was wondering when and how often do you use 'Fetch as a Google'' on Google Webmaster and do you submit individual pages or main URL only?
I've googled it but i got confused more. I appreciate if you could help.
Thanks
-
I hazard to say that if the new product was in the sitemap it would have also appears in the SERPs. We submit sitemaps every day and products are in the index within hours.
I guess the GWMT manual submission is okay if you need to manually fix some pages, but then it asks the question, how your SEO efforts could not make those visible to bots (via link structure or sitemaps).
-
Thanks Gerd, it's a bit more clear now. Appreciate your help.
-
Thanks Frank, appreciate your help
-
Thank you so much for your reply. I am a bit more clear now what to do. Appreciate your help.
-
Sida, what I meant is that I use the Google Webmaster Tool function "Fetch as Google" only as a diagnostic function to see how GoogleBot receives a request from my website.
It seems that people fetch URLs via the GWMT "Fetch as Google" and then use the function to submit it to the index. I find that not a good idea as any new content should either be discoverable (via SEO) or should be submitted to Google automatically via a sitemap (hinted in robots.txt)
-
Thanks Gerd, Would you mind clarifying a bit more what 'diagnostic tool' is and if you recommend a name as well, that'll be fantastic.
-
Use it as a "diagnostic tool" to check how content or error pages are retrieved via the bot. I specifically look at it from a content and HTTP-status perspective.
I would not use it to submit URLs - for that you should rather use a sitemap file. Think of "Fetch as Google" as a troubleshooting tool and not something to submit pages to an index.
-
Here's an oh-by-the-way.
One of our manufacturer's came out with a product via slow roll literally within the last 5 days. They have not announced the release of it to the retailers. I happened to stumble on it visiting their site while updating products.
I did a search of the term and found I wasn't the only one unaware of it so I scrambled to add the product to the site, promote it and submit it to the index late Tuesday.
It's Thursday and its showing in SERPs.
Would it have appeared that quickly if I didn't submit it via fetch? I don't know for sure but I'm inclined to think not. Call me superstitious.
Someone debunk the myth if you can. One less thing for me to do.
-
If I add a lot a product/articles I just do a sitemap re-submit but if I only add one product or article I just wait till the bots crawl to that links. It usually takes a couple of day before it gets indexed. I never really used the fetch as google unless I made changes to the structure of the website.
Hope this helps.
-
I submit every product and category I add.
Do I have to? No. Is it necessary? No - we have an xml sitemap generator. Google is like Big Brother - he will find you. Fetch is a tool that you can use or not use.
Will Google find it faster and will you show up more quickly in search results if you submit it? I don't know.
-
Thank you AWC, I've read that article arlready but I am not quite sure is that how often this feature should be used. I think i should be more specific..If you have a ecommerce website and adding a product every 2-3days, would you submit the link every time you add a new item ? When you publish a blog article on your website, would you submit it immediately?
-
I think GWT explains it very well.
https://support.google.com/webmasters/answer/158587?hl=en
I typically use it to submit new pages to the index although its probably not necessary if you have an xml sitemap. Not certain on that one.
More tech savvy folks probably use it to also check the crawlability and "health" of pages.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to get into Google's Tops Stories?
Hi All, I have been doing research for a few weeks and I cannot for the life of me figure out why I cannot get my website (Racenet) into the top stories in Google. We are in Google News, have "news article" schema, have AMP pages. Our news articles also perform quite well organically and we typically dominate the Google News section. We have two main competitors (Punters and Just Horse Racing) who are both in top stories and I cannot find anything that we are doing that they aren't. Apparently the AMP "news article" schema is incorrect and that could be the reason why we aren't showing up in Google Top Stories, but I can't find anything wrong with the schema and it looks the same as our competitors. For example: https://search.google.com/structured-data/testing-tool/u/0/#url=https%3A%2F%2Fwww.racenet.com.au%2Fnews%2Fblake-shinn-booked-to-ride-doncaster-handicap-favourite-alizee-20190331%3FisAmp%3D1 Does anyone have any ideas of why I cannot get my site into Google Top Stories? Any and all help would be greatly appreciated. Thanks! 🙂
Technical SEO | | Saba.Elahi.M.0 -
Switching from HTTP to HTTPS and google webmaster
HI, I've recently moved one of my sites www.thegoldregister.co.uk to https. I'm using wordpress and put in the permanent 301 redirect for all pages to false https for all pages in the htaaccess file. I've updated the settings in google analytics to https for the original site. All seems to be working well. Regarding the google webmaster tools and what needs to be done. I'm very confused by the google documentation on this subject around https. Does all my crawl data and indexing from http site still stand and be inherited by the https version because of the redirects in place. I'm really worried I will lose all of this indexing data, I looked at the "change of address" in the settings of webmaster, but this seems to refer to changing the actual domain name rather than the protocol which i haven't at all. I've also tried adding the https version to the console as well, but the https version is showing a severe warning "is robots.txt blocking some important pages". I don't understand this error as it's the same version and file as the http site being generated by all in one seo pack for wordpress (see below at bottom). The warning is against line 5 saying it will ignore it. What i don't understand is i don't get this error in the webmaster console with the http version which is the same file?? Any help and advice would be much appreciated. Kind regards Steve User-agent: *
Technical SEO | | lqz
Disallow: /wp-admin/
Disallow: /wp-includes/
Disallow: /xmlrpc.php
Crawl-delay: 10 ceLAHIv.jpg0 -
Google's ability to crawl AJAX rendered content
I would like to make a change to the way our main navigation is currently rendered on our e-commerce site. Currently, all of the content that appears when you click a navigation category is rendering on page load. This is currently a large portion of every page visit’s bandwidth and even the images are downloaded even if a user doesn’t choose to use the navigation. I’d like to change it so the content appears and is downloaded only IF the user clicks on it, I'm planning on using AJAX. As that is the case it wouldn’t not be automatically on the site(which may or may not mean Google would crawl it). As we already provide a sitemap.xml for Google I want to make sure this change would not adversely affect our SEO. As of October this year the Webmaster AJAX crawling doc. suggestions has been depreciated. While the new version does say that its crawlers are smart enough to render AJAX content, something I've tested, I'm not sure if that only applies to content injected on page load as opposed to in click like I'm planning to do.
Technical SEO | | znotes0 -
Will a blog post about a collection of useful tools and web resources for a specific niche being seen as negative by google for too many links?
SEO newbie here, I'm thinking about creating a blog post about a collection of useful tools and web resources for my specific niche. It'd be 300 links or more, but with comments, and categorized nicely. It'd be a useful resource for my target audience to bookmark, and share. Will google see this as a negative? If so, what's the best way to do such a blog post? Thanks
Technical SEO | | ericzou0 -
Should I add 'nofollow' to site wide internal links?
I am trying to improve the internal linking structure on my site and ensure that the most important pages have the most internal links pointing to them (which I believe is the best strategy from Google's perspective!). I have a number of internal links in the page footer going to pages such as 'Terms and Conditions', 'Testimonials', 'About Us' etc. These pages, therefore, have a very large number of links going to them compared with the most important pages on my site. Should I add 'nofollow' to these links?
Technical SEO | | Pete40 -
Hit by Google
My site - www.northernlightsiceland.com - has been hit by google and Im not sure why. The traffic dropped 75% last 24 hours and all the most important keywords have dropped significantly in the SERP. The only issue I can think of are the subpages for the northern lights forecasting I did every day e.g. http://www.northernlightsiceland.com/northern-lights-forecast-iceland-3-oct-2012/ I have been simply doing a copy/paste for 1 month the same subpage, but only changing the top part (Summary) for each day. Could this be the reason why Im penalized? I have now simply taken them all down minus the last 3 days (that are relevant). What can I do to get up on my feet again? This is mission critical for me as you can imagine. Im wondering if it got hit by this EMD update on 28 sept that was focusing on exact match domains http://www.webmasterworld.com/google/4501349-1-30.htm
Technical SEO | | rrrobertsson0 -
Google Webmaster Tools: Keywords
Hi SEOmozzers! I'm the Dr./owner/in-house SEO for my eye care practice. The URL is www.ofallonfamilyeyecare.com. Our practice is in O'Fallon, MO. Since I'm an optometrist, my main keywords are "optometrist o'fallon" and "o'fallon optometrist". As I get more familiarity with SEO, Google Analytics and Webmaster Tools, I've discovered the Keywords that Google feels best represent my website. About a week ago I noted Google counted 21 instances of "optometrist" on the 28-30 pages of my website, which ranks as #32 in the most common keywords. #1 is "eye" with 506 instances. Even though 21 occurrences seemed low, I went though every page adding "optometrist" a couple times in the body where it would naturally be appropriate. I also added it to the address shown on the footer of every page. I changed the top navigation option of "meet Dr. Hegyi" to "our optometrist". I must have added at least 4 occurrences to every page on my site, and submitted for a re-crawl. I even tried to scale back the "eye" occurrences on a few pages. Today I see that Google has re-crawled the site and the keywords have been updated. "Optometrist has DROPPED from #32 to #33. Does anyone have any ideas or suggestions why I'm not seeing increased occurrence in Googles eyes? I realize this may not be a big factor in SERPs, but every bit of on-page optimization helps. Or is this too minor of an issue to sweat? Thanks!
Technical SEO | | JosephHegyi0 -
Should we use Google's crawl delay setting?
We’ve been noticing a huge uptick in Google’s spidering lately, and along with it a notable worsening of render times. Yesterday, for example, Google spidered our site at a rate of 30:1 (google spider vs. organic traffic.) So in other words, for every organic page request, Google hits the site 30 times. Our render times have lengthened to an avg. of 2 seconds (and up to 2.5 seconds). Before this renewed interest Google has taken in us we were seeing closer to one second average render times, and often half of that. A year ago, the ratio of Spider to Organic was between 6:1 and 10:1. Is requesting a crawl-delay from Googlebot a viable option? Our goal would be only to reduce Googlebot traffic, and hopefully improve render times and organic traffic. Thanks, Trisha
Technical SEO | | lzhao0