When do you use 'Fetch as a Google'' on Google Webmaster?
-
Hi,
I was wondering when and how often do you use 'Fetch as a Google'' on Google Webmaster and do you submit individual pages or main URL only?
I've googled it but i got confused more. I appreciate if you could help.
Thanks
-
I hazard to say that if the new product was in the sitemap it would have also appears in the SERPs. We submit sitemaps every day and products are in the index within hours.
I guess the GWMT manual submission is okay if you need to manually fix some pages, but then it asks the question, how your SEO efforts could not make those visible to bots (via link structure or sitemaps).
-
Thanks Gerd, it's a bit more clear now. Appreciate your help.
-
Thanks Frank, appreciate your help
-
Thank you so much for your reply. I am a bit more clear now what to do. Appreciate your help.
-
Sida, what I meant is that I use the Google Webmaster Tool function "Fetch as Google" only as a diagnostic function to see how GoogleBot receives a request from my website.
It seems that people fetch URLs via the GWMT "Fetch as Google" and then use the function to submit it to the index. I find that not a good idea as any new content should either be discoverable (via SEO) or should be submitted to Google automatically via a sitemap (hinted in robots.txt)
-
Thanks Gerd, Would you mind clarifying a bit more what 'diagnostic tool' is and if you recommend a name as well, that'll be fantastic.
-
Use it as a "diagnostic tool" to check how content or error pages are retrieved via the bot. I specifically look at it from a content and HTTP-status perspective.
I would not use it to submit URLs - for that you should rather use a sitemap file. Think of "Fetch as Google" as a troubleshooting tool and not something to submit pages to an index.
-
Here's an oh-by-the-way.
One of our manufacturer's came out with a product via slow roll literally within the last 5 days. They have not announced the release of it to the retailers. I happened to stumble on it visiting their site while updating products.
I did a search of the term and found I wasn't the only one unaware of it so I scrambled to add the product to the site, promote it and submit it to the index late Tuesday.
It's Thursday and its showing in SERPs.
Would it have appeared that quickly if I didn't submit it via fetch? I don't know for sure but I'm inclined to think not. Call me superstitious.
Someone debunk the myth if you can. One less thing for me to do.
-
If I add a lot a product/articles I just do a sitemap re-submit but if I only add one product or article I just wait till the bots crawl to that links. It usually takes a couple of day before it gets indexed. I never really used the fetch as google unless I made changes to the structure of the website.
Hope this helps.
-
I submit every product and category I add.
Do I have to? No. Is it necessary? No - we have an xml sitemap generator. Google is like Big Brother - he will find you. Fetch is a tool that you can use or not use.
Will Google find it faster and will you show up more quickly in search results if you submit it? I don't know.
-
Thank you AWC, I've read that article arlready but I am not quite sure is that how often this feature should be used. I think i should be more specific..If you have a ecommerce website and adding a product every 2-3days, would you submit the link every time you add a new item ? When you publish a blog article on your website, would you submit it immediately?
-
I think GWT explains it very well.
https://support.google.com/webmasters/answer/158587?hl=en
I typically use it to submit new pages to the index although its probably not necessary if you have an xml sitemap. Not certain on that one.
More tech savvy folks probably use it to also check the crawlability and "health" of pages.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Fetch and Render - does this fix penalties?
Ran the fetch and render and came up with two "issues". My specific question is how likely would a link to quantcast (which blocks acces via roberts.txt) really hurt us if fetch and render shows it preventing rendering - which it is not. Thoughts and comments are much appreciated.
Technical SEO | | robertdonnell0 -
'External nofollow' in a robots meta tag? (advertorial links)
I believe this has never worked? It'd be an easy way of preventing any penalties from Google's recent crackdown on paid links via advertorials. When it's not possible to nofollow each external link individually, what are people doing? Nofollowing and/or noindexing the whole page?
Technical SEO | | Alex-Harford0 -
My webmaster doesn't shows backlinks data why???
Hi every one I have 2 websites with same domain with targeting different country name for example www.domain.com.au and the other is wwww.domain.co.nz . So my question is that in webmaster tool my one domain doesn't shows back links data or search query data. why this is happen?
Technical SEO | | SanketPatel0 -
I am getting an error message from Google Webmaster Tools and I don't know what to do to correct the problem
The message is:
Technical SEO | | whitegyr
"Dear site owner or webmaster of http://www.whitegyr.com/, We've detected that some of your site's pages may be using techniques that are outside Google's Webmaster Guidelines. If you have any questions about how to resolve this issue, please see our Webmaster Help Forum for support. Sincerely, Google Search Quality Team" I have always tried to follow Google's guidelines and don't know what I am doing wrong, I have eight different websites all getting this warning and I don't know what is wrong, is there anyone you know that will look at my sites and advise me what I need to do to correct the problem? Website with this warning:
artistalaska.com
cosmeticshandbook.com
homewindpower.ws
montanalandsale.com
outdoorpizzaoven.net
shoes-place.com
silverstatepost.com
www.whitegyr.com0 -
Google indexing less url's then containded in my sitemap.xml
My sitemap.xml contains 3821 urls but Google (webmaster tools) indexes only 1544 urls. What may be the cause? There is no technical problem. Why does Google index less URLs then contained in my sitemap.xml?
Technical SEO | | Juist0 -
Is there any evidence that using Google Site Search will help your ranking, speed of indexing, or traffic?
I am considering using Google Site Search on our new site. I was told... "We have also seen a bump in traffic for sites when using Google Site Search because Google indexes the site more often (they claim using the paid Google Site Search has no effect on search rankings but we have also seen bumps in rankings after using it so that may just be what they have to say legally)." Is there any evidence of this? Would you recommend using Google Site Search? Thanks David
Technical SEO | | DavidButler710 -
Does google use the wayback machine to determine the age of a site?
I have a site that I had removed from the wayback machine because I didn't want old versions to show. However I noticed that in many seo tools the site now always shows a domain age of zero instead of 6 years ago when I registered it. My question is what do the actual search engines use to determine age when they factor it into the ranking algorithm? By having it removed from the wayback machine, does that make the search engines think the site is brand new? Thanks
Technical SEO | | FastLearner0 -
How do I use the Robots.txt "disallow" command properly for folders I don't want indexed?
Today's sitemap webinar made me think about the disallow feature, seems opposite of sitemaps, but it also seems both are kind of ignored in varying ways by the engines. I don't need help semantically, I got that part. I just can't seem to find a contemporary answer about what should be blocked using the robots.txt file. For example, I have folders containing site comps for clients that I really don't want showing up in the SERPS. Is it better to not have these folders on the domain at all? There are also security issues I've heard of that make sense, simply look at a site's robots file to see what they are hiding. It makes it easier to hunt for files when they know the directory the files are contained in. Do I concern myself with this? Another example is a folder I have for my xml sitemap generator. I imagine google isn't going to try to index this or count it as content, so do I need to add folders like this to the disallow list?
Technical SEO | | SpringMountain0