When do you use 'Fetch as a Google'' on Google Webmaster?
-
Hi,
I was wondering when and how often do you use 'Fetch as a Google'' on Google Webmaster and do you submit individual pages or main URL only?
I've googled it but i got confused more. I appreciate if you could help.
Thanks
-
I hazard to say that if the new product was in the sitemap it would have also appears in the SERPs. We submit sitemaps every day and products are in the index within hours.
I guess the GWMT manual submission is okay if you need to manually fix some pages, but then it asks the question, how your SEO efforts could not make those visible to bots (via link structure or sitemaps).
-
Thanks Gerd, it's a bit more clear now. Appreciate your help.
-
Thanks Frank, appreciate your help
-
Thank you so much for your reply. I am a bit more clear now what to do. Appreciate your help.
-
Sida, what I meant is that I use the Google Webmaster Tool function "Fetch as Google" only as a diagnostic function to see how GoogleBot receives a request from my website.
It seems that people fetch URLs via the GWMT "Fetch as Google" and then use the function to submit it to the index. I find that not a good idea as any new content should either be discoverable (via SEO) or should be submitted to Google automatically via a sitemap (hinted in robots.txt)
-
Thanks Gerd, Would you mind clarifying a bit more what 'diagnostic tool' is and if you recommend a name as well, that'll be fantastic.
-
Use it as a "diagnostic tool" to check how content or error pages are retrieved via the bot. I specifically look at it from a content and HTTP-status perspective.
I would not use it to submit URLs - for that you should rather use a sitemap file. Think of "Fetch as Google" as a troubleshooting tool and not something to submit pages to an index.
-
Here's an oh-by-the-way.
One of our manufacturer's came out with a product via slow roll literally within the last 5 days. They have not announced the release of it to the retailers. I happened to stumble on it visiting their site while updating products.
I did a search of the term and found I wasn't the only one unaware of it so I scrambled to add the product to the site, promote it and submit it to the index late Tuesday.
It's Thursday and its showing in SERPs.
Would it have appeared that quickly if I didn't submit it via fetch? I don't know for sure but I'm inclined to think not. Call me superstitious.
Someone debunk the myth if you can. One less thing for me to do.
-
If I add a lot a product/articles I just do a sitemap re-submit but if I only add one product or article I just wait till the bots crawl to that links. It usually takes a couple of day before it gets indexed. I never really used the fetch as google unless I made changes to the structure of the website.
Hope this helps.
-
I submit every product and category I add.
Do I have to? No. Is it necessary? No - we have an xml sitemap generator. Google is like Big Brother - he will find you. Fetch is a tool that you can use or not use.
Will Google find it faster and will you show up more quickly in search results if you submit it? I don't know.
-
Thank you AWC, I've read that article arlready but I am not quite sure is that how often this feature should be used. I think i should be more specific..If you have a ecommerce website and adding a product every 2-3days, would you submit the link every time you add a new item ? When you publish a blog article on your website, would you submit it immediately?
-
I think GWT explains it very well.
https://support.google.com/webmasters/answer/158587?hl=en
I typically use it to submit new pages to the index although its probably not necessary if you have an xml sitemap. Not certain on that one.
More tech savvy folks probably use it to also check the crawlability and "health" of pages.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Clean URL vs. Parameter URL and Using Canonical URL...That's a Mouthfull!
Hi Everyone, I a currently migrating a Magento site over to Shopify Plus and have a question about best practices for using the canonical URL. There is a competitor that I believe is not doing it the correct way, so I want to make sure my way is the better choice. With 'Vendor Pages' in Shopify, they show up looking like: https://www.campusprotein.com/collections/vendors?q=Cellucor. Not as clean. Problem is that Shopify also creates https://www.campusprotein.com/collections/cellucor. Same products, same page, just a different more clean URL. I am seeing both indexed in Google. What I want to do is basically create a canonical URL from the URL with the parameter that points to the clean URL. The two pages are very similar. The only difference is that the clean URL page has some additional content at the top of the page. I would say the two pages are 90% the same. Do you see any issue with that?
Technical SEO | | vetofunk0 -
Google Search console says 'sitemap is blocked by robots?
Google Search console is telling me "Sitemap contains URLs which are blocked by robots.txt." I don't understand why my sitemap is being blocked? My robots.txt look like this: User-Agent: *
Technical SEO | | Extima-Christian
Disallow: Sitemap: http://www.website.com/sitemap_index.xml It's a WordPress site, with Yoast SEO installed. Is anyone else having this issue with Google Search console? Does anyone know how I can fix this issue?1 -
Redirect 'keyword-url' to improve ranking?
I was wondering if a good url, with a keyword in it, can help you improve the position of that certain keyword by redirecting that url to your website. To make it clear: We run the website www.terello.nl, and have the possibility to let the url www.iphonereparatie.nl (translation: iphonerepair) redirect to our website. Would this help us to rank for the keyword 'iPhone reparatie'? I hope that I made myself clear this way:) Otherwise i'm more than happy to clearify myself!
Technical SEO | | Jan-Peter0 -
Why Can't Googlebot Fetch Its Own Map on Our Site?
I created a custom map using google maps creator and I embedded it on our site. However, when I ran the fetch and render through Search Console, it said it was blocked by our robots.txt file. I read in the Search Console Help section that: 'For resources blocked by robots.txt files that you don't own, reach out to the resource site owners and ask them to unblock those resources to Googlebot." I did not setup our robtos.txt file. However, I can't imagine it would be setup to block google from crawling a map. i will look into that, but before I go messing with it (since I'm not familiar with it) does google automatically block their maps from their own googlebot? Has anyone encountered this before? Here is what the robot.txt file says in Search Console: User-agent: * Allow: /maps/api/js? Allow: /maps/api/js/DirectionsService.Route Allow: /maps/api/js/DistanceMatrixService.GetDistanceMatrix Allow: /maps/api/js/ElevationService.GetElevationForLine Allow: /maps/api/js/GeocodeService.Search Allow: /maps/api/js/KmlOverlayService.GetFeature Allow: /maps/api/js/KmlOverlayService.GetOverlays Allow: /maps/api/js/LayersService.GetFeature Disallow: / Any assistance would be greatly appreciated. Thanks, Ruben
Technical SEO | | KempRugeLawGroup1 -
Google not using redirect
We have a GEO-IP redirect in place for our domain, so that users are pointed to the subfolder relevant for their region, e.g: Visit example.com from the UK and you will be redirected to example.com/uk This works fine when you manually type the domain into your browser, however if you search for the site and come to example.com, you end up at example.com I didn't think this was too much of an issue but our subfolders /uk and /au are not getting ranked at all in Google, even for branded keywords. I'm wondering if the fact that Google isn't picking up the redirect means that the pages aren't being indexed properly? Conversely our US region (example.com/us) is being ranked well. Has anyone encountered a similar issue?
Technical SEO | | ahyde0 -
Consistent top 10 in G image search - but a different 'stolen' version every time!
I have a photo that was uploaded back in 2005. It is an aerial shot and has received a fair bit of traffic over the years. I'm pretty sure it was ranked #1 in Google Images for the town name for a while. Now, however, it never ranks. Well actually it does. But every single time it is a version on a different website that is being used without permission.
Technical SEO | | Cornwall
And I'm not talking about one website. Every time I fill out a DMCA and have the image removed only to see a completely different website featuring in the top 10. This has happened 5 times so far and I'm just about to fill out another DMCA request. What is going on? Surely Google in its infinite wisdom is smart enough to check the timestamp or date cues on page to figure out which is the original. These other sites are often complete unknowns compared to my site which is a 12yr old authority site on the subject.
Don't get it!0 -
Google Page speed
I get the following advice from Google page speed: Suggestions for this page The following resources have identical contents, but are served from different URLs. Serve these resources from a consistent URL to save 1 request(s) and 77.1KiB. http://www.irishnews.com/ http://www.irishnews.com/index.aspx I'm not sure how to fix this the default page is http://www.irishnews.com/index.aspx, anybody know what need to be done please advise. thanks
Technical SEO | | Liammcmullen0 -
How Best to Handle 'Site Jacking' (Unauthorized Use of Someone else's Dedicated IP Address)
Anyone can point their domain to any IP address they want. I've found at least two domains (same owner) with two totally unrelated domains (to each other and to us) that are currently pointing their domains to our IP address. The IP address is on our dedicated server (we control the entire physical server) and is exclusive to only that one domain (so it isn't a virtual hosting misconfiguration issue) This has caused Google to index their two domains with duplicate content from our site (found by searching for site:www.theirdomain.com) Their site does not come up in the first 50 results though for any of the keywords we come up for so Google obviously knows THEY are the dupe content, not us (our site has been around for 12 years - much longer than them.) Their registration is private and we have not been able to contact these people. I'm not sure if this is just a mistake on the DNS for the two domains or it is someone doing this intentionally to try to harm our ranking. It has been going on for a while, so it is most likely not a mistake for two live sites as they would have noticed long ago they were pointing to the wrong IP. I can think of a variety of actions to take but I can find no information anywhere regarding what Google officially recommends doing in this situation, assuming you can't get a response. Here's my ideas. a) Approach it as a Digital Copyright Violation and go through the lengthy process of having their site taken down. Pro: Eliminates the issue. Con: Sort of a pain and we could be leaving possibly some link juice on the table? b) Modify .htaccess to do a 301 redirect from any URL not using our domain, to our domain. This means Google is going to see several domains all pointing to the same IP and all except our domain, 301 redirecting to our domain. Not sure if THAT will harm (or help) us? Would we not receive link juice then from any site out there that was linking to these other domains? Con: Google will see the context of the backlinks and their link text will not be related at all to our site. In addition, if any of these other domains pointing to our IP have backlinks from 'bad neighborhoods' I assume it could hurt us? c) Modify .htaccess to do a 404 File Not Found or 403 forbidden error? I posted in other forums and have gotten suggestions that are all over the map. In many cases the posters don't even understand what I'm talking about - thinking they are just normal backlinks. Argh! So I'm taking this to "The Experts" on SEOMoz.
Technical SEO | | jcrist1