Is it dangerous to use "Fetch as Google" too much in Webmaster Tools?
-
I saw some people freaking out about this on some forums and thought I would ask.
Are you aware of there being any downside to use "Fetch as Google" often? Is it a bad thing to do when you create a new page or blog post, for example?
-
Hi Keri
I did yes, i stumbled upon it and thought i'd give my two pennies worth as an SEO!
Certainly wasnt looking for a backlink as it would be pretty irrelevant for our industry and would never expect a dofollow links from a comments section anyway.
Thanks to you also for your feedback
Cheers!
-
Welcome, LoveSavings. Just wanted to make sure you knew this post is a year old, and that all of the links in Q&A are automatically nofollowed. Thanks for the thoughtful answer!
-
Having done lots of tests on this, i would say that fetching as google is the best wat forward.
Although the steps listed above are all excellent ways of boosting the speed at which google will index your page, none of them seem to be as effective as fetching in webmaster tools. you can a few hundred of these a month, so you shouldnt run out unless you are publishing immense amounts of content - in which case google is likely to be indexing your content very quickly anyway.
www.loveenergysavings.com is still relatively small although we publish excellent, though leadership style content. so, to ensure that our posts are indexed as quickly as possible (as we are competing with some massive sites) we always fetch our posts in google webmaster tools. this is always quicker than tweeting, google+ etc. we also have an xml sitemap which automatically adds our post, this doesnt guarantee rapid indexing though.
having messed around with all of these methods, fetching as g-bot is always the quickest and most effective option. as danatanseo says, its there to be utilised by seo's so why not take full advantage? i can't see why google would ever look unfavourably on a site for wanting its content to be available to the public as quickly as possible?
-
I would say it is not a preferred way to alert Google when you have a new page and it is pretty limited. What is better, and frankly more effective is to do things like:
- add the page to your XML sitemap (make sure sitemap is submitted to Google)
- add the page to your RSS feeds (make sure your RSS is submitted to Google)
- add a link to the page on your home page or other "important" page on your site
- tweet about your new page
- status update in FB about your new page
- Google Plus your new page
- Feature your new page in your email newsletter
Obviously, depending on the page you may not be able to do all of these, but normally, Google will pick up new pages in your sitemap. I find that G hits my sitemaps almost daily (your mileage may vary).
I only use fetch if I am trying to diagnose a problem on a specific page and even then, I may just fetch but not submit. I have only submitted when there was some major issue with a page that I could not wait for Google to update as a part of its regular crawl of my site. As an example, we had a release go out with a new section and that section was blocked by our robots.txt. I went ahead and submitted the robots.txt to encourage Google to update the page sooner so that our new section would be :"live" to Google sooner as G does not hit our robots.txt as often. Otherwise for 99.5% of my other pages on sites, the options above work well.
The other thing is that you get very few fetches a month, so you are still very limited in what you can do. Your sitemaps can include thousands of pages each. Google fetch is limited, so another reason I reserve it for my time sensitive emergencies.
-
https://support.google.com/webmasters/answer/158587?hl=en#158587
I just double-checked David, and it looks like the allocation may not be different for different sites. According to Google you get 500 fetches and 10 URL + Linked pages submissions every week.
-
You are welcome David, and no this isn't a lifetime limit at all. I believe it resets at least once every 30 days, maybe more often than that. I manage four different sites, some large, some small and I've never run out of fetches yet.
-
Thanks Dana. Is it possible to get more fetches? Presumably it's not a lifetime limit, right?
-
No, I wouldn't worry about this at all. This is why Google has already allocated a finite number of "Fetches" and URL + Links submissions to your account. These numbers are based on the size of your site. Larger sites are allocated more and smaller sites less. [Please see my revised statement below regarding Google's "Fetch" limits - it isn't based on site size] I don't think enough Webmasters take advantage of the Fetch as often as they should.
Hope that helps!
Dana
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
When to Fetch?
If I'm about to submit a new sitemap for Google to crawl, is there any need to use the Fetch tool?
Intermediate & Advanced SEO | | muzzmoz0 -
Ranking drop for "Mobile" devices category in Google webmaster tools
Hi, Our rank dropped and we noticed it's a major drop in "Mobile" devices category, which is contributing to the overall drop. What exactly drops mobile rankings? We do not have any messages in search console. We have made few redirects and removed footer links. How these affect? Thanks,
Intermediate & Advanced SEO | | vtmoz
Satish0 -
Fetch as Google - Redirected
Hi I have swaped from HTTP to HTTPS and put a redirect on for HTTP to redirect to HTTPS. I also put www.xyz.co.uk/index.html to redirect to www.xyz.co.uk When I fetch as Google it shows up redirect! Does this mean that I have too many 301 looping? Do I need the redirect on index.html to root domain if I have a rel conanical in place for index.html htaccess (Linix) - RewriteCond %{HTTP_HOST} ^xyz.co.uk
Intermediate & Advanced SEO | | Cocoonfxmedia
RewriteRule (.*) https://www.xyz.co.uk/$1 [R=301,L] RewriteRule ^$ index.html [R=301,L]0 -
Why would my ip address show up in my webmaster tools links?
I am showing thousands of links from my servers ip address. What would cause that?
Intermediate & Advanced SEO | | EcommerceSite0 -
Bypassing Google, Data Highlighter and Webmaster tools
eLLo! Has anyone used Data Highlighter? I've had colleagues mentioning a jump in CTR after using the data highlighter on pages. Thought I'll do the same and went into my webmaster tools but I've hit a brick wall. Whenever I highlight a product page, my country selector pops up and I'm unable to highlight a product page. A colleague of mine mentioned to bypass google by basing it on user agent, this will allow you to avoid the country selector. But if I bypass Google, wouldn't it affect Google Analytics, Indexing etc?
Intermediate & Advanced SEO | | Bio-RadAbs0 -
How does Google Webmasters decide what order to show external links?
In "links to your site" how does Google Webmasters determine the order of the URLs? By influence? Quality?
Intermediate & Advanced SEO | | nicole.healthline0 -
How permanent is a rel="canonical"?
We are rolling out our canonicals now, and we were wondering: what happens if we decide we did this wrong and need to change where canonicals point? In other words, how bad of a thing is it to have a canonical tag point to page a for a while, then change it to point to page b? I'm just curious to see how permanent of a decision we are making, and how bad it will be if we screwed up and need to change later. Thanks!
Intermediate & Advanced SEO | | CoreyTisdale0 -
Strange Linking Data in Webmaster Tools
I run a site that was a Wordpress blog with Edirectory software for a directory on the back end. I've scrapped the Edirectory and built the entire site on Wordpress. After the site change I'm seeing about 700 404 Not Found crawling errors, which appear to be old Edirectory pages that no longer exist. My understanding is that they'll cycle out eventually. What troubles me is the linking data I'm seeing. In the "Links to My Site" area of Webmaster tools, I'm seeing 4,430 links to the "About" page, another 2,900 to an obscure deleted directory listing page and only 2,050 to the home page. I show 1,700 links to a terms and conditions pdf and other strange data. To summarize, I'm showing huge numbers of links to obscure pages. Any help would be greatly appreciated.
Intermediate & Advanced SEO | | JSOC0