Is it dangerous to use "Fetch as Google" too much in Webmaster Tools?
-
I saw some people freaking out about this on some forums and thought I would ask.
Are you aware of there being any downside to use "Fetch as Google" often? Is it a bad thing to do when you create a new page or blog post, for example?
-
Hi Keri
I did yes, i stumbled upon it and thought i'd give my two pennies worth as an SEO!
Certainly wasnt looking for a backlink as it would be pretty irrelevant for our industry and would never expect a dofollow links from a comments section anyway.
Thanks to you also for your feedback
Cheers!
-
Welcome, LoveSavings. Just wanted to make sure you knew this post is a year old, and that all of the links in Q&A are automatically nofollowed. Thanks for the thoughtful answer!
-
Having done lots of tests on this, i would say that fetching as google is the best wat forward.
Although the steps listed above are all excellent ways of boosting the speed at which google will index your page, none of them seem to be as effective as fetching in webmaster tools. you can a few hundred of these a month, so you shouldnt run out unless you are publishing immense amounts of content - in which case google is likely to be indexing your content very quickly anyway.
www.loveenergysavings.com is still relatively small although we publish excellent, though leadership style content. so, to ensure that our posts are indexed as quickly as possible (as we are competing with some massive sites) we always fetch our posts in google webmaster tools. this is always quicker than tweeting, google+ etc. we also have an xml sitemap which automatically adds our post, this doesnt guarantee rapid indexing though.
having messed around with all of these methods, fetching as g-bot is always the quickest and most effective option. as danatanseo says, its there to be utilised by seo's so why not take full advantage? i can't see why google would ever look unfavourably on a site for wanting its content to be available to the public as quickly as possible?
-
I would say it is not a preferred way to alert Google when you have a new page and it is pretty limited. What is better, and frankly more effective is to do things like:
- add the page to your XML sitemap (make sure sitemap is submitted to Google)
- add the page to your RSS feeds (make sure your RSS is submitted to Google)
- add a link to the page on your home page or other "important" page on your site
- tweet about your new page
- status update in FB about your new page
- Google Plus your new page
- Feature your new page in your email newsletter
Obviously, depending on the page you may not be able to do all of these, but normally, Google will pick up new pages in your sitemap. I find that G hits my sitemaps almost daily (your mileage may vary).
I only use fetch if I am trying to diagnose a problem on a specific page and even then, I may just fetch but not submit. I have only submitted when there was some major issue with a page that I could not wait for Google to update as a part of its regular crawl of my site. As an example, we had a release go out with a new section and that section was blocked by our robots.txt. I went ahead and submitted the robots.txt to encourage Google to update the page sooner so that our new section would be :"live" to Google sooner as G does not hit our robots.txt as often. Otherwise for 99.5% of my other pages on sites, the options above work well.
The other thing is that you get very few fetches a month, so you are still very limited in what you can do. Your sitemaps can include thousands of pages each. Google fetch is limited, so another reason I reserve it for my time sensitive emergencies.
-
https://support.google.com/webmasters/answer/158587?hl=en#158587
I just double-checked David, and it looks like the allocation may not be different for different sites. According to Google you get 500 fetches and 10 URL + Linked pages submissions every week.
-
You are welcome David, and no this isn't a lifetime limit at all. I believe it resets at least once every 30 days, maybe more often than that. I manage four different sites, some large, some small and I've never run out of fetches yet.
-
Thanks Dana. Is it possible to get more fetches? Presumably it's not a lifetime limit, right?
-
No, I wouldn't worry about this at all. This is why Google has already allocated a finite number of "Fetches" and URL + Links submissions to your account. These numbers are based on the size of your site. Larger sites are allocated more and smaller sites less. [Please see my revised statement below regarding Google's "Fetch" limits - it isn't based on site size] I don't think enough Webmasters take advantage of the Fetch as often as they should.
Hope that helps!
Dana
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
"WWW" versus non "WWW" on domain
We plan on migrating our site to a new shorter domain name. I like the idea of removing "www" to gain an additional 3 letters in the URL display. Is there any disadvantage of doing so from a technical or SEO perspective? Thanks,
Intermediate & Advanced SEO | | Kingalan1
Alan0 -
Syntax: 'canonical' vs "canonical" (Apostrophes or Quotes) does it matter?
I have been working on a site and through all the tools (Screaming Frog & Moz Bar) I've used it recognizes the canonical, but does Google? This is the only site I've worked on that has apostrophes. rel='canonical' href='https://www.example.com'/> It's apostrophes vs quotes. Could this error in syntax be causing the canonical not to be recognized? rel="canonical"href="https://www.example.com"/>
Intermediate & Advanced SEO | | ccox10 -
Google WMT/search console showing thousands of links in "Internal Links"
Hi, One of our blog-post has been interlinked with thousands of internal links as per search console; but lists only 2 links it got connected from. How come so many links it got connected internally? I don't see any. Thanks, Satish
Intermediate & Advanced SEO | | vtmoz0 -
Google Penalties not in Webmaster tools?
Hi everybody, I have a client that used to rank very well in 2014. They launched an updated URL structure early January 2015, and since they rank very low on most of the keywords (except the brand keywords). I started working with them early this year, tried to understand what happened, but they have no access to their old website and I cant really compare. I tried the started optimisation methods but nothing seems to work. I have a feeling they have been penalised by Google, probably a Panda penalty, but their Webmaster tools account does not show any penalties under manual actions. Do people impose penalties that are not added to Webmaster tools? If so, is there away I can find out what penalties and what is wrong exactly so we can start fixing it? The website is for a recruitment agency and they have around 400 jobs listed on it. I would love to share the link to the website but I don't believe the client will be happy with that. Thank you in advance.
Intermediate & Advanced SEO | | iQi0 -
"sex" in non-adult domain name
I have a client with a domain that has "sex" in the domain name. For example, electronicsexpo.com. The domain ranks for a few keywords related to the services offered. It is an old domain that has been online for over 10 years. It ranks well for local keywords. No real SEO effort has been made on this domain, so it is rather a clean slate. I am going to be doing SEO on this site. Will the fact that the word "sex" exists in the name have any sort of negative consequence. There is ABSOLUTELY NOTHING adult related or pornographic on this site. I would think that search engines are sophisticated enough to differentiate, but would potential customers with things like parental filters be blocked from viewing content? Is this hurtful in anyway? If so, would I be better off changing domain names? TIA
Intermediate & Advanced SEO | | inhouseseo0 -
Change of URLs: "little by little" VS "all at once"
Hi guys, We're planning to change our URLs structure for our product pages (to make them more SEO friendly) and it's obviously something very sensitive regarding the 301 redirections that we have to take with... I'm having a doubt about Mister Google: if we slowly do that modification (area by area, to minimize the risk of problems in case of bad 301 redirection), would we lose rankings in the search engine? (I'm wondering if they might consider our website is not "coherent" -> not the same product page URLs structure for all the product pages during some time) Thanks for your kind opinion 😉
Intermediate & Advanced SEO | | Kuantokusta0 -
What metrics is Google looking for to classify a websites as a "Store" or "Brand"
Our company is both a store and brand as we sell manufacture direct. We are not included in Google's "Related Searches for widgets:" Picture attached as reference (we are not selling computers ... just an example) What is Google looking for to pull these brands and stores? hXSLn.gif
Intermediate & Advanced SEO | | tatermarketing0 -
Google ranking for the term "locum tenens"
Hello- My company is having a very difficult time performing well for the term "locum tenens". This term literally defines our industry and target market (temporary physician staffing, essentially) and is by far the most searched term in our industry (30k / month, give or take). For us, “locum tenens” is like “ice cream” is to Ben & Jerry’s. Of course, there are other keywords we're concerned with, but this is by far the most important single term. We've moved up to page 3 a few times since launching our redesigned site in April, but seem to continuously settle on page 5 (we've been on page 5 for many weeks now). While I didn’t expect us to be on page 1 at this point, I having a hard time understanding why we’re not on at least 2 or 3, in light of the sites ahead of us. We have a ton of decent, optimized content and we’ve tried not to be too spammy (every page does have locum tenens on it many times, but it describes our service – it’s hard not to use it many times). We are working on developing backlinks and are avoiding any spammy backlink schemes (I get calls every day from companies saying they can give me 400 backlinks a month, which I have a hard time believing is a good long term strategy). It just sort of seems like our site is cursed for some reason that I can't understand. We are working with a competent SEO firm, and still have not made much progress for this term. So, I’m hoping maybe the community here might have some helpful advice. Our site is www.bartonassociates.com. Any insight you guys may have would be GREATLY appreciated. Thanks in advance and have a great day. Jason
Intermediate & Advanced SEO | | ba_seomoz0