Is it dangerous to use "Fetch as Google" too much in Webmaster Tools?
-
I saw some people freaking out about this on some forums and thought I would ask.
Are you aware of there being any downside to use "Fetch as Google" often? Is it a bad thing to do when you create a new page or blog post, for example?
-
Hi Keri
I did yes, i stumbled upon it and thought i'd give my two pennies worth as an SEO!
Certainly wasnt looking for a backlink as it would be pretty irrelevant for our industry and would never expect a dofollow links from a comments section anyway.
Thanks to you also for your feedback
Cheers!
-
Welcome, LoveSavings. Just wanted to make sure you knew this post is a year old, and that all of the links in Q&A are automatically nofollowed. Thanks for the thoughtful answer!
-
Having done lots of tests on this, i would say that fetching as google is the best wat forward.
Although the steps listed above are all excellent ways of boosting the speed at which google will index your page, none of them seem to be as effective as fetching in webmaster tools. you can a few hundred of these a month, so you shouldnt run out unless you are publishing immense amounts of content - in which case google is likely to be indexing your content very quickly anyway.
www.loveenergysavings.com is still relatively small although we publish excellent, though leadership style content. so, to ensure that our posts are indexed as quickly as possible (as we are competing with some massive sites) we always fetch our posts in google webmaster tools. this is always quicker than tweeting, google+ etc. we also have an xml sitemap which automatically adds our post, this doesnt guarantee rapid indexing though.
having messed around with all of these methods, fetching as g-bot is always the quickest and most effective option. as danatanseo says, its there to be utilised by seo's so why not take full advantage? i can't see why google would ever look unfavourably on a site for wanting its content to be available to the public as quickly as possible?
-
I would say it is not a preferred way to alert Google when you have a new page and it is pretty limited. What is better, and frankly more effective is to do things like:
- add the page to your XML sitemap (make sure sitemap is submitted to Google)
- add the page to your RSS feeds (make sure your RSS is submitted to Google)
- add a link to the page on your home page or other "important" page on your site
- tweet about your new page
- status update in FB about your new page
- Google Plus your new page
- Feature your new page in your email newsletter
Obviously, depending on the page you may not be able to do all of these, but normally, Google will pick up new pages in your sitemap. I find that G hits my sitemaps almost daily (your mileage may vary).
I only use fetch if I am trying to diagnose a problem on a specific page and even then, I may just fetch but not submit. I have only submitted when there was some major issue with a page that I could not wait for Google to update as a part of its regular crawl of my site. As an example, we had a release go out with a new section and that section was blocked by our robots.txt. I went ahead and submitted the robots.txt to encourage Google to update the page sooner so that our new section would be :"live" to Google sooner as G does not hit our robots.txt as often. Otherwise for 99.5% of my other pages on sites, the options above work well.
The other thing is that you get very few fetches a month, so you are still very limited in what you can do. Your sitemaps can include thousands of pages each. Google fetch is limited, so another reason I reserve it for my time sensitive emergencies.
-
https://support.google.com/webmasters/answer/158587?hl=en#158587
I just double-checked David, and it looks like the allocation may not be different for different sites. According to Google you get 500 fetches and 10 URL + Linked pages submissions every week.
-
You are welcome David, and no this isn't a lifetime limit at all. I believe it resets at least once every 30 days, maybe more often than that. I manage four different sites, some large, some small and I've never run out of fetches yet.
-
Thanks Dana. Is it possible to get more fetches? Presumably it's not a lifetime limit, right?
-
No, I wouldn't worry about this at all. This is why Google has already allocated a finite number of "Fetches" and URL + Links submissions to your account. These numbers are based on the size of your site. Larger sites are allocated more and smaller sites less. [Please see my revised statement below regarding Google's "Fetch" limits - it isn't based on site size] I don't think enough Webmasters take advantage of the Fetch as often as they should.
Hope that helps!
Dana
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
H Tags Vs "H Style" Tags?
Hey everybody! So I was wondering what the difference between the H tags and "H Style". My first thought is that it's just the style guide, and not actually a meta tag, but before I go around changing all these styles I want to make sure my computer isn't going to explode SEO juice. Thanks!
Intermediate & Advanced SEO | | HashtagHustler0 -
Does Google View "SRC", "HREF", TITLE and Alt tags as Duplicate Content on Home Page Slider?
Greetings MOZ Community. A keyword matrix was developed by my SEO firm. I am in the process of integrating primary, secondary and terciary phrases into the text and am also sprinkling three or four other terms. Using a keyword density tool (http://www.webconfs.com/keyword-density-checker.php) the results were somewhat unexpected after I optimized. So I then looked at the source code and noticed text from HREF, ALT and SRC tags that may be effecting how Google would interpret text on the page. Our home page (www.nyc-officespace-leader.com) contains a slider with commercial real estate listings. Would Google index the SRC, HREF, TITLE and ALT tags in these slider items? Would this be detrimental to SEO? The code for one listing (and there are 7-8 in the slider) looks like this: | href="http://www.nyc-officespace-leader.com/listings/305-fifth-avenue-office-suite-1340sf" title="Lease a Prestigious Fifth Avenue Office - Manhattan, New York">Class A Fifth Avenue Offices class="blockLeft"><a< p=""></a<> href="http://www.nyc-officespace-leader.com/listings/305-fifth-avenue-office-suite-1340sf" title="Lease a Prestigious Fifth Avenue Office - Manhattan, New York"> src="http://dr0nu3l9a17ym.cloudfront.net/wp-content/uploads/fsrep/houses/125x100/305.jpg" alt="Lease a Prestigious Fifth Avenue Office - Manhattan, New York" width="125" height="94" /> 1,340 Sq. Ft. $5,918 / month Fifth Avenue Midtown / Grand Central <a< p=""></a<> | Could the repetition of the title text ("lease a Prestigious Fifth...") trigger a duplicate content penalty? Should the slider content be blocked or set to no-index by some kind of a Java script? We have worked very hard to optimize the home page so it would be a real shame if through some technical oversight we got hit by a Google Panda penalty. Thanks, Alan Thanks
Intermediate & Advanced SEO | | Kingalan10 -
Google webmaster tool (GWT) owner removal issue
Hi! I have a new client, the former agency added the client property with the agency account so we had to create a new GA account (as you can’t transfer ownership at the account level) but we also kept access to the former account to keep historical data. We were granted owner access to the GWT (which is more flexible, you can remove owners and creators) and we now want to remove former agency users. We have 3 adresses. One was verified with delegation method (no pb for removal), one with meta tag (no pb) and one with Google Analytics. Here it becomes tricky as Google says regarding GA verif method “If this account was verified using a Google Analytics tracking code, you should make sure that the user you want to unverify is no longer an administrator on the Analytics account. Otherwise, removal may not be permanent”. The thing is that this user has the same email address as the one used to create the agency GA account (no ownership transfer) so I basically can’t remove admin rights. The other possibility, as Google mentions when I try to unlink this user, is “remove the administrator status in Google Analytics or delete the Google Analytics tracking code on the website”. But we don’t want to remove the code as we still want to track data with the former account for historical analysis purposes. Has anyone ever faced this situation? Do you know how to handle this? Do you think that unlinking the GWT and the GA accounts will unverify the GA method? Many thanks in advance ! Ennick
Intermediate & Advanced SEO | | ennick0 -
How much does "overall site semantic theme" influence rankings?
OK. I've optimized sites before that are dedicated to 1, 2 or 3 products and or services. These sites inherently talk about one main thing - so the semantics of the content across the whole site reflect this. I get these ranked well on a local level. Now, take an e-commerce site - which I am working on - 2000 products, all of which are quite varied - cookware, diningware, art, decor, outdoor, appliances... there is a lot of different semantics throughout the site's different pages. Does this influence the ranking possibilities? Your opinion and time is appreciated. Thanks in advance.
Intermediate & Advanced SEO | | bjs20100 -
Trailing slash and rel="canonical"
Our website is in a directory format: http://www.website.com/website.asp Our homepage display URL is http://www.website.com which currently matches our to eliminate the possibility of duplicate content. However, I noticed that in the SERPs, google displays the homepage with a trailing slash http://www.website.com/ My question: should I change the rel="canonical" to have a trailing slash? I noticed one of our competitors uses the trailing slash in their rel="canonical" Do potential benefits outweigh the risks? I can PM further information if necessary. Thanks for the assistance in advance...
Intermediate & Advanced SEO | | BethA0 -
Hidden Content with "clip"
Hi We're relaunching a site with a Drupal 7 CMS. Our web agency has hidden content on it and they say it's for Accessibility (I don't see the use myself, though). Since they ask for more cash in order to remove it, the management is unsure. So I wanted to check if anyone knows whether this could hurt us in search engines. There is a field in the HTML where you can skip to the main content: Skip to main content The corresponding CSS comes here: .element-invisible{position:absolute !important;clip:rect(1px 1px 1px 1px);clip:rect(1px,1px,1px,1px);} #skip-link a,#skip-link a:visited{position:absolute;display:block;left:0;top:-500px;width:1px;height:1px;overflow:hidden;text-align:center;background-color:#666;color:#fff;} The crucial point is that they're hiding the text "skip to main content", using clip:rect(1px 1px 1px 1px), which shrinks the text to one pixel. So IMO this is hiding content. How bad is it? PS: Hope the source code is sufficient. Ask me if you need more. Thx!
Intermediate & Advanced SEO | | zeepartner0 -
Is there any delay between crawling a page by google and displaying of the ratings in rich snippet of the results in google?
Is there any delay between crawling a page by google and displaying of the ratings in rich snippet of the results in google?
Intermediate & Advanced SEO | | NEWCRAFT0 -
400 errors and URL parameters in Google Webmaster Tools
On our website we do a lot of dynamic resizing of images by using a script which automatically re-sizes an image dependant on paramaters in the URL like: www.mysite.com/images/1234.jpg?width=100&height=200&cut=false In webmaster tools I have noticed there are a lot of 400 errors on these image Also when I click the URL's listed as causing the errors the URL's are URL Encoded and go to pages like this (this give a bad request): www.mysite.com/images/1234.jpg?%3Fwidth%3D100%26height%3D200%26cut%3Dfalse What are your thoughts on what I should do to stop this? I notice in my webmaster tools "URL Parameters" there are parameters for:
Intermediate & Advanced SEO | | James77
height
width
cut which must be from the Image URLs. These are currently set to "Let Google Decide", but should I change them manually to "Doesn't effect page content"? Thanks in advance0