Fetch as Google in GWT - Functionality
-
Hi,
For example, some of the HTML improvements notices from GWT, says dupe meta descriptions or titles, for pages that have since been 301 redirected or had a canonical tag added.
So, my idea is to force google to read it using "Fetch as Google" - hoping that it will now see 301 redirection or the fix we have implemented.
Does this work? How long does it take? Lastly, should I just click the "fetch as google" or should I also click on the "Submit to index" button?
Thanks!
-
BJS1976
The Fetch as Google tool allows you to see what Google sees. Yes, it can help you get the page indexed more quickly, but first take a look in WMT and see how the indexing is now. If your site is being indexed regularly, look at crawl errors. Do you see a problem there? etc. When you fix the problem, mark it as fixed and it will be removed from the list. The next time G crawls it, if the problem still exists it will reappear. But, you will know it was crawled. This allows you to dig deeper.
NOTE: Fetch as Google will not follow a redirect.
If you feel you are still not getting the page reindexed, I would resubmit the sitemap.
I hope this helps a bit.
Robert
-
If you made these changes you will just ahve to wait as Google may take a very long while to refresh its data.
About the 301 I would suggest you to fix them and wait. I found many page that have been redirect long time ago, still appearing in my duplicate meta title.
-
For the HTML improvements to refresh, it could take a while. If you already fixed the issues displayed, just sit tight.
As for the fetch as G bot, if you made changes, then submit the fetch to the index, it will increase the possibility if being re-indexed faster, but it gives you no guarantee.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Not showing in Google map listing. Why?
We have a client who's law firm is the highest google reviewed, on page two or three of St. Louis personal injury lawyer, but does not show in the map listing. Any ideas why this would happen or how to ensure they are viewable in the map listing?
Algorithm Updates | | David-Kley0 -
Did Google update the length of characters allowed in Meta Description?
Hey all, I do SEO. I'm currently working with another SEO firm on a project. The lady mentioned to me that Google recently updated (couple months ago) and changed their font causing them to lower the meta description to 55 characters. Is this true? I have not heard of this. Could she be confusing the meta description with the title tag? I didn't know Google could have even update the Title tag too.
Algorithm Updates | | ColeLusby0 -
Why some sites doesn't get ranked in Google but in Bing and Yahoo
Few of my sites e.g. Business-Training-Schools.com and Ultrasoundtechnicians.com doesnt get much visits from Google but these sites get top ranked in Bing and Yahoo. I have tried searching for answer to these question but i did not find anything convincing.
Algorithm Updates | | HQP2 -
Why has my homepage been replaced in Google by my Facebook page?
Hi. I was wondering if others have had this happen to them. Lately, I've noticed that on a couple of my sites the homepage no longer appears in the Google SERP. Instead, a Facebook page I've created appears in the position the homepage used to get. My subpages still get listed in Google--just not the homepage. Obviously, I'd prefer that both the homepage and Facebook page appear. Any thoughts on what's going on? Thanks for your help!
Algorithm Updates | | TuxedoCat0 -
Google indexing my website's Search Results pages. Should I block this?
After running the SEOmoz crawl test, i have a spreadsheet of 11,000 urls of which 6381 urls are search results pages from our website that have been indexed. I know I've read that /search should be blocked from the engines, but can't seem to find that information at this point. Does anyone have facts behind why they should be blocked? Or not blocked?
Algorithm Updates | | Jenny10 -
Is Google Rotating Good Matches?
I have a theory that Google may be trying to be fair to white-hat-seo sites that are doing the right things with blogging, linking, social media, etc. [ie that deserve equal good positioning] are being cycled to and from the first page, perhaps in a weekly or monthly basis. My theory would be that they are purposefully doing it to give those sites more equal exposure. My case: I've had top rankings for http://thedogbitelawyer.com for almost all of the important terms for dog bite lawyers for a couple of years now. When Penguin came out we lost some ground across the board, and identified that perhaps there was too much duplicate content left over from when I inherited the site. I reworked the site wording and link structure a bit and gained back positioning. Since that time we are up and down like a yo-yo on the top terms! Anybody else have this suspicion? If it's true, I don't need to stress, if we are bouncing around for other reason's I'd better keep stressing!
Algorithm Updates | | JCDenver0 -
How do you get photo galleries indexed on Google News?
I work for a news site and some of our photo galleries get indexed by Google News while others never do. I'm trying to determine why some are more successful than others even though they all follow the same guidelines regarding keyword-rich headlines & copy, h1s, etc. When comparing what's been indexed in the past with current galleries, there doesn't appear to be an obvious pattern. Can anyone share some insight into this?
Algorithm Updates | | BostonWright0 -
Google SERPS problem - "block all results from this domain - click here".
Anyone know what can be done about this when it happens to one of your own domains? On the Google SERPS page, underneath the Title, next to the Description, Google has added "Block all results from this domain?". I understand that this is a new "feature", aimed at allowing users to filter out results from low quality, pornograhphic or offensive sites. But the site in question is none of the above - any ideas how to tackle? Couldn't find anything yet by searching.
Algorithm Updates | | Understudy0