Is there a we to get Google to index our site quicker?
-
I have updated some pages on a website, is there a way to get Google to index the page quicker?
-
Fetching as Googlebot and ensuring the page is visible elsewhere on the web (marketing!) are the best ways to spur quicker crawling and indexing, as others have said. If you notice the cache date of the pages not updating and changes not making it into Google's index, it would be time to check for larger issues that might be preventing or dissuading Google from reaching the site more regularly.
-
please also be aware you can only do a fetch 10 times a month so make them count! Only do it when you must.
-
Google Plus is a great way. There was a study done by Stone Temple Consulting
http://www.stonetemple.com/measuring-google-plus-impact-on-search-rankings/
It concluded that there was no direct impact on ranking, but here was the interesting part. GoogleBot visited a page within 6 minutes of the page being shared on Google Plus.
All of the other points on fetching in GWT, etc are all valid as well, it was just interesting to me how quick GoogleBot reacts to Google Plus.
Cheers!
-
I would be sure to share the page on Google plus. Since you can't otherwise control crawl frequency, make sure your site is well-optimized to ensure that googlebot doesn't have problems crawling it. Check the page speed, fix and HTML errors, correct any missing URLs and broken links.
-
Fetch as Google works well alternatively you can also post on twitter and it will get crawled from there depending on how popular your account is etc.
-
I agree, i would definitely fetch in WMT or you can maybe update your content or post a blog to get them to recrawl.
-
This is a little more specific:
You can get there by going to GWT, clicking on your site, then on the left, click on "crawl," then click on "Fetch as google," then enter the url you want indexed and hit "fetch." You then can pick between just "the page" or "the page and all the pages linked to it "to be fetched. That's pretty much up to you, but if you don't use the tool all that often, "you might as well pick the page and the pages linked to it" option.
Sometimes you'll get this weird error message, but that's (most likely) not your fault. I've had it happen every now and then. I just try it again a few times, and it usually works. If not, still, just try it again in a few hours.
hope this helps,
Ruben
-
Yes. You can use google fetch in webmaster tools. It's more of a request, than a demand. However, it has worked for me in the past, and google has indexed my pages faster when I used it.
- Ruben
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How do I figure out what's wrong with my site?
I'm fairly new to SEO and can't pinpoint what's wrong with my site...I feel so lost. I am working on revamping www.RiverValleyGroup.com and can't figure out why it's not ranking for keywords. These keywords include 'Louisville homes', 'Homes for sale in Louisville KY', etc. Any suggestions? I write new blog posts everyday so I feel there's no shortage of fresh content. I'm signed up with Moz Analytics and Google analytics
Algorithm Updates | | gohawks77900 -
Top resulting sites sites for a specific keyword
I'm teaching myself SEO so that I can speak more intelligently to it with my clients. I've spent a great deal of time on seomoz and love it. The more I learn, the more I realize I don't know and that brings me to my current question. I can search on a keyword and see results, however I see every URL available. I'm looking for a simple way to see the root domains for the top 100-500 resulting websites for a specific keyword. Is there an easy way to get this information I'm sure it's right in front of me, but I can't find it. Many thanks, ahossom
Algorithm Updates | | ahossom0 -
How do I separate 2 Google+ business listings?
Ever since Google Places started merging with Google+, my client's business listing is now showing up in local search results incorrectly under another business name who shares the same address as them. Has anyone else encountered this problem or a way to correct it?
Algorithm Updates | | TheeDigital0 -
What do you think Google analyzes for SERP ranking?
I've been doing some research trying to figure out how the Google algorithm works. The one thing that is constant is that nothing is constant. This makes me believe that Google takes a variable that all sites have and divides it by that number. One example would be taking the load time in MS and dividing it by the total number or points the website scored. This would give all of the websites a random appearance since there that variable would throw off all the other constants. I'm going to continue doing research but I was wondering what you guys think matters in the Google Algorithm. -Shane
Algorithm Updates | | Seoperior0 -
Problems with Google results
Hi Everybody, I ve been dealing with this issue for a while now. i have a multilingual website: www.vallnord.com When a search for Vallnord in Google it always shows the result in Catalan, but it does not show what I specified in the meta description, it displays what it crawls from the home page. I have 2 problems here: It is not showing my meta description. What can I do? It is not showing the language from which the search was made. Example: if you search from Google.com and your default language is english it should been displayed the result from the english HTML. www.vallnord.com/en but it is not like this. It is always the catalan (default language of the site) the one that is displayed. I have tried several things already: Inserting the Hreflang function Changing the descriptions Resubmitting the sitemap via Google Webmaster I can not figure out what is going on because if you search: "Vallnord Castellano" it will display the spanish URL but still not the proper description. Moreover if you search: "www.vallnord.com/es" on google , it will display the proper URL and description. FYI, I am using 301 redirects for the languages: es.vallnord.com it is the sames as www.vallnord.com/es In addition to this, If using Yahoo search engine there is no problem. it will show the proper language. from yahoo.com the first result is in english and from yahoo.es the first result Spanish. So any idea what would be the problem?And furthermore, any Idea which would be the solution? Thanks in advance, Guido.
Algorithm Updates | | SilbertAd0 -
What determines rankings in a site: search?
When I perform a "site:" search on my domains (without specifying a keyword) the top ranked results seem to be a mixture of sensible top-level index pages plus some very random articles. Is there any significance to what Google ranks highly in a site: search? There is some really unrepresentative content returned on page 1, including articles that get virtually no traffic. Is this seriously what Google considers our best or most typical content?
Algorithm Updates | | Dennis-529610 -
Removing secure subdomain from google index
we've noticed over the last few months that Google is not honoring our main website's robots.txt file. We have added rules to disallow secure pages such as: Disallow: /login.cgis Disallow: /logout.cgis Disallow: /password.cgis Disallow: /customer/* We have noticed that google is crawling these secure pages and then duplicating our complete ecommerce website across our secure subdomain in the google index (duplicate content) https://secure.domain.com/etc. Our webmaster recently implemented a specific robots.txt file for the secure subdomain disallow all however, these duplicated secure pages remain in the index. User-agent: *
Algorithm Updates | | marketing_zoovy.com
Disallow: / My question is should i request Google to remove these secure urls through Google Webmaster Tools? If so, is there any potential risk to my main ecommerce website? We have 8,700 pages currently indexed into google and would not want to risk any ill effects to our website. How would I submit this request in the URL Removal tools specifically? would inputting https://secure.domain.com/ cover all of the urls? We do not want any secure pages being indexed to the index and all secure pages are served on the secure.domain example. Please private message me for specific details if you'd like to see an example. Thank you,0