Is there a we to get Google to index our site quicker?
-
I have updated some pages on a website, is there a way to get Google to index the page quicker?
-
Fetching as Googlebot and ensuring the page is visible elsewhere on the web (marketing!) are the best ways to spur quicker crawling and indexing, as others have said. If you notice the cache date of the pages not updating and changes not making it into Google's index, it would be time to check for larger issues that might be preventing or dissuading Google from reaching the site more regularly.
-
please also be aware you can only do a fetch 10 times a month so make them count! Only do it when you must.
-
Google Plus is a great way. There was a study done by Stone Temple Consulting
http://www.stonetemple.com/measuring-google-plus-impact-on-search-rankings/
It concluded that there was no direct impact on ranking, but here was the interesting part. GoogleBot visited a page within 6 minutes of the page being shared on Google Plus.
All of the other points on fetching in GWT, etc are all valid as well, it was just interesting to me how quick GoogleBot reacts to Google Plus.
Cheers!
-
I would be sure to share the page on Google plus. Since you can't otherwise control crawl frequency, make sure your site is well-optimized to ensure that googlebot doesn't have problems crawling it. Check the page speed, fix and HTML errors, correct any missing URLs and broken links.
-
Fetch as Google works well alternatively you can also post on twitter and it will get crawled from there depending on how popular your account is etc.
-
I agree, i would definitely fetch in WMT or you can maybe update your content or post a blog to get them to recrawl.
-
This is a little more specific:
You can get there by going to GWT, clicking on your site, then on the left, click on "crawl," then click on "Fetch as google," then enter the url you want indexed and hit "fetch." You then can pick between just "the page" or "the page and all the pages linked to it "to be fetched. That's pretty much up to you, but if you don't use the tool all that often, "you might as well pick the page and the pages linked to it" option.
Sometimes you'll get this weird error message, but that's (most likely) not your fault. I've had it happen every now and then. I just try it again a few times, and it usually works. If not, still, just try it again in a few hours.
hope this helps,
Ruben
-
Yes. You can use google fetch in webmaster tools. It's more of a request, than a demand. However, it has worked for me in the past, and google has indexed my pages faster when I used it.
- Ruben
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best and easiest Google Depersonalization method
Hello, Moz hasn't written anything about depersonalization for years. This article has methods, but I don't know if they are valid anymore. What's an easy, effective way to depersonalize Google search these days? I would just log out of Google, but that shows different ranking results than Moz's rank tracker for one of our main keywords, so I don't know if that method is correct. Thanks
Algorithm Updates | | BobGW0 -
Are Google algos different between .co.uk and .com?
I have a site that is starting to rank well (top 10 to top 50) for dozens of keywords in Google.co.uk but very little traction in .com. Google.com is the primary market. Webmaster tools is set to US, less than 1% of links to the site are the UK TLD or hosted in the UK. Keywords I'm ranking for in UK are medium to high competition with up to 16k exact search volume per month in the US. I just started to get ranked for these keywords in .co.uk in the past week, and I do rank for some long tail keywords in google.com. I have a handful of keywords ranking in google.ca and google.fr as well, but next to nothing for google.com. I have been building links for one month. I can think of a few possible explanations: - There is a delay in updating the rankings for Google.com and the rankings similar to my .co.uk rankings will come soon - Google.com vs .co.uk use a different algorithm - My site is penalized in .com only Of course, there is no way to be sure what the reason is, but what do you think is the most likely? Thanks!
Algorithm Updates | | kentaro-2569290 -
Best practice for cleaning up multiple Google Places listings and multiple Google accounts when logins were lost.
We are an inbound marketing agency, most of our clients are not relying on local seo. I have a pretty good understanding of it when starting fresh but not so much in joining a "movie in progress" kind of scenario. Recently we've brought on two clients who have had their websites in place for awhile, have made small attempts at marketing themselves online over the years and its resulted in multiple Google places listings, variations of the company names (one of them changed their name), worried there are yet more accounts out there they aren't aware of, etc (analytics, and others from well intentioned employees and past service providers - no internal leadership at the company level). In reading Google help forums I'm seeing some recently having their accounts suspended when they try to clean things up - in one case a person setup a new Google account thinking he would start fresh and in trying to claim listings, get rid of duplicates, etc. his account was suspended. What is the CURRENT recommended course of action in situations like these? With all the changes going on with Google, I don't know which route to take and have combed the Internet reading articles about this (including Google's resources) - would like some current real world advise.
Algorithm Updates | | rhgraves651 -
Google Reconsideration - To do or not to do?
We haven't been manually penalized by Google yet but we have had our fair share of things needing to be fixed; malware, bad links, lack/if no content, lack-luster UX, and issues with sitemaps & redirects. Should we still submit a reconsideration even though we haven't had a direct penalty? Does hurt us to send it?
Algorithm Updates | | GoAbroadKP0 -
Google is forcing a 301 by truncating our URLs
Just recently we noticed that google has indexed truncated urls for many of our pages that get 301'd to the correct page. For example, we have:
Algorithm Updates | | mmac
http://www.eventective.com/USA/Massachusetts/Bedford/107/Doubletree-Hotel-Boston-Bedford-Glen.html as the url linked everywhere and that's the only version of that page that we use. Google somehow figured out that it would still go to the right place via 301 if they removed the html filename from the end, so they indexed just: http://www.eventective.com/USA/Massachusetts/Bedford/107/ The 301 is not new. It used to 404, but (probably 5 years ago) we saw a few links come in with the html file missing on similar urls so we decided to 301 them instead thinking it would be helpful. We've preferred the longer version because it has the name in it and users that pay attention to the url can feel more confident they are going to the right place. We've always used the full (longer) url and google used to index them all that way, but just recently we noticed about 1/2 of our urls have been converted to the shorter version in the SERPs. These shortened urls take the user to the right page via 301, so it isn't a case of the user landing in the wrong place, but over 100,000 301s may not be so good. You can look at: site:www.eventective.com/usa/massachusetts/bedford/ and you'll noticed all of the urls to businesses at the top of the listings go to the truncated version, but toward the bottom they have the full url. Can you explain to me why google would index a page that is 301'd to the right page and has been for years? I have a lot of thoughts on why they would do this and even more ideas on how we could build our urls better, but I'd really like to hear from some people that aren't quite as close to it as I am. One small detail that shouldn't affect this, but I'll mention it anyway, is that we have a mobile site with the same url pattern. http://m.eventective.com/USA/Massachusetts/Bedford/107/Doubletree-Hotel-Boston-Bedford-Glen.html We did not have the proper 301 in place on the m. site until the end of last week. I'm pretty sure it will be asked, so I'll also mention we have the rel=alternate/canonical set up between the www and m sites. I'm also interested in any thoughts on how this may affect rankings since we seem to have been hit by something toward the end of last week. Don't hesitate to mention anything else you see that may have triggered whatever may have hit us. Thank you,
Michael0 -
Page details in Google Search
I noticed this morning a drop in the SERPs for a couple of my main keywords. And even though this is a little annoying the more pressing matter is that Google is not displaying the meta title I have specified for the majority of my sites pages, despite one being specified and knowing my site has them in place. Could this sudden change to not using my specified title be the cause of the drop, and why would they be being displayed by Google in the first place, when they are there to be used. The title currently being displayed inthe SERPs is not anything that has been specified in the past or from the previous latest crawl etc. Any insight would be appreciated. Tim
Algorithm Updates | | TimHolmes0 -
Meta Title Not Showing up in Google
Hello Friends, I have a website, www.bollywoodshaadis.com. On 1st may we changed our servers and revamped our website as per SEO updated guidelines. For some strange reason Google is not showing site Meta Title when you search the website on Google. All it shows is the domain name in the meta title. However, when you search info:www.bollywoodshaadis.com it shows the right Meta tags. Any reason for this happening? I have never seen this before. Thank you in advance.
Algorithm Updates | | SEOcandy0 -
Difference in which pages Google is ranking?
Over the past two weeks I've noticed that Google has decided to change which pages on our site rank for specific keywords. The thing is, this is for keywords that the homepage was already ranking for. Due to our workload, we've made no changes to the site, and I'm not tracking any additional backlinks. Certainly there are no new deep links to these pages. In SEOmoz dashboard (and via tools/manual checking with a proxy) of the 24 terms we have first page ranking for, 9 of them are marked "new to top 50". These are terms we were already ranking for. Google just appears to have switched out the homepage for other pages. I've noticed this across a couple of client sites, too, though none to the extent that I'm seeing on our own. Certainly this isn't a bad thing, as the deeper pages ranking means that they're landing on the content they want first, and I can work to up the conversion rates. It's just caught me by surprise. Anyone else noticing similar changes?
Algorithm Updates | | BedeFahey1