Is there a we to get Google to index our site quicker?
-
I have updated some pages on a website, is there a way to get Google to index the page quicker?
-
Fetching as Googlebot and ensuring the page is visible elsewhere on the web (marketing!) are the best ways to spur quicker crawling and indexing, as others have said. If you notice the cache date of the pages not updating and changes not making it into Google's index, it would be time to check for larger issues that might be preventing or dissuading Google from reaching the site more regularly.
-
please also be aware you can only do a fetch 10 times a month so make them count! Only do it when you must.
-
Google Plus is a great way. There was a study done by Stone Temple Consulting
http://www.stonetemple.com/measuring-google-plus-impact-on-search-rankings/
It concluded that there was no direct impact on ranking, but here was the interesting part. GoogleBot visited a page within 6 minutes of the page being shared on Google Plus.
All of the other points on fetching in GWT, etc are all valid as well, it was just interesting to me how quick GoogleBot reacts to Google Plus.
Cheers!
-
I would be sure to share the page on Google plus. Since you can't otherwise control crawl frequency, make sure your site is well-optimized to ensure that googlebot doesn't have problems crawling it. Check the page speed, fix and HTML errors, correct any missing URLs and broken links.
-
Fetch as Google works well alternatively you can also post on twitter and it will get crawled from there depending on how popular your account is etc.
-
I agree, i would definitely fetch in WMT or you can maybe update your content or post a blog to get them to recrawl.
-
This is a little more specific:
You can get there by going to GWT, clicking on your site, then on the left, click on "crawl," then click on "Fetch as google," then enter the url you want indexed and hit "fetch." You then can pick between just "the page" or "the page and all the pages linked to it "to be fetched. That's pretty much up to you, but if you don't use the tool all that often, "you might as well pick the page and the pages linked to it" option.
Sometimes you'll get this weird error message, but that's (most likely) not your fault. I've had it happen every now and then. I just try it again a few times, and it usually works. If not, still, just try it again in a few hours.
hope this helps,
Ruben
-
Yes. You can use google fetch in webmaster tools. It's more of a request, than a demand. However, it has worked for me in the past, and google has indexed my pages faster when I used it.
- Ruben
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Birthday Update - noticeable industries?
Anybody see anything specific around which sites are seeing upswing from birthday update? We saw a lot of rankings drop, most drastic in terms that are more loosely associated with our offerings but also seeing more sites with higher DA taking a spot or two above and some more images in top spots for ecommerce terms that you might not usually think to click images to view. For a lot of these terms we're sure our conversion is better, and we offer to more of the search query intent, than some of the competitor sites that have taken top spots - even Amazon and staples who I'm guessing will move back down after google sees people leaving their site to find what they are looking for (hopefully).
Algorithm Updates | | david-johns-sheetlabels1 -
How can I tell Google two sites are non-competing?
We have two sites, both English language. One is a .ca and the other is a .com, I am worried that they are hurting one another in the search results. I'd like to obviously direct google.ca towards the .ca domain and .com towards the .com domain and let Google know they are connected sites, non-competing.
Algorithm Updates | | absoauto0 -
Do you think Google is destroying search?
I've seen garbage in google results for some time now, but it seems to be getting worse. I was just searching for a line of text that was in one of our stories from 2009. I just wanted to check that story and I didn't have a direct link. So I did the search and I found one copy of the story, but it wasn't on our site. I knew that it was on the other site as well as ours, because the writer writes for both publications. What I expected to see was the two results, one above the other, depending on which one had more links or better on-page for the query. What I got didn't really surprise me, but I was annoyed. In #1 position was the other site, That was OK by me, but ours wasn't there at all. I'm almost used to that now (not happy about it and trying to change it, but not doing well at all, even after 18 months of trying) What really made me angry was the garbage results that followed. One site, a wordpress blog, has tag pages and category pages being indexed. I didn't count them all but my guess is about 200 results from this blog, one after the other, most of them tag pages, with the same content on every one of them. Then the tag pages stopped and it started with dated archive pages, dozens of them. There were other sites, some with just one entry, some with dozens of tag pages. After that, porn sites, hundreds of them. I got right to the very end - 100 pages of 10 results per page. That blog seems to have done everything wrong, yet it has interesting stats. It is a PR6, yet Alexa ranks it 25,680,321. It has the same text in every headline. Most of the headlines are very short. It has all of the category and tag and archive pages indexed. There is a link to the designer's website on every page. There is a blogroll on every page, with links out to 50 sites. None of the pages appear to have a description. there are dozens of empty H2 tags and the H1 tag is 80% through the document. Yet google lists all of this stuff in the results. I don't remember the last time I saw 100 pages of results, it hasn't happened in a very long time. Is this something new that google is doing? What about the multiple tag and category pages in results - Is this just a special thing google is doing to upset me or are you seeing it too? I did eventually find my page, but not in that list. I found it by using site:mysite.com in the search box.
Algorithm Updates | | loopyal0 -
Stop google indexing CDN pages
Just when I thought I'd seen it all, google hits me with another nasty surprise! I have a CDN to deliver images, js and css to visitors around the world. I have no links to static HTML pages on the site, as far as I can tell, but someone else may have - perhaps a scraper site? Google has decided the static pages they were able to access through the CDN have more value than my real pages, and they seem to be slowly replacing my pages in the index with the static pages. Anyone got an idea on how to stop that? Obviously, I have no access to the static area, because it is in the CDN, so there is no way I know of that I can have a robots file there. It could be that I have to trash the CDN and change it to only allow the image directory, and maybe set up a separate CDN subdomain for content that only contains the JS and CSS? Have you seen this problem and beat it? (Of course the next thing is Roger might look at google results and start crawling them too, LOL) P.S. The reason I am not asking this question in the google forums is that others have asked this question many times and nobody at google has bothered to answer, over the past 5 months, and nobody who did try, gave an answer that was remotely useful. So I'm not really hopeful of anyone here having a solution either, but I expect this is my best bet because you guys are always willing to try.
Algorithm Updates | | loopyal0 -
What determines rankings in a site: search?
When I perform a "site:" search on my domains (without specifying a keyword) the top ranked results seem to be a mixture of sensible top-level index pages plus some very random articles. Is there any significance to what Google ranks highly in a site: search? There is some really unrepresentative content returned on page 1, including articles that get virtually no traffic. Is this seriously what Google considers our best or most typical content?
Algorithm Updates | | Dennis-529610 -
Google seems to have penalised one section of our site? Is that possible?
We have a page rank 5 website and we launched a new site 6 months ago in February. Initially we had horrible urls with a bunch of numbers and stuff and we since changed them to lovely human readable urls. This had an excellent effect across the site except on one section of the site: http://www.allaboutcareers.com/careers/graduate-employers Although Google has indexed these pages and several have a PR 2 they do not appear in Google when previously they were on page 1 when we had the old urls. We figured we just needed some time for Google to get used to it, but it hasn't done anything. It is also worth mentioning we changed the page titles from: FIRM NAME | DOMAIN NAME then... FIRM NAME | Graduate Scheme, Jobs, Internships & Apprenticeships | DOMAIN NAME then.. FIRM NAME | Graduate Scheme, Jobs, Internships & Apprenticeships Do you think these are being penalised? There are two types of page: Example A: http://www.allaboutcareers.com/careers/graduates/addleshaw-goddard.htm Example B: http://www.allaboutcareers.com/careers/graduates/accenture.htm
Algorithm Updates | | jack860 -
Are you getting any action from Google +1 ?
If you have added google plus one to your website you can check on the impact by visiting your webmaster tools account. In your GWT account you will see a left menu item for "+1 Metrics". If you click on "Search Impact" you can see the CTR change attributed to +1. Anybody seeing anything there yet?
Algorithm Updates | | EGOL0 -
Today all of our internal pages all but completely disappeared from google search results. Many of them, which had been optimized for specific keywords, had high rankings. Did google change something?
We had optimized internal pages, targeting specific geographic markets. The pages used the keywords in the url title, the h1 tag, and within the content. They scored well using the SEOmoz tool and were increasing in rank every week. Then all of a sudden today, they disappeared. We had added a few links from textlink.com to test them out, but that's about the only change we made. The pages had a dynamic url, "?page=" that we were about to redirect to a static url but hadn't done it yet. The static url was redirecting to the dynamic url. Does anyone have any idea what happened? Thanks!
Algorithm Updates | | h3counsel0