Getting page cached
-
I am reworking some content that is deep in my site. What is the best way for google to find it? Some of the pages were cached about 3 weeks ago, but I don't want to wait too long to get them to see the new content (and links).
-
Cool, will give it a try!
-
OKay, thanks again!
-
I haven't seen any documentation by Google supporting this. In my experience I have seen it be successful. However, I haven't run any significant tests to back it up. It has just worked for me, so hoping it works for others. However, Fetch as Googlebot was still in the labs when I saw this working, so I'm not sure if this has changed.
-
I've heard from others that this can be very effective and I've seen good results getting pages cached quickly (a couple of days) after using it. I've used this very sparingly, so I don't know the period for the allotment. I also used this when it was in the Google Labs. They may have made a few changes when they brought it out of the lab.
-
If this works, I have also learnt something new today. I always thought this just gave webmasters a chance to look at pages in the way Google would see them. Are you sure this actually sends the real Googlebot to crawl and cache a page?
I didn't see it mentioned here:
http://www.google.com/support/webmasters/bin/answer.py?answer=158587Or have you found that it just speeds up the process? i.e. Google have been given some form of indication of where to visit? Definitely interesting!
-
Stupid question Joe, but is that safe to do? I just submitted 3 URL's successfully, and says I have 47 left. Is 50 a monthly number allotment? So if it was fetched successfully, does that mean google will be caching page soon?
This is pretty cool if it really works.
Thanks,
Mike
-
Go into webmaster tools > Diagnostics> Fetch as Googlebot. Enter the URL for the page you want crawled.
-
If your homepage is crawled regularly link to them from your homepage for a period of time. Alternatively, using social signals has worked a charm for me recently - tweet about them, share on Facebook etc, get other people to share.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Search Console issue: "This is how Googlebot saw the page" showing part of page being covered up
Hi everyone! Kind of a weird question here but I'll ask and see if anyone else has seen this: In Google Search Console when I do a fetch and render request for a specific site, the fetch and blocked resources all look A-OK. However, in the render, there's a large grey box (background of navigation) that covers up a significant amount of what is on the page. Attaching a screenshot. You can see the text start peeking out below (had to trim for confidentiality reasons). But behind that block of grey IS text. And text that apparently in the fetch part Googlebot does see and can crawl. My question: is this an issue? Should I be concerned about this visual look? Or no? Never have experienced an issue like that. I will say - trying to make a play at a featured snippet and can't seem to have Google display this page's information, despite it being the first result and the query showing a featured snippet of a result #4. I know that it isn't guaranteed for the #1 result but wonder if this has anything to do with why it isn't showing one. VmIqgFB.png
On-Page Optimization | | ChristianMKG0 -
On-Page Optimization Question
My company sells Blue widgets and we are located in Denver, CO. Keyword research indicates that the the highest volume phrase is "blue widgets for sale in denver co". Should my meta title tag be: Blue Widgets for sale in Denver CO , and my h1 tag be the same? or should they be semantic phrases? Thanks in advance!
On-Page Optimization | | FicklingCompany0 -
Too Many On-Page Links
Hello. So, my SEO team has worked very hard to finally resolve RogerBot/GoogleBot specific Crawl Errors either manually or programmatically can be fixed for our Budget Blinds USA Pro Campaign. We've done a good job even if a lot of it came from Robots.txt file entries as this was the most efficient way our client chose to do it. Good news is most of it is CMS configuration and not bad site architecture. That being said our next big volume of Crawl Errors is "Too Many On-Page Links". Our Moz DomainRank is 61. Our client, on this new version of the website, added a large nav-based footer which has duplicate links from the Header Main Navigation. I believe our solution is to put in No-Follow Metatags at the Footer Link Level, so we don't zap Page Authority by over-dividing as you recommend. Is this the best way to resolve this? Is there any risk in this? Or is a 61 DomainRank high enough for RogerBot and GoogleBot to crawl these anyway? Please advise,
On-Page Optimization | | Aviatech0 -
Similar content multiple pages
I have run in to a situation on an e-commerce store where products from a certain manufacturer require a fairly large chunk of corporate information to be posted underneath the product description: I.E. Trademark information, etc. This information happens to be close to half the size of the product description information. Am I at risk of getting hit negatively for this portion of text duplicated across multiple products? I was considering putting a link to a separate informational page with this information but am not sure if it even matters? What are your recommendations brilliant SEO'erz?
On-Page Optimization | | wishmedia0 -
Duplicate page content
what is duplicate page content, I have a dating site and it's got a groups area where the members can base there discussions in a category like for an example, night life, health and beauty, and such. why would this cause a problem of duplicate page content and how would I fix it. explained in the terms of a dummy.
On-Page Optimization | | clickit2getwithit0 -
Locating Duplicate Pages
Hi, Our website consists of approximately 15,000 pages however according to our Google Webmaster Tools account Google has around 26,000 pages for us in their index. I have run through half a dozen sitemap generators and they all only discover the 15,000 pages that we know about. I have also thoroughly gone through the site to attempt to find any sections where we might be inadvertently generating duplicate pages without success. It has been over six months since we did any structural changes (at which point we did 301's to the new locations) and so I'd like to think that the majority of these old pages have been removed from the Google Index. Additionally, the number of pages in the index doesn't appear to be going down by any discernable factor week on week. I'm certain it's nothing to worry about however for my own peace of mind I'd like to just confirm that the additional 11,000 pages are just old results that will eventually disappear from the index and that we're not generating any duplicate content. Unfortunately there doesn't appear to be a way to download a list of the 26,000 pages that Google has indexed so that I can compare it against our sitemap. Obviously I know about site:domain.com however this only returned the first 1,000 results which all checkout fine. I was wondering if anybody knew of any methods or tools that we could use to attempt to identify these 11,000 extra pages in the Google index so we can confirm that they're just old pages which haven’t fallen out of the index yet and that they’re not going to be causing us a problem? Thanks guys!
On-Page Optimization | | ChrisHolgate0 -
On-page report tool
This is a question regarding the advise this tool offer to increase the ranking of a webpage with focusing with a particularly keyword we choose. I give an example: On-page Report Card am checking my keywords and I use. "cleanse london" my surprise is Report card give an "F" for my target landing page http://www.purifyne.com , but the issue is I am already in first place first position. I know SEOmoz know have the algorithm from Google to know how to rank better but my issue is should be a little more accurate! at least. I don't want to be misunderstood here, I just want more guidance, to rank much better using this tool that I am paying for. Any thoughts?
On-Page Optimization | | teksyte0 -
Autogenerated pages
My main product is database conversion software. As it supports tons of databases, it's fairly easy to generate thousands of landing pages simply by variating source/target database names, connection information etc. In fact, I autogenerated almost 25k pages that way. As I didn't want to jeopardize my main site, I placed all that content to a new microsite (www.fullconvert.com) which had no history and no inbound links. Results were nice - site is live two months and in second month already had 1300 visitors. Now, my question is - should I create the same thing on my (old and rather authoritative) main site www.spectralcore.com? I could use a different template to avoid duplicate content. Of course, my main concern is being penalized by Google. In my opinion, this autogenerated content is fine because it provides (tons of) laser-focused landing pages, so visitors will instantly recognize they found what they're looking for. But Google might disagree! What do you think? Is there a danger in trying to leverage authority of my main site in adding 20k+ autogenerated pages with inbound no links to them?
On-Page Optimization | | metadata0