The "Fetch As Google" limit has been decreased - what now?
-
Since Google decreased the "Fetch As Google" limit to ten pages per day, we've been a bit stuck. We're publishing 20-30 content pages per day, targeting a huge range of search queries. Circa 40% of our traffic comes to us through these pages.
Since we're now heavily restricted on submitting these to Google, who's got other ideas to get the pages picked up quickly? I'm slightly concerned because although the pages link outwards to other areas of the website, no other areas of the site link to these pages. They're purely top-of-the-funnel.
We can't be the only people with this concern. How would you address it?
-
Thank you. Will have a go at Google News - not sure we fit the criteria but can't hurt to try, eh??
-
Thank you very much. That's really helpful and I appreciate you putting so much time into your answer.
-
You maybe right about Google+, but that said, in the google news guidelines, they do suggest that they may use google+ to 'better surface content'.
"Add or edit your Google+ Page URL. We may use publicly available information from your Google+ page to deliver a better news experience and to better surface your content. "
https://support.google.com/news/publisher/answer/4581428?hl=en-GB
-
Hi MSG
You should not have to fetch as Google to get your pages indexed and please with deference to Paul, above, posting on Google+ will not speed up indexing. If it did then people would be all over it and they are not. You also can only apply to Google news if you have a newsworthy site and I can tell you right now that 90+% of sites are rejected because what they post is not news.
The fact that a page is posted puts it in the sitemap which is picked up by Google. Also depending on the size of your site some of it is crawled by Google every day. Just go to Google Seach Console Crawl Stats to see that.
I have seen pages indexed in under a minute (none news sites - yes seriously) and others take a week. What is of major concern is how you say that these pages are not linked from anywhere because there lies a large part of your problem. You need a proper site structure so that link juice passes from the main menu down through all the pages of your website. The more 'shallow' this is, the better as the pages are closer to the top.
If you are burying pages deep in your website with no logical route to them then they are just not going to rank quickly.
Read more on site architecture here: https://moz.com/blog/site-architecture-for-seo It's old but still very relevant.
You need a solid site structure which starts with the main menu with proper departments, categories and if necessary, sub-categories.
Sort that out and you will see your pages indexed much quicker.
If you can get do-follow backlinks then this will help, but certainly not from Facebook and G+ as they are all nofollow.
Regards
Nigel
-
Although I think it depends a little on how 'important' it thinks your site is, I think google crawls sites with regularly updated content very frequently, so I'd suspect that as long as you're linking to this content from somewhere prominent on your site, you shouldn't have an issue.
You could also use a sitemap to tell google about fresh content, perhaps also consider applying to google news if you can, and then use a news sitemap which google will certainly be very quick to check and spider new urls from. In the advice for google news, they also suggest posting content to google+, so I'd also use that to post new content to, as it's another method of telling google you have something fresh for them to spider.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What do people think of Google discouraging guest blogging
Google is recommending that websites nofollow any links from press releases or guest blogging. Do people think this is really the new standard and sites could get penalized for guest blogging with follow links? Is anyone changing their strategy after hearing this announcement? Where does this leave people who work in difficult niches (such as gambling)?
Industry News | | theLotter1 -
How to add excel in Google custom Search
Hi I was watching this video http://www.seomoz.org/blog/eight-link-building-tips-whiteboard-friday and i wanted to know how to put the excel sheet in Google Custom Search engine? Many thanks
Industry News | | conversiontactics0 -
Are Wordpress sites being dinged by Google? Read a few articles regarding.
I read a couple "SEO" related articles that sites built in Wordpress are going to be dinged by Google because Google sees Wordpress sites as simple to make and a higher potential to be "spammy". Is there any truth to this? Your thoughts? I do give "thumbs up" and "best answer" marks and appreciate receiving thumbs up myself... Thanks
Industry News | | JChronicle1 -
Google Changes Up The Search Results Page
Hi Guys, As you Google has made changes on search results page. I have two points two discuss here : 1. Are we going to see more ads on left sidebar in future ? 2. I think it will also affect the CTR of top three ads in SERP ? Waiting for you guys opinion on it ? Reference: http://www.webpronews.com/google-changes-up-the-search-results-page-2012-11
Industry News | | SanketPatel1 -
"Links To Your Site" In Webmaster Tools
How often does Google update the "links to your site" data. It seems that it has been static for about a month now even though we have made a lot of changes Does anyone have any idea? If you have made changes to your links (i.e removed links, updated anchor text, etc.), do you have to wait for this information to be updated to measure the impact? Or is that whenever Google crawls those pages/sites and sees changes there is a adjustment. Thanks
Industry News | | inhouseseo0 -
How do I "link" up with USA based SEO's
After 28 long years of wait this year I finally made it to the United States of America. I grew up on "The A Team" , "Knight Rider" and Mcdonalds cheese burgers from 2 years onwards. I am as American as a English person could possibly be. It wasn't until I arrived in Florida and spent two weeks in the great country that I understood the difference in the minds and mentality of people. People were nicer ! People on the whole had a can do mentaily. If your a success people congratulate you for it . Unlike the UK.... Plus people were not scared to make a fool of themselves... (People jumping up and down at traffic lights with advertising boards) Now to my question - I loved it so much I want to go back and I figured me and my wife and 3 year old may have a extended break next time where I can work on vacation maybe over a month or so. I would love to get to know as many SEO's and people based in the USA as possible please and introduce myself etc. Hey who knows maybe SEO Moz can change my future (fingers crossed) Thanks to all the people that get in touch !
Industry News | | onlinemediadirect0 -
Does anyone have a copy of the 2011 Google Quality Raters Handbook that was recently leaked?
http://searchengineland.com/download-the-latest-google-search-quality-rating-guidelines-97391 Google has been on a conquest taking them down online but I would really like to take a look at it if you have a copy! [moderator note - please use the PM system and exchange email addresses there. We've removed emails from this thread before it gets indexed and exposed to the world]
Industry News | | altecdesign4 -
What is the best method for getting pure Javascript/Ajax pages Indeded by Google for SEO?
I am in the process of researching this further, and wanted to share some of what I have found below. Anyone who can confirm or deny these assumptions or add some insight would be appreciated. Option: 1 If you're starting from scratch, a good approach is to build your site's structure and navigation using only HTML. Then, once you have the site's pages, links, and content in place, you can spice up the appearance and interface with AJAX. Googlebot will be happy looking at the HTML, while users with modern browsers can enjoy your AJAX bonuses. You can use Hijax to help ajax and html links coexist. You can use Meta NoFollow tags etc to prevent the crawlers from accessing the javascript versions of the page. Currently, webmasters create a "parallel universe" of content. Users of JavaScript-enabled browsers will see content that is created dynamically, whereas users of non-JavaScript-enabled browsers as well as crawlers will see content that is static and created offline. In current practice, "progressive enhancement" in the form of Hijax-links are often used. Option: 2
Industry News | | webbroi
In order to make your AJAX application crawlable, your site needs to abide by a new agreement. This agreement rests on the following: The site adopts the AJAX crawling scheme. For each URL that has dynamically produced content, your server provides an HTML snapshot, which is the content a user (with a browser) sees. Often, such URLs will be AJAX URLs, that is, URLs containing a hash fragment, for example www.example.com/index.html#key=value, where #key=value is the hash fragment. An HTML snapshot is all the content that appears on the page after the JavaScript has been executed. The search engine indexes the HTML snapshot and serves your original AJAX URLs in search results. In order to make this work, the application must use a specific syntax in the AJAX URLs (let's call them "pretty URLs;" you'll see why in the following sections). The search engine crawler will temporarily modify these "pretty URLs" into "ugly URLs" and request those from your server. This request of an "ugly URL" indicates to the server that it should not return the regular web page it would give to a browser, but instead an HTML snapshot. When the crawler has obtained the content for the modified ugly URL, it indexes its content, then displays the original pretty URL in the search results. In other words, end users will always see the pretty URL containing a hash fragment. The following diagram summarizes the agreement:
See more in the....... Getting Started Guide. Make sure you avoid this:
http://www.google.com/support/webmasters/bin/answer.py?answer=66355
Here is a few example Pages that have mostly Javascrip/AJAX : http://catchfree.com/listen-to-music#&tab=top-free-apps-tab https://www.pivotaltracker.com/public_projects This is what the spiders see: view-source:http://catchfree.com/listen-to-music#&tab=top-free-apps-tab This is the best resources I have found regarding Google and Javascript http://code.google.com/web/ajaxcrawling/ - This is step by step instructions.
http://www.google.com/support/webmasters/bin/answer.py?answer=81766
http://www.seomoz.org/blog/how-to-allow-google-to-crawl-ajax-content
Some additional Resources: http://googlewebmastercentral.blogspot.com/2009/10/proposal-for-making-ajax-crawlable.html
http://www.seomoz.org/blog/how-to-allow-google-to-crawl-ajax-content
http://www.google.com/support/webmasters/bin/answer.py?answer=357690