The "Fetch As Google" limit has been decreased - what now?
-
Since Google decreased the "Fetch As Google" limit to ten pages per day, we've been a bit stuck. We're publishing 20-30 content pages per day, targeting a huge range of search queries. Circa 40% of our traffic comes to us through these pages.
Since we're now heavily restricted on submitting these to Google, who's got other ideas to get the pages picked up quickly? I'm slightly concerned because although the pages link outwards to other areas of the website, no other areas of the site link to these pages. They're purely top-of-the-funnel.
We can't be the only people with this concern. How would you address it?
-
Thank you. Will have a go at Google News - not sure we fit the criteria but can't hurt to try, eh??
-
Thank you very much. That's really helpful and I appreciate you putting so much time into your answer.
-
You maybe right about Google+, but that said, in the google news guidelines, they do suggest that they may use google+ to 'better surface content'.
"Add or edit your Google+ Page URL. We may use publicly available information from your Google+ page to deliver a better news experience and to better surface your content. "
https://support.google.com/news/publisher/answer/4581428?hl=en-GB
-
Hi MSG
You should not have to fetch as Google to get your pages indexed and please with deference to Paul, above, posting on Google+ will not speed up indexing. If it did then people would be all over it and they are not. You also can only apply to Google news if you have a newsworthy site and I can tell you right now that 90+% of sites are rejected because what they post is not news.
The fact that a page is posted puts it in the sitemap which is picked up by Google. Also depending on the size of your site some of it is crawled by Google every day. Just go to Google Seach Console Crawl Stats to see that.
I have seen pages indexed in under a minute (none news sites - yes seriously) and others take a week. What is of major concern is how you say that these pages are not linked from anywhere because there lies a large part of your problem. You need a proper site structure so that link juice passes from the main menu down through all the pages of your website. The more 'shallow' this is, the better as the pages are closer to the top.
If you are burying pages deep in your website with no logical route to them then they are just not going to rank quickly.
Read more on site architecture here: https://moz.com/blog/site-architecture-for-seo It's old but still very relevant.
You need a solid site structure which starts with the main menu with proper departments, categories and if necessary, sub-categories.
Sort that out and you will see your pages indexed much quicker.
If you can get do-follow backlinks then this will help, but certainly not from Facebook and G+ as they are all nofollow.
Regards
Nigel
-
Although I think it depends a little on how 'important' it thinks your site is, I think google crawls sites with regularly updated content very frequently, so I'd suspect that as long as you're linking to this content from somewhere prominent on your site, you shouldn't have an issue.
You could also use a sitemap to tell google about fresh content, perhaps also consider applying to google news if you can, and then use a news sitemap which google will certainly be very quick to check and spider new urls from. In the advice for google news, they also suggest posting content to google+, so I'd also use that to post new content to, as it's another method of telling google you have something fresh for them to spider.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Trusted Stores
Hello, So we sell millions of dollars a month in merchandise - most of that comes from eBay transactions. We do have a script that posts to eBay and we do download our transactions from eBay and process the orders from our admin. Now I feel we will do a lot better in the SERPs if we have the trusted stores quality signal. However; it comes down to this. The conversion pixel. Since the don't pay through the site - do you think we can get away of sending a email to a second conversion page for eBay transactions? Have any of you noticed a boost in SERPs once you were approved with the Trusted Stores? Any advise?
Industry News | | joseph.chambers0 -
Did Google Search Just Get Crazy Local?
Hey All, I think it's a known fact at this point that when signed into a personal Google account while doing a search, the results are very oriented around keywords and phrases you have already searched for, as well as your account's perceived location; for instance when I wanted to check one of my own web properties in SE listings I would sign out or it would likely appear first as a false reading. Today I noticed something very interesting: even when not signed in, Google's listings were giving precedence to locality. It was to a very extreme degree, as in when searching for "web design," a firm a mile away ranked higher than one 1.5 miles away and such. It would seem that the algos having this high a level of location sensitivity and preference would actually be a boon for the little guys, which is, I assume why it was implemented. However, it brings up a couple of interesting questions for me. 1. How is this going to affect Moz (or any SE ranking platform, for that matter) reports? I assume that Google pulls locations from IP Addresses, therefore would it not simply pull the local results most relevant for the Moz server(s) IP? 2. What can one do to rise above this aggressive level of location based search? I mean, my site (which has a DA of 37 and a PA of 48) appears above sites like webdesign.org (DA of 82, PA of 85). Not that I'm complaining at the moment, but I could see this being a fairly big deal for larger firms looking to rank on a national level. What gives? I'd love to get some opinions from the community here if anyone else has noticed this...
Industry News | | G2W1 -
What is triggering Google account suspensions?
Over the past 24 hours many of our clients have had their Google accounts suspended. The explanation has been: "After reviewing your profile, we determined that it has been used to impersonate another individual or mislead other users. This violates the Google+ User Content and Conduct Policy." We are NOT impersonating our clients, we have their permission. We are not misleading anyone, simply setting up profiles for our clients on Google+. This has not affected all of our clients, but a significant number of them. We cannot find a common variable between the clients that have been suspended, and those who have not. Some have had other Google+ profiles in the past, in another account, some have not. Some have been previously verified via SMS, others by phone. Some have posts in their profile, others have only the profile info filled out. Again, we are not trying to game Google, we are simply setting up authorship for them, with their permission. I have not seen much in the SEO community about this today, and this is NOT related to fake reviews. We do not partake in that kind of activity. We have written a post on the topic, and no matter how this shakes out, I think our take is solid. Authorship is changing the game, content is changing the game, trust is changing the game… and Google is getting serious about it. We have also seen this happen to our clients, to our competitors' clients, and to other marketing firms' clients, outside of our vertical. Does anyone know more about the topic, especially in regards to the suspensions over the past 24 hours?
Industry News | | Einstein-Industries0 -
Google's Current Wave of Updates (4/24 edition)
Edit: Someone beat me to the punch, here is the thread: http://www.seomoz.org/q/google-webspam-algo-update-24-4-12 Let's just discuss it there. So Google has said they are doing another wave of algorithm updates that could impact anywhere from 3-5% of SERPs. I saw it here: http://searchengineland.com/google-launches-update-targeting-webspam-in-search-results-119295 http://googlewebmastercentral.blogspot.com/2012/04/another-step-to-reward-high-quality.html Has anyone seen any changes? I've heard from a few friends that their sites are bouncing all over the place, which seems to happy a lot during these updates. We might not actually know the fallout for a few days/weeks/months. I saw a few of my smaller sites take a hit, but most of mine have stayed the same. Anyway what do you guys think? Sometimes an update like this can be a wake-up call to people who think they are doing white hat stuff but may be pushing the envelope a bit too much. Thoughts?
Industry News | | vforvinnie0 -
Chrome blocked sites used by Googles Panda update
Google's Panda update said it used Chrome users blocked sites lists as a benchmark for what they now term poor quality content. They said the Panda update effectively took about 85% of them out of the search results. This got me thinking, it would be very nice to discover what are the exact sites they don't like. Does anyone know if there is an archive of what these sites might be? Or if none exists, maybe if people could share their Chrome blocked sites on here we might get an idea?
Industry News | | SpecialCase0 -
How do I "link" up with USA based SEO's
After 28 long years of wait this year I finally made it to the United States of America. I grew up on "The A Team" , "Knight Rider" and Mcdonalds cheese burgers from 2 years onwards. I am as American as a English person could possibly be. It wasn't until I arrived in Florida and spent two weeks in the great country that I understood the difference in the minds and mentality of people. People were nicer ! People on the whole had a can do mentaily. If your a success people congratulate you for it . Unlike the UK.... Plus people were not scared to make a fool of themselves... (People jumping up and down at traffic lights with advertising boards) Now to my question - I loved it so much I want to go back and I figured me and my wife and 3 year old may have a extended break next time where I can work on vacation maybe over a month or so. I would love to get to know as many SEO's and people based in the USA as possible please and introduce myself etc. Hey who knows maybe SEO Moz can change my future (fingers crossed) Thanks to all the people that get in touch !
Industry News | | onlinemediadirect0 -
What is the best method for getting pure Javascript/Ajax pages Indeded by Google for SEO?
I am in the process of researching this further, and wanted to share some of what I have found below. Anyone who can confirm or deny these assumptions or add some insight would be appreciated. Option: 1 If you're starting from scratch, a good approach is to build your site's structure and navigation using only HTML. Then, once you have the site's pages, links, and content in place, you can spice up the appearance and interface with AJAX. Googlebot will be happy looking at the HTML, while users with modern browsers can enjoy your AJAX bonuses. You can use Hijax to help ajax and html links coexist. You can use Meta NoFollow tags etc to prevent the crawlers from accessing the javascript versions of the page. Currently, webmasters create a "parallel universe" of content. Users of JavaScript-enabled browsers will see content that is created dynamically, whereas users of non-JavaScript-enabled browsers as well as crawlers will see content that is static and created offline. In current practice, "progressive enhancement" in the form of Hijax-links are often used. Option: 2
Industry News | | webbroi
In order to make your AJAX application crawlable, your site needs to abide by a new agreement. This agreement rests on the following: The site adopts the AJAX crawling scheme. For each URL that has dynamically produced content, your server provides an HTML snapshot, which is the content a user (with a browser) sees. Often, such URLs will be AJAX URLs, that is, URLs containing a hash fragment, for example www.example.com/index.html#key=value, where #key=value is the hash fragment. An HTML snapshot is all the content that appears on the page after the JavaScript has been executed. The search engine indexes the HTML snapshot and serves your original AJAX URLs in search results. In order to make this work, the application must use a specific syntax in the AJAX URLs (let's call them "pretty URLs;" you'll see why in the following sections). The search engine crawler will temporarily modify these "pretty URLs" into "ugly URLs" and request those from your server. This request of an "ugly URL" indicates to the server that it should not return the regular web page it would give to a browser, but instead an HTML snapshot. When the crawler has obtained the content for the modified ugly URL, it indexes its content, then displays the original pretty URL in the search results. In other words, end users will always see the pretty URL containing a hash fragment. The following diagram summarizes the agreement:
See more in the....... Getting Started Guide. Make sure you avoid this:
http://www.google.com/support/webmasters/bin/answer.py?answer=66355
Here is a few example Pages that have mostly Javascrip/AJAX : http://catchfree.com/listen-to-music#&tab=top-free-apps-tab https://www.pivotaltracker.com/public_projects This is what the spiders see: view-source:http://catchfree.com/listen-to-music#&tab=top-free-apps-tab This is the best resources I have found regarding Google and Javascript http://code.google.com/web/ajaxcrawling/ - This is step by step instructions.
http://www.google.com/support/webmasters/bin/answer.py?answer=81766
http://www.seomoz.org/blog/how-to-allow-google-to-crawl-ajax-content
Some additional Resources: http://googlewebmastercentral.blogspot.com/2009/10/proposal-for-making-ajax-crawlable.html
http://www.seomoz.org/blog/how-to-allow-google-to-crawl-ajax-content
http://www.google.com/support/webmasters/bin/answer.py?answer=357690 -
Googles' Anonymous data sharing "pool"
Is sharing this information good for my websites? And Is it Open information for anyone to hack into, and see my sites analytics? Bottom line, good or a bad thing?
Industry News | | smstv0