Pages not Indexed after a successful Google Fetch
-
I am trying to understand why google isn't indexing key content on my site.
www.BeyondTransition.com is indexed and new pages show up in a couple of hours.
My key content is 6 pages of information for each of 3000 events (driven by mySQL on a wordpress platform).
These pages are reached via a search page, but no direct navigation from the home page.
When I link to an event page from an indexed page it doesn't show up in search results.
When I use fetch on webmaster tools the fetch is successful but is then not indexed - or if it does appear in results it's directed to the internal search page
e.g. http://www.beyondtransition.com/site/races/course/race110003/ has been fetched and submitted with links but when I search for BeyondTransition Ironman Cozumel I get these results....
So what have I done wrong and how do I go about fixing it? All thoughts and advice appreciated
Thanks
Denis
-
Thanks Nick. I'll work through all of those points
-
Not sure if it was a connection issue on my end or what, but that page takes a very long time to load, which could explain the lack of indexing of the pages linked from it.
Also, Google states that pages submitted witht the Fetch as Googlebot tool are not guaranteed to be indexed, so there may be quite a delay on that. Are all pages included in your XML sitemap? An XML sitemap is the preferred way to notify Google of pages it may not normally find. Here is a link to more about XML sitemaps https://www.google.com/support/webmasters/bin/answer.py?answer=156184&hl=en
Even with an XML sitemap, Google may not immediately crawl many pages. Actually, indexing is rarely immediate. The frequency of crawling and speed of indexing has to do with many of the same factors as your ranking - quality, number of inbound links and pagerank, site performance, etc. If all your pages load quickly and you are in pretty good shape as far as links, etc, you could also try something to draw Google's attention to the new pages - like Tweeting a link or posting to Google+. That seems to "force" faster indexing in some cases.
I just checked your site with webpagetest.org and it is showing a load time of about 14 seconds. Tools.pingdom.com seemed to get hung up on some of the javascripts and couldn't complete its test. Doing what you can to speed up the site and address any other "quality" issues will help with indexing, and your performance in search engine results in general. -
I''m not sure - I created this page yesterday as a map of all the races and added to the bottom of the home page as 'site map'. I then added 'site map' to the index using fetch on webmaster tools and used the submit links option. This morning it's been indexed but after quick sample none of the links from it have been indexed (or appear in google search results).
This suggests its something that's wrong with my page/page design but what?????
So a widget will help, but only once I've figured out the underlying problem
-
I''m not sure - I created this page yesterday as a map of all the races and added to the bottom of the home page as 'site map'. I then added 'site map' to the index using fetch on webmaster tools and used the submit links option. This morning it's been indexed but after quick sample none of the links from it have been indexed (or appear in google search results).
This suggests its something that's wrong with my page/page design but what?????
So a widget will help, but only once I've figured out the underlying problem
-
Since it may not be practical to have every event linked through navigation, maybe a widget that shows the last maybe ten events would be good enough.
-
Hi Nick,
Thanks for the answer. I've got a word press plugin but I don't think it captures everything so I'm in the process of manually generating an XML site map - but I think you have you finger on the answer why pages aren't crawled
Navigation is on the list of things to do - it's working out the relative urgency.
I like the RSS idea - time for some research on how to do it.
-
You should use a XML site map to keep Google up to date with new pages. I could not find one for your site. Otherwise, if the event pages can only be found by using the search feature on your site, those pages will not probably not be crawled and indexed. you could also submit the feed to RSS sites Fetch as Googlebot may work, but it probably will not be as fast as using a sitemap.xml file.
Would it be possible to have the event pages available through some kind of navigation in addition to being found by your site's search?
You might also consider setting up an RSS feed of the events and submitting it to feed burner and other RSS sites. That may be a little complicated, but would also help speed up indexing.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google not Indexing images on CDN.
My URL is: https://bit.ly/2hWAApQ We have set up a CDN on our own domain: https://bit.ly/2KspW3C We have a main xml sitemap: https://bit.ly/2rd2jEb and https://bit.ly/2JMu7GB is one the sub sitemaps with images listed within. The image sitemap uses the CDN URLs. We verified the CDN subdomain in GWT. The robots.txt does not restrict any of the photos: https://bit.ly/2FAWJjk. Yet, GWT still reports none of our images on the CDN are indexed. I ve followed all the steps and still none of the images are being indexed. My problem seems similar to this ticket https://bit.ly/2FzUnBl but however different because we don't have a separate image sitemap but instead have listed image urls within the sitemaps itself. Can anyone help please? I will promptly respond to any queries. Thanks
Technical SEO | | TNZ
Deepinder0 -
Single page website vs Google
Hi, I was wondering on this issue: There is a website for guesthouse. It has all information on one page (it is a valid page, with legitimate content). How google treats those pages? Would it treat it as Doorway Page? Or give some other penalties? What about a bounce rate? Because it will be pretty high, as there is no option to go somewhere else? What is your opinion on single page websites - SEO wise? Is it a shot in the foot? Thanks!
Technical SEO | | LeszekNowakowski0 -
Fetch as Google - stylesheets and js files are temporarily unreachable
Fetch as Google often says that some of my stylesheets and js files are temporarily unreachable. Is that a problem for SEO? These stylesheets and scripts aren't blocked and Search Consoles show that a normal user would see the page just fine.
Technical SEO | | WebGain0 -
Why is Google not indexing my site?
I'm a bit confused as to why my site just isn't indexing on Google. Even if I type in my brand name, my social channels rank and there's no evidence of my website. I've followed all of the advice I've read and gone into webmaster tools and got the Wordpress yoast plug-in but nothing seems to be making a difference!One thing I've noticed, in Google Webmaster Tools it says "Couldn’t communicate with the DNS server." in site errors. I've called GoDaddy and they said that everything is fine. A bit frustrating. Trying to work out what my next steps should be but feeling a bit lost to be honest! Any help GREATLY appreciated!
Technical SEO | | j1066s0 -
Should I put meta descriptions on pages that are not indexed?
I have multiple pages that I do not want to be indexed (and they are currently not indexed, so that's great). They don't have meta descriptions on them and I'm wondering if it's worth my time to go in and insert them, since they should hypothetically never be shown. Does anyone have any experience with this? Thanks! The reason this is a question is because one member of our team was linking to this page through Facebook to send people to it and noticed random text on the page being pulled in as the description.
Technical SEO | | Viewpoints0 -
Page Indexing increase when I request Google Site Link demote
Hi there, Has anyone seen a page crawling increase in Google Web Master Tools when they have requested a site link demotion? I did this around the 23rd of March, the next day I started to see page crawling rise and rise and report a very visible spike in activity and to this day is still relatively high. From memory I have asked about this in SEOMOZ Q&A a couple of years ago in and was told that page crawl activity is a good thing - ok fine, no argument. However at the nearly in the same period I have noticed that my primary keyword rank for my home page has dropped away to something in the region of 4th page on Google US and since March has stayed there. However the exact same query in Google UK (Using SEOMOZ Rank Checker for this) has remained the same position (around 11th) - it has barely moved. I decided to request an undemote on GWT for this page link and the page crawl started to drop but not to the level before March 23rd. However the rank situation for this keyword term has not changed, the content on our website has not changed but something has come adrift with our US ranks. Using Open Site Explorer not one competitor listed has a higher domain authority than our site, page authority, domain links you name it but they sit there in first page. Sorry the above is a little bit of frustration, this question is not impulsive I have sat for weeks analyzing causes and effects but cannot see why this disparity is happening between the 2 country ranks when it has never happened for this length of time before. Ironically we are still number one in the United States for a keyword phrase which I moved away from over a month ago and do not refer to this phrase at all on our index page!! Bizarre. Granted, site link demotion may have no correlation to the KW ranking impact but looking at activities carried out on the site and timing of the page crawling. This is the only sizable factor I can identify that could be the cause. Oh! and the SEOMOZ 'On-Page Optimization Tool' reports that the home page gets an 'A' for this KW term. I have however this week commented out the canonical tag for the moment in the index page header to see if this has any effect. Why? Because as this was another (if not minor) change I employed to get the site to an 'A' credit with the tool. Any ideas, help appreciated as to what could be causing the rank differences. One final note the North American ranks initially were high, circa 11-12th but then consequently dropped away to 4th page but not the UK rankings, they witnessed no impact. Sorry one final thing, the rank in the US is my statistical outlier, using Google Analytics I have an average rank position of about 3 across all countries where our company appears for this term. Include the US and it pushes the average to 8/9th. Thanks David
Technical SEO | | David-E-Carey0 -
Why did google pick this page to rank over another one?
I recently started working here and I have noticed that google is ranking some pages over other for the main key word. Example: We are ranking on page one for ATV tires for this url http://www.rockymountainatvmc.com/t/43/81/165/723/ATV-Tires-All I thought google would pick http://www.rockymountainatvmc.com/c/43/81/165/ATV-Tires since it is higher up in the folders. I Have a couple reasons why the are picking the other one. Mostly from link signals from one other site and footer link.. Any other thoughts. If we want google to rank the second url instead what would you suggest?
Technical SEO | | DoRM0 -
De-indexing millions of pages - would this work?
Hi all, We run an e-commerce site with a catalogue of around 5 million products. Unfortunately, we have let Googlebot crawl and index tens of millions of search URLs, the majority of which are very thin of content or duplicates of other URLs. In short: we are in deep. Our bloated Google-index is hampering our real content to rank; Googlebot does not bother crawling our real content (product pages specifically) and hammers the life out of our servers. Since having Googlebot crawl and de-index tens of millions of old URLs would probably take years (?), my plan is this: 301 redirect all old SERP URLs to a new SERP URL. If new URL should not be indexed, add meta robots noindex tag on new URL. When it is evident that Google has indexed most "high quality" new URLs, robots.txt disallow crawling of old SERP URLs. Then directory style remove all old SERP URLs in GWT URL Removal Tool This would be an example of an old URL:
Technical SEO | | TalkInThePark
www.site.com/cgi-bin/weirdapplicationname.cgi?word=bmw&what=1.2&how=2 This would be an example of a new URL:
www.site.com/search?q=bmw&category=cars&color=blue I have to specific questions: Would Google both de-index the old URL and not index the new URL after 301 redirecting the old URL to the new URL (which is noindexed) as described in point 2 above? What risks are associated with removing tens of millions of URLs directory style in GWT URL Removal Tool? I have done this before but then I removed "only" some useless 50 000 "add to cart"-URLs.Google says themselves that you should not remove duplicate/thin content this way and that using this tool tools this way "may cause problems for your site". And yes, these tens of millions of SERP URLs is a result of a faceted navigation/search function let loose all to long.
And no, we cannot wait for Googlebot to crawl all these millions of URLs in order to discover the 301. By then we would be out of business. Best regards,
TalkInThePark0