Pages not Indexed after a successful Google Fetch
-
I am trying to understand why google isn't indexing key content on my site.
www.BeyondTransition.com is indexed and new pages show up in a couple of hours.
My key content is 6 pages of information for each of 3000 events (driven by mySQL on a wordpress platform).
These pages are reached via a search page, but no direct navigation from the home page.
When I link to an event page from an indexed page it doesn't show up in search results.
When I use fetch on webmaster tools the fetch is successful but is then not indexed - or if it does appear in results it's directed to the internal search page
e.g. http://www.beyondtransition.com/site/races/course/race110003/ has been fetched and submitted with links but when I search for BeyondTransition Ironman Cozumel I get these results....
So what have I done wrong and how do I go about fixing it? All thoughts and advice appreciated
Thanks
Denis
-
Thanks Nick. I'll work through all of those points
-
Not sure if it was a connection issue on my end or what, but that page takes a very long time to load, which could explain the lack of indexing of the pages linked from it.
Also, Google states that pages submitted witht the Fetch as Googlebot tool are not guaranteed to be indexed, so there may be quite a delay on that. Are all pages included in your XML sitemap? An XML sitemap is the preferred way to notify Google of pages it may not normally find. Here is a link to more about XML sitemaps https://www.google.com/support/webmasters/bin/answer.py?answer=156184&hl=en
Even with an XML sitemap, Google may not immediately crawl many pages. Actually, indexing is rarely immediate. The frequency of crawling and speed of indexing has to do with many of the same factors as your ranking - quality, number of inbound links and pagerank, site performance, etc. If all your pages load quickly and you are in pretty good shape as far as links, etc, you could also try something to draw Google's attention to the new pages - like Tweeting a link or posting to Google+. That seems to "force" faster indexing in some cases.
I just checked your site with webpagetest.org and it is showing a load time of about 14 seconds. Tools.pingdom.com seemed to get hung up on some of the javascripts and couldn't complete its test. Doing what you can to speed up the site and address any other "quality" issues will help with indexing, and your performance in search engine results in general. -
I''m not sure - I created this page yesterday as a map of all the races and added to the bottom of the home page as 'site map'. I then added 'site map' to the index using fetch on webmaster tools and used the submit links option. This morning it's been indexed but after quick sample none of the links from it have been indexed (or appear in google search results).
This suggests its something that's wrong with my page/page design but what?????
So a widget will help, but only once I've figured out the underlying problem
-
I''m not sure - I created this page yesterday as a map of all the races and added to the bottom of the home page as 'site map'. I then added 'site map' to the index using fetch on webmaster tools and used the submit links option. This morning it's been indexed but after quick sample none of the links from it have been indexed (or appear in google search results).
This suggests its something that's wrong with my page/page design but what?????
So a widget will help, but only once I've figured out the underlying problem
-
Since it may not be practical to have every event linked through navigation, maybe a widget that shows the last maybe ten events would be good enough.
-
Hi Nick,
Thanks for the answer. I've got a word press plugin but I don't think it captures everything so I'm in the process of manually generating an XML site map - but I think you have you finger on the answer why pages aren't crawled
Navigation is on the list of things to do - it's working out the relative urgency.
I like the RSS idea - time for some research on how to do it.
-
You should use a XML site map to keep Google up to date with new pages. I could not find one for your site. Otherwise, if the event pages can only be found by using the search feature on your site, those pages will not probably not be crawled and indexed. you could also submit the feed to RSS sites Fetch as Googlebot may work, but it probably will not be as fast as using a sitemap.xml file.
Would it be possible to have the event pages available through some kind of navigation in addition to being found by your site's search?
You might also consider setting up an RSS feed of the events and submitting it to feed burner and other RSS sites. That may be a little complicated, but would also help speed up indexing.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Fetch as Google issues
HI all, Recently, well a couple of months back, I finally got around to switching our sites over to HTTPS://. In terms of rankings etc all looks fine and we have not move about much, only the usual fluctuations of a place or two on a daily basis in a competitive niche. All links have been updated, redirects in place, the usual https domain migration stuff. I am however, troubled by one thing! I cannot for love nor money get Google to fetch my site in GSC. No matter what I have tried it continues to display "Temporarily unreachable". I have checked the robots.txt and it is on a new https:// profile in GSC. Has anyone got a clue as I am stumped! Have I simply become blinded by looking too much??? Site in Q. caravanguard co uk. Cheers and looking forward to your comments.... Tim
Technical SEO | | TimHolmes0 -
Removed Subdomain Sites Still in Google Index
Hey guys, I've got kind of a strange situation going on and I can't seem to find it addressed anywhere. I have a site that at one point had several development sites set up at subdomains. Those sites have since launched on their own domains, but the subdomain sites are still showing up in the Google index. However, if you look at the cached version of pages on these non-existent subdomains, it lists the NEW url, not the dev one in the little blurb that says "This is Google's cached version of www.correcturl.com." Clearly Google recognizes that the content resides at the new location, so how come the old pages are still in the index? Attempting to visit one of them gives a "Server Not Found" error, so they are definitely gone. This is happening to a couple of sites, one that was launched over a year ago so it doesn't appear to be a "wait and see" solution. Any suggestions would be a huge help. Thanks!!
Technical SEO | | SarahLK0 -
How do I influence what page on my site google shows for specific search phrases?
Hi People, My client has a site www.activeadventures.com. They provide adventure tours of New Zealand, South America and the Himalayas. These destinations are split into 3 folders in the site (eg: activeadventures.com/new-zealand, activeadventures.com/south-america etc....). The actual root folder of the site is generic information for all of the destinations whilst the destination specific folders are specific in their information for the destination in question. The Problem: If you search for say "Active New Zealand" or "Adventure Tours South America" our result that comes up is the activeadventures.com homepage rather than the destination folder homepage (eg: We would want activeadventures.com/new-zealand to be the landing page for people searching for "active new zealand"). Are there any ways in influence google as to what page on our site it chooses to serve up? Many thanks in advance. Conrad
Technical SEO | | activenz0 -
Investigating a huge spike in indexed pages
I've noticed an enormous spike in pages indexed through WMT in the last week. Now I know WMT can be a bit (OK, a lot) off base in its reporting but this was pretty hard to explain. See, we're in the middle of a huge campaign against dupe content and we've put a number of measures in place to fight it. For example: Implemented a strong canonicalization effort NOINDEX'd content we know to be duplicate programatically Are currently fixing true duplicate content issues through rewriting titles, desc etc. So I was pretty surprised to see the blow-up. Any ideas as to what else might cause such a counter intuitive trend? Has anyone else see Google do something that suddenly gloms onto a bunch of phantom pages?
Technical SEO | | farbeseo0 -
Best on-line tool for checking indexed pages (or just for a Mac)
Hey guys, I'm on a Mac and that's why I can't use the usual PC software for checking if my links have been indexed. Here's the deal. I ordered some guest posts. The guest poster did it for me and put my back links. Now, I want to quickly check which pages (with my backlinks) have been indexed. I have a lot of guest posts. So, I need something that can check if those pages have been indexed by Google. I need an online tool or something that will work for my Mac. Help. 🙂
Technical SEO | | VinceWicks0 -
Removing some of the indexed pages from my website
I am planning to remove some of the webpages from my website and these webpages are already indexed with search engine. Is there any way by which I need to inform search engine that these pages are no more available.
Technical SEO | | ArtiKalra0 -
Why googlebot indexing one page, not the other?
Why googlebot indexing one page, not the other in the same conditions? In html sitemap, for example. We have 6 new pages with unique content. Googlebot immediately indexes only 2 pages, and then after sometime the remaining 4 pages. On what parameters the crawler decides to scan or not scan this page?
Technical SEO | | ATCnik0 -
Importance of an optimized home page (index)
I'm helping a client redesign their website and they want to have a home page that's primarily graphics and/or flash (or jquery). If they are able to optimize all of their key sub-pages, what is the harm in terms of SEO?
Technical SEO | | EricVallee340