Google Index/Cashe questions
-
I have 15k+ pages. I have 4.5k pages indexed.
What relation is the google cashe to indexing pages? My site gets cashed every two days. The competition in my SERP goes 2-3weeks to get cashed. What does this indicate? Is your cashe date your last google crawl?
How can I get google to crawl my site? Is there a way I can get google to crawl my site starting from an internal page. This way I could set up a better linking structure that would benefit from doing activities that get that page indexed to help get my site indexed more thoroughly...
-
I guess I might try that. I was concerned 40-50 links on an internal page (very internal = No PR) would kill that page and any chance of the internal pages ranking.
-
Yes, you have a good take on it JML. I too would love to hear what others have to say about this.
Have you thought about putting a "View All" option (link) on your listings pages? This could make it possible for both users and Google bot to access all of the listings without having to deal with the latency inherent with paginated pages. (Googe's own research has shown that users prefer scrolling through lots of content instead of having to move from one page to another....just a thought).
-
So if I can get google to come to an interior page, how deep will it dig. Perhaps I need to figure out how many times can I call google per month, then have that many interior pages that have a significant number of links that lead to additional static pages (each link) that are linked out as well.
Like a pyramid. Google enters Interior Page (A). It's a static page (wordpress) that links 99 links that all link to 99 pages that link to 99 pages. Google would go about three in? I could get more indexed like this.
My page is linked well. It's dynamic content (real estate listings) that have hubs set up. There is a lot of pagination which is what I think google stops at. It enters subdivision "X" and there are 10 listings per page, paginated. There are 300 listings in that subdivision. I don't think google is going too far in to that subdivisions listings page. I need to understand google indexing better.
I invite more participants. Thank you Dana. I need more info all
Help!
-
Yes, you can get Google to crawl your site starting from an interior page using "Fetch as Googlebot" which is available as an item in the left hand navigation menu is Google Webmaster tools. After entering the URL of the page click on "Submit to Index" and then a box will open that gives the choice of submitting just that URL or submitting the URL plus all linking pages. Choose that second option. You can only do this a limited number of times a month so be judicious about which ones you submit this way.
As far as cache date, convention wisdom is that the more recent your cache date, the better. What this means is that Google is actually crawling your site more often. A cache date of 3 weeks or more happens when Google has come to your site or page many times but rarely finds any new content. WHen that's the case, google bot visits less frequently. Because Google values fresh content, it's better to have a newer cache date and to have google bot visiting more frequently.
I would be a little bit concerned that you have 15K pages and only 4.5K indexed. That ratio seems low to me. This might be an indication that google bot is having trouble crawling your entire site. Also, how new is your site? Is it a few months old or a few years old? Google bot crawls new sites more frequently, simply because they are new content. Over time, depending on how often your content is edited, updated or new content is posted, the crawl schedule will change.
I know this is only a partial answer to your question. But hopefully it provides a little insight.
Dana
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does Google avoid indexing pages that include registered trademark signs?
I am suspecting that Google often hesitates to index pages that have registered trademarks on them that are marked with a ®. For example EGOL® used in the title tag or in the tag at the top of the page. Registered trademarks are everywhere and most retail product pages contain at least one of them. However, most people use the registered trademark names as text in their writing without adding the registered trademark sign of ®. Have you experienced a problem getting such pages indexed or have you read any articles about how Google treats registered trademarks?
On-Page Optimization | | EGOL0 -
Fetch as Google
Are there any pros or cons with using Google fetch and submit? I realise Google will likely find it of its own accord in due course but I have found it may take a couple of weeks if at all. Fetch and submit seems to speed this process up, sometimes anyway.
On-Page Optimization | | seoman100 -
Catergories not appaearing in google help!
Hi I call upon the seomoz experts again this time regarding getting our categories to show in google when you type in our company name Tidy Books. Ive added 3 pictures 1 of our UK store www.tidy-books.co.uk 2 of US store - www.tidy-books.com 3 of a competitor who has over taken us and had the Google mark up we require. Would some love help some help in the last few months we've lost a few places and traffic is going down Thanks everyone ItiqzWV 6g8e1Zy I2qEJCc
On-Page Optimization | | tidybooks0 -
On page link question, creating an additional 'county' layer between states and zips/cities
Question We have a large site that has a page for all 50 states. Each of these pages has unique content, but following the content has a MASSIVE amount of links for each zip AND city in that state. I am also in the process of creating unique content for each of these cities and zips HOWEVER, I was wondering would it make sense to create an additional 'county' layer between the states and the zips/cities. Would the additional 'depth' of the links bring down the overall rank of the long tail city and zip pages, or would the fact that the counties would knock the on page link count down from a thousand or so, to a management 50-100 substantially improve the overall quality and ranking of the site? To illustrate, currently I have State -> city and zip pages (1200+ links on each state page) what i want to do is do state -> county (5-300 counties on each state page) -> city + zip (maybe 50-100 links on each county page). What do you guys think? Am I incurring some kind of automatic penalty for having 1000+ links on a page?
On-Page Optimization | | ilyaelbert0 -
Prevent Indexing of URLs Based on Tags
I started my website as a blog over at Posterous, but decided to turn it into a full scale business website with a self-hosted WordPress theme. Shortly after transitioning from Posterous to WordPress, I noticed that Google was indexing not only my old blog posts, but the URLs of my blog posts based on the tags they have. Is there any reason why this is a problem? I'm sure it shouldn't qualify as duplicate content, but for some reason it just feels a bit sloppy to me to have all of these pages indexed...Is this a non-issue? Should I just be more discriminating with my use of 'tags' if it bothers me? JiGLH.png
On-Page Optimization | | williammarlow0 -
Indexation problem
Hello, I have an online store specialized in offers and discounts (http://www.offertazo.com/) with an indexation problem. The products are not updated correctly. I think the problem is that when I publish a new offer, it doesn´t appear on the top of my page´s SERP. I would appreciate any suggestions. Best regwards.
On-Page Optimization | | ofuente0 -
Got loads of pages, but none indexing?
I have a WordPress site with loads of pages on a url like this http://mysite.com.au However, Google has indexed http://www.mysite.com.au and as a result only indexing 2 pages. How do I fix this? Many thanks Dan
On-Page Optimization | | Pokodot0 -
Strange Keywords Google Webmaster Tools
Hey, I have a website about "coffee machines". Since a few months I added a vBulletin forum with vbSEO installed. The keywords Google found before I added the forum where highly related to the topic of my website (e.g. "coffee", "machine" etc), which I checked through the webmaster tools. However, I recently checked again and noticed that Google finds mostly keywords like "post", "forum", "thread", "share" etc. with high significance. Those "keywords" only appear on the forum. Now I'm a bit worried since Google states "These should reflect the subject matter of your site." Any advice how to solve that issue? Thanks!
On-Page Optimization | | netminds0