Google Index/Cashe questions
-
I have 15k+ pages. I have 4.5k pages indexed.
What relation is the google cashe to indexing pages? My site gets cashed every two days. The competition in my SERP goes 2-3weeks to get cashed. What does this indicate? Is your cashe date your last google crawl?
How can I get google to crawl my site? Is there a way I can get google to crawl my site starting from an internal page. This way I could set up a better linking structure that would benefit from doing activities that get that page indexed to help get my site indexed more thoroughly...
-
I guess I might try that. I was concerned 40-50 links on an internal page (very internal = No PR) would kill that page and any chance of the internal pages ranking.
-
Yes, you have a good take on it JML. I too would love to hear what others have to say about this.
Have you thought about putting a "View All" option (link) on your listings pages? This could make it possible for both users and Google bot to access all of the listings without having to deal with the latency inherent with paginated pages. (Googe's own research has shown that users prefer scrolling through lots of content instead of having to move from one page to another....just a thought).
-
So if I can get google to come to an interior page, how deep will it dig. Perhaps I need to figure out how many times can I call google per month, then have that many interior pages that have a significant number of links that lead to additional static pages (each link) that are linked out as well.
Like a pyramid. Google enters Interior Page (A). It's a static page (wordpress) that links 99 links that all link to 99 pages that link to 99 pages. Google would go about three in? I could get more indexed like this.
My page is linked well. It's dynamic content (real estate listings) that have hubs set up. There is a lot of pagination which is what I think google stops at. It enters subdivision "X" and there are 10 listings per page, paginated. There are 300 listings in that subdivision. I don't think google is going too far in to that subdivisions listings page. I need to understand google indexing better.
I invite more participants. Thank you Dana. I need more info all
Help!
-
Yes, you can get Google to crawl your site starting from an interior page using "Fetch as Googlebot" which is available as an item in the left hand navigation menu is Google Webmaster tools. After entering the URL of the page click on "Submit to Index" and then a box will open that gives the choice of submitting just that URL or submitting the URL plus all linking pages. Choose that second option. You can only do this a limited number of times a month so be judicious about which ones you submit this way.
As far as cache date, convention wisdom is that the more recent your cache date, the better. What this means is that Google is actually crawling your site more often. A cache date of 3 weeks or more happens when Google has come to your site or page many times but rarely finds any new content. WHen that's the case, google bot visits less frequently. Because Google values fresh content, it's better to have a newer cache date and to have google bot visiting more frequently.
I would be a little bit concerned that you have 15K pages and only 4.5K indexed. That ratio seems low to me. This might be an indication that google bot is having trouble crawling your entire site. Also, how new is your site? Is it a few months old or a few years old? Google bot crawls new sites more frequently, simply because they are new content. Over time, depending on how often your content is edited, updated or new content is posted, the crawl schedule will change.
I know this is only a partial answer to your question. But hopefully it provides a little insight.
Dana
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to replace the keywords of our Google Site https://www.opcfitness.com/ 's TITLE
How to replace the keywords of our Google Site https://www.opcfitness.com/ 's TITLE Our new google site https://www.opcfitness.com/ page https://www.opcfitness.com/commercial-fitness title: Gym Equipment for Sale - Buy Commercial Fitness The site name is Gym Equipment for Sale. But we need the title like this Buy Commercial Fitness - Gym Equipment for Sale How to fix it?
On-Page Optimization | | ahislop5740 -
Feedback / Advice
Hi! For spanish speakers: ¿Hay por aquí clientes de agenciaSEO.eu? Estoy buscando una agencia de posicionamiento web y agradecería algo de feedback. In English: I'm asking for some feedback. http://agenciaseo.eu/ Gracias!
On-Page Optimization | | patrizia_h0 -
Google Search - One page having problems
this issue is concerning my site - cruvoir.com we retail designer clothing online, and currently have 17 'designer' pages - one for each manufacturer brand name. We target these brand names for our campaign and track the progress with Moz and try to focus them in Google search. Of many of the designer names, we rank pretty well in Google search (usually under #15 when searching for the specific brand. All brands are doing well, except one brand : "Lost And Found" - a designer label we carry. This is the page for this brand name : https://cruvoir.com/5-lost-and-found we cannot figure it out. It happens to be our most important label we carry. when we search for this brand name or include it in any other search terms, we never are in the google search results. I expect it is a crawl issue, but we have covered all our ground in optimizing this brand page. It seems this page is also indexed with Google. But we cannot figure out why it does not rank us in search.
On-Page Optimization | | cruvoir0 -
Boatload of 301 Redirects Question
We have a client that came to us and they recently did a site makeover. Previously they had all their pages in root directory including 75+ spammy article pages. On their makeover, they moved all the article pages into a directory and added 301 redirects. In going over their site we noticed they have redundant articles, like an article on blue-marble-article.htm and blue-marbles-article.htm Playing on singular and plural with dulpicate content for most part with exception to making it plural. If they have 75 articles, Id say 1/3 are actually somewhat original content. I would like to 301 redirect 2/3's of the articles to better re-written article pages but that would add a whole lot more 301 redirects. We would then have a 301 redirect from root directory to article directory, then another 301 redirect from spam article to new re-written article. My question is, would this be too many redirects for googlebot to sort through and would it be too confusing or send bad signals? Or should I create a new directory with all good articles and just redirect the entire old articles directory to the new one? Or just delete the redirects and old spammy directory and let those fall on a 404 error page. Id hate to lose 50-75 pages but I think its in fact those spammy pages that could be why the site fell from top of first page google to third page and now 10th page in a years time. I know, Im confused just typing this out. Hope it makes sense for some good feedback and advise. Thanks.
On-Page Optimization | | anthonytjm0 -
Google indexing page differently
Does google index an interal page differently depending on whether you are using a FULL url (including domain) or just a relative link? Also, is it possible that using a full URL (http://mysite.com/page.html) causes the browser to "ping" the server differently than just having the href linked to using relative links (/page.html) Could this cause server or firewall perfomance issues?
On-Page Optimization | | WebRiverGroup0 -
Blocking Google seeing outbound links?
Apart from rewriting the outbound url to look like a folder 'abc.co.uk/out/link1' and blocking the folder 'out' in the robots.txt file, along with also nofollowing the links as well, is there anything else you can do?
On-Page Optimization | | activitysuper0 -
Why are some of page indexed and others not
I have created a site structure like this: domain/for-sale/brand domain/for-sale/brand-model domain/for-sale/brand-model/pg1 domain/for-sale/brand-model/pg2 domain/for-sale/brand-model/pg3 etc.... I cannot understand why the domain/for-sale/brand-model does not seem to be indexed, yet the domain/for-sale/brand-model/pg6 is? This is a new site, but I cannot understand why this URL would be indexed without the others... Any ideas? My home pages has links to the domain/for-sale/brand, this page has links to domain/for-sale/brand-model1, domain/for-sale/brand-model2 etc, each of these pages have links to domain/for-sale/brand-model/pg1, domain/for-sale/brand-model/pg2 etc...
On-Page Optimization | | MirandaP0 -
Meta Description not displaying in Google
Hi Mozzers, I have a client that wants to change the way the meta description for some of his pages is being displayed. I've tried using the NOOPD and NOYDIR tags and its not worked. This isn't the client but perform this search in Google.ie - "accommodation newry daft" you get this result - http://www.google.ie/#hl=en&sclient=psy-ab&q=accommodation+newry+daft&pbx=1&oq=accommodation+newry+daft&aq=f&aqi=&aql=&gs_sm=e&gs_upl=11197l11712l2l12016l5l5l0l0l0l0l186l851l0.5l5l0&bav=on.2,or.r_gc.r_pw.r_qf.,cf.osb&fp=f5c640577bb5a285&biw=1600&bih=775 See how Daft.com (2nd results down) has the text "10+ items" in the description- my client has this as well as do many other competitors but its not present in the meta description tag. Anyone know how to get rid of this and get the good old meta descrition in the SERPs? Thanks BUsh
On-Page Optimization | | Bush_JSM0