My website internal pages are not getting cached with latest data
-
In our website we have sector list, in home page main category list is displayed click on main category user has to select sub category and reach the result page.
EX: Agriculture->Agribusiness->Rice
Agriculture page is indexed,but Agribusiness and Rice page is not getting cached,it is showing old indexed date as 23 July 2013,but i have submitted the sitemaps after this 4 times, and some url i have submitted manually in web master tool, but after this also my pages are not cached recently,
Please suggest the solution and what might be the problem
Thank you In Advance,
Anne
-
Hi Anne
I would make sure the page is in fact accessible via the crawler.
1. First check the page its self in something like URI Valet and make sure it's responding with a 200 OK code. Use Googlebot as the user agent.
2. You can also "fetch as Googlebot" in Webmaster Tools and from there submit the URL. So do the fetch and assuming it returns your 200 code you can then re-submit to the index.
3. You can also try crawling the site with Screaming Frog SEO Spider (with Googlebot as the user agent) and see if those pages come up in the crawl.
Lastly, I am curious how you know the "indexed date" of the page? I know if the page is cached you can see cache date, but not sure where indexed date would be. And sometimes Google may just not re-cache or update the index of a page for a while if it has a lower PageRank and/or the content is not new and fresh - it will not see a reason to update the cache.
Also, have these URLs ever been cached?
-Dan
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Car Dealership website - Duplicate Page Content Issues
Hi, I am currently working on a large car dealership website. I have just had a Moz crawl through and its flagging a lot of duplicate page content issues, these are mostly for used car pages. How can I get round this as the site stocks many of the same car, model, colour, age, millage etc. Only unique thing about them is the reg plate. How do I get past this duplicate issue if all the info is relatively the same? Anyone experienced this issue when working on a car dealership website? Thank you.
Technical SEO | | karl621 -
How do I setup sitemaps for an international website?
I am adding translated versions of my sites to a subdomain for example es.example.com. Will I add each subdomain into Google Webmaster Tools? Will each need its own sitemap?
Technical SEO | | EcommerceSite0 -
3,511 Pages Indexed and 3,331 Pages Blocked by Robots
Morning, So I checked our site's index status on WMT, and I'm being told that Google is indexing 3,511 pages and the robots are blocking 3,331. This seems slightly odd as we're only disallowing 24 pages on the robots.txt file. In light of this, I have the following queries: Do these figures mean that Google is indexing 3,511 pages and blocking 3,331 other pages? Or does it mean that it's blocking 3,331 pages of the 3,511 indexed? As there are only 24 URLs being disallowed on robots.text, why are 3,331 pages being blocked? Will these be variations of the URLs we've submitted? Currently, we don't have a sitemap. I know, I know, it's pretty unforgivable but the old one didn't really work and the developers are working on the new one. Once submitted, will this help? I think I know the answer to this, but is there any way to ascertain which pages are being blocked? Thanks in advance! Lewis
Technical SEO | | PeaSoupDigital0 -
Many Pages Being Combined Into One Long Page
Hi All, In talking with my internal developers, UX, and design team there has been a big push to move from a "tabbed" page structure (where as each tab is it's own page) to combining everything into one long page. It looks great from a user experience standpoint, but I'm concerned that we'll decrease in rankings for the tabbed pages that will be going away, even with a 301 in place. I initially recommending#! or pushstate for each "page section" on the long form content. However there are technical limitations with this in our CMS. The next idea I had was to still leave those pages out there and to link to them in the source code, but this approach may get shot down as well. Has anyone else had to solve for this issue? If so, how did you do it?
Technical SEO | | AllyBank1 -
Blog Ranking NOT home page main website?!
Hi, Our Blog (http://blog.thailand-investigation.com) is ranking for some of our major keywords but not our home page (http://www.thailand-investigation.com)!? Our blog is WordPress and our main website is HTML. It seems like the search engines consider that they are 2 separate websites!? When I check the incoming links to our website, I get also the blog links!!!??? Is it normal? Do I have to build a relation of some kind or write some code saying that it is our Blog... I don't know! I'm not a SEO specialist or even a webmaster. I'm a small business owner and take care on my website. I created by myself but never learned! So, please help! Thanks
Technical SEO | | MichelMauquoi0 -
Getting 404 error when open the cache link of my site
My site is hazanstadservice.se and when I am trying to open this to check the cache date i got a 404 error from google. I don't know why ? The cache page url is http://webcache.googleusercontent.com/search?q=cache:j99uW96RuToJ:www.hazanstadservice.se/+&cd=1&hl=en&ct=clnk.
Technical SEO | | Softlogique0 -
Mass 404 pages
Hi Guys, If I were to have to take down the majority of my site, taking all content and links pointing to that content down. How would the search engines react? Would I get a penalty for the majority of the site all of the sudden missing? My only concern is the loss of traffic on the remanding pages. Thanks!
Technical SEO | | DPASeo0 -
Our Development team is planning to make our website nearly 100% AJAX and JavaScript. My concern is crawlability or lack thereof. Their contention is that Google can read the pages using the new #! URL string. What do you recommend?
Discussion around AJAX implementations and if anybody has achieved high rankings with a full AJAX website or even a partial AJAX website.
Technical SEO | | DavidChase0