Cache Not Working on Our Site
-
We redesigned our site (www.motivators.com) back in April. Ever since then, we can't view the cache. It loads as a blank, white page but the cache text is at the top saying:
"This is Google's cache of http://www.motivators.com/. It is a snapshot of the page as it appeared on Jul 22, 2013 15:50:40 GMT. The current page could have changed in the meantime. Learn more. Tip: To quickly find your search term on this page, press Ctrl+F or ⌘-F (Mac) and use the find bar."
Has anyone else ever seen this happen? Any ideas as to why it's happening? Could it be hurting us? Advice, tips, suggestions would be very much appreciated!
-
Thank you for the kind words. I send you a private message and feel free to contact me no charge for my advice
-
You are more than welcome mask it to solve your problem? Because it does not seem to me that the problem is with your cash it seems to me that the problem is with to many calls on the DNS and the site structure do you want to talk to me about this I can tell you I know quite a bit about site speed
Thomas
tzickell@blueprintmarketing.com
test this
-
Thank you, Bereijk and Thomas, for your responses!
-
to give you a short answer a content delivery network will help with caching quite a bit.
Implement max CDN your site will not be perfect, but you will not have to worry about a cache problem.
However the problems with your site run much deeper than what you're questioning suggest is the issue.
Honestly you need to clean up the code quite a bit.
However using a CDN even cloud flare a free CDN would help your site dramatically.
Sincerely,
Thomas
-
Dallas you have an extremely slow loading website
Page size
2.7MB
Load time
9.99s
Requests
195
I honestly rarely see a page that has almost 200 requests this is not a good thing especially considering you're not using a quality DNS company use DynECT or if you think Dyn is a lot to spend uses DNS Made Easy
I have never heard of your hosting company
You are using http://www.webair.com/
so I cannot tell you if it is good or bad I would move to a company that I trust however I know nothing of your hosting company so it could be good I just don't know it check out fire host they will have everything you need including DNS and content delivery networks but the real issue is your front end the code is extremely bad there's a lot of JavaScript getting in the way of everything I would recommend moving to media Temple and using their DV server with cloudflare's railgun it is free media Temple hosting accounts and I strongly advise you do not use their GS server it is garbage
You have a google page speed of 82/100
The page Promotional Products and Promotional ... got an overall PageSpeed Score of 82 (out of 100).Learn more
https://developers.google.com/speed/pagespeed/insights#url=www.motivators.com&mobile=false
Because cloudflare's railgun rewrites your code for you this could help you immensely but you really need to hire a web developer and have your site code fixed to have the best possible outcome I can recommend firsthand using
He can fix your site
https://builtwith.com/mobile.aspx?http://www.motivators.com/
http://tools.pingdom.com/fpt/#!/dCpDKu/www.motivators.com
I would add a content delivery network Max CDN is a inexpensive and very easy to implement CDN along with being one of the few that offer support at such a inexpensive price.
I hope this was of help,
Thomas
-
The homepage takes more than 9 seconds to load, I can imagine that the crawler of Google hasn't taken the time create a snapshot of the page... I would try to speed up the website first... much better for user experience and also for rankings.
Check this report: http://www.webpagetest.org/result/130722_RJ_13WT/ and this one: http://www.webpagetest.org/result/130722_3R_143F/
Check this site about speed: http://www.strangeloopnetworks.com/web-performance-infographics/.
Not sure if that is the solution, but I would start fixing the speed issue rather than worrying about the Google Cache Pages...
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google webmaster is not crawling links and site cache still in old date
Hi guys, I have been trying to get my page indexed in Google with new title and descriptions but it is not getting indexed. I have checked in many tools but no useful. Can you please tell me what could be the issue? Even I have set up And Google webmaster is not crawling links I have built so far. Few links are indexed but others do not. Why this is happening. My url is: https://www.paydaysunny.com thanks
Technical SEO | | ksmith880 -
Local site under generic domain
Howdy Mozers, We have main website on .com domain and local websites for each language like .es, .fr, .in etc. We decided to move all local sites under main domain .com using subdirectories with gTLDs. One of the local sites has a manual penalty. Right now we are redirecting local site which have penalty using 302 redirect. So my question is. Will 302 redirect hurt our main site? Is there any other way to redirect visitors from local site without passing penalty? We have few thousands monthly users who are still using local domain links to get to our site, so we can't remove redirect at all. Best Regards,
Technical SEO | | juris_l
Juris0 -
Should we dump the https from a client site?
We inherited a site that has both http and https. No e-commerce or data transfer...just html. Should we dump the https certificate? I think it might be causing issues with indexing and possible duplicate content. The https site has a certificate warning message...not good. The URL is www.charlottemechanical.com
Technical SEO | | theideapeople0 -
Site architecture & breadcrumbs
Hi A client hasn't structured site architecture in a silo type format so breadcrumbs are not predicating in a topical hierarchy as one would desire (or at least i think one would prefer) For example: say the site is called www.fruit.com and it has a category called 'types of fruit' and then sub/content pages called things like 'apples' and 'pears'. So in terms of architecture that should be: www.fruit.com/types-of-fruit/apples and www.fruit.com/types-of-fruit/pears etc etc The client has kept it all flat so instead architecture is: www.fruit.com/types-of-fruit and www.fruit.com/apples and www.fruit.com/pears As a result breadcrumbs follow suit and hence since also not employing logical predication dont reflect the topical & sub-topical hierarchy I have seen that some seo's at least used to think this was better for seo since kept the page/s nearer the root but surely its better to structure site architecture in a logical topical hierarchy so long as dont go beyond say 3 or 4 directories/forward slashes in the url's? Also is it theoretically possible to keep url structure as is (flat) and just edit/customise the breadcrumbs to reflect a topical hierarchy in a silo structure rather than change the entire site architecture & required 301'ing etc in order to do this (or is that misleading or just not possible?) Cheers Dan
Technical SEO | | Dan-Lawrence0 -
Google having trouble accessing my site
Hi google is having problem accessing my site. each day it is bringing up access denied errors and when i have checked what this means i have the following Access denied errors In general, Google discovers content by following links from one page to another. To crawl a page, Googlebot must be able to access it. If you’re seeing unexpected Access Denied errors, it may be for the following reasons: Googlebot couldn’t access a URL on your site because your site requires users to log in to view all or some of your content. (Tip: You can get around this by removing this requirement for user-agent Googlebot.) Your robots.txt file is blocking Google from accessing your whole site or individual URLs or directories. Test that your robots.txt is working as expected. The Test robots.txt tool lets you see exactly how Googlebot will interpret the contents of your robots.txt file. The Google user-agent is Googlebot. (How to verify that a user-agent really is Googlebot.) The Fetch as Google tool helps you understand exactly how your site appears to Googlebot. This can be very useful when troubleshooting problems with your site's content or discoverability in search results. Your server requires users to authenticate using a proxy, or your hosting provider may be blocking Google from accessing your site. Now i have contacted my hosting company who said there is not a problem but said to read the following page http://www.tmdhosting.com/kb/technical-questions/other/robots-txt-file-to-improve-the-way-search-bots-crawl/ i have read it and as far as i can see i have my file set up right which is listed below. they said if i still have problems then i need to contact google. can anyone please give me advice on what to do. the errors are responce code 403 User-agent: *
Technical SEO | | ClaireH-184886
Disallow: /administrator/
Disallow: /cache/
Disallow: /components/
Disallow: /includes/
Disallow: /installation/
Disallow: /language/
Disallow: /libraries/
Disallow: /media/
Disallow: /modules/
Disallow: /plugins/
Disallow: /templates/
Disallow: /tmp/
Disallow: /xmlrpc/0 -
Webmaster Tools Links To Your Site
I logged onto webmaster tools today for my site and the section 'Links to Your Site' is showing no data. Also if I search using link:babskibaby.com it only shows 1 link. My site had been showing 500+ links previously. Does anyone know why this is?
Technical SEO | | babski0 -
Google.ca is showing our US site instead of our Canada Site
When our Canadian users who search on google.ca for our brand (e.g. Travelocity, Travelocity hotels, etc.), the first few results our from our US site (travelocity.com) rather than our Canadian site (travelocity.ca). In Google Webmaster Tools, we've adjusted the geotargeting settings to focus on the appropriate locale, but the wrong country TLD is still coming up at the top via google.ca. What's the best way to ensure our Canadian site comes up instead of the US site on google.ca? Thanks, Tory Smith
Technical SEO | | travelocitysearch
Travelocity0