Is anyone else noticing way longer than usual caching delays in Chrome?
-
I typically see browsers refresh at 48 hours the longest. We pushed some changes through production about a week ago and Chrome still has the old version cached. I'm seeing some similar posts and wonder if Google is up to something and we are starting to "cache" on (pun intended)?
-
Hi Brett, thanks for the response. I am on Chrome in Windows. This is the first time that I have noticed a page cached for more than 72 hours after a site update.
-
Hi Emily,
I wonder if there's a difference between the Windows and Apple version. I have not noticed it on my mac, but haven't paid close attention on my PC. What are you operating on?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can anyone help me diagnose an indexing/sitemap issue on a large e-commerce site?
Hey guys. Wondering if someone can help diagnose a problem for me. Here's our site: https://www.flagandbanner.com/ We have a fairly large e-commerce site--roughly 23,000 urls according to crawls using both Moz and Screaming Frog. I have created an XML sitemap (using SF) and uploading to Webmaster Tools. WMT is only showing about 2,500 urls indexed. Further, WMT is showing that Google is indexing only about 1/2 (approx. 11,000) of the urls. Finally (to add even more confusion), when doing a site search on Google (site:) it's only showing about 5,400 urls found. The numbers are all over the place! Here's the robots.txt file: User-agent: *
Intermediate & Advanced SEO | | webrocket
Allow: /
Disallow: /aspnet_client/
Disallow: /httperrors/
Disallow: /HTTPErrors/
Disallow: /temp/
Disallow: /test/ Disallow: /i_i_email_friend_request
Disallow: /i_i_narrow_your_search
Disallow: /shopping_cart
Disallow: /add_product_to_favorites
Disallow: /email_friend_request
Disallow: /searchformaction
Disallow: /search_keyword
Disallow: /page=
Disallow: /hid=
Disallow: /fab/* Sitemap: https://www.flagandbanner.com/images/sitemap.xml Anyone have any thoughts as to what our problems are?? Mike0 -
Did not get Good reply on my previous query. Can anyone help me?
I did not get satisfactory answer on my previous question here - https://moz.com/community/q/please-provide-solution-for-my-website-duplicate-content-problem Please help me.
Intermediate & Advanced SEO | | Alexa.Hill0 -
Leverage browser caching
anyone know a good tutorial on how to implement Leverage Browser Caching? Do I need something like cloud flare or can I add meta tags to do this?
Intermediate & Advanced SEO | | Cocoonfxmedia0 -
Rel=Canonical to Longer Page?
We've got a series of articles on the same topic and we consolidated the content and pasted it altogether on a single page. We linked from each individual article to the consolidated page. We put a noindex on the consolidated page. The problem: Inbound links to individual articles in the series will only count toward the authority of those individual pages, and inbound links to the full article will be worthless. I am considering removing the noindex from the consolidated article and putting rel=canonicals on each individual post pointing to the consolidated article. That should consolidate the PageRank. But I am concerned about pointing****a rel=canonical to an article that is not an exact duplicate (although it does contain the full text of the original--it's just that it contains quite a bit of additional text). An alternative would be not to use rel=canonicals, nor to place a noindex on the consolidated article. But then my concern would be duplicate content and unconsolidated PageRank. Any thoughts?
Intermediate & Advanced SEO | | TheEspresseo0 -
Large number of new web pages to launch in bulk or to stretch over longer time?
We are going to launch about 60.000 new web pages under one established domain (google pagerank 6), which currently has about 30.000 web pages. So web pages will triple. New pages contain keyword variations of existing pages with pages targeting specific niches. All pages with unique and useful specific content for visitor. All pages heavily interlinked among each other. Would you recommend to stretch launch of new pages over some time period, or do you see no problem to launch them all on one day? In about 3 months we plan to launch another 180.000 new web pages. Thanks.
Intermediate & Advanced SEO | | lcourse0 -
Google SERPs do not display "cached"
When I am signed in with Google and searching sites, the snippets do not display the "cached" link. Not good since I am trying to see when a particular page was crawled. If I login to another server that I never use to browse and search from there the "cache" link does show up. Assumption: google knows who I am on my machine and is "helping" me.......but is there an easy way to turn this help off?
Intermediate & Advanced SEO | | Eyauuk0 -
Quickseoresults.com - Anyone used them?
Has anyone had any experience with or used quickseoresults.com? I'm just looking into them now. They seem to offer a 30 day free trial based on 'white hat' tactics that gives results. You can then pay to continue their services. They seem to base their services heavily around link building, so I'm dubious.
Intermediate & Advanced SEO | | PeterAlexLeigh0 -
Best way to handle old re-directs?
What happens if you go back and change old 301 re-directs? So instead of it re-directing from A to B then C, we write a new redirect for A to C. What does Google see this as next time it crawls the site?
Intermediate & Advanced SEO | | anchorwave0