Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Using 2 cache plugin good or not?
-
Hi, Can anyone tell me - whether using 2 cache plugin helps or it cause any issue? Besides, when i used w3 cache plugin in WordPress its found like inline CSS issue to get cleared. So, i tried auto optimized but my website Soc prollect gone crashed in between while using the some. Is there any solution and can anyone tell me which plugin advantages to speed the site by removing java script and inline css at a time.
-
Thank you
-
From a technical standpoint, using 2 caching plugins only causes conflicts and unnecessary queries in the backend. Knowing your running wordpress i can see the (bad) score appearing on https://developers.google.com/speed/pagespeed/insights/?url=https%3A%2F%2Fwww.socprollect-mea.com%2F
-
Best way is to stick with one plugin that does the most of it. Auto-optimize is in my experience on of the better plugins that covers most of it. The JS compression feature is a tricky one, as it could break things on your website like the mobile menu or other 'features' that rely on javascript. You need to test this out before applying.
-
If you want a quick speed-up install Webp Express. It's a plugin that converts the existing JPG files on your website to Webp and it offers a far better compression while retaining quality compared to Jpg. This will give you some points in Insights.
-
Try to remove any of the external features you do not need. I see you load up alot of external files which can cause an additional render/load time as insights tells you.
-
Try to disable plugins you do not need. They cause extra queries and that adds up to the Time to first byte (Server response time).
I optimize both wordpress and other related websites, i usually get very good scores (of 85% for wordpress) and 99% for normal based websites. They do have an effect in SEO but it's only marginal. And, you need less resources to provide the same for your visitors and / or server.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is 301 redirect the only way when using Vanity URLs?
We have been using vanity urls for some of our pages. Mostly the pages that have a vanity URL have a long URL length. But now the problem is, the vanity URL is getting displayed on the search engine when the particular keyword related to the page is entered. I checked the google search console, the vanity URL is indexed and the original URL remains unindexed. What should I do? Is adding 301 redirect to the vanity URLs are solution? Since some of vanity URLs are not redirecting to the original. Some of the original pages are not getting traffic. Also, can using canonical tag help?
Technical SEO | | tejasbansode0 -
Why does my Google Web Cache Redirects to My Homepage?
Why does my Google Webcache appears in a short period of time and then automatically redirects to my homepage? Is there something wrong with my robots.txt? The only files that I have blocked is below: User-agent: * Disallow: /bin/ Disallow: /common/ Disallow: /css/ Disallow: /download/ Disallow: /images/ Disallow: /medias/ Disallow: /ClientInfo.aspx Disallow: /*affiliateId* Disallow: /*referral*
Technical SEO | | Francis.Magos0 -
Is it good to redirect million of pages on a single page?
My site has 10 lakh approx. genuine urls. But due to some unidentified bugs site has created irrelevant urls 10 million approx. Since we don’t know the origin of these non-relevant links, we want to redirect or remove all these urls. Please suggest is it good to redirect such a high number urls to home page or to throw 404 for these pages. Or any other suggestions to solve this issue.
Technical SEO | | vivekrathore0 -
Am I Wasting my time using pingler.com
Ok so here is the question. A few months ago i decided to join pingler.com and pay for the service as i was using the free service, but after four months now i have not noticed any changes and i am just wondering if i am wasting my time using the paid service. would love to hear from people who have or are using the service and let me know if this is a waste of time and my money could be better spent elsewhere. look forward to hearing your thoughts
Technical SEO | | ClaireH-1848860 -
Should we use Google's crawl delay setting?
We’ve been noticing a huge uptick in Google’s spidering lately, and along with it a notable worsening of render times. Yesterday, for example, Google spidered our site at a rate of 30:1 (google spider vs. organic traffic.) So in other words, for every organic page request, Google hits the site 30 times. Our render times have lengthened to an avg. of 2 seconds (and up to 2.5 seconds). Before this renewed interest Google has taken in us we were seeing closer to one second average render times, and often half of that. A year ago, the ratio of Spider to Organic was between 6:1 and 10:1. Is requesting a crawl-delay from Googlebot a viable option? Our goal would be only to reduce Googlebot traffic, and hopefully improve render times and organic traffic. Thanks, Trisha
Technical SEO | | lzhao0 -
Which is the best wordpress sitemap plugin
Does anyone have a recommendation for the best xml sitemap plugin for wordpress sites or do you steer clear of plugins and use a sitemap generator then load it up to the root manually?
Technical SEO | | simoncmason0 -
How to use overlays without getting a Google penalty
One of my clients is an email subscriber-led business offering deals that are time sensitive and which expire after a limited, but varied, time period. Each deal is published on its own URL and in order to drive subscriptions to the email, an overlay was implemented that would appear over the individual deal page so that the user was forced to subscribe if they wished to view the details of the deal. Needless to say, this led to the threat of a Google penalty which _appears (fingers crossed) _to have been narrowly avoided as a result of a quick response on our part to remove the offending overlay. What I would like to ask you is whether you have any safe and approved methods for capturing email subscribers without revealing the premium content to users before they subscribe? We are considering the following approaches: First Click Free for Web Search - This is an opt in service by Google which is widely used for this sort of approach and which stipulates that you have to let the user see the first item they click on from the listings, but can put up the subscriber only overlay afterwards. No Index, No follow - if we simply no index, no follow the individual deal pages where the overlay is situated, will this remove the "cloaking offense" and therefore the risk of a penalty? Partial View - If we show one or two paragraphs of text from the deal page with the rest being covered up by the subscribe now lock up, will this still be cloaking? I will write up my first SEOMoz post on this once we have decided on the way forward and monitored the effects, but in the meantime, I welcome any input from you guys.
Technical SEO | | Red_Mud_Rookie0