Is Wordpress Website Backup Service Worth the Investment?
-
I was horrified to learn that my hosting company, InMotion Hosting does not offer redundant backups, that it is on the customer to set up backups to ensure they don't lose their data.
I plan to back up to Google Drive 3 x a week for 12 backups and also create 3 backups on our server (Sunday, Tuesday, Thursday). So if something goes wrong and we catch it within a week we can generate the backup directly from our server.
There are website backup services such as BlogVault. Do they offer any meaningful advantages to taking the contents of the entire server (16 gigs) and backing it up? They do offer Malware removal. Does this have value?
Is back up on an external service like Google Cloud while simultaneously backing up on the server a safe way to proceed? If not, what is the simplest and most effective manner to backup? I prefer to avoid adding any plugins to WordPress as our site already has too many (about 30).
Thanks!!
Alan -
I agree with Salem, UpdraftPlus is the way to go, though I'd pay for the premium versions as you get support & help for $1 per week,
-
i am using updraftplus - free version
it automate the backup to my AWS S3 bucket , and i think it's better than many paid plugins
-
With web hosting companies it depends upon, whether you have literally just bought the hosting space alone (to save as much money as possible) or whether you have bought a bundle that contains other services. Even an automated backup solution is part of their product so usually it does cost extra (depending upon who you go with, of course!)
If a deal seems too good to be true, it usually is
I'm not sure about any of the auto-backup services you can plug onto a WordPress site, but I'd be extremely wary of any automated malware removal to be honest. On my desktop PC I often script little tools to help me out or make my workflow more efficient. Because they don't fit the 'mold' of what regular software is 'expected' to do, they often get flagged by my own AV (falsely) as malware
A lot of malware shields these days look for 'unusual' items instead of 'harmful' items. This makes me wonder, if you innovated and designed / developed something really cool on your site - would the malware shield strip it out? Then if your site gets hit and you have to restore it, you might lose out on some of your site's unique architecture and functionality just because it didn't fit the 'norm'
On a desktop environment I know that there are lots of spam-tools produced to solve problems that don't exist, which can often make a machine run worse than before. A good example of this are all the sh*tty driver update and registry 'mending' softwares which just leave your system a steaming pile of garbage. Anti-malware can be good but it does have the propensity to go OTT and for the devs to get extremely lazy, building tools that seek out unusual applications instead of damaging ones (prejudice in coding)
I would assume that for web-based software, the same problems would creep in. So personally I'd try to be smart in terms of what I installed and how I protected the site, but rather than having some half-ass malware-strip on backup - I'd probably just do the backups without any 3rd party intervention. When you back up your site, you know it's working. Why would you want to automatically alter the backup as it's saved, in a way which might break it upon restoring the site's image? Seems like an unnecessary risk sold as a product to me
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
If I disavow bad Backlinks of my website. If, I create Backlinks again, those websites. Did that again become count in my Backlinks?
Hi, all please tell me. If I disavow bad Backlinks of my website. If, I create Backlinks again, those websites. Did that again become count in my Backlinks?
Intermediate & Advanced SEO | | sourav60 -
Website Redesign - Duplicate Content?
I hired a company to redesign our website.there are many pages like the example below that we are downsizing content by 80%.(believe me, not my decision)Current page: https://servicechampions.com/air-conditioning/New page (on test server):https://servicechampions.mymwpdesign.com/air-conditioning/My question to you is, that 80% of content that i am losing in the redesign, can i republish it as a blog?I know that google has it indexed. The old page has been live for 5 years, but now 80% of it will no longer be live. so can it be a blog and gain new (keep) seo value?What should i do with the 80% of content i am losing?
Intermediate & Advanced SEO | | CamiloSC0 -
Is this organic search sketchiness worth unwinding?
Started working on a site and learned that the person before me had done a fairly sketchy maneuver and am wondering if it's a net gain to fix it. The site has pages that it wanted to get third party links linking to. Thing is, the pages are not easy to naturally link to boost them in search. So, the woman before me started a new blog site in the same general topic area as the first/main site. The idea was to build up even the smallest bit of authority for the new blog, without tipping Google off to shared ownership. So, the new blog has a different owner/address/registrar/host and no Google Analytics or Webmaster Tools account to share access to. Then, as one method of adding links to the new blog, she took some links that originally pointed to the main site and re-directed them to the blog site. And voila! ...Totally controllable blog site with a bit of authority linking to select pages on the main site! At this point, I could un-redirect those links that give the blog site some of its authority. I could delete the links to the main site on the blog pages. However, on some level it may have actually helped the pages linked to on the main site. The whole thing is so sketchy I wonder if I should reverse it. I could also just leave it alone and not risk hurting the pages that the blog currently links to. What do you think? Is there a serious risk to the main site in this existing set up? The main site has hundreds of other links pointing to it, a Moz domain authority of 43, thousands of pages of content, 8 years old and Open Site Explorer Spam Score of 1. So, not a trainwreck of sketchiness besides this issue. To me, the weird connection for Google is that third party sites have links that (on-page-code-wise) still point to the main site, but that resolve via the main site's redirects to the blog site. BTW, the blog site points to other established sites besides the main site. So, it's not the exclusive slave to the main site. Please let me know what you think. Thanks!
Intermediate & Advanced SEO | | 945010 -
HTTPS website migration and internal links
Hey Moz! I read Moz's guide on migrating websites from http to https, and it seems changing all relative internal links to absolute https is recommended (we currently use relative internal links). But is doing this absolutely necessary if we will already have a redirect in our .htaccess file forcing all http pages to https? Changing all of our internal links to absolute https will be very time consuming, and I'd like to hear your thoughts as to whether it's absolutely recommended/necessary; and if so, why? Thanks!
Intermediate & Advanced SEO | | TheDude0 -
How many pages should be on landscapers website
Hi Guys, We have a good website strong onsite and offsite seo. A year ago, we had a 15 pages website for all main keywords we needed and we were on top 3 for most of these keywords in google. We were happy but we wanted more.. So we created lots of unique content targeting long tail keywords and created 100 more pages for the website. In next 4-5 months we lost positions for almost all our main keywords but got lots of longtails SERPs. Trafiic grew but the quality and the conversion rate shrinked. Everybody keep saying that it doesn't matter how many pages you have on the website as long as content is unique and I don't think it is true. I see lots of 3-5 paged websites without any seo in top 3 results in google. Does it mean that if I delete all these 100 pages that I created I will have more chances to get my main keywords SERP back? Basically does the seo juice that you have on domain is spreading across all pages and the more pages you have the less juice every page will get?
Intermediate & Advanced SEO | | vadimmarusin100 -
We're currently not using schemas on our website. How important is it? And are websites across the globe using it?
Schemas looks like an important thing when it comes to structuring your website and ensuring the crawl bots get all the details. I've been reading a lot of articles around the web and most of them are saying that schemas are important but very few websites are using it. Why so? Are the schemas on schema.org there to stay or am I wasting my time?
Intermediate & Advanced SEO | | Shreyans920 -
Link masking in WordPress
in Wordpress, I want to block Google from crawling my site using the primary navigation. I want to use anchor text links in the body and custom menus in the sidebar to make maximum benefit of the "first link counts" rule. In short, I want to obfuscate all of the links in my primary navigation without using the dreaded nofollow. I do not want to block other links to the pages - body text, custom menus, etc. . This would be site wide. I'd rather not use Ajax or any type of programming unless it's part of a plugin. Can anyone make a simple, Google-friendly suggestion?
Intermediate & Advanced SEO | | CsmBill0 -
Service Keyword in URL - too much?
We're working on revamping the URL structure for a site from the ground up. This firm provides a service and has a library of case studies to back up their work. Here's some options on URL structure: 1. /cases/[industry keyword]-[service keyword] (for instance: /cases/retail-pest-control) There is some search traffic for the industry/service combination, so that would be the benefit of using both in URL. But we'd end up with about 70 pages with the same service keyword at the end. 2. /cases/[industry keyword] (/cases/retail) Shorter, less spam potential, but have to optimize for the service keyword -- the primary -- in another way. 3. /cases/clientname (/cases/wehaveants) No real keyword potential but better usability. We also want the service keyword to rank on its own on another page (so, a separate "pest control" page). So don't want to dilute that page's value even after we chase some of the long tail traffic. Any thoughts on the best course of action? Thanks!
Intermediate & Advanced SEO | | kdcomms1