Best Website Security Service
-
Having been hacked twice and, it seems, still suffering from problems as a result (file changes and alterations, etc) I'm wondering if there is a good paid service that provides security for websites?
I'm an online travel agent so our websites "up-time" and search engine position is vital but I'm spending more time trying sort out our websites' security than selling travel.
It would be sensible to pay for a service that would ensure, as much as possible, that our site is secure and any optimisation we do is not cancelled out by hacking and security problems.
If anyone could recommend a good company I would be very grateful.
Colin
-
Thanks very much Matt.
-
No problem - an SSL isn't going to prevent the problem you have, but that is why I mentioned going down the route of an SSL with extras such as daily malware scanning and weekly vulnerability assessment.
Verisign would have most likely alerted you to the potential issue before hand so you could have fixed the vulnerability. Also if your site is compromised with malware it will alert you, so you can take immediate action.
As far as SSL goes; now your site has been compromised and you are acting as a travel agent you want to make sure your pages are as secure as possible and a Verisign logo will help customers trust in entering data into your site.
You will still need to make sure that you employ the help of someone who is knowledgeable on the CMS that you are using to make sure it is setup to the optimum not leaving easily exploited windows open, so to speak.
-
Most compromises of that type I've seen have been down to a flaw in either the CMS or the way it was set up (usually permissions). This type of breech is far more common that cracked passwords, servers being compromised etc. They can be annoying to track down, but are usually more easily fixed.
-
Thanks Matt. I take your point.
We've added a number of security measures to the CMS but it looks like we havent' successfully added enough.
Your suggestion to search the platform specific communities is really useful.
Cheers.
-
Just adding an SSL isn't going to help if you are having the sort of problems that you mention.
Your core issue is most likely to be with your CMS. If that is the case then you are probably best finding someone who specialises in that CMS (or generally in coldfusion) who can find the source of the problem and lock it down.
If you are using an off the shelf CMS make sure that it is up to date and fully patched. Check the platform specific communities for people having similar issues and see whether they have successfully prevented the problem recurring.
Good luck. You might just find that there is an insecure upload script or something and once you find that the problems will just end.
-
Hi Matt, thanks for replying.
I have looked at Verisign but wonder if it's comprehensive enough. (If there is such a service).
I wonder if Verisign's service would flag up or better still prevent something like my robot.txt file being altered by a malicious script?
Or whether the malicious script would not have been able to access my site if I was with Verisign?
Colin
-
Have you considered verisign - http://www.verisign.com/ and one of their SSL solutions, with extras such as
http://www.symantec.com/verisign/ssl-certificates/secure-site-pro-ev?inid=vrsn_symc_ssl_SSPEV
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best practices for retiring 100s of blog posts?
Hi. I wanted to get best practices for retiring an enterprise blog with hundreds of old posts with subject matter that won't be repurposed. What would be the best course of action to retire and maintain the value of any SEO authority from those old blog pages? Is it enough to move those old posts into an archive subdirectory and Google would deprioritize those posts over time? Or would a mass redirect of old blog posts to the new blog's home page be allowed (even though the old blog post content isn't being specifically replaced)? Or would Google basically say that if there aren't 1:1 replacement URLs, that would be seen as soft-404s and treated like a 404?
White Hat / Black Hat SEO | | David_Fisher0 -
How to re-rank an established website with new content
I can't help but feel this is a somewhat untapped resource with a distinct lack of information.
White Hat / Black Hat SEO | | ChimplyWebGroup
There is a massive amount of information around on how to rank a new website, or techniques in order to increase SEO effectiveness, but to rank a whole new set of pages or indeed to 're-build' a site that may have suffered an algorithmic penalty is a harder nut to crack in terms of information and resources. To start I'll provide my situation; SuperTED is an entertainment directory SEO project.
It seems likely we may have suffered an algorithmic penalty at some point around Penguin 2.0 (May 22nd) as traffic dropped steadily since then, but wasn't too aggressive really. Then to coincide with the newest Panda 27 (According to Moz) in late September this year we decided it was time to re-assess tactics to keep in line with Google's guidelines over the two years. We've slowly built a natural link-profile over this time but it's likely thin content was also an issue. So beginning of September up to end of October we took these steps; Contacted webmasters (and unfortunately there was some 'paid' link-building before I arrived) to remove links 'Disavowed' the rest of the unnatural links that we couldn't have removed manually. Worked on pagespeed as per Google guidelines until we received high-scores in the majority of 'speed testing' tools (e.g WebPageTest) Redesigned the entire site with speed, simplicity and accessibility in mind. Htaccessed 'fancy' URLs to remove file extensions and simplify the link structure. Completely removed two or three pages that were quite clearly just trying to 'trick' Google. Think a large page of links that simply said 'Entertainers in London', 'Entertainers in Scotland', etc. 404'ed, asked for URL removal via WMT, thinking of 410'ing? Added new content and pages that seem to follow Google's guidelines as far as I can tell, e.g;
Main Category Page Sub-category Pages Started to build new links to our now 'content-driven' pages naturally by asking our members to link to us via their personal profiles. We offered a reward system internally for this so we've seen a fairly good turnout. Many other 'possible' ranking factors; such as adding Schema data, optimising for mobile devices as best we can, added a blog and began to blog original content, utilise and expand our social media reach, custom 404 pages, removed duplicate content, utilised Moz and much more. It's been a fairly exhaustive process but we were happy to do so to be within Google guidelines. Unfortunately, some of those link-wheel pages mentioned previously were the only pages driving organic traffic, so once we were rid of these traffic has dropped to not even 10% of what it was previously. Equally with the changes (htaccess) to the link structure and the creation of brand new pages, we've lost many of the pages that previously held Page Authority.
We've 301'ed those pages that have been 'replaced' with much better content and a different URL structure - http://www.superted.com/profiles.php/bands-musicians/wedding-bands to simply http://www.superted.com/profiles.php/wedding-bands, for example. Therefore, with the loss of the 'spammy' pages and the creation of brand new 'content-driven' pages, we've probably lost up to 75% of the old website, including those that were driving any traffic at all (even with potential thin-content algorithmic penalties). Because of the loss of entire pages, the changes of URLs and the rest discussed above, it's likely the site looks very new and probably very updated in a short period of time. What I need to work out is a campaign to drive traffic to the 'new' site.
We're naturally building links through our own customerbase, so they will likely be seen as quality, natural link-building.
Perhaps the sudden occurrence of a large amount of 404's and 'lost' pages are affecting us?
Perhaps we're yet to really be indexed properly, but it has been almost a month since most of the changes are made and we'd often be re-indexed 3 or 4 times a week previous to the changes.
Our events page is the only one without the new design left to update, could this be affecting us? It potentially may look like two sites in one.
Perhaps we need to wait until the next Google 'link' update to feel the benefits of our link audit.
Perhaps simply getting rid of many of the 'spammy' links has done us no favours - I should point out we've never been issued with a manual penalty. Was I perhaps too hasty in following the rules? Would appreciate some professional opinion or from anyone who may have experience with a similar process before. It does seem fairly odd that following guidelines and general white-hat SEO advice could cripple a domain, especially one with age (10 years+ the domain has been established) and relatively good domain authority within the industry. Many, many thanks in advance. Ryan.0 -
Suggest me a best plan for linking building chart for small static website.
Hi Everyone, Can any one suggest me a clear idea for off page link building chart i.e) Our page is a 24 page website we like to plan for off page activity like bookmarking, classifieds, directory bla bla bla. So how many links we are supposed to post and in how much day time gap example: 15 Links in bookmarking, 10 links in classified, weekly one article submission, after one week the same cycle goes on.....
White Hat / Black Hat SEO | | dineshmap0 -
Update: Copied Website
So I discovered a website the other day that is a complete duplicate of ours: justinchina.co.uk This is our website: petmedicalcenter.com . Thanks to help from Erica, I dug in deeper to see why this was happening. It seems that the justinchinca.co.uk which is hosted by GoDaddy has their A Record pointing at our web host. So that being said, our website does not seem to be hacked which is good news. Would this still cause an issue with our Google rankings? Our host, Host Monster said to contact GoDaddy and GoDaddy said that a domain owner can point their URL to anywhere that they choose. Anyway, any feedback would be helpful. Thanks for everyone thus far that has helped me. Brant
White Hat / Black Hat SEO | | BCB11210 -
Website that just got hit....Need some tips or ideas...
Hey guys, The website of the company i work hit in the PR update two days ago . A little history , the site was notice by Google about spam links around 5-6 months ago .
White Hat / Black Hat SEO | | WayneRooney
Since then there is a company that cleans all the spam links and manage all the disavow process. In the last penguin update ( about two months ago ) the site jumped like crazy in the ranking and stayed there ever since... In the last three months we create less than ten links to the site, and we have focus all our work to improve
the optimization of the site.
It should be noted that the company is investing a lot in social networks and all the work in the past 3 month are White and clean... Now, two days ago in the PR update (more or less) the site just dropped , but when i say dropped , it's 200 keys that was in page 1-2 that just want out to page 5-6-7. Like the website is gone, i never see something like this... The things that pass through my head: A lot of the links the linking to the site with high PR lost their pr and now they are worthless, but still this drop ? its to extreme.
Or that Google received the disavow and just disavow a lot of links.... Does anyone have any ideas or tips on the subject ? Thank you0 -
Do I need to use meta noindex for my new website before migration?
I just want to know your thoughts if it is necessary to add meta noindex nofollow tag in each page of my new website before migrating the old pages to new pages under a new domain? Would it be better if I'll just add a blockage in my robots.txt then remove it once we launch the new website? Thanks!
White Hat / Black Hat SEO | | esiow20130 -
Is there a danger linking to and from one website too many times?
Basically my webdeveloper has suggested that instead of using a subfolder to create an English and Korean version of the site I should create two different websites and then link them both together to provide the page in English, or in Korean, which ever the case may be. My immediate reaction is that search engines may perceive this kind of linking to be manipulative, as you can imagine there will be a lot of links (One for every page). Do you think it is OK to create two webpages and link them together page by page? Or do you think that the site will get penalized by search engines for link farming or link exchanging. Regards, Tom
White Hat / Black Hat SEO | | CoGri0