No Silver Bullets
There is no single process or technology that will secure your organization's information
A recent study by a team of Greek academics on endpoint detection & response (EDR) software (An Empirical Assessment of Endpoint Security Systems Against Advanced Persistent Threats Attack Vectors by George Karantzas and Constantinos Patsakis), highlights the challenges facing CISOs when choosing which tools to deploy within their environments to counter malware and APT threats.
In what I expect will likely be a surprise to many CISOs, the results show that almost all of the top EDR tools on the market could be bypassed when presented with common attack methods used by threat actors. The team doing the analysis enabled almost all of the protections the tools could provide in order to gauge their effectiveness. Considering that some of these protections would likely be turned off in real-world scenarios due to user impacts, this means that the tools’ effectiveness is likely to be even worse than these results.
As with similar published studies in the past, I expect that there will be a lot of noise from the vendors on how the results were invalid, their tools were not set up appropriately, etc. My intent is not to point fingers at any vendor. Who is top of the heap in one test is likely to fall in the next. Technologies and attack methods are constantly changing and tool vendors are in a constant game of “catch-up”.
Rather, I wanted to use this study to highlight the challenge that CISO’s face in making technology choices to mitigate threats. This report is a rarity in that it provides in-depth information on how the test was structured and the tools were configured. For most of the thousands of security products, all CISOs have to rely on is the vendor’s biased marketing and test results, and possibly some functional comparisons from a firm like Gartner. Even for companies that do internal evaluations, they are more likely to be focused on the management aspects with basic checks of functionality. I don’t believe I have ever seen technical evaluations conducted to this depth in any company I have worked for or with. Almost all companies neither have the time, resources, or expertise to go to this level.
Even feedback from other CISOs is not likely to be effective. For one, CISOs have little way to know how well their chosen tool is working unless they have a significant breach that is traceable to the failure of the tool to prevent it. Second, it is a rare CISO who will publicly admit that they made a suboptimal choice.
The problem is that in the end, CISOs rarely have the type of empirical data that they need to make effective tool selection decisions, so the decision is more likely to be based on the slickness of the presentation, the tools’ UI, and the price, than it is on the real-world effectiveness of the tool. This is not the CISOs fault as they often have no way to know.
I state this so that CISO’s can be clear when both making their choices and presenting them to their leadership, that to a certain (perhaps large) extent, they are taking a shot in the dark. Never promise or fall for the hype that any tool is the silver bullet solution, instead be realistic in stating you made the best choice based on limited data, and that something is usually better than nothing at all. Just as with Heuristic Risk Management, you need to practice heuristic tool selection and not worry about finding the “perfect” tool - it does not exist. Do your due diligence, make your selection and move on: your attackers are not waiting.
The other lesson that this report highlights is the need to focus on response and detection, not just prevention. It’s not a matter of if you will be breached, but when (or when you will notice). How you respond will be the true measure of your success in surviving the attack.