Why cybersecurity products always defy traditional user reviews

I read with interest the latest batch of evaluation data from MITRE on various endpoint solutions, this time focusing on the detect, response and containment of these various solutions against malware created by FIN7 and CARBANAK threat groups. While academically interesting, it illustrates the difficulty in giving reviews to cybersecurity products in the endpoint protection category and trying to attribute a “best” label to a specific product in a specific category (be it endpoint or anything else).


First of all, lets focus on the report. The samples it derived were from two threat groups, Carbanak and Fin7.

Carbanak is a strange choice seeing as the leader of the group was arrested in Spain back in 2018 by Europol and is no longer active, not to mention that all its malware samples are now publicly accessible, so even if you were testing samples using a legacy signature based anti-virus, it would detect and stop them just fine.

Fin7 is a group also effectively retired after it was identified as running out of a front company by U.S. law enforcement in 2018. The group was responsible for targeting U.S. retail, restaurant, and hospitality sectors since mid-2015 and often use point-of-sale malware. So how is all this relevant to businesses today?

Well, simply put, it isn’t. If you were an e-commerce company selling widgets online in Europe, none of these results would be relevant to you. At all. Ignoring the fact they were conducted years ago and the threat actors have been disbanded, it highlights the increasing disconnect between labels attributed to a product and requirements that companies typically expect from an endpoint solution. Taking these reports at face value, it would be easy to find the product that performed the “best” and include this in your cybersecurity arsenal, but you’d be simply falling into the requirements trap that befalls many a company when acquiring technology: you’re not basing any of these acquisitions on your requirements, but someone else’s.

If we took a different product, let’s say the humble web application firewall (WAF), your requirements list would look quite long, divided roughly into two sections (functional and non-functional requirements) and comprise a giant checklist of activities it needs to be able to perform and at a specific level required by your business. Can it process a certain amount of traffic? What kind of attacks can it actually see? Can it see them in both GET and POST requests (you’d be surprised how many WAF’s can’t see things like XXE’s in POST requests).

These are lists you need to draw up yourself based on your needs, and not the vendor feature set. Once you have this, you can begin the long laborious process of product testing, proof of value and finding something that you can actually afford, which brings me to one of the important caveats that are missing from all product reviews: budget.

All tools have a monetary cost. Even when you are using open-source technology, you have an increased investment in manpower and support (even though the product itself is effectively free). Pricing and cost are conspicuously absent from all product reviews.

While it is understandable vendors are reluctant to divulge product pricing, those that do typically refer to “list price,” which is usually far above what any company will pay for the product.

Budget is a fundamental requirement for any acquisition, be it of an endpoint solution or anything else.

Let’s say you were a U.S.-based retailer (a popular target for the FIN7 group back in the day) and used the MITRE list as a way to pick an endpoint solution.

You would pick the top performing endpoint solution but then quickly find out that you cannot afford it. Tools that label themselves XDR or EDR are naturally going to be pricier than “legacy” alternatives, which can hover as low as the $5 per head price.

The list also ignores the fact that endpoint products must cover things like USB control, DLP, application control, encryption and other such functionality companies expect out of an endpoint solution these days. Endpoint is no longer just about protecting devices from malware.

When evaluating reviews, your last avenue is usually to get advice from your contacts within the infosec industry, but again: beware of loose comparisons. A “good” WAF recommended by your colleague in Company A may not match your requirements since their company will have any of the following: a different infosec budget, a different technology stack, different company culture, different skillsets, a different business model, and will operate in a different vertical. If for some miracle all of these requirements match, you can give that recommendation some weight, but otherwise you need to go back to basics.

Drawing upon reviews taken online is a losing gamble when it comes to cybersecurity tools. Like with everything else, caveat emptor stands: draw up your requirements first, do your own research, perform your own testing and you’ll end up with something you actually need, rather than something a vendor wants you to have.

cybersecurity, products, user reviews

Related Training

Haynes, Alex (2021) Why cybersecurity products always defy traditional user reviews. Recovered on 22 June 2021 https://www.helpnetsecurity.com/2021/05/28/reviews-cybersecurity-products/