- Equinix to cut 3% of staff amidst the greatest demand for data center infrastructure ever
- One of the best portable speakers I tested is $40 off for Black Friday: Get this music powerhouse in a small package
- Level up your PS5 with this PlayStation VR2 bundle for $250 off before Black Friday
- The 45+ best Black Friday phone deals 2024: Sales on iPhones, Samsung, and more
- I recommend this 15-inch MacBook Air to most people, and it's $255 off for Black Friday
Complexity is Still the Enemy of Security
Ease of Use, Ease of Integration Encourages Data Protection
By Gregory Hoffer, CEO of Coviant Software
In 1999 noted cybersecurity expert Bruce Schneier wrote in his Schneier on Security blog that, “The worst enemy of security is complexity. This has been true since the beginning of computers, and it’s likely to be true for the foreseeable future.” In the context of that post Schneier explained that complexities inherent in the design of technology products made it difficult to simultaneously test whether the product was secure. Whether software, hardware, or interconnected systems, you built a product to do a certain job and only when completed could it be tested to see if it was secure. Even then, Schneier observed, testing for security was not a high priority.
“This is a time-consuming and expensive process, and almost no one bothers to go through it. If they did, they would quickly realize that most systems are far more complex to analyze, and that there are security flaws everywhere,” Schneier continued.
It’s Not 1999 Anymore
The now common “security by design” approach was not yet in vogue, even though threat actors and cyberattacks were on the rise and recognized as a problem. Security flaws were an accepted part of using technology; a cost of doing business. A 1999 article from CNN Sci-Tech on cyberattacks reported that financial losses “rose to more than $100 million for the third straight year.” How quaint. Fast forward to 2023 and some estimates have the total cost of cybercrime reaching a staggering $8 trillion this year.
Today’s technologies are far from perfect, but cybersecurity is top of mind for most organizations, and the array of tools and services available to protect networks and data go far beyond the seemingly primitive firewall and anti-virus approach that was common a quarter century ago. Still, complexity remains an enemy of security, but not just in the way described by Mr. Schneier.
Even if created using the security by design approach, and tested to assure an absence of known vulnerabilities, when technology is difficult to use it can cause people to avoid using the product and instead find unsecure workarounds, thereby creating more security issues for the organization. At times those workarounds manifest as a stubborn refusal to abandon the old processes or, if that is not an option, to create a new way to complete whatever task is involved. That’s a natural response to change when the new way is (actually or merely perceived as) complicated. And when that happens, complexity’s enmity with security will rear its ugly head.
Workarounds are Anti-Security
We have seen this far too often in the realm of data transfer. Unfortunately, there is no shortage of easy and familiar ways of sending data from one place to another that are far from secure, so when that secure but complex process frustrates staff, they may look for an easier solution like email, file sharing utilities, or consumer-grade cloud services. These work just fine when you want to send grandma a bunch of photos and videos of the kids’ dance recital or the family’s summer trip to the Grand Canyon, but not when sending business-critical files. When these are the solutions that organizations use to send sensitive, even regulated, data like personally identifiable information (PII), protected health information (PHI), intellectual property, financial files, and data associated with contractual obligations, it puts everyone at risk.
Some organizations believe they can solve the problem themselves by creating an in-house file sharing process that combines some existing components, open-source software, and a bit of ingenuity from someone on the IT team. Once again, the problem is usually that the resulting solution is a bit convoluted which can discourage its use. And roll-your-own tools are rarely ever documented by the person who created them, inevitably leading to problems when that person leaves the organization, gets sick, or goes on vacation.
Another thing to consider is: what happens when something breaks, a key piece of software becomes obsolete and unsupported, or a trading partner changes their standards? When the poop hits the fan ten years after the guy from IT who wrote the code left the company, where do you go for support? It’s a lot to ask someone to try and solve a miasmic puzzle of code, bug fixes, extensions, and inexpertly bolted-on changes. Furthermore, in-house solutions lack features necessary for ensuring file security and for proving compliance with any applicable regulations. If a file goes missing and you can’t prove that it was encrypted, the assumption must be that the data are compromised.
Make Security Easy
And finally, even when a commercial managed file transfer product is picked, there may be inherent complexities that make it difficult to implement and confusing to use. Too many customizations, both of features that should be standard and of some that are unnecessary, increase the chances of getting things wrong, either through omission or commission. And then there is the all-too-common bottom-up implementations that require vendor-specific “pseudo-coding” language to navigate to ensure you are getting all the functionality that is needed to do the job instead of intuitive top-down implementations consistent with the no-code ethic.
When the goal is to deliver a product that is supposed to help an organization conduct their business securely, it doesn’t help to undermine those efforts by delivering a product that is difficult to use. A cynic might wonder if the complexity is not a bug, but a feature designed to get the customer to buy into costly support contract and, ultimately, fall victim to the sunk cost fallacy. That may be a good (if short-sighted) business strategy, but it is not a sound approach to cybersecurity.
We are All Security Stakeholders
The White House’s recent National Cybersecurity Strategy states that, “To counter common threats, preserve and reinforce global Internet freedom, protect against transnational digital repression, and build toward a shared digital ecosystem that is more inherently resilient and defensible, the United States will work to scale the emerging model of collaboration by national cybersecurity stakeholders to cooperate with the international community.”
We believe that every technology vendor is a stakeholder in strengthening our national cybersecurity. As such, every technology vendor should work to make products that are secure by design, and that includes designed to be easy to install and use. Security should not be frustrating to the user. We are the ones with the skills to make the security experience integral to our products and easy for the user by using things like process automation to tackle essential steps that might be skipped or forgotten, to backstop the customer with alerts and documentation, and to not only streamline the functions our products perform, but to make the people and organizations who use our products more productive.
About the Author
Gregory Hoffer is CEO of San Antonio-based Coviant Software, maker of the secure, managed file transfer platform Diplomat MFT. Greg’s career spans more than two decades of successful organizational leadership and award-winning product development. He was instrumental in establishing ground-breaking technology partnerships that helped accomplish Federal Information Processing Standards (FIPS), the DMZ Gateway, OpenPGP, and other features essential for protecting large files and data in transit.
For more information visit Coviant Software online, or follow Coviant Software on Twitter and LinkedIn.