The Role of Trust and Failure in Information Security
The principles that define the information security field are decades older than computing, and we'd do well to learn from the lessons that precede our industry.
We as security professionals naively construct an "our stuff versus them" model when attempting to defend our networks in our early career. As we develop more of a salty patina, the realization that we shouldn't trust everything begins to set in, transforming previous revisions of our security model from "assume a cow is a sphere in a vacuum at absolute zero" levels of oversimplification to something more worthy. How do we accelerate that learning process?
Confronting Failure
IT Professionals have a truly bizarre relationship with the concept of failure, causing some notably oppressive culture. Burying failure is making our systems vulnerable. This deep-seated crisis of ego has proven to undermine companies a great deal.
When re-reading Cyberpunk: Outlaws and Hackers on the Computer Frontier my experiential lens provided new insight - Kevin Mitnick probably would not have been as successful if DEC was more transparent about how compromised their systems were. Reviewing past through rose-tinted glasses, DEC was considered a company that provided full-service computing - all maintenance and loading was done by a DEC employee.
DEC needed total implicit trust from their customers to operate, and did not disclose their history of compromise to keep the ego-driven narrative ("we have no problems") going for a number of years. This choice empowered Kevin Mitnick and others to continue compromising DEC customers for years and evade capture.
The industry has learned quite a bit about its problems handling failure since 1991, but it could do much better. Vindictive behavior in the emergence of a new breach is common behavior nowadays, with language like "how could they be compromised?" being bandied about as if we didn't know about thankless dependence on somebody from Nebraska pattern, turns in a counter-productive direction. We need transparency from those who provide us paid software, but we punish them for providing it.
Google feels strongly about learning from failure, and so should we. Engineering professions (the truer, more long-lived ones) have long since begun to analyze failure as a method of teaching, proving that we leave a wealth of information wasted every time we revert to blame in the advent of a problem.
The entire industry needs to figure out how to constructively learn from failure, while simultaneously applying appropriate levels of pressure on all product vendors to ensure that vulnerabilities, breaches, and other problems are disclosed fairly and appropriately. Easy, right?
Building on a Foundation for Trust
Despite this cycle of abuse, the industry does want to see more from companies that provide tech products. Vulnerability disclosure programs are particularly successful and important due to significant pressure to improve. Locksmithing is of particular interest in this case, as is the Enigma story - technology doesn't passively improve over time, it requires conscious effort and does not progress until problems are acknowledged.
Vulnerability disclosure made big strides transitioning from the more negative past (see AT&T's stance here) where the courts would use the CFA as a sledgehammer to cover up or mask problems to the current day's model - "Heartbleed" and "Shellshock". Examine those websites - the vulnerability campaigns maintain blameless language, and focus consumers on how to resolve the issues, what questions to ask of their vendors. We complain about "vulnerability fatigue" often, forgetting that we only began to transform the industry to a more secure future a mere 8 years ago.
Let's commit to some meaningful changes to help us get to the future - We aren't there yet!
- Encourage and Promote Transparency: When a company provides you information on a security problem, push for more information. CloudFlare publishes their post-mortems here as an example.
- Don't be Punitive: This part doesn't specifically apply to security, or even IT. The person nearest to you probably has nothing to do with your issue.
- For bonus points, don't allow others to paint you this way.
- Focus on the Fix: Some people find this part easier than others - shift focus on solving problems and providing real results. Continually ask yourself the question "am I contributing to the objectives of this conversation" and ensure that emphasis stays on what to do next or how something will be prevented.
- Persuade Others: Group-think begins working against you when building trust or establishing a culture of disclosure. Don't allow others to steer the conversation back into punitive territory:
- Listen: If others paint your behavior as punitive, listen to what they have to say and example it objectively.
- This conversation also needs to remain constructive, so cascading tactics may apply. Operating in good faith is key.
- Recognize Contribution: It takes courage to share information about a problem, sincerely remind those who disclose via a direct verbal utterance.
- Restate Commitments: A business relationship, like any other human relationship, requires maintenance. In times of strain, it's important to be forward and remind participants of their commitment to each other.
- Sympathize: Find common ground with those who failed. We've all done it, blur the factional lines by reflecting on other failures - but only bring your own to avoid creating adversarial tension.
Learning from failure is a crucial aspect to improving oneself, improving others, and building trust. Don't let a good failure go to waste by fighting over it.