Volkswagen

At one point in time I worked for an organization that banned the use of open source software. They were concerned that in the event of issues “there would be no support”. At the time they believed that commercial software from companies like IBM, Oracle, Microsoft and SAP was “safer” because there was a commercial entity that stood behind the products they sold and provided support.

Now that’s changing

Most technology leadership today recognizes that proprietary software is not safer than open source software. Basic metrics around the number of disclosed vulnerabilities and “time to patch availability” seem to give an edge to open source software at least being patched faster. Not only that but you have the option to inspect the code yourself, and patch it yourself - options that are simply not available with commercial, closed software. This causes a re-evaluation of the relationship of proprietary, commercial software to operating a safe and secure business.

The recent Volkswagen scandal brings to light the broader question of how vulnerable we are to software that is beyond oversight. Today when we have software running critical systems like medical devices, transportation systems, elevators, and cars we are realizing this code must be open to inspection. This does not mean the code must be open source - only that the code must be available for inspection and oversight.

For example, the Food and Drug Administration (FDA) is responsible for evaluating the risks of new devices and monitoring the safety and efficacy of those currently on market. The software on these devices performs life-sustaining functions such as cardiac pacing and defibrillation, drug delivery, and insulin administration.

However, the agency is unlikely to scrutinize the software operating on devices during any phase of the regulatory process unless a model that has already been surgically implanted repeatedly malfunctions or is recalled. Despite the crucial importance of these devices medical device software is considered the exclusive property of its manufacturers, meaning neither patients nor their doctors are permitted to access their source code or test its security.

Proprietary software puts us at risk

A Columbia University professor, Eben Moglen, who founded the Software Freedom Law Center, has argued proprietary, and thus invisible, software puts us all at risk. He uses the analogy of being able to inspect an elevator’s operating mechanical condition as a foundational principle for having “safe” elevators. But isn’t the software that operates elevators necessary to inspect as well?

“Proprietary software is an unsafe building material”

— Eben Moglen, Columbia University

How can we have “safe” cars or medical devices when the software is a black box and we simply have to trust the manufacturers? The code in automobiles is tightly protected under the Digital Millennium Copyright Act. Last year, several groups sought to have the code made available for “good-faith testing, identifying, disclosing and fixing of malfunctions, security flaws or vulnerabilities”. In response, a group of automobile manufacturers said that opening the code to scrutiny could create “serious threats to safety and security.” Think about that for a moment.

When proprietary code inflicts harm, such as Volkswagen’s polluting cars, the creators of that code must be held accountable. It’s definitely time to start demanding appropriate transparency.

References

Volkswagen’s Diesel Fraud Makes Critic of Secret Code a Prophet
VW’s Cheating Proves We Must Open Up the Internet of Things