
In the last post I wrote about a simple way to help create so-called safe code, or code that executes correctly. I wrote that code in C++ using gcc/g++ (the GNU Compiler Collection). What was implied, but never explicitly addressed, was my use of the gcc standard libraries along with the compiler, and the implicit trust I put into both not to do evil with my small application. It’s that trust I want to address in this post. I’ll start that discussion by quoting from Ken Thompson (co-creator of Unix and the C language) and his 1984 ACM Award lecture, “Reflections on Trusting Trust“:
You can’t trust code that you did not totally create yourself. (Especially code from companies that employ people like me.) No amount of source-level verification or scrutiny will protect you from using untrusted code. (emphasis mine)
The entire paper is devoted to the Thompson Hack, a method he came up with for creating what he (and later, many others) thought at the time was an undetectable way to create malicious versions of critical applications by compromising the C compiler itself in an undetectable manner. For an example he spoke of adding a backdoor to login if the compiler detected it was compiling login. Although given in 1984, it was actually describing what had been discovered a full decade earlier in 1974. This was pretty interesting and prescient, given today’s revelations. But lest you think that it couldn’t get worse, it actually could. As Ken Thompson continues on in the same paragraph:
In demonstrating the possibility of this kind of attack, I picked on the C compiler. I could have picked on any program-handling program such as an assembler, a loader, or even hardware microcode. As the level of program gets lower, these bugs will be harder and harder to detect. A well-installed microcode bug will be almost impossible to detect.
The comment about microcode is telling; it strikes at the heart of computing, the computer hardware itself. In 2008 the IEEE published “The Hunt for the Kill Switch,” where the authors detailed some of the ways they thought our adversaries could cripple our defense systems (if they hadn’t already) because of hardware kill switches or back doors built into the chips the DoD was buying (they brought up the constantly troubled F-35 as a prime example). Who would put such things into chips? The very people we’ve off-shored chip manufacturing to, primarily the Chinese. Since the 1980s the commercial US chip makers have been pushing chip fabrication and packaging off shore to save money. And the south Asian countries, primarily the Chinese, have been falling all over themselves to oblige. The response at the time was Darpa’s Trusted Integrated Circuits (TRUST). Whether that has had any positive effects has yet to be seen. I personally have my doubts…
Are you sufficiently paranoid yet? Let’s dial that paranoia back a bit. Let’s come back up the rabbit hole a bit into the realm of mere software. Referring back to the Thompson Hack, David A. Wheeler countered with a solution to the Hack titled “Countering Trusting Trust through Diverse Double-Compiling,” in which, you guessed it, Wheeler provides a way to protect yourself from the Hack. The key to Wheeler’s solution is to use a second, trusted, compiler. While the paper is great about the technique, the question about where to get the second trusted compiler is never satisfactorily answered.
But running untrusted code doesn’t have to get this convoluted. You can just pick a commonly used library or tool to use with your system. Consider two of the more egregious errors that led to security lapses: Shellshock and Heartbleed, both disclosed in 2014. Shellshock came about because of buggy code checked into the bash shell sources back in 1992, during a much more innocent period. The web came along and many developers made the decision to use Bash as part of a web server’s execution backend, opening up the Bash remote execution hole. Heartbleed, based on a bug introduced into OpenSSL sometime in 2012, was so devastating because it allowed a MITM attack to grab the encrypted key of digital certificates used to authenticate servers and encrypt traffic between said servers and users. It meant that web servers needed not only to patch their systems, but probably re-apply for all new certificates. Just in case. What led to this was too much code written over too long a period of time and maintained by too few developers paid too little. The folks who wanted something for nothing didn’t bother to look at the open source, thinking someone else had already done the critical vetting. Which just goes to prove the fallacy of Linus’ Law.
I’ve made myself so depressed over reading these types of reports over the last ten to fifteen years that I’ve seriously considered just selling all my computers and such and going to live off the grid. But I’m way too old for that now. Running from the problems means the bad people exploiting all of this win. I hate when that happens. I may not be able to stop them, but I can certainly do my part to slow them down as much as possible; keep my mouth shut while paying attention to what gets published around me in these areas. And keep on, keeping on.
You must be logged in to post a comment.