The Clipper Chip was a dangerous Clinton-era technology that sought to provide “back door” access for law enforcement (and presumably intelligence agencies) to encrypted digital communications. What the government should’ve learned about backdoors from the Clipper Chip asks a critical question that needs to be answered by those involved in the current debate over encryption.
The Clipper Chip initiative failed mainly because when push came to shove (and the government shoved really hard) the dominant private business players in the market (mostly US based at the time) followed the advice of actual computer scientists over government apologists.
The Ars Technica article does an excellent job of laying out the facts surrounding the 1993 Clipper Chip controversy, and in answering the question posed in the title. It deserves a careful reading. In full.
For me it was a reminder of the bipartisan nature of these sorts of attacks on civil liberties. The Clipper technology wasn’t invented by Bill Clinton, but both he and his then Vice President, Al Gore (famously a self-acknowledged expert on the Internet), fought hard for it — as hard as the NSA under George Bush fought to conduct mass surveillance of everyone on the Internet or James Comey (a Republican FBI Director under a Democratic President) now fights to compromise strong encryption. Whether the next President is Hilary Clinton or Ted Cruz, we can expect that the U.S. government will continue to put intelligence gathering ahead of the civil liberties that activity is supposed to be preserving in pursuit of a “Golden Key” that allows the government to circumvent encryption.
Perhaps my favorite part of Ars coverage is this reader comment attached to the article, which cogently sums up the consequences of living in a technologically illiterate society:
The statement from Comey about how people in the field just aren’t trying hard enough summarizes pretty much all the reasons why I’m pessimistic about the possibility that this is an issue we’ll ever see a reasonable discussion on. Far too many people think that software is essentially just magic, that you can make it do anything as long as you wish hard enough. They’re used to dealing with people in their work, where you can say something as vague as “do something about this” and get results. They have no concept at all of working with a system where anything that can’t be represented by a specific set of rules is flatly impossible, let alone how complex and error-prone those rules would need to be in order to accomplish the things they want.