Mounting concern over information security has motivated a growing number of governments to demand access to software source code as a condition for doing business in their country. Officially, those governments want to ensure that software is not vulnerable to compromise as a result of negligence or design. Others worry that scrutiny of the code is for the purpose of identifying vulnerabilities that can be exploited by national intelligence services. No government appears immune to the temptation to peek behind the curtain — and a large number of software firms have complied, compelled not so much by government pressure as the lure of the market share that will follow.
Governments have long demanded the cooperation of IT companies in various endeavors; most famously, the FBI demanded Apple's help in 2015 to access an iPhone that belonged to people who had committed a terrorist attack in the United States. Apple refused, arguing that it could not be compelled to create a key to unlock the phone (a demand that is different from merely handing over an existing key). The company also argued that compliance would spur similar additional demands and that the key would eventually leak into the public domain.
Law enforcement officials dismissed the second complaint, but that confidence was exposed as hollow. A year later, cyberactivists obtained a suite of hacking tools from the U.S. National Security Agency, proof that even the world's most secure intelligence agency can be penetrated and that even "the crown jewels" cannot be protected. And, to confirm Apple's worst suspicions, the NSA tools were released earlier this year and have been reconfigured and used for the WannaCry and Petya hacks of recent weeks.
With your current subscription plan you can comment on stories. However, before writing your first comment, please create a display name in the Profile section of your subscriber account page.