M | T | W | T | F | S | S |
---|---|---|---|---|---|---|
1 | 2 | 3 | 4 | 5 | 6 | 7 |
8 | 9 | 10 | 11 | 12 | 13 | 14 |
15 | 16 | 17 | 18 | 19 | 20 | 21 |
22 | 23 | 24 | 25 | 26 | 27 | 28 |
29 | 30 | 31 |
As the Apple/FBI dispute continues on, court documents reveal the argument that Apple has been providing forensic services to law enforcement for years without tools being hacked or leaked from Apple. Quite the contrary, information is leaked out of Foxconn all the time, and in fact some of the software and hardware tools used to hack iOS products over the past several years (IP-BOX, Pangu, and so on) have originated in China, where Apple’s manufacturing process takes place. Outside of China, jailbreak after jailbreak has taken advantage of vulnerabilities in iOS, some with the help of tools leaked out of Apple’s HQ in Cupertino. Devices have continually been compromised and even today, Apple’s security response team releases dozens of fixes for vulnerabilities that have been exploited outside of Apple. Setting all of this aside for a moment, however, lets take a look at the more immediate dangers of such statements.
By affirming that Apple can and will protect such a backdoor, Comey’s statement is admitting that Apple will be faced with not only the burden of breathing this forensics backdoor into existence, but must also take perpetual steps to protect it once it’s been created. In other words, the courts are forcing Apple to create what would be considered a weapon under the latest proposed Wassenaar rules, and charging them with the burden of also preventing that weapon from getting out – either the code itself, or the weaknesses that Apple would have to continue allowing to be baked into their products to allow the weapon to work.
The Wassenaar Arrangement is a multilateral export control regime designed to control the export of weapons including battle tanks, armored combat vehicles, warships, and – you guessed it – intrusion tools:
“Software” specially designed or modified to avoid detection by ‘monitoring tools’, or to defeat ‘protective countermeasures’, of a computer or network capable device, and performing any of the following:
a. The extraction of data or information, from a computer or network capable device, or the modification of system or user data; or
b. The modification of the standard execution path of a program or process in order to allow the execution of externally provided instructions.
Based on the Wassenaar Agreement’s definition, a more appropriate term than “backdoor” may be intrusion tool; Apple is being ordered to develop an intrusion tool; a tool so dangerous that it’s subject to weapons export control and cannot be exported outside of the United States, yet government officials publicly claims is so innocuous and inconsequential that Apple has no reason to complain.
What are the consequences if this “weapon” should find its way out of Apple, or if Apple is compelled to use this intrusion tool for another government? Would Apple executives risk prosecution? Would the company be fined into oblivion? Apple would at the very least suffer catastrophic and irreparable damage to its brand. Apple is arguing that they may not have control over how the tool is used once it’s created – either by our government, or any other government, putting them in a very vulnerable position on the world’s stage.
The only other viable option is to take this weapon and destroy it, so that it no longer exists. However, this option (as Apple argues in their motion to vacate) puts an even greater burden on the company, given that the Department of Justice already has over a dozen similar orders pending for Apple, as soon as the first order is complied with. This would leave Apple in a position of having to create, then recreate, then recreate this backdoor over and over again; and the burden of doing it once is not a light burden, according to Apple:
E. The Resources And Effort Required To Develop The Software Demanded By The Government
The compromised operating system that the government demands would require significant resources and effort to develop. Although it is difficult to estimate, because it has never been done before, the design, creation, validation, and deployment of the software likely would necessitate six to ten Apple engineers and employees dedicating a very substantial portion of their time for a minimum of two weeks, and likely as many as four weeks. Neuenschwander Decl. ¶ 22. Members of the team would include engineers from Apple’s core operating system group, a quality assurance engineer, a project manager, and either a document writer or a tool writer. Id.
So there you have it, Apple is being ordered to face the burden of either protecting and bearing the liability for what our own laws consider a weapon with export controls, or the burden of allocating massive resources and time to developing the backdoor over and over again to service future court orders that will likely span far beyond US-based federal agencies.
Would this be reasonable assistance to anyone?
Update: Very suspicious timing, the government has just announced in the midst of this case that they are revising new Wassenaar rules pertaining to this. Perhaps they see the overall dangers in treating intrusion tools as weapons, or perhaps this is a legal obstacle anticipated for this case.
M | T | W | T | F | S | S |
---|---|---|---|---|---|---|
1 | 2 | 3 | 4 | 5 | 6 | 7 |
8 | 9 | 10 | 11 | 12 | 13 | 14 |
15 | 16 | 17 | 18 | 19 | 20 | 21 |
22 | 23 | 24 | 25 | 26 | 27 | 28 |
29 | 30 | 31 |