Apple and the FBI

The story of a company told to break into its own device.

Pandora's iPhone

As a security nerd, and someone who has worked extensively in the intelligence community, I have followed various security cases over the years but this one takes the cake for sure.  No surprise here, but this Apple/FBI thing is a big deal and I wanted to join the conversation with thoughts on what the FBI wants from the Apple and how this case aligns with other legal precedents set over time.

Privacy has been a hot topic ever since the commercialization of the computer and the “internetification” (so I made up a word – go with me here) of the TCP\IP network. Apple has made huge strides in the way of privacy with the release of iOS with version 8 in late 2014. This release prevented the unlocking of data when a device was seized by law enforcement without the passcode based on its AES 256 key system (iOS7 could self-destruct with enough entries of the wrong passcode as well, but iOS8 tied the encryption to it). Then, when iOS9 came out they stepped it up again via two-factor enablement with TouchID. Also, the latest release took us from 4 digits (10k combinations) to 6 alphanumeric digit passcodes resulting in over a million possible combinations. That becomes a problem if someone seizes your phone but a it’s a great feature if your phone is lost.

Over the years there have been a few cases that required the courts to force their way into the digital realm for evidence. Phone passcode wins and losses; computer password wins and losses (they are too easy to circumvent when you have physical access CHNTPW, or if you’re just using EnCase to mount the data). There have also been forced decryptions of computers and even fingerprints- their reasoning being that since an individual can be required to do a DNA test, fingerprint testing falls in that same category.

Forced decryption of passcodes and passwords have been the stickiest cases over the years because of the 4th and 5th Amendments. The 4th Amendment protects you from unlawful searches (but not all searches), however the list of exceptions and loopholes that law enforcement continues to build keeps growing. The 5th Amendment is where Due Process comes into play and the statement “Nor shall be compelled in any criminal case to be a witness against himself” is used to protect our passwords, passcodes and encryption of our devices.

In 2015 an interesting case came up called SEC v. Huang. A federal trial court in Pennsylvania held that the government can’t force a person to give up his passcode to his smartphone. The case’s conclusion:

“Additionally, the foregone conclusion doctrine does not apply as the SEC cannot show with “reasonable particularity” the existence or location of the documents it seeks. Accordingly, the SEC’s motion to compel the passcodes is denied.”

Even as this case made its way to the appeals court, that phrase “reasonable particularity” was the primary reason for denial. The prosecution failed to provide the necessary details about what particularly was on the phone that was needed for evidence. Questions arose like, what had already been acquired from Apple with the last iCloud backup? Which applications would require further authentication for data access (iMessage, Kik, Silent Text, Confide, Wickr, Threema, ChatSecure, Telegram, Signal… a Snowden Favorite… and Snapchat all require individual password authentication). Then there are phone records, SMS records, and the mapping out of everyone involved in those conversations… would the need arise to sue each of these entities to force them to turn over data records? This case set a precedent for how specific one would have to get when taking legal action to break into a person’s private device.

With all that being said, I respect Apple for what they are doing but I also think they (Tim Cook) left out a few very important details that customers deserve to know. Apple has complied with a lot of law enforcement requests over the years but considers this one to be different. Because of cases like the Edward Snowden leaks, CISA S. 2588 (passed in the Omnibus, Dec 18, 2015) and how the intelligence network works, I don’t think this stand against the FBI will last long. Also, and unfortunately, I believe if they do comply we may never know about it. The customer letter that Tim Cook wrote was so close to the five day deadline from the FBI that it sounds like he was shooting from the hip in reaction to the FBI’s three requests. I personally would be surprised if he consulted with his legal counsel before publishing the letter. Whoever made this request on the FBI’s behalf knew exactly what they were doing (because it had been done before) and likely never intended for this to be a backdoor to privacy invasion – maybe it really is all about this single phone. Basically they need to jailbreak this device or do some software update/recovery plan so that the FBI can hack the device and get the passcode.

Below are excerpts from the warrant issued to Apple and my thoughts on each part.

  1. Apple shall assist in enabling the search of a cellular telephone, Apple make: iPhone 5C, Model: A1532, P/N:MGFG2LL/A, S/N:FFMNQ3MTG2DJ, IMEI:358820052301412, on the Verizon Network, (the “SUBJECT DEVICE”) pursuant to a warrant of this Court by providing reasonable technical assistance to assist law enforcement agents in obtaining access to the data on the SUBJECT DEVICE.

Paragraph 1: Apple shall help them with the search of this iPhone 5C and provide reasonable technical assistance. From what I have read, Apple has already done this by helping them know the limits of what they are against.

  1. Apple’s reasonable technical assistance shall accomplish the following three important functions: (1) it will bypass or disable the auto-erase function whether or not it has been enabled; (2) it will enable the FBI to submit passcodes to the SUBJECT DEVICE for testing electronically via the physical device port, Bluetooth, Wi-Fi, or other protocol available on the SUBJECT and (3) it will ensure that when the FBI submits passcodes to the SUBJECT DEVICE, software running on the device will not purposefully introduce any additional delay between passcode attempts beyond what is incurred by Apple hardware.

Paragraph 2: Apple will figure out how to disable the auto-erase feature for 10 bad passcode attempts. Using brut force to access a device can take days, months, even years depending on if the person had a 4-6 digit code. The problem is that you would have only 9 tries before being forced to reboot, wait an extended period of time, or do it at a slow enough pace so as not to set it off.

  1. Apple’s reasonable technical assistance may include, but is not limited to: providing the FBI with a signed iPhone Software file, recovery bundle, or other Software Image File (“SIF”) that can be loaded onto the SUBJECT DEVICE. The SIF will load and run from Random Access Memory (“RAM”) and will not modify the iOS on the actual phone, the user data partition or system partition on the device’s flash memory. The SIF will be coded by Apple with a unique identifier of the phone so that the SIF would only load and execute on the SUBJECT DEVICE.  The SIF will be loaded via Device Firmware Upgrade (“DFU”) mode, recovery mode, or other applicable mode available to the FBI. Once active on the SUBJECT DEVICE, the SIF will accomplish the three functions specified in paragraph 2. The SIF will be loaded on the SUBJECT DEVICE at either a government facility, or alternatively, at an Apple facility; if the latter, Apple shall provide the government with remote access to the SUBJECT DEVICE through a computer allowing the government to conduct passcode recovery analysis.

Paragraph 3: This is where the warrant gets specific with suggestions to Apple on how to hack the device. Apple creates its applications with signed files so they would have to assist here unless the FBI could hire experienced hackers and figure out how to access the phone in recovery mode. The FBI even wants, if possible, for this to work only on this phone, which to me means they know more about this process than they’re letting on.

By the way… this warrant was likely written by a former Apple employee (or someone very familiar with iOS). The biggest indicator of this being that their software is signed and they know how jailbreaks have worked in the past. They also suggest that the software needed for this case could stay with Apple at their facility. This is just my opinion (well, all of this is really) but if Apple trusts their OS and their employees, then the risks to individual privacy are greatly decreased.

  1. If Apple determines that it can achieve the three functions stated above in paragraph 2, as well as the functionality set forth in paragraph 3, using an alternate technological means from that recommended by the government, and the government concurs, Apple may comply with this Order in that way.
  2. Apple shall advise the government of the reasonable cost of providing this service.

Paragraphs 4 and 5: If Apple can do the things from Paragraphs 2-3 then they have complied with this order.

  1. Although Apple shall make reasonable efforts to maintain the integrity of data on the SUBJECT DEVICE, Apple shall not be required to maintain copies of any user data as a result of the assistance ordered herein. All evidence preservation shall remain the responsibility of law enforcement agents.

Paragraph 6: Apple will make reasonable efforts to maintain the data on the device during this process but they are not required to copy the data. How nice of them.

  1. To the extent that Apple believes that compliance with this Order would be unreasonably burdensome, it may make an application to this Court for relief within five business days of receipt of the Order.

Paragraph 7: Apple can cry “unreasonably burdensome” in 5 business days it can let the court know. By the way, that day is today.

All in all, the FBI isn’t asking Apple to hack the planet but they are asking them to help with this one phone – and I think it’s possible for Apple to comply. This is a HUGE DEAL. Undoubtedly this is the most pivotal case of its kind and has the potential to completely change the security landscape depending on how things play out. I’m glad Apple is standing up for privacy. I also think that if the National Security system weighs in the pressure will be too much. The weight of this case is such that it doesn’t matter what political party you subscribe to – this decision will determine how device and personal security will be handled in the future.

I do wonder where (and if) it will stop if Apple loses this fight and gives in to the pressure. The security nerd in me says “don’t back down, Apple” but the intelligence nerd says it’s inevitable that they’ll have to give in because of a Homeland Security gag and non-disclosure orders (this is why we never heard about PRISIM until Snowden). It’s upsetting to me that in either case, decisions like these aren’t being made by informed IT security experts. Words matter, and when there are billions of dollars and national security on the line, it becomes a landscape for confusion, propaganda and ultimately a bad decision. If Apple loses this battle and the legal precedent is set, our digital lives will be forever changed.

My goal with this blog was to present all the information from both sides and offer some professional insight into each. Like I said, I’m a security nerd to the core but also have extensive experience in the intelligence community. Honestly, I really can see this case from both sides. I don’t think the risk of back door-ing every iOS device based on this one case is truly that high based on the FBI requests in the warrant. I also get that if you can do this for one, the possibility of “rinse and repeat” is undoubtedly there and therein lies the threat. The only way anyone in my field could get on board with creating this software is if it’s done in an extremely controlled manner (i.e. for domestic acts of terrorism only), but who knows if a line like that could ever be drawn.

In conclusion, I hope all the information I’ve presented here has served to further confuse you like it did me as we wait to see what happens (kidding, but really… this is as messy as it gets, folks).

 

By the way, a couple of things that would have helped in this case:

  1. If the phone was an iPhone 6 or higher with a fingerprint reader they could have replicated the fingerprint and opened it with no problem and we wouldn’t be here.
  2. If the local investigators hadn’t reset the iCloud account password, they could have gotten access to the backup and we wouldn’t be here. The phone hadn’t been backed up in long time (based on a couple reports) so they could have figured out something depending on what application was used for communication.

 

References:

Apple’s response to the world about the situation.

http://www.apple.com/customer-letter/

Copy of the FBI Court Order.

https://www.documentcloud.org/documents/2714001-SB-Shooter-Order-Compelling-Apple-Asst-iPhone.html

SEC v. Huang Case Details.

https://scholar.google.com/scholar_case?case=8061258343044074065&hl=en&as_sdt=6&as_vis=1&oi=scholarr

4th Amendment Deep Dive.

http://www.uscourts.gov/about-federal-courts/educational-resources/about-educational-outreach/activity-resources/what-does-0

Even Bill Gates says this fight isn’t black and white.

http://money.cnn.com/2016/02/23/technology/bill-gates-apple-fbi-encryption/

Recent Posts