Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Iphone Government Privacy United States Apple

How the FBI Managed To Get Into the San Bernardino Shooter's iPhone (theverge.com) 94

A new report from The Washington Post reveals how the FBI gained access to an iPhone linked to the 2015 San Bernardino shooting. Apple refused to build a backdoor into the phone, citing the potential to undermine the security of hundreds of millions of Apple users, which kicked off a legal battle that only ended after the FBI successfully hacked the phone. Thanks to the Washington Post's report, we now know the methods the FBI used to get into the iPhone. Mitchell Clark summarizes the key findings via The Verge: The phone at the center of the fight was seized after its owner, Syed Rizwan Farook, perpetrated an attack that killed 14 people. The FBI attempted to get into the phone but was unable to due to the iOS 9 feature that would erase the phone after a certain number of failed password attempts. Apple attempted to help the FBI in other ways but refused to build a passcode bypass system for the bureau, saying that such a backdoor would permanently decrease the security of its phones. After the FBI announced that it had gained access to the phone, there were concerns that Apple's security could have been deeply compromised. But according to The Washington Post, the exploit was simple: [An Australian security firm called Azimuth Security] basically found a way to guess the passcode as many times as it wanted without erasing the phone, allowing the bureau to get into the phone in a matter of hours.

The technical details of how the auto-erase feature was bypassed are fascinating. The actual hacking was reportedly done by two Azimuth employees who gained access to the phone by exploiting a vulnerability in an upstream software module written by Mozilla. That code was reportedly used by Apple in iPhones to enable the use of accessories with the Lightning port. Once the hackers gained initial access, they were able to chain together two more exploits, which gave them full control over the main processor, allowing them to run their own code. After they had this power, they were able to write and test software that guessed every passcode combination, ignoring any other systems that would lock out or erase the phone. The exploit chain, from Lightning port to processor control, was named Condor. As with many exploits, though, it didn't last long. Mozilla reportedly fixed the Lightning port exploit a month or two later as part of a standard update, which was then adopted by the companies using the code, including Apple.

This discussion has been archived. No new comments can be posted.

How the FBI Managed To Get Into the San Bernardino Shooter's iPhone

Comments Filter:
  • Apple and Privacy (Score:5, Interesting)

    by Thaelon ( 250687 ) on Wednesday April 14, 2021 @06:07PM (#61274434)

    This story why I still favor Apple and iPhones over Android. One of the reasons anyway.

    The FBI didn't just want Apple to unlock the phone as they and their lapdog press claimed. They wanted Apple to create a version of iOS that would bypass all security of all iPhones forever. Not even the media howling about Terrorism and how Apple was the bad guy for helping the Terrorist got them to cave, and I was impressed.

    • Re: (Score:3, Interesting)

      by AmiMoJo ( 196126 )

      So is this an older iPhone or are Apple lying?

      For a number of years they are supposed to have been using a secure sub processor for the passcode, which should not be vulnerable to this kind of attack.

      • by Entrope ( 68843 )

        It's an iPhone 5C. As TFS notes, the shooting was in 2015. The shooter and his wife tossed their phones into a pond (IIRC, maybe it was a ditch or lake), committed their murders, and were shot to death by police in a shootout when the police tried to arrest them.

        • Re:Apple and Privacy (Score:5, Interesting)

          by gnasher719 ( 869701 ) on Wednesday April 14, 2021 @06:54PM (#61274558)

          The shooter and his wife tossed their phones into a pond

          As far as I remember, he (didn't read about his wife) destroyed two Samsung phones, the hard drive of his Mac disappeared and was never found, and the iPhone, owned by his employer, was left untouched. Apple could have recovered some data in iCloud, but the FBI made a clumsy attempt to change the iCloud password which destroyed all access that Apple would have had.

          The fact that the shooter left this iPhone untouched should have indicated that there was nothing of value on it, and that turned out to be the case in the end. So all the money that the FBI spent was wasted in the end.

          • The fact that the shooter left this iPhone untouched should have indicated that there was nothing of value on it, and that turned out to be the case in the end. So all the money that the FBI spent was wasted in the end.

            And since the phone was owned by the Inland Regional Center, the Center should have had the phone properly managed, which would have given them admin control of the phone.

          • Re:Apple and Privacy (Score:5, Interesting)

            by tlhIngan ( 30335 ) <[ten.frow] [ta] [todhsals]> on Thursday April 15, 2021 @05:04AM (#61275600)

            As far as I remember, he (didn't read about his wife) destroyed two Samsung phones, the hard drive of his Mac disappeared and was never found, and the iPhone, owned by his employer, was left untouched. Apple could have recovered some data in iCloud, but the FBI made a clumsy attempt to change the iCloud password which destroyed all access that Apple would have had.

            The fact that the shooter left this iPhone untouched should have indicated that there was nothing of value on it, and that turned out to be the case in the end. So all the money that the FBI spent was wasted in the end.

            Basically that. But it was a convenient omission by the FBI and President Trump in order to exert pressure on Apple. After all, the FBI has been complaining a lot about not being able to decrypt iPhones. Something they didn't have many problems with on Android which is why they never complained.

            But the fact that the guy destroyed all the things that had real data on it, but not the iPhone (which belonged to his employer) indicates at least the guy was obeying proper OpSec - the iPhone was employer provided and not trusted, so it should not contain any data on it. Use it for work and that's it.

            Of course, the FBI played it up like it could've had everything on it, even though destroyed hard drives and destroyed phones would've pointed to the fact that there was nothing valuable on it. At best, to a naive investigator, they'd spend all their time on breaking the phone and getting nothing while any other evidence rots away. It's like, destroyed phones, destroyed laptop, perfectly intact phone. Red herring material.

            Considering you can securely erase an iPhone from the OS itself (it dumps the encryption key and regenerates a new one, as well as forces a reboot which would reset the memory encryption key and re-scramble both memory addresses and data), all evidence points to it not having any useful data on it.

            • This started during the Obama administration. James Comey was the person publicly pushing for it. Obama didn't say much about it, just that we've never had such vast warrant-proof stores of data before and need to maintain a balance between privacy and subpoena powers. Under a non-Trump administration, an investigation into a lone wolf is not something the president should be actively involved with.
      • The passcode on an iPhone is a numeric key of 4 to 6 digits.

        If you can get past the limited attempt hurdle, a brute-force search is easy.

        Many people use dates such as "200769" which have a much smaller search space than random attempts. At one attempt per second, you can often crack a code in a few hours.

        • The default passcode on an iPhone is a 6 digit number, I think (it's either 6 or 8). Back in 2015, it may have been 4. However, even back then you had the option of choosing to use a custom alphanumeric passcode of arbitrary length - which is what I've always done.

          • Re: (Score:2, Informative)

            by tlhIngan ( 30335 )

            The default passcode on an iPhone is a 6 digit number, I think (it's either 6 or 8). Back in 2015, it may have been 4. However, even back then you had the option of choosing to use a custom alphanumeric passcode of arbitrary length - which is what I've always done.

            Actually, the passcode option has been around since iPhone OS 1.0 or so, and I think even then there was the option to use a password.

            It's a really old option.

            In fact, the encryption key used by the iPhone is easily crackable with a 4 digit PIN, w

            • by ebvwfbw ( 864834 )

              Wonder what it was? 4321 LOL

            • Well, 4 digits is easily crackable if you get around security measures. One non-optional one is a delay after two money attempts, donâ(TM)t know if itâ(TM)s 30 minutes or 2 hours per attempt after ten wrong ones. 5,000 attempts half an hour apart takes about 110 days if you have people trying 24/7. I think an early hack was copying the memory, do 8 attempts in five minutes, copy the memory back, have another 8 attempts and so on. And copying/restoring all memory isnâ(TM)t trivial.
      • Re:Apple and Privacy (Score:5, Informative)

        by Distan ( 122159 ) on Wednesday April 14, 2021 @06:29PM (#61274492)

        Secure enclave was introduced on the iPhone 5S. Farook's phone was a 5C which was one generation older than that.

        • by AmiMoJo ( 196126 )

          Thanks, that explains it. Good to know that so far the secure enclave seems to have remained fairly secure. I'm sure that will be of great comfort to the idiot who modded my post "flamebait" because the thought of Apple being less than magical upset them.

          • There is a known hardware hack that involves soldering off the NAND FLASH chips off the main PCB, copying their contents and soldering in a rig where every time they reach the maximum retry limit they just shut down the phone, re-flash the original contents of the NAND FLASH chips and start again. It's a bit tedious, but it does work and from what I've read the FBI does use it in high-severity cases like this.
      • So is this an older iPhone or are Apple lying?

        Obviously flamebait.

        They have had the "Secure Enclave" processor for a while. As designed, it is unrelated to this kind of attack. There is a security feature on all newer iPhones that will erase the phone after ten unsuccessful attempts. The user can turn this on or off. With the feature turned off, the FBI could always do a million attempts if they wanted. And if you are very clumsy, or sometimes try to unlock your phone while drunk, you shouldn't turn this on.

        This feature needs a _counter_ to erase

        • by Anonymous Coward

          And it's 8 bit, so it would be impossible for Apple to implement "erase after 300 wrong passwords" using the counter in the Secure Enclave.

          Even the original SEP was an ARMv7-A which was a 32-bit processor. Perhaps you're referring to the 2nd generation Secure Storage Component's counter lockboxes that have a 128-bit salt, a 128-bit passcode verifier, an 8-bit counter, and an 8-bit maximum attempt value? There's nothing preventing Apple from applying a signed firmware update to the SEP that expands and rewrites those values. But what would be the point? That's decreasing security.

        • Re:Apple and Privacy (Score:4, Interesting)

          by swillden ( 191260 ) <shawn-ds@willden.org> on Wednesday April 14, 2021 @11:08PM (#61275070) Journal

          This feature needs a _counter_ to erase the content after 10 attempts. That counter wasn't sufficiently protected.

          Yes, that's the point. In a reasonable implementation the counter would be in the secure enclave. But, as others have pointed out, the iPhone 5C didn't have a secure enclave processor; that was added in the 5S. Prior iPhones probably had a secure enclave in TrustZone, but they apparently didn't use RPMB (or similar) to protect the counter.

          FWIW, Android devices have been required to have a sort of secure enclave since 2015 (many did earlier, but it wasn't mandatory), and to implement password checking with an RPMB-based counter -- and to increment the counter before checking the password, not after. Note that he result of too many consecutive failures on Android devices isn't to wipe the device (in most cases), but to require exponentially-increasing delays between allowed attempts, and the delay timing is also done in the secure environment, not by the main system. Many Android devices with a secure element do this in the secure element, including the delays. All Pixels have done so since the PIxel 2, and all Pixels since Pixel 2 have also implemented "insider attack resistance", which makes it impossible for Google to retroactively backdoor them the way the FBI wanted Apple to.

          • by tlhIngan ( 30335 )

            FWIW, Android devices have been required to have a sort of secure enclave since 2015 (many did earlier, but it wasn't mandatory), and to implement password checking with an RPMB-based counter -- and to increment the counter before checking the password, not after. Note that he result of too many consecutive failures on Android devices isn't to wipe the device (in most cases), but to require exponentially-increasing delays between allowed attempts, and the delay timing is also done in the secure environment,

            • FWIW, Android devices have been required to have a sort of secure enclave since 2015 (many did earlier, but it wasn't mandatory), and to implement password checking with an RPMB-based counter -- and to increment the counter before checking the password, not after. Note that he result of too many consecutive failures on Android devices isn't to wipe the device (in most cases), but to require exponentially-increasing delays between allowed attempts, and the delay timing is also done in the secure environment, not by the main system. Many Android devices with a secure element do this in the secure element, including the delays. All Pixels have done so since the PIxel 2, and all Pixels since Pixel 2 have also implemented "insider attack resistance", which makes it impossible for Google to retroactively backdoor them the way the FBI wanted Apple to.

              Alas, Android didn't require flash memory encryption until much later. If you wanted access to the data, the counter was useless - you'd just desolder the flash (around the time, eMMC) and then mount it in a jig and read it out.

              Er... RPMB [westerndigital.com] has nothing to do with general device encryption. RPMB content isn't necessarily even encrypted, and there's absolutely no reason for the authn failure counter to be encrypted. The important thing about RPMB flash is that all reads and writes are authenticated. And, no, you can't work around it by desoldering and reading it out, because the eMMC/UFS chip with RPMB does the authentication checking itself. You can desolder it and try to read data out, but it won't respond if you can't apply the co

      • Besides the fact that it's an older model, the security processor itself can't a) wipe the whole phone and b) verify that it has actually been wiped (the memory module isn't lying to the processor).

        Given physical access, skill, lots of time, and resources, any system can be hacked. For a skilled attacker with physical access and plenty of time, you can only make it *inconvenient* to access the data. You can't make it impossible.

        For example, one could physically cut the Trace to the erase pin on the flash c

        • by Anonymous Coward

          Given physical access, skill, lots of time, and resources, any system can be hacked.

          Any system? Have you found a way to crack OTP encryption? If so, your Nobel Prize awaits. If not, maybe you shouldn't speak in absolutes.

    • For iPhones accepting updates, Apple has a bypass for all security on iPhones forever.

      • Re:Apple and Privacy (Score:4, Informative)

        by gnasher719 ( 869701 ) on Wednesday April 14, 2021 @06:56PM (#61274564)

        For iPhones accepting updates, Apple has a bypass for all security on iPhones forever.

        I haven't seen an update yet that would be performed without me unlocking the phone and allowing it. And they couldn't unlock the phone without the passcode.

      • Re:Apple and Privacy (Score:5, Informative)

        by karmatic ( 776420 ) on Wednesday April 14, 2021 @07:58PM (#61274708)

        Firmware updates require user consent (with passcode), or DFU mode which wipes the device. USB access requires user consent for trust, or DFU mode which wipes the device. After the phone has been idle for a while, the USB port is disabled for accessories, and requires user consent to enable, except for DFU mode that wipes the device.

        Plus, MDMs can disable USB access. Our corporate network does for anything that is missing a specific certificate that has to be installed on a computer running Apple Configurator 2.

        If I want to update the firmware, I either have to do it online and provide user consent ... or enter DFU mode which wipes the device.

    • This story why I still favor Apple and iPhones over Android. One of the reasons anyway.

      Google would have taken the same position as Apple, I think, had it been a Nexus/Pixel device. Of course, Google wouldn't have had any say if it were a non-Google device.

      Also, this [googleblog.com].

    • by tlhIngan ( 30335 )

      This story why I still favor Apple and iPhones over Android. One of the reasons anyway.

      The FBI didn't just want Apple to unlock the phone as they and their lapdog press claimed. They wanted Apple to create a version of iOS that would bypass all security of all iPhones forever. Not even the media howling about Terrorism and how Apple was the bad guy for helping the Terrorist got them to cave, and I was impressed.

      Plus a president. You have to remember President Candidate Trump making a big deal about it, and

    • They wanted Apple to create a version of iOS that would bypass all security of all iPhones forever.

      This statement is quite deceptive. While I have no doubt there were (and are) many high ranking people in the government that "want" that, but the actual lawsuit was to have the software locked just to the specific phone in question [link below].

      It also sound like you might want to replace your media sources, I don't recall any news I saw try to paint Apple as the bad guy. Everything I saw either presented arguments for both sides, or favored Apple.

      https://en.m.wikipedia.org/wik... [wikipedia.org]

  • Is there a reason they can't use dead persons finger print or dead face to unlock the phone? Can't be that hard to get it to unlock, assuming they still have a face, that isn't blown off. Don't know how many people still use a PIN vs the other methods?!?
    • PINs are required after rebooting an iPhone
    • By design biometrics time out after a while. You can change the setting under "Settings, Face ID and Passcode, Require Passcode..."

      • Disable biometrics after first unlock.
      • Assuming the suspect didn't have a passcode, the first thing the police/FBI should do, is unlock their phone, and remove biometrics. Do this before the battery dies, or rebooting the phone. Don't think the 4th Amendment applies to dead people, but I dunno.
        • Apple essentially requires a passcode, and requires the passcode to re-enable biometrics. Biometrics are useful in that if you are keeping your phone on you it will generally unlock the phone, but after a while (like an hour) you need the PIN again.

          The whole idea is that if someone gets their hands on your device while its out of your control, they can't use cloned biometrics to break into it. It just happens that the same thing applies when the FBI dredges a lake. The odds of them finding the phone, and

    • by yagmot ( 7519124 )

      Apple's Face ID requires you to actively look at the phone to unlock it. It won't work if the person is dead or asleep, or even looking in a different direction.

    • The TouchID sensor requires the skin touching it to have residual electricity that only exists while the person's alive. If that isn't actually a problem then the next issue is that once unlocked it has to be re-unlocked at least once an hour. Not terribly practical.

      PIN usage has almost certainly gone up in light of the pandemic and masks obscuring the face.

      • by Jeremi ( 14640 )

        The TouchID sensor requires the skin touching it to have residual electricity that only exists while the person's alive.

        Not to get too ghoulish, but wouldn't a 9-volt battery (or similar, with appropriate modulation) be able to supply the appropriate amount of residual electricity as necessary?

  • Was it worth it? (Score:3, Interesting)

    by eford49 ( 3521375 ) on Wednesday April 14, 2021 @07:20PM (#61274636)
    What has not been disclosed yet is whether there actually was actionable evidence found on the device they decried as so essential to protecting the public. Or was this just another agency crying wolf?
    • Re: (Score:1, Informative)

      by guruevi ( 827432 )

      Per the report, nothing of relevance was found.

    • by sconeu ( 64226 )

      There wasn't.

    • Re: (Score:3, Interesting)

      They were crying wolf before they got it unlocked. They fumbled the iCloud unlock, which cut them off from a source of information Apple would have assisted them with, then tried a PR'ish stunt to shame Apple into doing their dirty work.

      Since nobody seems to remember that, I'm worried it worked.

      • You assume they fumbled the unlock. Maybe they unlocked the phone, foiund nothing of value, then "fumbled" the unlock so they would have a reason to push Apple to building in a backdoor (to be used on every single Apple phone world wide).
      • by necro81 ( 917438 )

        Since nobody seems to remember that, I'm worried it worked.

        Apple remembers. Slashdot remembers. The FBI remembers, too, but that don't mean diddly to them.

    • You didn't mean "decried", you meant "insisted".

  • by cfalcon ( 779563 ) on Wednesday April 14, 2021 @08:16PM (#61274728)

    " After the FBI announced that it had gained access to the phone, there were concerns that Apple's security could have been deeply compromised."

    Oh, is it not a deep compromise of how the phone works? Lets look at the next line:

    "Azimuth Security basically found a way to guess the passcode as many times as it wanted without erasing the phone"

    Ok, that is the deepest possible compromise. The security model of the iPhone uses all this fancy encryption, but it has a weakpoint: it needs to be accessible via a password. You can set a very secure passcode, but almost no one does, because your 50 character poem would take awhile to type every time you want access to the phone. Anything less than that is completely inadequate.
    If you don't use a password of arbitrary length, then you have a 6 or 4 digit passcode. A 50+ character passphrase is actually secure against brute force. A 6 or 4 digit passcode is not.

    So Apple's ENTIRE security model is to use hardware to prevent the mapping from "4 digit code" to "256 bit private key" to be undiscoverable to the attacker, while using software and hardware to fix the number of attempts allowed before the phone will destroy the private key.

    So this means that Apple's security HAS been "deeply compromised".

    Now, they've had time to fix this vulnerability. But this was high profile, and Apple took it on the chin. Most Apple fans are just happy that they tried, but honestly, why would anyone assume these are private? You'd need like 10 years of them actually working before you can assume you have a functional product.

    • Checking any passcode takes 80 milliseconds, and there is no way to shorten the time. It can only be done on the phone itself, because one of three components of the key is inside the processor. So eight digits takes about 3 months.
      • So eight digits takes about 3 months

        Eight digits of what character set? Alpha? Alphanumeric?

        I'm not coming up with the numbers you are, but I think it might be my maths. Can you show your calculations?

  • Grayshift tools (Score:4, Informative)

    by Hari Pota ( 7672548 ) on Wednesday April 14, 2021 @08:35PM (#61274768)
    This article gives a little more about Grayshift and its hardware device GrayKey.

    https://blog.malwarebytes.com/... [malwarebytes.com]

    Theyre still selling this product. For the San bernadino tool the fbi payed 900,000.
  • IOS history (Score:4, Informative)

    by Hari Pota ( 7672548 ) on Wednesday April 14, 2021 @08:42PM (#61274792)
    This is a decent review from 2020 of the chronology of ios security developments and the tools used to defeat them. Unlike the WAPO article it is not paywalled - gotta love google scholar.

    https://www.aimspress.com/file... [aimspress.com]
  • As nobody competent will call hardware "tamper proof" these days (if somebody does, do not trust them), tamper resistance is all you get. And then it is essentially a question of effort invested. Eventually that may change, but at this time a password long enough to not be guessable and encryption with a key derived from that is the only somewhat secure solution.

  • Anyone got the name?

  • After the FBI announced that it had gained access to the phone, there were concerns that Apple's security could have been deeply compromised. But according to The Washington Post, the exploit was simple: [An Australian security firm called Azimuth Security] basically found a way to guess the passcode as many times as it wanted without erasing the phone, allowing the bureau to get into the phone in a matter of hours.

    That really is a deep compromise of Apple's security. Reducing password guess attempts is a key part of that.

  • Some security mechanisms are more secure than others. But security is complex, there is always some weakness somewhere, if the attacker is determined enough. If the FBI wants into your device badly enough, they'll get in.

    Still, this does not make securing your device useless. The FBI (or nation states, or other wealthy attackers) is rarely interested in spending the kind of money and time it takes to defeat a secured system. Even minimal security is better than none.

  • consider.
    make a digital copy of the iphone.
    simulate entering password input using a rainbow file.
    profit

The moon is made of green cheese. -- John Heywood

Working...