Apple v. FBI

Apple v FBII was watching part of the Congressional hearing investigating the conflict between Apple and the FBI. Both sides have an arguable point of view, which I’ll touch on, but what really struck me was that this issue is a direct consequence of our digital media world. What’s at stake here has never been at stake before.

It’s also an example of a theme I’ve hammered on several times here: It was not ever thus. This is an example of a new thing. Never have we put so much of our lives in a digital vault that depends completely on digital encryption for security.

The outcome of this debate is crucial to our future!

For one thing, this could go all the way to the Supreme Court, and with SCOTUS in balance due to missing a judge, suddenly the political side of this rises to the foreground. Who ends up replacing Scalia could be instrumental to how this case is decided.

Another thing interesting about this is how it requires a level of technical knowledge to understand. And therefore also to judge. If you don’t understand the issues involved, your opinion on them is pretty meaningless.

Apple v FBI 2

Here’s one way the FBI can get into the phone…

The good news is that the knowledge required isn’t out of the reach of any intelligent person willing to pay attention. It’s not like math is involved.

(Although, since we’re talking encryption, it can be if you want it to be.)

But the point is that this is an important modern technology issue that does require some degree of genuine understanding. (And it’s certainly not the only issue facing us that does.)

Let’s start with exactly what the FBI is asking of Apple…

Actually, at the risk of tipping my opinion early, let’s rephrase that as: What is the FBI using a court order to try to force Apple to do against its will?

They want them to create a new version of the phone’s O/S (operating system) with three modifications:

  1. Disable the phone’s setting to delete all data after 10 failed passcode attempts.
  2. Disable the time delay forced by the phone between passcode attempts.
  3. Add a new capability to allow passcodes to be entered electronically rather than by hand

The combination of these items (especially #2 and #3) would allow the FBI to use their computing power (which is considerable) to brute force the passcode. That is, to try all possible combinations until they get it.

Geraldo Rivera

Nothing found!

That would give them access to whatever is on the phone. Which, as in Geraldo Rivera’s infamous vault, might be nothing. Keep that in mind. The FBI doesn’t know what, if anything, is on the phone.

The FBI’s basic position is that, while of course they support the right to privacy of all Americans (duh!), a court order can overcome that right. In principle, such a court order can overcome any and all privacy rights.

For example, a judicial order can compel a DNA sample or even surgery to recover needed evidence (such as a bullet). It can certainly make available all your personal and business information.

But a key point Apple makes is that court orders compel existing work product or evidence. They cannot, generally, compel you to create new work product (or evidence) to satisfy the ruling.

As such, the court cannot compel Apple to require its workers create a new version of the operating system.

digital keyThe FBI counters with the example of compelling a landlord to create a new key allowing access to one of the building units.

(I would counter with how that’s not a new key, that’s a copy of an existing key. It’s like ordering a business to provide copies of their documents.)

So it does seem the FBI is asking for something out of the ordinary when they try to force Apple to create a new work product.

§

A big part of Apple’s resistance comes from the fear that, once this modified O/S is created, there are two dangers:

  1. Despite the FBI’s claim this is a one-off, there will be other requests to repeat this trick.
  2. The wrong people might get their hands on the modified O/S.

The way out of those dangers, Apple says, is to never make the modified O/S!

Tim Cook

Apple’s Tim Cook

I’m less supportive of the second danger (I suspect Apple is very careful about the security of its O/S code), but the first one is bad enough.

Other law agencies have already said they want in on this.

And what happens when England or Germany or Israel asks us to help stop a terrorist plot by cracking a terrorist phone? Do we say no?

What happens when China asks? Or Iran? Would we help Iran crack the phone of an ISIS terrorist? What if the plot was against the USA?

Very often in the world, once a thing is done once, once the virginity is lost, so to speak, doing it again is much, much easier. This is true for many things in life. The first time is the challenge.

§

mobile bankingIf this were a matter of getting into some terrorist communications it would be much less of an issue. But today people put their entire lives on their smart phones.

Someone with access to your phone might also have access to your banking, your medical records, your utilities, your home security, your social media and anything stored in the cloud, even your car.

And consider how much information is available through the phones of government workers and business men.

The stakes here are extremely high, so we all need to educate ourselves about this and pay attention!

And, if it isn’t obvious, I do side with Apple on this.

§

encryption

Encryptogram!

One last note: This modified O/S is not a “backdoor” to encryption, although some of the same issues do apply.

Encryption is the mathematical process of turning the numbers representing the plaintext (readable text) into other numbers (the ciphertext, which looks like random noise).

Decryption reverses the process, restoring the plaintext.

The encryption and decryption processes use private (and sometimes public) keys. The idea being that only those with the key can read the text.

A backdoor is a bit of secret code in the encryption-decryption algorithms that allows those who know about the backdoor to read all encrypted text without knowing the key!

Backdoor code is considered very, very bad, because (unlike a modified O/S) the bad guys only have to know about the backdoor. Once that secret is out, all your encrypted traffic (that used that process) is dead beef.

The FBI counters (accurately) that they’re asking Apple to just remove the guard dogs so they can try to pick the lock. No backdoor involved.

§

questionsIn closing, there are some questions I have about this case (please speak up if you know any of the answers):

Why would a locked phone allow its operating system to be replaced?

If that’s possible, why doesn’t the FBI hire someone to do that?

Is it the case that the phone’s O/S can be uploaded but not downloaded (and studied to determine how to modify it)?

So why doesn’t the FBI demand the (existing) source code for the O/S (or at least the relevant sections) along with the (existing) technique for uploading a modified O/S and do the work themselves?

About Wyrd Smythe

The canonical fool on the hill watching the sunset and the rotation of the planet and thinking what he imagines are large thoughts. View all posts by Wyrd Smythe

24 responses to “Apple v. FBI

  • Steve Morris

    I don’t have a strong opinion either way, but it seems to me that Apple is simply using a technicality to try to get out of doing something it doesn’t want to do. As you say, many people might not wish to comply with a legal request to hand over private information. But the law doesn’t make exceptions. It compels people to do things they don’t think ought to happen. It’s not up to them.

    It’s very easy to put forward a strong moral case for unlocking the private data of a terrorist. What if 9/11 could have been stopped, but Apple refused to help?

    • Wyrd Smythe

      “[I]t seems to me that Apple is simply using a technicality to try to get out of doing something it doesn’t want to do.”

      Why do you think they don’t want to do it?

      “But the law […] compels people to do things they don’t think ought to happen.”

      How would you feel about a court order forcing you to spend two weeks creating a work product both you and your company felt was wrong and a bad idea?

      “What if 9/11 could have been stopped, but Apple refused to help?”

      Yeah. This is, in some regards, a variation on the the Trolley problem.

      There is a difference between a known clear and present danger and violating our principles because we think maybe there might be useful information. A police officer in “hot pursuit” can sometimes do things not normally allowed. Or the more extreme case of allowing even an ordinary citizen to kill in self-defense under clear and present life threat.

      If we knew there was useful, critical, information on the phone, that might be a different story depending on the nature of the info. But it might not. At what point do we compromise our principles? In the long run, what is worth that?

      There is also that terrorists (and other bad actors) use their own encryption tools, so even if the FBI does break into the phone, and even if there was useful intel there, it might still be inaccessible due to the encryption. That would actually be that expected condition for any real terrorist, but those two were apparently a solo act, so encryption isn’t likely. But neither is there being a lot of useful intel in that case.

      • Steve Morris

        “Why do you think they don’t want to do it?”
        Possibly an ethically-driven viewpoint, but more probably a desire to sell a product that ensures a user’s privacy.

        “How would you feel about a court order forcing you to spend two weeks creating a work product both you and your company felt was wrong and a bad idea?”
        I would hate it. But I hate lots of things about the law, and the law compels people to do things they don’t want to. That’s the reason for its existence.

        “If we knew there was useful, critical, information on the phone, that might be a different story.”
        If Apple hadn’t encrypted the data, we would already know that 🙂

        “At what point do we compromise our principles?
        I don’t know what principle is being compromised. As you pointed out, a court order can overcome any and all privacy rights.

        I don’t actually have much sympathy for the privacy argument in this case. The way I see it is that Apple created a problem for itself. It deliberately created an OS that couldn’t be hacked. It must surely have anticipated that at some point the authorities would try to compel it to break the encryption, so in building the OS it deliberately created a problem, knowing that it would have to use legal arguments to try to avoid complying with the authorities. Let’s see what the Supreme Court decides. After all, Apple doesn’t get to decide what the law says, and it makes me quite angry that the company thinks it can.

      • Wyrd Smythe

        “Possibly an ethically-driven viewpoint, but more probably a desire to sell a product that ensures a user’s privacy.”

        It’s clear we disagree on this, and that you feel strongly about the issue, so let me just say first that I respect your opinion and understand the arguments for it.

        In fact, I do not feel strongly largely because I see both sides. I’m on Apple’s side, but it’s by no means a slam dunk for me.

        I do think Apple’s main concerns are ethical. Other companies doing related business, as well as others in the industry, have all come out in support of Apple’s position.

        Considering that, if the FBI won this hands down, Apple and Microsoft and others would still make millions of dollars on their products, I’m convinced for myself the main reason is ethical.

        I do think the ethical debate is substantial enough to make the battle necessary to have without there needing to be any financial consideration. (It’s a little bit like the battle for the IEEE floating point standard. There were vested interests, but the primary concern was a useful standard, and that standard benefits us all daily.)

        “But I hate lots of things about the law, and the law compels people to do things they don’t want to.”

        Would you say no to the law if you thought it was wrong? (Although this is a side point not really connected with the ethical issue itself.)

        “I don’t know what principle is being compromised.”

        The general principles of law. Warrants are usually very clear about what they seek. Sometimes so specific that other illegal evidence found during a warrant search cannot be used in court.

        There is no hot pursuit, there is no imminent danger, there are no known connections, there is no indication there is anything useful on that phone.

        Yet Apple is being asked to do something unprecedented legally. Something that needs to be considered very carefully, I think.

        “The way I see it is that Apple created a problem for itself. It deliberately created an OS that couldn’t be hacked.”

        You are opposed to the idea of an unhackable O/S?

        Are you also opposed to the idea of uncrackable encryption?

        “Apple doesn’t get to decide what the law says, and it makes me quite angry that the company thinks it can.”

        I can see that, and, given your views, I can see why. FWIW, I would say Apple is more saying what the law cannot do (and surely we agree the law cannot do anything).

        There is the ancient equation of balance between the lock-makers and the lock-pickers. That’s a battle that never ends. Lock-picking fascinates me, but more in the “know the enemy sense.” I side with the lock-makers.

        There is also the equation of security versus liberty. We all have our own view of both equations.

        I recall that political placement chart you posted about and how I fell decidedly in the Left-Libertarian zone. Such is definitely my worldview! 🙂

      • Steve Morris

        Actually I don’t have a strong opinion on this, and it seems largely to rest on legal technicalities that I am not qualified to judge.

        But people and companies have to comply with the law. Some laws are bad, but it’s not up to individuals or corporations to make that call.

        You can choose to disobey the law, but then you have to face the legal consequences of that. Apple can make itself into a martyr if it wants, but I very much doubt that it will. It will just drag its feet over this until either its lawyers win, or the FBI’s lawyers win.

        I still don’t understand what the ethical debate hinges on. If I hide some secret information in my house, lock the door, and throw away the key, does this make the information special in some way? The police will break down the door if they have a warrant to search my house.

        If the legal issue is simply that there is no warrant for the FBI to access the information on the phone, then it is irrelevant whether it is encrypted or not, so that cannot be the basis of this dispute.

        “Are you also opposed to the idea of uncrackable encryption?”
        Not in the slightest, but if it breaks the law, then the person or corporation that created it is bound to end up in trouble.

        My opinion on this whole issue is that it is a narrow legal battle between Apple and the FBI. There is of course a bigger question about the balance between liberty and security, but that is not for Apple to decide.

      • Wyrd Smythe

        “But people and companies have to comply with the law. Some laws are bad, but it’s not up to individuals or corporations to make that call.”

        As a general rule, I agree completely with the first sentence! But I disagree pretty strongly with the conclusions of the second. I think it is up to people (and companies made of people) to stand up to laws that are wrong.

        My country was founded on that idea (that not having religious freedom is wrong), and it’s deeply embedded in our psyche. A good example is the civil disobedience that led to changes in segregation and “Jim Crow” laws. Those laws were absolutely wrong.

        “You can choose to disobey the law, but then you have to face the legal consequences of that.”

        I agree completely. One has to pick one’s battles, but if one does pick a battle, then the consequences do follow. One may view that battle, and the consequences, worth it.

        The premise here is that Apple (and the computing community as a whole) think this is a battle worth fighting. (FWIW, I agree.)

        “If I hide some secret information in my house,…”

        Suppose they break down the door (with a warrant) and take everything you have because they think — based only on context — that there might be useful information somewhere among it?

        Suppose there was, but you encoded it using your own person code and the only known key is in your head (and you ain’t tellin’).

        There is also that it’s possible to construct a physical safe that, while it can be cracked, will destroy its information contents. Technology raises the interesting concept of inaccessible information and that is a fact of modern living we may have to learn to live with.

        (These are exactly the sorts of issues I’m talking about with regard to changes in modern life. The jump to digital information is as significant as the jump to the electron or fire. It changes just about everything.)

        “Not in the slightest, but if it breaks the law,…”

        (That in response to my question about uncrackable encryption.) How would uncrackable encryption break the law? That’s actually a very interesting question that’s right in the heart of the lock-pick versus lock-maker tension.

        Maybe the question boils down to whether you feel uncrackable encryption should be against the law.

        I’m fundamentally anti-authoritarian, so I say Hell, No! I say learning to live in a world where information can be inaccessible is a consequence of opening the Pandora’s Box labeled “Digital” in the first place.

      • Steve Morris

        “Maybe the question boils down to whether you feel uncrackable encryption should be against the law.”
        I have absolutely no opinion on that.

        I take a purely pragmatic approach to privacy. I don’t believe that privacy is a fundamental right. Many facts about me are known to anyone who cares to observe me – my physical appearance, for instance. My name and plenty of personal and administrative details are available to many people, corporations and organisations.

        If I leave notes for myself on my kitchen table, anyone who enters my kitchen will be able to read those notes. I can’t claim that they are private and expect some kind of right to prevent people using their own eyes.

        But if I keep a secret journal and hide it well, I can keep secrets. Of course, that journal may be discovered and then it is no longer secret. That depends on how well I hid it. I don’t understand why I would have a right to keep secrets in my journal if the law says I can’t.

        If I really want to keep a secret, I can keep it inside my head. Then there is no practical way (other than torture, which is illegal) that anyone can discover that. My view of the matter is purely pragmatic.

        In this case, it is difficult to extract information from the phone, but only because Apple chose to make it difficult. If it is inconvenient for Apple to break into the phone, it’s because they designed it that way. They have only themselves to blame.

        But if the phone was genuinely unhackable, and impossible to break into under any circumstances, then Apple would not be in this situation. They might be in a different situation, if the FBI tried to force them to withdraw the OS from use, and make uncrackable encryption illegal. That’s an entirely different argument, and again is one where I don’t have a strong opinion.

        My opinion on the whole matter is that different issues are being conflated in an unhelpful manner.

      • Wyrd Smythe

        “If it is inconvenient for Apple to break into the phone, it’s because they designed it that way. They have only themselves to blame.”

        This not about Apple’s convenience. They are taking a stand on an important issue. Others in their business have joined them.

        I think we have reached a basic point of disagreement in worldviews.

        If I’m following, you are making an authoritarian argument: Apple, as a member of the state, should obey the state.

        As I said, while I agree in general, I believe there is also a moral imperative involved. One does not obey laws that are wrong.

        I’ll go further and say that, as an artist, I have a fundamental view that rules are suggestions that should always be questioned (see: Breaking the (Art) Rules). Plus, I’m a life-long iconoclast, so that desire to question the rules is even stronger.

        I simply don’t share an authoritarian view. (Which is one reason Trump so horrifies me — he has exactly the same view with regard to Apple and to authority in general.)

        “If I really want to keep a secret, I can keep it inside my head.”

        Or you can use strong encryption methods. Or older methods of encoding your secrets. There are uncrackable codes that even pre-date digital.

        Crucially, you can keep secrets regardless of what the state says. Encryption is well-studied science, and exactly as with guns, if the state does manage to take strong encryption away from the (law-abiding) people, then the only ones with strong encryption will be those who are willing to break those laws.

        And strong encryption is not that much more difficult to create than guns are. There is nothing secret about encryption algorithms; it’s all published work.

        “I don’t believe that privacy is a fundamental right.”

        Where do fundamental rights come from?

        What are our fundamental rights?

        Is privacy an aspect of personal sovereignty and, if so, is personal sovereignty a right?

      • Steve Morris

        “I think we have reached a basic point of disagreement in worldviews.”
        Not so fast! It’s simply a case that I just don’t understand what the fundamental principle is supposed to be. Forgive me if I am slow, but why is it OK for the FBI to read private information on my Android phone, but not on my iPhone?

        Is the argument that the FBI has no right to enter my home and read any of my secret information under any circumstances? Or is it that I have a right to encrypt secret information with an unbreakable code so that the FBI cannot access it? Or is it something else?

        You see, I simply don’t understand what people are arguing for or against.

      • Wyrd Smythe

        “It’s simply a case that I just don’t understand what the fundamental principle is supposed to be.”

        Authoritarianism. Not everyone supports the idea. For example, I do not.

        This particular case with Apple is seen as a case of state over-reach by those who oppose it. It seems to violate important American values.

        “[W]hy is it OK for the FBI to read private information on my Android phone, but not on my iPhone?”

        That statement of the situation ignores key aspects that make this case so important. I’ve gone into those aspects, and you’ve offered no rebuttal to them, so I’m not sure why you’re ignoring them.

        Who says the FBI can read information on your Android? If you used an encryption app, they couldn’t.

        “Is the argument that the FBI has no right to enter my home and read any of my secret information under any circumstances?”

        That conflates two things. The right of entry is a right some do oppose, but it is a right well-supported by judicial action and legislation.

        The right of access to your secret information? Many more people oppose that idea, especially if it means no way to ever have private information.

        Fortunately, that has never been the case. It has always been possible to keep private information private.

        “Or is it that I have a right to encrypt secret information with an unbreakable code so that the FBI cannot access it?”

        Yes, that is exactly it. And you have that right merely in virtue of pragmatics. Inaccessible information is (and always has been) possible, so therefore it exists.

        And therefore, we need to learn to live in a world where that’s possible.

      • Steve Morris

        Wyrd, just so you know, I am not ignoring what you said, nor am i being deliberately obstructive. I just don’t understand what the argument is!

        if authoritarianism is the argument, sign me up, I’m a libertarian!

        Except that the FBI/police/etc already have rights to invade my privacy, and Apple isn’t arguing about that. Hell, the FBI stole the guy’s iPhone, so that’s not the issue.

        Now if I create an unbreakable encryption algorithm, or just invent a really good cipher, and use that to encode secrets, then the FBI can’t get access to those secrets. I understand that. It’s not in doubt. But if I just hide my diary under my bed and the FBI find it, I can’t stop them reading it. Similarly, if I use an encryption method that can be broken (like putting it on my iPhone), then the FBI can again read it. They just need a court order to force Apple to help them. Which may or may not be granted.

        Again, it just seems like something lawyers will argue about, and the court will decide, not some brand new threat to my privacy. After all this is done and dusted, the FBI still can’t get my secrets unless they have a warrant to do so, and if I encrypt them or hide them well enough I can still keep them secret.

        As you say, it has always been possible to keep information secret. And it still is. Apple wants to make it easier for people to keep secrets, and the FBI obviously doesn’t want that to happen. So this isn’t about taking away people’s rights, it’s about a desire to give everyone an unbreakable encryption tool.

        Note that I am not taking sides on this issue, just trying to clarify it. I am sometimes very slow.

      • Wyrd Smythe

        “I just don’t understand what the argument is!”

        Not as laid out in my post and in our conversation? As I’ve said, I think we just see this differently, but there are key points you haven’t addressed.

        “[The] FBI/police/etc already have rights to invade my privacy,”

        Our physical privacy, yes, with a court order. Our personal privacy is another matter. For example, they cannot brainwash or torture to produce information.

        One of Apple’s implicit arguments is that people use technology to extend their privacy, and given the degree to which this happens, maybe we need to rethink our approach to the state’s access to our physical privacy.

        “Now if I create an unbreakable encryption algorithm,…”

        The argument here hinges on the coincidence that the information seems to be physically accessible, so therefore a court order should clear the way.

        (But we don’t know if there is any useful info or that, if there is, it’s not well-encrypted. I wonder a little if people are just curious and want to know why two terrorists squandered themselves by attacking his co-workers at a party. I hope the FBI does get into the phone just so they can discover… nothing. 🙂 )

        A key point here is that the FBI is trying to compel Apple to create a new work product, which I’m not sure warrant authority extends to. I kind of think it doesn’t to the extent necessary to create O/S mods.

        (If you read the conversation I had with Tina here you’ll see how I question why the FBI doesn’t go at this by requesting existing work product, something well within the bounds of a court order. And, as an added bonus for you, would discomfort the crap outta Apple. “We have to give you our O/S code? ARG!!”)

        Further, Apple is arguing that the tool they are being asked to create is dangerous and should not exist. They are concerned about the potential of losing control of the tool, and they are concerned that, once it exists, it’ll become easier for LEOs to demand they use it. (As mentioned, we know this to already be the case.)

        “Apple wants to make it easier for people to keep secrets, and the FBI obviously doesn’t want that to happen.”

        Which is exactly why Apple sees this as a precedent-setting case. The first time something like this is litigated sets the stage for the future.

        Apple feels there is too much new territory here, and too much at stake, to just casually go along with this court order. This needs to be considered carefully and may very well end up before Congress or the SCOTUS.

        “Note that I am not taking sides on this issue, just trying to clarify it.”

        Unless I misunderstand you, you’re saying Apple should do what the FBI wants. Isn’t that a stand?

      • Steve Morris

        No, I’m saying Apple should do what the law says.

      • Wyrd Smythe

        I don’t understand. How is that different?

      • Steve Morris

        The FBI should also do what the law says!

      • Wyrd Smythe

        “The law” equally refers to agents of the law. You’re suggesting Apple (and the FBI) follow existing written laws (although, in that they are agents of that law, obeying the FBI generally amounts to obeying that existing law).

        As far as I know, they both are obeying the existing written laws.

        Apple (or the FBI) wouldn’t be violating any laws in resisting a court order until such order is ruled on by the SCOTUS (or they decline to rule — or they do rule and tie! (part of what makes this interesting is the condition of the SCOTUS right now) — in which case the ruling of the highest court that took the case stands).

        There is no existing law forcing a company to create new work product under order by LEOs. This is the first time something like this has come up.

        So it’s an important case worthy of careful consideration, is the point! 🙂

  • rung2diotimasladder

    I’ve been wondering if you’d write a post on this. Thanks for the info!

    “But a key point Apple makes is that court orders compel existing work product or evidence. They cannot, generally, compel you to create new work product (or evidence) to satisfy the ruling.

    As such, the court cannot compel Apple to require its workers create a new version of the operating system.”

    I’ve also been wondering why the FBI can’t “do the work themselves” but I figured there was some reason for it. This does seem to be the key issue.

    I was sitting there thinking, why can’t they just run through passcode combinations until they hit the right one? I figured there was some new passcode device since I have an old phone. I didn’t know the phones delete all data after 10 tries. Seems like something I would find out the hard way. Glad I know it now!

    • Wyrd Smythe

      “I’ve also been wondering why the FBI can’t ‘do the work themselves’ but I figured there was some reason for it.”

      It’s possible the reason is that they’re incompetent, but I hope that isn’t so.

      The key question is about the O/S and how accessible it is. Apple agrees what the FBI asks is possible, so we know the O/S can be modified by uploading. Either an O/S patch can be applied or the O/S as a whole can be changed from outside the phone.

      That doesn’t mean (one way or the other) that the O/S can be downloaded or inspected from outside the phone. It would certainly be in Apple’s interests to prevent that, since it exposes their O/S code to this exact inspection.

      As I write this, it occurs to me, from an Engineering point of view, that I’d design the phone either so the O/S is completely locked when the phone is locked (which is know not to be the case here) or — if it was a design requirement — provide a mechanism that allows authorized patches to be applied.

      That is, I’d make the O/S “write only.”

      So, if the FBI got smart and demanded the existing O/S code along with the protocol for applying a patch, then they really ought to be able to do it themselves. Under those conditions, the only barrier would be skill level.

      “I didn’t know the phones delete all data after 10 tries.”

      It’s a security feature (as I understand it) you have to enable. You have to chose to lose your own data if you forget your passcode! This is one of Apple’s points: users are making explicit choices to secure their data. (Some phones have things like this on by default, so it’s wise to know your security settings very well.)

      This is also possible with physical safes. No safe can’t be cracked (as the FBI Director said, as last resort, we just blow the door off). But you can create safes that destroy their data.

      It’s one thing to secure objects, but information takes things to a new level.

      • rung2diotimasladder

        “So, if the FBI got smart and demanded the existing O/S code along with the protocol for applying a patch, then they really ought to be able to do it themselves. Under those conditions, the only barrier would be skill level.”

        Gaw. They’re the FB-Freaking-I!

      • Wyrd Smythe

        I know, right?!

        There’s really no way (that I can see, but what do I know) that Apple can deny a court order to produce: [1] the O/S modification protocol; and [B] the relevant sections of the source code; and [iii] how to tie it all together.

        That should all be existing work product, which is one plank of Apple’s argument.

      • rung2diotimasladder

        I’m wondering if there’s something we don’t know going on here? But you know me, always full of conspiracy theories. I don’t believe what’s been reported could be the full story. I’m sort of kind of thinking this is some kind of: “Apple is so awesome not even the FBI can keep up.” But what do I know.

      • Wyrd Smythe

        Yeah, I’m sure there’s more going on than we know about. We’ll just have to see how it all unfolds!

  • Steve Morris

    By the way, just don’t reply if you’re getting bored, frustrated or angry by this exchange 🙂

    • Wyrd Smythe

      None of the above. 🙂

      (I have been told repeatedly that my enthusiasm often comes off as the wrong sort of passion. That’s just me, and some days I try to moderate that more than others. The older I get, the more inclined I am to just be me. Much to the dismay of everyone else. 😮 )

      To the extent you think Apple should obey the court order while I think they should take a stand, I think we just disagree based on our worldviews. I am very much in favor of taking a stand against laws that are wrong, and in this case I think what the FBI is asking might be wrong.

      One aspect of the nature of this case is that it can’t be decided afterwards when one of Apple’s contentions is that it’s a huge mistake to make the mods in the first place. Apple is pleading to leave this Pandora’s Box closed, because, if opened once, that’s all she wrote.

And what do you think?

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: