Why You Should Side with Apple Against FBI

There’s an old joke

  • Would you sleep with me for a million euros?
  • Hmmm, ok.
  • Would you sleep with me for €100?
  • No, what kind of person do you think I am?  A prostitute?
  • We’ve already established that.  Now, we’re just arguing about the price.

The case between Apple and FBI has nothing to do with decrypting a single phone or even terror.  It’s simply about establishing whether the US government can force a company to create software allowing it to circumvent encryption.

Note that I’ve been particularly lazy about references in this post.  It is already very long and took long to write; adding references for every little claim I’ve picked up in the past couple weeks would add another couple hours at least (as I tend to follow references to the source material and double-check that).  I may have gotten some details wrong, but don’t think I’ve gotten anything substantially wrong.  If I did get something wrong, let me know in the comments (preferably with actual sources) and I’ll mitigate in a very open source laissez-faire (aka lazy) approach to writing.

The Case at a Glance

Last year there was a terror attach on US territory.  It was very likely done by one guy on his own with access to the plethora of guidance you can get with a simple Google search and enough insanity to realize your plans.  The guy managed to kill some people and was himself killed during the attack.

Basically, the case is closed: attack is done, guy is dead.  The authorities now want to check that the guy was indeed acting on his own and if not scoop up any contacts to other terrorist or groups.  The guy had at least a private phone and one supplied by his work, some governmental agency.  He managed to smash up his private phone to the point that the authorities could not recover any information from it.  The guy did not attempt to smash up his work phone.

The case is about this work phone, an iPhone 5c.

The FBI wants Apple to help them recover data from this phone.  Apple always does this whenever possible if ordered by a court.  Apple provided the FBI with data from its iCloud service, where the phone was backed up.  This backup was pretty old, so Apple suggested they put the phone in a charger on a wifi so it would do a fresh backup they could then hand over.  The FBI botched this up by changing the iCloud password, so the phone could not do this.

The last resort, FBI claims is to brute-force the password, i.e., try all possible combinations.  Unfortunately, the iPhone has 3 protections in place making this hard: 1) the computation required to convert the password provided into a decryption key is intentionally slow and running on slow hard-ware, 2) the phone imposes a delay after wrong attempts that get progressively longer, and 3) after 10 attempts, the phone deletes all data.  There’s no real way around the first stumbling block, but the last two hindrances are part of the operating system and can be removed.

This is what FBI is asking Apple to do: to write a new operating system that removes security measures 2 and 3 so they can try the 10000 possible 4 digit passwords one after another until they find the right one.  They need Apple to do this because software updates need to be signed by Apple to be loaded onto the device.

It is very important that they are not asking Apple to “unlock the phone,” they are asking Apple to make an entirely new operating system specialized for their use.  An operating system that is basically a piece of forensics software that does not exist at this time.  Apple has always provided the data when they have it and decrypted if they could.  This is a case where they cannot provide the data and cannot decrypt it.  They are not unwilling, they are simply unable.

It is technically possible for Apple to change the device so they can provide the data and decrypt it.  They CAN write a new operating system for the 5C.  They cannot do that (at least using the approach suggested) for iPhone 5S/6/6S.

If Apple writes such a piece of software, it is of course easy to also use it for other devices or for it to fall into the wrong hands (as if FBI were not sufficiently wrong hands).  To get around that, the FBI has suggested that Apple ties the software to the particular device; iPhones has a lot of unique identifiers making this technically possible.  Additionally, they say that Apple does not have to hand over the software; they can install it and run it in their buildings.  Break the password and hand over the data to FBI.

Cherry-picked Case

This case is not about the individual phone.  It’s not about terror.  It’s about setting a precedent.  For most Europeans (and sane people in general) the American judicial system is a bit weird.  It is a common-law system, which means that the laws are just general guide-lines and are fleshed out by the courts.  That’s why winning or losing a lawsuit is a big deal in the US: it sets a precedent for future cases.

Precedent is the reason that many cases are settled out of court; the losing part simply doesn’t want a court to rule against them, as this will harm the potential for future cases.  That’s the reason so many cases about, e.g., software patents are settled outside of court; better to pay off some attacker than to lose a lucrative patent or set an uncomfortable precedent that will be hard to change in the future.

The case against Apple is picked to set a precedent in a couple of ways we’ll get to later.  It is picked as a terror-attack on American ground with American casualties to edge the public opinion in the direction of “‘merica, fuck yeah!”  It’s about a dead terrorist – there is no way he can help them with the information they seek.  It’s about a phone belonging to a dead guy (who is also a terrorist), whose privacy is therefore not something many think is important to conserve.  Not only that, the phone more or less belongs to the authorities.  The authorities dangle the possibility of preventing future attacks if they can just get the Skype names of all of ISIS which may or may not (very likely not) be stored on the phone in question.  Even if the FBI doesn’t get the data it doesn’t really matter: it is very, very likely the guy was operating alone and if not it is very unlikely he would store important information on his government-owned phone when he took care to smash his private phone.

Negative Consequences of Ruling Against Apple

There’s three negative consequences against a ruling against Apple.  None of them are technical and as IANAL it’s just based on my layman guesses and what I’ve read.  Which I’m sure is just as hilarious for somebody who understands law as it is to me when people with no knowledge of security discuss what Apple can and cannot do to make the iPhone more secure.

The first reason a ruling against Apple is devastating is that it will transform the world from a place where the special government version of the iPhone software doesn’t exist to a world where it does.  When the software doesn’t exist, Apple legitimately cannot help law-enforcement.  When it does they can.  That means that law-enforcement can also get access to the software in lesser cases.  If it can be used for terrorism, why not for murders?  Or child-pornography?  Or rape?  Or other violence?  Or conspiring against the state?  Or tax evasion?  It is already known (weasel word) that the US government has at least 12 additional iPhones they would very much like unlocked.  Apple can to some extent get around this by deleting the software after use, but any software developer knows that writing the software the second time around is a lot easier.  Also, there’s always…

The second reason a ruling is bad is that it will set a precedent.  Even if there’s no software, if they had Apple write it once, why won’t they be able to have them write it again?  Why can’t they ask Apple to write it for murders, CP, rape, violence, conspiring, tax evasion?  This is all very speculative, but any experience shows that the salami method to privacy invasion works very well.

Both of these are not really too bad; they require court orders and do very likely prevent crime.  The main negative consequence is that it desensitized people towards surveillance.  For completeness, here’s to numb-nut libertarian conspiracy-dipshit quotes that are often thrown around when it comes to surveillance:

Those who would give up essential Liberty, to purchase a little temporary Safety, deserve neither Liberty nor Safety.

Ben Franklin

and this one

First they came for the Socialists, and I did not speak out—
Because I was not a Socialist.

Then they came for the Trade Unionists, and I did not speak out—
Because I was not a Trade Unionist.

Then they came for the Jews, and I did not speak out—
Because I was not a Jew.

Then they came for me—and there was no one left to speak for me.

Martin Niemöller

The second one is ablut nazis, so anybody quoting that ever has lost thanks to good ol’ Godwin.

Both do have a point, though.  They stress how we are trading a little right not be under constant surveillance for a little bit of (illusion of) security.  They stress how surveillance sneak up on us and get worse.

The third reason I have for it being so crucial that Apple wins the case is way more tinfoil hatty than the others and much more speculative: I’ve already argued several times that this is not about an individual phone, this is not about terror.  This case is about forcing a company to circumvent a security measure to allow the government access.  This is a big point, so I’ve devoted an entire section to this.  That’s next section.  Feel free to jump right into the crazy.

If Apple Loses it Might Lead to Mass Surveillance (Now is a Good Time to Put on Your Tinfoil Hats)

If the FBI gets a ruling granting them to force Apple to write a piece of software circumventing the phone’s security measures, why can’t they be forced to write software circumventing stronger security measures?

The iPhone 5S/6/6S has stronger built-in security measures and would not be vulnerable to this particular attack.  From what I read, it is possible to circumvent the security measures of even these phones, but I don’t recall where I read it or what exactly the method would be.  That’s sufficiently well-documented to put on the front-page!

But what if Apple develops an iPhone that’s so safe they legitimately cannot break into it.  That is very likely possible, and definitely something Apple is working on now according to reports (weasel word).  Apple is already taking measures; recently there was a tiny shitstorm about “Error 53,” which would happen if you had the screen replaced by a third-party.  The same “security experts” who are as qualified to talk about security as I am to talk about law were quick to say that Apple was “blocking out third party repairs to strengthen lock-in,” but I’m very certain that the main reason was security-related.  Apple was simply authenticating that the finger-print reader was authentic.  If it was not authentic, it would be possible to replace it with a big computer masking as a fingerprint reader, trying to break the fingerprint lock.  Just like the FBI is asking Apple to replace the PIN code check with code masking to do this.  Error 53 was mostly (or perhaps even entirely) a security measure.  To me, this illustrates Apple really is trying to beef up security to a point where not even they can break into your phone.

The weakness is also illustrated by how the Error 53 issue could be fixed in software; if it can be fixed in software, it can likely also be circumvented in software.  That is bad and has to be fixed.  I believe Apple is trying to do exactly this to not be vulnerable to lawsuits like this one.

But if Apple has a verdict saying that they can be forced to circumvent security measures, why can they not also be forced to circumvent these new measures making the iPhone impenetrable?  For example by including a back door.  This can be in the form of a master password only Apple has access to that can unlock any phone (or more cryptographically sophisticated analogous techniques).  In fact, rumors are (weasel word to the max) such a bill is circulating around Capitol Hill.  Microsoft already does this in their Bitlocker cryptography software for Windows: they can (and will) decrypt and harddrive encrypted by their software.

And if we can do that, why not go all-out paranoia and do a chain of escalations: if we can get access to any phone, why not allow such access remotely so we can decrypt the phone of a terrorist before he kills?  If we can do so remotely, why not do so automatically, so we can monitor suspected terrorists and pick out the real ones?  Or murderers, or CP dealers, rapists, violinists((I.e., those playing the violin; no worse kind of violence exists.)), tax-evaders?

As soon as the door is opened towards the backdoor, the same escalation is likely to happen as people get accustomed to it.

Why is Mass-surveillance Bad?

This is actually the point, where I get really torn.  I mostly don’t care about being under surveillance; I legitimately have nothing to hide.  I share much more than what many recommend online.  I am still careful with some details (for example I block out credit card numbers and keys because I’m not an idiot, and I block out personal details about others).

I also don’t think there’s anything wrong with the FBI getting access to the terrorist’s phone.  In fact, we had a discussion on Facebook on this topic, and somebody (note how I don’t mention their name!) mentioned that it makes sense that the authorities after a court order has access to such information is all cases.  I largely agree with this.  A very good case can be made for allowing the police all such weapons if it can help investigations.

The thing is, this is the first time in human history we actually have the ability to do mass-surveillance.

The much-hated late Supreme Court judge Scalia, who by many is regarded an an uber-conservative, is really on the same side:

…the Constitution sometimes insulates the criminality of a few in order to protect the privacy of us all.

Antonin Scalia

The thing is, never before did we have the ability to monitor everybody efficiently. The STASI archives in Germany was recently opened, and one of the surprising things was how crappy they were. There was information about much fewer people that people thought. The information about people was much more spotty than people thought. There was random irrelevant information making it hard to find relevant information about the person you wanted.

The problem was that the information was gathered by hand and categorized by hand.  The same was true for KGB.  While they could gather a lot of information, they could just not process it.

Looking back a bit further, we did not even have the ability to gather the information.  Before phone lines and telegraph, you could not just tap a phone line.  Because they were not there.  You had to literally be near the person you wanted to spy on.  This was true for the entirety of human history until very recently (at least until the phone eas invented and more like until much more recently when we had a global phone system).

In the 90s it was very popular to put words that would trigger FBI/CIA/NSA’s perceived filters on the internet and overload them (bomb, terror, George Bush, Iraq, etc.).  It’s quite likely they had and still have such software running.  I think it is perceivable that the effort was entirely wasted.  The software is very likely to have been bad at sorting the information, though, and very likely could not deal with other media like images, audio or video.

Today we can, though.  Speech recognition is good enough that we use it on our phones and even watches.  Google automatically transcribes videos and recognized pictures of cats.  I scan all my paper mail and can immediately search it on my laptop.  We have the processing power to scan and understand all media.  On top of that, we all carry around a GPS receiver, so it is possible to know exactly where we are at all time, and referencing this information with where everybody else who we meet up with.  Facebook has access to this information and has admitted to sharing it with law-enforcement.  Computers are powerful enough to be able to gather pretty much any information about all of us.

In addition, we have algorithms that are efficient enough and work well enough to sort data.  We know algorithms that are eerily good at recognizing spam, Facebook can recognize and even influence your mood.  Google can cluster ambiguous searches (do a Google Image search for, e.g., “Britney” and at the top you’ll see various categories of images) and find images that are similar to others.

For the first time in history, we have the ability to gather all information about everybody, and to analyze it.

I’m sure it will only be used for good now, but in the interest of “won’t somebody think of the children” or “‘merica, fuck yeah!” there’s no reason we won’t use it preventively.  Heck, a lot of European countries are already talking about session logging of internet data.  To prevent terrorism, but why not include CP and the others?  Most countries already make it illegal to travel with large amounts of cash and track all electronic transfers.  To prevent terrorism and organized crime, but it is already being used to track tax evasion.

Those are all very good things to prevent!  I am not even in the camp of some libertarians that want to be allowed to break the law by speeding or other “victimless crimes.”  I believe very strongly that if there’s a law, you abide by it.  If you are unhappy with the law, change the law.  I obviously live in a country where the government isn’t too bad at taking advantage of this, and when I was in China and Russia I used VPN to browse all the illegal newspapers because I’m double moral as all fuck.  What I am trying to say is that I am not against surveillance because I want the government to track “the others” breaking the law but not me.

I’m against mass surveillance because it causes people to change their behavior.  If you are being watched, you are much less likely to pick your nose or scratch your bum.  In Russia and China I encrypted my traffic.  When being observed, we alter our behavior.  There’s of course the good changes, where people stop looking up the “anarchist cookbook” under surveillance, don’t get in contact with terror groups, and stop downloading CP.  All sane people want less of those.  There’s also more unfortunate consequences like looking up taboo subjects; who would sort for their favorite shemale midget clown pornography if the neighbor can check your search history?  Or check the symptoms and treatments for erectile dysfunction?  Or the name of the new Justin Bieber song?  Or the connection between the two?  Even if most would agree that a world with less crime and religious nut-jobs would claim it would be a better place with fewer perverts, I think most would agree that the world would be a worse place if people with mental illnesses would have an even harder time asking for help because any inquiries online would be monitored, if kiddie-fiddlers would stop looking for help for fear of being monitored, or if teenagers would have to talk with their parents or teachers about the nascent sexuality instead of just googling “why are boobs so awesome.”

It can seem like this is a largely academic discussion, but algorithms impact us every day without us necessarily knowing.  Algorithms at Google and Facebook make sure we see content we are interested in.  That has the effect of reinforcing our existing beliefs.  Unless you actively look to challenge your beliefs, you might think that socialism or libertarianism are good ideas, you might be kept in the wrong belief that there are other gods that Britney, and you might wrongfully be under the impression that Sporty isn’t the best Spice.  Algorithms govern the stock markets; people have a strong belief in technical analysis and unironically talk about the price of a stock meeting resistance at, e.g., 100 and breaking free of that limit will send it sky-high.  This disregarding that stocks have no inherent resistance either way; the price is based on the price of a largely arbitrary number system representation of a largely arbitrarily valued currency.  There is no reason stocks should have such a resistance, yet systems that work based on such magic rules tend to actually work.  Because they are not trading against a rational market, but against a market consisting largely such magical systems with voodoo rules.

Right now, there’s big moral conundrums about automatic vehicles; how should a self-driving car react when there’s a choice of saving the family in the car by landing that car safely in a group of kindergarten children crossing the street instead of going into the truck at 100 km/h.  It is the ethics Trolley problem, except instead of thought experiment its about actual lives, and it’s solved by people whose autism is only surpassed by their smelliness, people who don’t even realize that is what they are doing, or people who make more mistakes in a month than Lindsay Lohan in an entire day!

The point is that maybe we should allow mass-surveillance; perhaps the good of catching terrorists outweigh the discomfort of having to discuss the funny feeling when you pee or the pain in the chest with your children.  Maybe the infraction on civil liberty is too great to accept at any cost.  There’s very good arguments on both sides, and it is an argument I think we need to take.  It should not be sneaked in the backdoor (ha!), pretending we are talking about terrorism in the FBI/Apple case.  For the first time in history, we have the ability to completely hide information from prying eyes with cryptography that can likely never be broken in the one hand, and in the other hand we have the ability to collect and analyze all information about everybody.  The two cannot exist in the same universe no more than an impenetrable wall and an unstoppable force.

As anybody who has seen a video of Charlton Heston speaking at NRA meetings, it is a very sane and smart thing to take clues from gun-nuts.  To paraphrase the hick militia: “If having reliable cryptography is criminal, only criminals will have reliable cryptography.”

The question of whether Britney herself can create cryptography so strong not even she can break it has been answered, and the answer is yes.  This leads to the next question of whether she should.

Why Trust Apple?

A natural question is then, why trust Apple?  Why would Apple genuinely create a phone they cannot break into?

Google is known as being on the butt-end of jokes about their motto “don’t be evil.”  They started with that ideal and slowly had to compromise.  The reason was that the people in charge changed.  The business changed.  The motto got in the way of the business.  There’s very likely a lot of people at Google, who still believe in that.  Google does make a lot of financially unwise moves just to do good.  But at the end of the day, Google is a for-profit responsible towards their owners.  It is freely traded company on NASDAQ, which means the owners change.  Google is not owned by idealists who can afford to have principles, but by people relying on them to make money for their retirement fund.

The thing is, that

If you are not paying for it, you’re not the customer; you’re the product being sold.

blue_beetle

You are (most likely) not paying Google. Even if you pay for an Android phone, you likely get it from Samsung or HTC. They might pay Google pittance for some of the applications on Android, but you do not. Android largely exists to drive users to Google’s services. You most likely don’t pay for those either. Google’s services mostly exist to harvest information about you to feed into Google’s knowledge engine and sell advertisements. You are using a phone to drive the 20-years-later version of “punch the monkey” banner ads.

There is nothing wrong with that. You just have to know that your desire to not be monitored is in conflict with Google’s desire to sell ads to it’s customers. Which is not you. By better monitoring everything you do, Google can sell ads better to its customers.  Which is again not you.

Google has a clear desire to not alienate their users (which is you and which is not their customers), and largely also a large part of the “no evil” policy left.  But there is a clear conflict between you not wanting to be monitored and Google wanting to monitor the hell out of you.

As an example, back in the days, website owners would have statistics running on their servers.  This would be done by scripts collecting information about where users come from and information such as which queries did they perform in search engines.  Google resolved the “users do not want all that information to be freely available” and “we want all the information we can get” conundrum by encrypting their search engine.  This means that web-site owners no longer get information about the search words (they were previously available in the Referrer header which was designed for exactly this sort of thing; this header is not sent from encrypted sites because they are supposed to be secret).  to retain the information, website owners had to sign up for Googles convenient Analytics service, which requires website owners to insert a bit of code on their site, which sends all this information to Google, which will then in return give you information such as search information.  This gives Google more information to sell ads and reveals all information about users, while disguising itself as a privacy measure.

Google needs access to your information.  This means that at some point do they need to have the information in clear text.  It is possible to do a lot of clever tricks to minimize their knowledge of you, but it is entirely impossible to immediately encrypt data with a user’s key so they cannot decrypt it themselves – either on the handset or directly upon receival at the end of their cloud services – without compromising their desire to snoop around in your e-mail and dick-pics.

Apple, on the other hand, has it’s wishes aligned with you.  Apple doesn’t sell services (well, it does and they suck and people don’t purchase Apple products for the services, they live with Apple’s services because they like Apple’s products).  iTunes Music Store was introduced as a means to sell iPods and today is a means to sell iPhones.  If we look at how much Apple makes on services (iTunes, App Store, iCloud, software), it’s roughly $20 billion in 2015.  It’s steadily rising and many companies would be thrilled to make $20 billion in revenue full stop, but Apple made more on the Mac.  It made roughly the same on the iPad.  It made more than 6 times that on the iPhone.  Apple is decidedly a hardware company and only makes software and services to sell their hardware.

Apple tried getting into the ad business, primarily to piss off Google.  They were never really good at it, and recently announced they are shutting down the business.  Perhaps they did because their privacy drive sounds hollow if they at the same time harvests data in their own way.  Probably more likely they just sucked at selling ads as they pretty much do at their other services.  At the end of the day, Apple does not rely on advertising at all.  They pretty much have one customer: you.

That means Apple has no incentive to mine your data, so it doesn’t sound hollow when they promise, like they just recently did, they will encrypt their Cloud so not even they can read your data.  It sounds impossible but isn’t – at the simplest level, they just encrypt everything at the phone/computer and only store already encrypted data in the cloud.  They can also allow the user to upload a key but require that the key is encrypted with a password, so they can only access the key when opened with a password.  If they never store the unencrypted key, they genuinely cannot read your data.  They can add a backdoor storing the decrypted key without your knowledge, but why would they?  That would only make it possible for them to oblige requests for user data.  By denying themselves access, they cannot help with such requests not matter how much they want.

When the Error 53 controversy was rattling, discussion were about whether it was done by Apple on purpose to limit 3rd party repairs or it was a necessary security feature.  The answer, I believe, is “why can’t it be both.”  I believe it was an honest mistake while Apple implemented a security feature.  They wanted to authenticate the fingerprint reader and never really considered the case of unsupported repairs.  Why should they?  That’s why they are called unsupported.  Therefore, the error handling in case of an unsupported repair just failed instead of consciously working around it.  They “fixed” the problem by just ignoring an unauthenticated fingerprint reader, which is perfectly fine for security: you no longer can replace the fingerprint reader with an FBI Device 2000 for hacking fingerprints.

In iOS 7 or 8 they introduced anti-theft features to completely kill your phone if stolen.  It is a great feature because it makes the phone less attractive for thieves (and hence more attractive to customers) and makes it possible to ensure that at least data is not leaked from a stolen phone.  It is also a great feature for Apple, because it removed an iPhone from the market, so they can sell another iPhone.  Making the App store a walled garden means less malware on the iPhone than on other markets but also means Apple can make money off all software on phones produced by the company.

Making a phone ununlockable has the same feature for Apple: it removes a phone from the market.  For customers it has the advantage that it is secure.  The more secure the phone is, the more phones will be removed from the market.  Furthermore, the more secure a phone is, the less likely it is that Apple will be inconvenienced by authorities.

At the end of the day, a lot of the time, users’ (= customers’!) wishes are aligned with the wishes of Apple.  And that is why I would rather entrust my data to them than to Google.  Google may change direction if they change owners and leadership, but even if Apple does, their direction is likely to be unchanged as they are still in the market to make money.

Conclusion

At the end of the day, we have to ask ourselves whether we want to prostitute ourselves.  We should not let the thought of a shiny million make us make a decision we are not prepared to make, and end up just haggling about the price for a trick.

We should not open the backdoor to our encryption pretending that the FBI vs Apple case is in any way about terrorism.  Because all that is left afterwards is to agree on exactly how fucked we want to be by the government.

There is nothing inherently wrong with deciding that the pros of mass-surveillance outweighs the cons, but it should be a conscious decision rather than an emergent one disguised as something else.  For that reason Apple needs to win this case so we can have a good, long and open discussion about the principles of the case instead of making a rash decision in the wake of a tragedy.

There’s another backdoor and prostitution joke to be made, but I’ll leave the details to the reader.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.