Fair warning, this will most likely be a controversial post. While I’m generally in favor of a citizen’s right to privacy, and I support companies that actually go to bat for our civil liberties, I’ve gotta side with law enforcement on this one–well, sorta. Let me be absolutely clear, this isn’t a question of where my political loyalties lie, but rather a question of security and precedent. I’m also not suggesting we hand over universal, back-door keys to our devices to anyone, including the government. What I am suggesting is that this problem is complicated, and we need to find a creative solution to it that allows our law enforcement agencies to do their jobs.
To explain the meat of my argument, I’ll use an example. Let’s say that Harley is a nutbag that shoots a bunch of people on a college campus. He gets caught red-handed, but an investigation of Harley’s life must still take place. We, the people, need to know whether or not he had accomplices. Was he clinically insane? Was Harley the leader of a group of like-minded individuals?
Facts like these are important for the prosecution so that they can push for what they consider to be a just punishment. They’re important for the defense because further investigation might turn up mitigating circumstances that would impact sentencing. Police need to know whether or not someone is planning to do something similar, or if one of Harley’s buddies aided in the murders and is still at large.
As part of the investigation into all of that stuff, Harley’s apartment needs to be searched. Harley’s landlord, Judy is eager and willing to unlock the apartment (when shown a warrant) so that the police might search it for evidence pertinent to the shooting that they are investigating.
The police have probable cause to search the apartment, so they get a warrant and perform the search. Our law enforcement agencies and legal system rely on these investigations, and we would have a LOT more trouble putting criminals away if Judy the landlord said, “No, that door was locked by Harley, and I’m not letting you in.”
I’m sure you can see where I’m going with this. The FBI and the courts are just trying to do their job, and despite a lot of negative press recently, please keep in mind that most cops get into their line of work because they are good people that want to protect other people (like you). Yes, there are bad cops out there, but they’re the minority ruining it for all the good, hard working officers out there.
A lot of people rely on smartphones as not just a primary means of communication, but also a handy place to store information. Heck, some people even use it as their primary computing device. Law enforcement agencies have been decrypting hard drives out of personal computers and searching through digital files for years, if not a decade or two. iPhones (and every other smart phone) are extensions of this. Not letting law enforcement agencies investigate these devices is like handing semi-intelligent bad guys (like organized terrorist cells) a free pass to store incriminating evidence without fear.
Look, I’m not saying we should give the government carte blanche to break into your phone whenever they want. I am saying that, if they have probable cause, they need to be able to search your device. If you shoot a bunch of people, and the police need to look at your phone to see if you were planning another mass murder with a group of friends that are still out there, then they should damn well get access to it.
I don’t care where you fall in the political spectrum, just try to think about this from a “how do we catch the bad guys” perspective. Imagine what CSI or Castle would be like if the police could never search an apartment for evidence, or trace someone’s phone, or look into someone’s bank records.
Additionally, I don’t think its a very good idea to allow Apple to set this precedent. Allowing them to defy a court order and impede a federal investigation gives them more power and influence than I’m comfortable with. Corporations in this country hold way more clout than they should already, and we shouldn’t hand them more.
The other side of the argument
The people (and companies) backing Apple in this case have a very good point. Giving anyone custom software to break the encryption on any smartphone is dangerous because, once it’s out there, it could fall into the wrong hands, which would be disastrous (to say the least). The tool could essentially turn into a universal decryption key for specific phones, and would be kind of like having access to digital nukes.
On top of that, if the government is able to compel Apple to make custom firmware for this one case, what’s to stop them from forcing other companies to do the same, but in more invasive ways? Imagine Big Brother trolling your Facebook page to look for evidence of drug use. It could be 1984 for real, except Orwell may have underestimated the amount of damage that these policies could do to you on a personal level.
Heck, in the old days, if you wanted to keep something secret, you could put it in a safe. That way, if someone wanted to get at it, they would at least have to physically be there to remove it from that safe. In today’s age, it can all be done remotely, and, if a universal back door is found, people with the key could get to your stuff without your knowledge.
I get it–it’s scary. Read this Time article for more details on this side of the argument.
If you want to understand some of the more technical aspects of this side of the battle, check out this post on BGR. Keep in mind, the “hack” the investigators are asking for would be specific to that phone, and it’s not actually some universal set of decryption keys. Apple is primarily afraid that giving in will make them look bad, and open the door for more unreasonable requests in the future. In other words, the “slippery slope” argument.
So what do we do?
We need a “middle-ground” solution. We need to provide assistance to law enforcement agencies when it’s warranted, but we need to do it in such a way that doesn’t allow them to actively monitor any or all of us just because they can, whenever they want. In other words, like the apartment search, they need to have strong and just cause to search your device, but if they can prove it’s necessary (within reason), then they’re going to get the access they need.
We need to also be damned careful with the tools we allow the government to use. Preventative measures to protect against abuse must be in place, and usage of said tools should be monitored both by the companies involved (Apple, in this case) and proper oversight from law enforcement officials. Use of these tools should also be fully disclosed to the public so that we understand how and when they’re being used. It should not be an easy process to invoke these searches.
That’s how I’d solve it anyway, and make no mistake, what I’m suggesting is difficult (at best) to implement. Heck, from a technical perspective, I’m not sure how we could accomplish it all with 100% effectiveness. That might not even be possible, but I think a solution along these lines is absolutely vital–both for our privacy and our security. Cops, courts, and lawyers need evidence. Don’t for get, it can exonerate as well as it can condemn, so it’s in your best interests too.
Just give this issue some deep thought before you decide which side of it you’re on–that’s all I’m asking. I can see and understand both sides of the argument, and there are excellent points on each.
And another thing . . .
I might be more willing to give the benefits of the doubt to Apple’s firm proclamation outlining their stance on this issue, and their claim that they’re doing this for the good of all of us, if it weren’t for the fact that they’re the same company that’s bricking phones just because owners get them fixed at a less expensive, independent shop. You want to be outraged about something? Be outraged about that, because practices like that one screw the consumer. If you buy a phone, it’s your phone. You should be able to repair it yourself if you want–the worst they should be able to do is void your warranty (which is fine by me).
I only point this out because trust is very much at the center of this decryption issue, and sure don’t trust companies like Apple all that far. The same goes for the government, to be fair. /end rant[su_note note_color=”#6a6a6a” text_color=”#ffffff” radius=”2″]UPDATE: Right after I posted this, I read that Apple pushed an update to “fix” Error 53. This is likely due to people like me being angry about it and posting our frustration on the internet, so go us! Given Apple’s history on not wanting ANYONE to open an iPhone up except for them, I’m pretty sure the bricking was intentional, but they’ve reacted exactly as they should have to the criticism (shortly after talk of a class action suit). 🙂 So, I give them props for quickly turning it around with an update to do the right thing. Well done, my fruity friends![/su_note]