> “market something expressly to allow people to place themselves beyond the law.”
One thing the FBI does not realize, or does not care, is that THEY ARE NOT THE ONLY ONES THE BACKDOORS WORK FOR.
A backdoor is a backdoor. Period. When so much private information is on a single device it's not just about being beyond the law (which is a great reason in and of itself because law enforcement is itself beyond the law). It's about being safe.
Its more like "Give your house keys and safe combination to BigCorp, so that if we get a warrant for the contents of your safe, we can serve the warrant on BigCorp and more easily covertly execute the warrant without you knowing that it has happened. And try not to worry that no one at BigCorp will ever abuse the fact that they have your house keys and safe combination, or fail in their responsibility to safeguard those items from criminals."
Well, I kinda view giving my keys to BigCorp as leaving the door unlocked. From a privacy standpoint, I think that is also how it works [e.g. 3rd party doctrine]
Er...no. In the real world they don't always arrive with a SWAT team. Many warrants are served peacefully, and they have even been known to use locksmiths. I don't agree with stereotyping encryption users as if they were criminals, nor do I agree with stereotyping law enforcement as if they were shock troops.
> legally entitled to break into your house even if you don't want them to.
You even admit I'm right yet claim I'm not. So I'm kinda confused.
The key phrase there is "break in". As in, they acquire access without the consent of the owner. In this case, the "door" is the encryption key which they are expected to acquire and/or break so that they can gain access.
A conventional door you can:
1) Get a locksmith to essentially duplicate the essential function of a key to gain access. [Steal the Key]
2) Kick it in. [Brute Force]
3) Convince the owner to cooperate. [Social engineering]
They have the same options to acquire an encryption key. The fact it makes their job more difficult isn't relevant.
Cash makes their job more difficult as well since its essentially anonymous when handled with gloves. Should we ban cash and gloves too?
Of course it is. You keep your front door locked, you might have a lockable screen door and bars on the windows, or a burglar alarm or a number of other security precautions, but I'm pretty sure you don't live in an impregnable fortress or a bank vault.
More likely you have a basic level of security that deters crimes of opportunity, perhaps even a serious burglar, and that is sufficient, same as for most people. Now maybe I'm wrong and you live in a decommissioned missile silo or some other super-hardened structure, but to pretend that there's no difference between basic security measures and leaving your front door unlocked is total BS.
Encryption isn't an impregnable fortress. It is a basic, locked door.
Say you use TextSecure. If someone installs Malware on the phone and keylogs, they can capture your password. With the password and the phone, they can decrypt the messages.
I'm not sure why this is the equivalent of a bank vault to you? It can be defeated by some guy in his basement.
Law enforcement has shown the ability and willingness to use malware as well.
1) The app store is still controlled by the corporation.
2) Malware to capture the key is all that is required.
3) Law enforcement is capable of producing malware competently and legally forcing it onto machines.
4) The corporation can be legally compelled to install said malware via an automatic update [thereby disabling and/or capturing the key]. If that isn't an option, there are various other methods [such as taking the person into custody, installing the malware manually, then releasing them].
The only difference is the number of steps involved.
This is absurd. The complaint is that if service providers like Apple and Google maintain a backdoor to assist law enforcement, then it's wide open to any other attacker - fair enough.
Now you're saying it's just a basic locked door and it's trivial to break anyway and law enforcement can legally force malware onto machines if they need to do password harvesting, which to my mind sounds worse than having a backdoor in the first place that might be exploitable.
If I understand you right you just want some additional steps for law enforcement to go through (although the legal step of getting a warrant should be sufficient; making it more difficult after that is obstructing a lawful search rather than an opportunistic one. But now you also potentially have malware in the wild, because it's not beyond the capacity of bad guys to set up a honey pot to attract the attention of law enforcement.
Sorry, I just find you argument fundamentally circular.
No, pedantry and fundamental misrepresentations are absurd. That is exactly what you are doing.
Let us start from the top to get this across:
Me:
> "Leave your front door unlocked. Just in case we want to search you for drugs/obscene material/weapons/etc."
I said people have the right to lock their door [encryption key -> encryption software == key -> door].
You:
> Except it isn't. When the FBI has a warrant they are legally entitled to break into your house even if you don't want them to.
Yes, they have the right to break into your phone. That doesn't stop me from having a lockable door. It is perfectly legal and has always been so. There was a fight about it in the 90s with PGP and such.
So you are agreeing with me. Except for the fact you put "except it isn't" without providing any actual proof it isn't. Because y'know, there is no actual proof I'm wrong.
Me:
> The key phrase there is "break in". As in, they acquire access without the consent of the owner. In this case, the "door" is the encryption key which they are expected to acquire and/or break so that they can gain access.
You:
> Your original claim was that you were being asked the equivalent of leaving your house unlocked for their convenience, which is a wild exaggeration.
That isn't a wild exaggeration. Plain text being stored accessibly from the internet is the equivalent of leaving your house unlocked. Its basic, fundamental security. Its why you hash your password and store your backups in an encrypted container.
I don't understand why basic, standard precautions any competent IT person engages in is "wild exaggerations".
Me:
> No, it isn't. If I don't have the ability to prevent 3rd parties from entering my house, it isn't a wild exaggeration.
If strangers can walk into my house without breaking in and do WTF they want, it isn't locked by any sane definition.
You:
> Of course it is. You keep your front door locked, you might have a lockable screen door and bars on the windows, or a burglar alarm or a number of other security precautions, but I'm pretty sure you don't live in an impregnable fortress or a bank vault.
Now that is a wild exaggeration. Encryption is not an impregnable fortress.
Me:
> http://www.wired.com/2013/09/freedom-hosting-fbi/
> 1) The app store is still controlled by the corporation. 2) Malware to capture the key is all that is required. 3) Law enforcement is capable of producing malware competently and legally forcing it onto machines. 4) The corporation can be legally compelled to install said malware via an automatic update [thereby disabling and/or capturing the key]. If that isn't an option, there are various other methods [such as taking the person into custody, installing the malware manually, then releasing them].
A couple of ways in which encryption can be circumvented (malware, two delivery methods)
You:
> This is absurd.
At no point did I say it was "trivial". You've resorted to misrepresentations, putting word in mouth, etc. to argue with me.
The fact you are going to such lengths to argue against an obvious and accurate analogy [in the majority opinion, since I'm being upvoted while you are downvoted] is absurd.
You:
> If I understand you right you just want some additional steps for law enforcement to go through
At no point did I say that. What I have said this whole time is "Same steps as serving a physical warrant? Well, same for technology."
Get Warrant from Judge -> Serve Warrant [Force Entry if required] -> Find Evidence.
That is how physical warrants are handled and it is how digital warrants should be handled. That is how it works for computers too, fyi. Like desktops with full disk encryption.
Agreed, and it's shocking that this Washington Post article does not raise these points at all, or consult an "expert" that is not repeating the same PR rhetoric. For normal people to understand the issue and take a stance on it both sides should be illuminated. It could have been as simple as including Apple's or Google's original statements as to why they are taking this stance.
This story was likely published because the companies themselves told them to take a hike, and they are now trying to garner public sympathy. I don't think they are going to get it. Unauthorized intruders, whether or not they have a piece of paper signed by a guy that wears a robe to work, are not welcome in my phone.
I hope Apple & Google take a hard line on this, at least until the US government passes a law making it illegal to engineer things in a manner that makes it impossible for companies to comply with search warrants. I'm sure that will come, but hopefully it will take a while.
Step 1: Have the stenographers at the Washington Post publish your scary press release as "news" full of emotionally charged hyperbole like "Apple will become the phone of choice for the pedophile".
Step 2: After the next telegenic kidnapping / bombing / school shooting, claim "I told you so" as the perpetrator would have been stopped if not for a secure phone / encrypted chat / requirement to get a warrant.
Step 3: 95% of the public demands mandatory backdoors, criminalizing strong encryption, and warrantless dragnet surveillance.
I hope you're right. Outside of my tech-oriented friends, most people I've discussed surreptitious electronic government surveillance with are either unconcerned about it or actually for it. And these are highly educated people who should know better.
Is there not any rational reason for them to be unconcerned or for it?
They probably believe that they will not be targeted by it, that if an algorithm scans over their data that it's no big deal, and that it may lead to some criminals being caught. They can live their entire lives from start to finish, and whether their own communications were ever intercepted will likely have had zero impact on their life.
I'm not trying to be dismissive of people who are concerned, just wanting to point out that there are also valid reasons to take the other view.
These same people seem awfully shy when I ask to go through their photos, just to see if they have any using drugs.
But that's exactly it: if you leave a gaping hole for the feds to get through I (in the sense of a hypothetical attacker) can go through it too, especially with how easily most federal secrets leak out and the fact that rekeying is essentially impossible for this use.
Of course they're shy of you going through their photos. They know you personally, and you are a person, not an algorithm. Whilst it is possible that a person may go through their photos as part of mass surveillance operations, they believe it's unlikely to happen.
Your second point is true only if the access mechanism is reliant on weak security. It doesn't have to be.
>They probably believe that they will not be targeted by it
That would not be rational. They have no capacity (nor does anyone else) to predict which of their behaviors will be cause for targetting at some point in the future. Additionally, once they are profiled for their 'normal usage pattern', any life changes which cause statistically significant alteration of that pattern are likely to result in scrutiny.
For instance, you are currently involved in a conversation on a website called "Hacker News" with people whose backgrounds and motivations you do not know. The likelihood you're NOT connected to someone who is a criminal is probably very, very low. Everyone should expect to be targetted. If not today, then at any point for the rest of your life because all of these stores are maintained forever.
A mass, non-algorithmic, deep-dive approach is simply not economically feasible, so I don't agree that everyone should expect to come under scrutiny. The approach taken has to be targeted. That's not to say that there won't be completely innocent people who get scrutinized, but to me it seems that the average non-criminal person would be quite correct in assuming that the likelihood of ending up scrutinized by humans is minuscule.
I probably should have mentioned in my original comment that looking at it this way is purely self-interested. In reality, perhaps there are non-average attributes (eg. particular religions) which might see clusters of innocent people scrutinized. People outside those clusters may or may not take issue with that. It depends on their values. I don't mean to suggest that one opinion or the other on this is "correct", just that there might be rational (albeit self-interested) views on both sides of the argument.
>so I don't agree that everyone should expect to come under scrutiny
I think we're just using the word 'scrutiny' differently. I count extensive computational analysis of behavior patterns to be scrutiny. My worries are not that a human being will listen to my calls, read my emails, etc. My worries are that a piece of software will do it. The piece of software knows exactly and precisely whether I deviate from the norm and it would have no tolerance whatever for such deviation. Yes, we could pretty much completely guarantee maintenance of the status quo with automated analysis and really very civilized and quiet means of disrupting communication. It would prevent revolutions, riots, terrorism, and all sorts of negative things. But it would also doom everyone to a quiet tyranny and completely prevent any improvements as well as detriments.
if there was a terror group operating in the US, killing 10,000 children a week in all states with bombs, chemicals, and biological weapons, and you tried to stop the government, police, army, and other federal agencies from doing everything possible to stop that and save the lives of 4 year-old kids, you'd likely find yourself in jail or dead.
People in America who grew up on Nintendo and Xbox don't have a clue how stupidly brutal the world can be. This is why these "highly educated people" are "actually for it".
I will grant you that these activities should be under the oversight of the US government, State governments, who represent the People of the United States. The FBI is such a federal government agency.
If a group of terrorists develop the capability to kill 10,000 children a week, they are probably in league with the same US government.
Just a question, why would the terrorists be targeting children in particular? Why not a mix of infants, children, adults, and elderly people? Your scum-sucking interest in over-reaching government intrusiveness is leading you to come up with demented and perverse scenarios focused on children.
If there was a terror group operating in the US doing all you claim, access to their phones would provide us with no protection whatsoever. We have absolute, concrete proof of this thanks to the 9/11 Commission Report. The NSA knew where Mohammed Atta was. They knew exactly what he was planning. And they kept it secret. They are so afraid of revealing their methods and capabilities that there is literally NO tragedy so great that they would be willing to speak up to prevent it. When the CIA and FBI both asked the NSA for their information about Atta, the NSA refused to supply it or even warn them.
I think people have a very good handle on how brutal the world can be. In fact, they believe it to be quite a bit more brtual than it actually is. There have been ONE really major terrorist attack in the US in the past century. And having access to every single one of their conversations did not do anything to prevent it. Having separate entrances for pilots and passengers on planes would have prevented it completely, though.
My guess is more that the DoJ has agreed with their collaborators to help repair their tarnished public images. Shouting loudly about not being able to unlock a phone and get its content is pretty meaningless when they still have total access to everything that ever transits the network to/from that phone.
But it will make some people think "good, these companies are finally standing up for us" which reduces the chances that people will look into products and services that offer end-to-end encryption or from companies that aren't as enthusiastically cooperative.
Democracy has turned into a sham. Its inadequacy is cracking the world. The signs of it are everywhere. Since Snowden most of us have felt the world become darker. There were suggestions before that of course but since then there has been no doubt that the United States has become malevolent.
I think you and I and other people here will be lucky to get to the latter half of the century without being scathed. A hard reboot is on the menu.
Choose wisely yourself. Personally I am leaving. There is no hope for reform but to paraphrase The Road, we can carry the fire.
Apple, Google, EFF, and lots of others disagree with you. The right to use legally available tools to protect our privacy is just as valid and important as the right of the government to prosecute wrongdoers.
The quote that plants this firmly in pretending-to-be-serious land: “The average pedophile at this point is probably thinking, I’ve got to get an Apple phone.”
That was my thought, too. That assertion is a re-working of the sentiment that only people who have done something wrong have something to hide, and it's a dangerous one.
Let's be honest, improved encryption is going to restrain the government's ability to enforce the law. Beyond pedophiles, there are definitely going to be cases where innocent people get hurt as a result.
I'm okay with that. The whole idea of our government and society is that the mass of law-abiding and decent people are stronger than the criminal and malicious minority. People are by and large responsible, which is why they can and should govern themselves. Limiting the government's ability to snoop and intrude on citizens is a crucial check on the very real (if long-term) threat of government over-reach.
But let's not kid ourselves that our privacy, and its constraints on the government, is without consequences.
The government currently kills people as part of exercising its policing powers, which most people agree are overly aggressively used.
If this lowers that rate, it's possible that the difference in beatings, shootings, and home invasions will be basically even, because there isn't a large portion of the criminal element which is waiting for better encrypted cellphones to do these things (hint: other things about cellphones make these problematic, and most depend on other evidence anyway when prosecuted), where there is a reason to think that police being restrained in using illegal investigation methods will decrease the rate at which police use other illegal investigation methods.
People that are OK with innocent people getting hurt never seen to realize that this usually guarantees that criminals get away with their crimes by the wrong person being convicted. Is that also something you are OK with?
Maybe he is trying to explain why so many people in his department have recently purchased Apple phones. Latent criminal tendencies are the only possible explanation.
Hey FBI, if you want the data on the phone get a Warrant that requires the person who owns the phone to unlock it for you - or go to jail for not complying. You don't go to a next-door neighbor who has the key to a house and serve him the Warrant to get into your their neighbors house. Don't go to Apple or Google and make them unlock the phone. 4th Amendment, unlawful search and seizure and all that jazz.
Unless they know that the owner has the password, and can recall it, this potentially infringes on the 5th amendment. This has been tested repeatedly in court, although not always with the same result.
Forced key disclosure is legal in the UK, with up to 2 years (5 if terrorism is involved) in prison if you fail to comply. In the US it's more iffy and you'd be trying your luck with the judge.
Not sure if you can do this with Location Services on iOS, but I'd pay for an app that destroys my private key on the device if it detects I've been within X feet of a police station for longer than X minutes.
You can then have my passphrase. It's no longer any use.
FTA: The irony, some say, is that while the legal and technical changes are fueled by anger over reports of mass surveillance by the National Security Agency, the consequences are being felt most heavily by police detectives, often armed with warrants certifying that a judge has found probable cause that a search of a smartphone will reveal evidence of a crime.
I believe Apple has implemented things so that they cannot divulge the necessary data. Having it and withholding it would be contempt of court, which has unlimited penalties.
The was a WaPo article[0] posted on HN a while back:
>If I understand how it works, the only time the new design matters is when the government has a search warrant, signed by a judge, based on a finding of probable cause. Under the old operating system, Apple could execute a lawful warrant and give law enforcement the data on the phone. Under the new operating system, that warrant is a nullity. It’s just a nice piece of paper with a judge’s signature. Because Apple demands a warrant to decrypt a phone when it is capable of doing so, the only time Apple’s inability to do that makes a difference is when the government has a valid warrant. The policy switch doesn’t stop hackers, trespassers, or rogue agents. It only stops lawful investigations with lawful warrants.
> Under the old operating system, Apple could execute a lawful warrant and give law enforcement the data on the phone. [...] The policy switch doesn’t stop hackers, trespassers, or rogue agents. It only stops lawful investigations with lawful warrants.
Incorrect: under the old operating system a human at Apple could give anyone the data on the phone. Apple had procedural safeguards that aimed to ensure that no one at Apple would do that without proper approval, but as anyone who pays attention to the news should be aware, procedural safeguards of data do not always prevent humans who are motivated to violate those safeguards. (Including, inter alia, "hackers, trespassers, and rogue agents" -- as certainly the US government is aware; its hardly as if the NSA didn't have procedural safeguards that applied to the data that Edward Snowden released.)
The change (which is a technical change, not a policy change) means no human at Apple can do this, and the phone user is protected from humans who might violate Apple's policies, including "hackers, trespassers, and rogue agents." It incidentally means that any warrant for data on the phone will have to be served on the only person who has access to that data, the phone owner.
It doesn't "stop lawful investigations with lawful warrants", it reduces the number of people who can access a phone, and therefore the set of people on whom a lawful warrant can be served. But law enforcement has no inherent right to expect that some third party will have access to my private property (whether physical or virtual) that enables law enforcement to serve a warrant on a third person, and that third person having access is always a compromise of my security that makes me more vulnerable to rogue actors. It is not simply a convenience for "lawful investigations with lawful warrants" that provides no risk to me outside of such investigations.
> and that third person having access is always a compromise of my security that makes me more vulnerable to rogue actors.
You distill it well, a broken security solution is not a good security solution. Foreign intelligence services, "hackers", etc... they all want our trade secrets. We need real security systems to defend against them.
Funny: "That led investigators to a Facebook post, made two days after the homicide, in which another man posed in a cell phone selfie with the same gun."
You'd think a selfie would be enough to find someone the traditional way, but they seem to think they needed to locate the phone that took the picture.
It's possible to positively, uniquely identify a gun from a selfie? Was the selfie of the gun, showing it's most intimate private parts: the serial number?
So is this whole thing REALLY about the US Government losing access or more about them not wanting to expose their over-the-air code execution techniques?
Since presumably if they capture an already running device, they can just get the warrant, and ask the cellular network to send their specially crafted packet (which can unlock the phone, SMS the encryption key, or similar).
Sorry but if the Intelligence Services cannot unlock an encrypted device STILL RUNNING then I'll eat my hat.
> So is this whole thing REALLY about the US Government losing access or more about them not wanting to expose their over-the-air code execution techniques?
Obviously, the latter.
More to the point, the type of regular police who would push for this probably aren't even aware that such code execution techniques might exist.
He said he could not understand why companies would “market something expressly to allow people to place themselves beyond the law.” - I guess the FBI and the NSA don't talk much?
Hey if the FBI were obeying the law, then I might have some concern for them. But they aren't. They're going out there, organizing groups of people, setting them up with terrorist plots, providing them with material and financial support, and then busting them. Whether that's entrapment or not, I don't care--- it is a violation of anti-terrorism conspiracy laws.
Not to mention the NSA, and I believe FBI, conducting illegal spying operations.
Government has proven it cannot be trusted, now complains when companies stop trusting it?
Fortunately as yet there is no way for the USG to grant a monopoly on the manufacture of mobile devices in exchange for access to the contents. An alternate route they can take is through the monopolies granted on various radio frequencies. FCC approval in the future may require backdooring.
Would that prevent the end user from adding another layer of security onto the device? Kind of how you can log on to my machine, but that doesn't make it any easier to decrypt my Truecrypt partition (I think...).
If it communicates over the network though, all of your traffic would still be recorded and decryptable. You'd have to exclusively use end-to-end encrypted services.
This is pretty funny. It will be interesting to watch the narrative, and it is surprising how overtly partisan it is. I just wish I had a big PR budget I could use to place articles in the right places to adjust the narrative.
Well if they have a well-formed wallet they actually do have that right. The constitution contemplates the issuance of warrants precisely because there are circumstances where seizure is entirely appropriate.
You are confusing a right with a power, granted by consent of the governed, that can be withdrawn and can't rewrite the rules of mathematics. People have rights, and under constitutions written the way the US constitution is written, those rights are not enumerated or limited.
You have a right to use strong encryption because you have not consented to give away that right. The government has no actual rights at all.
Not just the NSA. The regular ol' police gives off the appearance of being at war with the citizens it's supposed to protect sometimes. They are extremely militarized, very technologically advanced, but not always fair and even-keeled.
The FBI is more than talented enough to get into phones, or areas, they just need to invest more time if something's locked. Which honestly, is a good thing. It means they'll have to actually choose on a targeted basis who is worth surveilling, which is more conductive to democracy than just saying, "Give us all your keys". An easy system to crack is an easy system to crack, period.
When I made a point about Orrin Kerr shilling for the law enforcement agencies https://news.ycombinator.com/item?id=8349006 one or two people didn't agree. I guess it is now obvious what that piece was about.
The disingeneous dimension to this issue is that smart phones are basically powerful hand held personal computers which make calls and take photos on the side. Less than 1% of their capability is used for for making calls. Orrin Kerr pretends not to understand this. He wouldn't insist that Microsoft, Apple and Linux developers weaken the encryption on regular laptops, desktops or servers for law enforcement to have access to data.
Why should he insist on it for smart phones which these days are just as powerful as PCs, the only difference being that people carry them around, make calls and store contact details on them?
Steve Job and Apple's desire to have complete control and have access to users information brought this situation about, and in doing so made themselves virtually accessories to whatever crimes people stored on their phones. Now they realize that it made them appear as agents and collaborators with the investigative agencies, they have decided to extricate themselves from that situation leaving them close to being labelled 'pedophile and terrorist facilitators'.
It's nice to see the media digest and swallow this bull from the FBI & co without bothering to even chew it a little first. "Oh the FBI said something, IT MUST be true!".
All it requires is for someone to find that backdoor & its mechanism and that's it, they can exploit the backdoor designed for someone else for their own purposes. And if that someone doesn't have legitimate intentions, what then? Am I supposed to accept making my device that much less secure because the FBI or another state actor may one day decide I'm trouble? The chance of that backdoor being exploited by someone other than the FBI is considerably higher than the FBI ever using it themselves, and the FBI seem to be in lalaland on this.
I think that under this noise both OS is stuffed with bookmarks heavier than any time before.
Ouch, those Apple and Google guys so bad that they following forth amendment, poor and evil FBI and NSA they can't do anything with american companies. They soo-o-o angry about their OS security that you definitely could trust it.
Wrong, FBI: this is a human rights issue, pure and simple. Our devices augment our minds, and you want to read our minds. And we don't want you to.
It is disingenuous to imply (let alone assert, as in this case) that you can't investigate criminal activity without the ability to access our phones/read our minds. Somehow you managed for many decades prior to the iPhone. To claim you need this access now is rather silly.
I would suggest that you allow people their private, augmented minds, enjoy the better society that will (hopefully) give us all, and pursue other investigatory avenues, of which there is no shortage and which should occur to your thought.
>He said he could not understand why companies would “market something expressly to allow people to place themselves beyond the law.”
Perhaps he's entirely unaware of any history where "the law" was used to oppress. Perhaps he's also unaware that something being "the law" does not make it just. For instance, in Washington, where I live, just two years ago it was illegal to own marijuana. Now it's legal. Does this now mean that the moral status of owning and smoking marijuana has changed, or does it merely show that the law is inconstant and often at odds with justice? Much is taken for granted in his rhetorical question.
I can't wait to see the FBI point the finger at the NSA and say something to the fact of "The FBI is trying to protect you, we aren't the NSA violating your right."
So here's what I don't get: This is just standard, off-the-shelf encryption, right? Like the kind I've been using on my desktop since forever? How is this such a big deal?
I realize that from a PR perspective, Apple would like to appear to be privacy conscious. And perhaps Apple was asking for legal fallout by flouting that particular aspect of correctly implemented encryption. It just seems a little too ironic that this is just standard off-the-shelf encryption we're talking about.
The technology has been there for a while, it's just finally gotten the point where people's mistrust for the government is something tech companies can market to. -
It would be an awful lot easier for police to catch criminals if they were permitted to simply kick in random doors and raid houses just to poke around and look for evidence of crimes, too. Police do not have an absolute right to easily-gathered evidence. Especially given the malleability of digital content, they should be relying on actual physical evidence more than ever anyway.
What do the watchers think we should do when we can't even have an open rational discussion about constitutional rights especially regarding new technology and information. I am willing to discuss and debate. My data was yours with a warrant. The tools to go beyond that weren't mainstream. By refusing to communicate the internet is working around you. Rightfully so.
> “market something expressly to allow people to place themselves beyond the law.”
Why isn't he complaining about Starbucks? Conspirators can meet in a Starbucks and discuss illegal things over coffee. If Starbucks cared about people, they'd bug all of the tables.
The funniest part is that as a customer, I still have very little guarantee that the company can't access the phone's content at will; it's still promises to me.
> “Our ability to act on data that does exist . . . is critical to our success,” Hosko said. He suggested that it would take a major event, such as a terrorist attack, to cause the pendulum to swing back toward giving authorities access to a broad range of digital information. - related article ( http://www.washingtonpost.com/business/technology/2014/09/17... )
USA, get ready for a "foreign terrorist attack" in the next few days (;
Of course they're outraged, in the same way that a serial burglar is outraged when they discover that someone's backdoor, which has been wide open for years, is suddenly locked. What's their outrage got to do with anything?
One thing the FBI does not realize, or does not care, is that THEY ARE NOT THE ONLY ONES THE BACKDOORS WORK FOR.
A backdoor is a backdoor. Period. When so much private information is on a single device it's not just about being beyond the law (which is a great reason in and of itself because law enforcement is itself beyond the law). It's about being safe.