We've been round and round this before with the "Clipper chip" and their "Law Enforcement Access Field".
Yeah, I'd say the post is shockingly blind to empirical evidence.
Even if this was law, it would be trivial to bypass. This type of law isn't just tyrannical, it's impossible to enforce. So, you're 100% absolutely right in saying it would be selectively enforced.
Every tech founder needs to standup to these attempts of control.
Make the hypothetical law: "Use of encryption without a golden key is an automatic assumption of guilt."
All it takes is a complicit supreme court.
And a change of constitution.
Specifically, unreadable speech is still protected by 2nd Ammendment.
Just make your cipher output speech (gibberish words, form of steganography) and you're set.
As a reminder, this is how Zimmerman's PGP was exported. Printed in a book.
Maybe you mean the first amendment?
You wouldn't need to change the constitution if the arbiter of what's constitutional (i.e. the Supreme Court) were complicit.
I think that would lead to "crypto-encryption", where the fact of encryption is hidden.
Encrypted data looks a lot like random noise, which might be a good basis for hiding it in regular traffic.
The term you're looking for is "steganography". Pretty fun to read about the existing work, including deniable filesystems.
The Chinese sure enforce the hell out of it--"you have illegal crypto on your device!" <bullet to the head>
It is. It's so feasible and even probable given combo of government aggression plus citizen apathy that I did a design exploration of high-assurance, lawful intercept just in case we were forced to add it. I'd rather be ready with something good than whatever purposely-0day-ridden crap they come up with. Also, partly inspired by the judge in Lavabit trial that offered to consider an alternative if Levison could produce it on the spot. He asked for money and time whereas a pre-certified solution that meets a warrant's requirements might have gotten accepted.
I just revised that proposal from 2014 for better clarity:
Furthermore, it strikes me as very likely that Hanlon's Razor applies to these 'ignorant authoritarian extremists': they are actually ignorant.
Don't play chicken with idiots.
Nice discussion, but the central call to action -- challenging technically ignorant authoritarian extremists in the legislature like McConnell and Feinstein to do something horrible, trusting that they truly won't do it at the end of the day because uhhh their "desire to do the right thing?" -- this is naive tech ideology at work. Realistically, the golden key law is totally feasible, politically viable, and should scare all of us. It would not be used to ban all encryption (which is impossible) but rather to target undesirables selectively and to ensure that either every person's every communication is recorded forever by the government so that they can be prosecuted for their speech later at will, or that they are put on a list of criminals for using banned encryption, so that they can be prosecuted later at will.
No tech person should ever breathe a word of support for this, especially not to play a psychotic game of chicken with people who would permanently disable their brakes in an instant if it would fire up their base for a news cycle.
Even better, the HDMI encryption system was made to handle that leak - rather than have one key, each manufacturer would have one key and then they disable any leaked keys.
Worked wonderfully, until somebody posted the master key matrix...
DeCSS was more about the CSS scheme being trivial to break, wasn't it?
We went through it in Coursera's crypto-1 course, as soon as you can see how it works you can see how to break it pretty trivially and recover the key.
In retrospect, yes it's easy to break. Getting to that understanding, however, was greatly facilitated by the aforementioned key leak. I recall the great excitement when it happened, and the subsequent proposals of making that number "illegal" (discouraging dissemination).
Upshot, regardless of technical details: we've been over the "golden key law" concept many times thru several decades, with numerous dramatic examples of why it's a really really bad idea. We can continue arguing about what exactly went wrong then, and why "it will be different this time", but if we've learned anything it's "no, it won't be different this time."
Never forget the great TSA Master Key fiasco.
To facilitate airline baggage security while easing bag searches, the TSA approved locks which could be opened with a master key (well, one from a small set of keys). These keys, like the "golden key" in the indicated article, were carefully guarded. Anyone using a luggage lock that either wasn't TSA-master-key-compliant, or not already open, was of course considered with great suspicion.
Then someone wrote an article about it. And talked a TSA security agent into posing with the keys displayed. http://www.extremetech.com/wp-content/uploads/2015/09/CM8Naj... was published, and within hours people were 3D-printing copies.
Never forget the great DVD CSS fiasco.
To ward off unauthorized copying, DVDs were encrypted. Authorized playback devices were given the secret decryption key. Controlled properly, this key would never be actually revealed.
Then someone didn't control it properly, storing it unsecured in a playback product. The key was found, copied, disseminated, and DeCSS software proliferated. It was even printed on t-shirts: https://c2.staticflickr.com/4/3451/3228250152_cf84bbd87d_b.j...
No, it's not "going to be different this time". Universal back-door keys get compromised/copied/disseminated eventually.
There's a huge difference between bomb making and crypto. To make a bomb, you need to gather materials and then carry out some difficult, dangerous steps to turn them into explosives and assemble a bomb. Then if you want another bomb, you have to do it again. If you want a bigger bomb, you need more materials. Consider that any random person can easily buy some ammonium nitrate, but if you try to buy enough to blow up a building without a legitimate reason, you'll run into major obstacles with the law.
Crypto is different. Once you set up crypto, you have it forever. Crypto that can encrypt 1kB can also encrypt 1TB. Crypto that works for one person can easily be made to work for a million people. Computers handle all the hard parts, and there's no obstacle to scale.
It's true that the government could require big companies to put in a master key. The problem is that criminals would trivially bypass such a requirement. There's no computer equivalent to requiring all ammonium nitrate sellers to report sales. And in the other direction, there's no physical equivalent to the havoc that would occur if the master key ever leaked. The risks and benefits are completely topsy turvy for a master key scheme.
Note that the government doesn't really try to restrict bomb making instructions, just bomb making materials. With crypto, the instructions are the only thing there is. There's no such thing as crypto materials, unless you propose to regulate CPUs in general.
> There's no computer equivalent to requiring all ammonium nitrate sellers to report sales.
Say, perhaps, every processor had a 2nd processor attached, that would run code not modifiable by the owner? With little public information about it's full capabilities? That had full unfettered access to the primary processor, it's memory, and I/O Ports? And insist that it's mandatory to the function of the primary processor?
Sounds a lot like Intel'S ME?
How does the ME determine if I’m running an encryption algorithm using a legal compromised key vs running illegal encryption with a non compromised key? And more importantly how does this ME turned spy actually determine what is encryption and what isn’t? It’d have to determine on the fly if all the operations on your computer were encryption-like because otherwise just tweaking the code so the outputs are identical but the instructions and memory access are vastly different would completely bypass the ME-snitch.
Yep, but you're not going to do something like that with all the embedded CPUs out there, and they're still quite capable of crypto.
You also don't have to build crypto from first principles, we have plenty of reusable software components at all layers of computing, that implement strong crypto.
This whole thing is a straw man.
It is we the people who should be dismantling the security state, in order to solve the terror/war problem - we sure as hell don't need career experts dictating this. The public truly needs to fight back on this.
Crypto is Math.
Broken crypto is broken Math. But Math is broken anyway, because we can't really stop people using Math to hurt us.
After Crypto being banned, what next? The Golden Rule?
We can't have Math, because "Terrorism".
The only solution is an open and honest society, in which sufficient resources are applied, at the human being level, to ensure that terrorism doesn't happen.
Like, we could deploy battleships with bombs all we want, but those same battleships would go a long way towards actually making peace in the world, by healing it instead.
But, we "can't do that", apparently, because "Threats/Force/Terror are more important".
Let we, the people, continue to use crypto to undo the need for such power structures as the NSA, and all the rest of the spooks ..
> Just like they control the sale of explosives, for public security, they can also require Microsoft, Apple and Google to put in a master-key. And I don't see a problem with that.
The problem is that there's no good way to control that master key. What do you do with it? Lock it on an encrypted USB drive in a fire proof safe bolted to the floor under the google headquarters?
That works until a police agency wants to use it. And then another.
How many people do you think you can reasonably give this key to before someone who isn't "supposed" to have it gets their hands on it?
What happens when a criminal gets his hands on the master key that unlocks every encryption made by apple, google and microsoft?
Regulate encryption like you regulate heavy weapons. You can carry a gun, but you can't carry an assault rifle. But you can get special permission for special cases.
Yes, the master key for your SMSes will be leaked pretty quickly, but every business with a legitimate use for encryption will be able to use it. In the end your bank account will not get hacked, terrorists will continue using strong encryption, but law enforcement will be pacified.
"every business with a legitimate use for encryption" is "every business" (SSL and SSH, remember?). "Every business" is "every person", given sole corporations.
> In the end your bank account will not get hacked
Equifax are still trying to explain to congress whether or not they are encrypting data: https://www.wsj.com/articles/equifax-ceo-to-congress-not-sur...
> terrorists will continue using strong encryption
Terrorists don't generally use encryption, surprisingly. There have been very few incidents recently where the authorities have found themselves with data they were unable to decrypt that might be relevant. The only one I can think of is https://en.wikipedia.org/wiki/2015_San_Bernardino_attack
Regulating math is not something that usually has a good outcome. See the 1897 Indiana Pi Law for a good lesson in unintended consequences.
You can't take a gun, put it into a machine, push a button, and have the machine make you a million guns. If you could, gun control would go from difficult and incomplete to downright impossible. That's how encryption is now.
Are you familiar with Defense Distributed, DEFCAD, and the Ghost Gunner? That stuff is already here. Every time I hear someone talk about banning "high capacity magazines" I think how trivial something like that is to 3d print...
I've heard about them a little. This stuff is definitely getting easier, but there's still a world of difference between the difficulty of replicating a gun and the difficulty of replicating a copy of crypto code.
Many banks send 2FA codes via SMS, so I question how my bank account will not get hacked if the master key for SMSes gets leaked "pretty quickly".
> Regulate encryption like you regulate heavy weapons.
They were at one time. Encryption was regulated under the ITAR laws until 1997.
Didn't the US try that one already?
I believe so: https://en.m.wikipedia.org/wiki/Clipper_chip
I was thinking more of https://en.wikipedia.org/wiki/Pretty_Good_Privacy#Criminal_i..., actually, but that's a good example too.
Hi, this is not how encryption works. Please go read up on the subject before participating in this discussion.
Until you can kill someone with encryption this is absolute bullshit FUD. Are you a paid shill?
> they can also require Microsoft, Apple and Google to put in a master-key. And I don't see a problem with that.
How much do you suppose such keys would be worth? Trillions of dollars?
Every nation-state and criminal organization on the planet would be after those keys. They'd use every technical and social trick in the book, including assassination, to get them.
The loss would be inevitable, swift, and disastrous.
Have a high schooler generate a one time pad. Unbreakable security in a simple system based on relatively simple mathematical concepts.
Yeah let's just walk around with our pants down because belts can be used to strangle people.
You can't download explosives. You can't type text into any computer and have it generate explosives.
Encryption is not a physical object, which makes it completely unlike explosives. Not only that, there isn't even a specific piece of software that is encryption, it is simply a series of math functions you run on data.
You can make all the rules in the world about companies having to have a master key. People have already downloaded PGP, it is too late to put that back in the bag.
unfortunately even goverments are not able to secure the private key part sufficient enough. Adding the risk of outdated devices with compromised keys would be catastrophic and force everyone to buy newer overpriced hardware, without in the end adding anything to our privacy anymore.
> But coding these algorithms is no joke.
They've already been written. RSA isn't going to magically disappear.
Open source implementations of quality crypto aren't going to up and vanish, just as how instructions to make bombs won't vanish.
Ostensibly, the crypto implementations available today will be good well into the future, barring quantum computing advances.
Implementations suffer not just from actual bugs, but from implementation choices that leak information (power consumption, timing data, etc...)
Crypto implementation available today are only good until they are not anymore and it's impossible to say when that might happen. For example, a new compiler optimization might expose something that wasn't previously exposed.
Side channel attacks are generally irrelevant for the scenario where a terrorist is using their own electronics to communicate with their fellows.
Which is the oft-cited reason we need government key escrow...
These situations do exist, but that doesn't change the point that the genie is out of the bottle.
People who want this software are going to be able to have it. End of story. If a timing attack happens to pop up in it, great, that's a win for the government. But by and large, these messages will go without decryption.
As with anything: these measures will not stop the determined lawbreaker - not that they necessarily need to.
> Don't accuse others of astroturfing or shillage. Email us and we'll look into it.
> In a December 2015 Monday Note titled Let’s Outlaw Math, I mocked our government officials and Law and Order public servants for their obdurate disregard for a fundamental mathematical property that makes well-designed encryption unbreakable
This is a wrong way to think about it. You can say similar things about building a bomb. The steps to manufacture explosives from common materials are simple if you are good at chemistry. Likewise, this mathematical explanation makes encryption sound like a high-school can do it using an abacus. But it's not. It's simple for a person to understand. But coding these algorithms is no joke.
Now if you want the government to not impose restrictions on crypto software, that is the equivalent of wanting to lift controls off explosive substances. Yes these are fundamental truths about the world (mix a and b and get an explosion, multiply prime numbers and you can't factorize), but that does not mean the government cannot try to control them. Just like they control the sale of explosives, for public security, they can also require Microsoft, Apple and Google to put in a master-key. And I don't see a problem with that.
I've certainly considered proposing this kind of "smash the piggybank" protocol, but it also needs to come with penalties for abuse and mandatory compensation for the value of the device. Otherwise it's going to be just like TSA smashing your luggage.
This type of thing needs to be considered, because I think people are really fooling themselves if they think Apple, et al won't be required to do this within the next 5-10 years.
There will be an incident that turns public opinion on this to such a degree that it will be inevitable.
How would it even work? How would some random hardware feature stop me from using PGP in user space?
It wouldn't. But convincing people may be easy. Just wait for the next bomb to go off or something, then say they were using encryption in their communication.
For example, by logging all of your keystrokes / input actions and storing them. (edit: this is why people are so concerned about things like Intel ME)
Logging them where? Intel ME doesn't have some huge storage location or anything.
Well, sending them over the network in real time is one option, or if we are talking custom hardware as in this discussion, then it could be made with local storage (and wouldn't need to be all that big...).
Sending them over the network would be trivial to detect and block, since you could just look at the outgoing packets on the switch the computer is connected to.
That can be blocked. Not to imply that your concern is invalid, but it would be difficult to enforce something like this when a countermeasure can easily be created.
None this sort of thing could only be done in hardware.
But what would stop someone from skipping the hardware encryption and just using a different encryption program? It would be technically impossible to prevent someone from using a different encryption scheme that doesn't touch this hardware encryption device.
Nothing stops that.
It's the same as with the old telephone system. The government can do a wiretap but you are always free to use an encrypting telephone if you are worried about that.
I've tried to think of something that hardware manufacturers could do to grant access to law enforcement and minimize abuse potential.
The best I could come up with would be to make decryption possible iff law enforcement had physical possession of the phone and if the act of decryption would make the phone unusable after (e.g. hardware access requires blowing some fusible links) and if the acquired data was still encrypted with a key that only the device manufacturer can provide.
I'd prefer to see no concessions made, but if decryption is going to be required, it should be expensive, require possession of the device and the cooperation of multiple parties.
> the FBI backed off, probably fearful of the PR consequences.
There was also a PR battle involved and Apple won.
Defending encryption is hard, because it is primarily a PR battle and the enemy always has the high ground. Notice how all these cases hinge on some terrible crime - terrorism, human trafficking, etc. Because the govt then gets to say "Aha, so who wants to stand up and defend terrorists!? Nobody> That's what we thought, so let's pass this new law then".
But what Apple did (and kudos to their PR team) is turn it around said it wasn't just a 1st Amendment issue, but also a practical personal safety risk issue. Not having encryption means being exposed to identity theft and fraud. It is not just something abstract but a specific and real danger that everyone either experienced or knows someone who it happened to.
Read it here: https://www.apple.com/customer-letter/
It is really a great example of good PR and a good punch back in the encryption battle. It helps sometimes when a tech giant throws their weight behind this.
... and overestimates legislators' attachment to principles.
I think this writer dreadfully underestimates just how little legislators know about complex subjects, and how unwilling to learn they are.
They already tried, with the Clipper chip. It was clearly a very bad idea, and flawed, but that won't stop more attempts (and with software, the costs are much more hidden, and the flaws are probably more hidden as well).
It's quite possible for legislation involving key escrow/recovery, import controls and mandatory sentencing for using noncompliant crypto and so forth to pass the current senate and house, regardless of the technical shortcomings of the solution.
My belief is that the LEAs are waiting for a sufficiently egregious event involving crypto so they can push through legislation rather than attempt shaky arguments in court with the current laws. Guessing that the phone involved in the recent shooting was unlockable, at least initially, and didn't contain anything sufficiently interesting to make political hay out of.
On the one hand, there's already a lot of US legal precedent for encryption as a form of arms, in the form of laws that classify encryption technology as such for export purposes.
On the other hand, there's even more US legal precedent demonstrating the Supreme Court thinks that the 2nd Amendment applies very narrowly to a very specific class of arms. The Supreme Court fairly recently decided that my city's former ban on handguns is unconstitutional. This decision has not, however, been construed to also cover local bans on things like throwing knives. Nor, I think, would it cover the bans on things like small rockets, even though they are usable as (and originally invented to be) weapons.
I'm inclined to say that the latter body of precedent is the one that the Court would choose to lean on if this were cast as a 2nd Amendment issue.
> I'm inclined to say that the latter body of precedent is the one that the Court would choose to lean on if this were cast as a 2nd Amendment issue.
Like I said: emotionally, if not constitutionally. For legislators, what the courts say is immaterial, if voting for such a bill is likely to make the unelectable.
I've been making that argument for some time now. The similarity between "responsible encryption laws" and "responsible gun laws" is stunning.
Just wait until all the dirty tricks from the gun control debate start getting used against encryption. Lots of cognitive dissonance incoming for many in our community...
We have "responsible gun laws" from sea to shining sea and they haven't helped much. Background checks are the greatest false sense of security ever invented. The people who call for responsible gun laws really just want guns banned. Which not coincidentally is what the people who want responsible encryption laws to do also--they have wanted to ban crypto since Zimmerman poked them all in the eye back in the '90s. It truly amazes me that we are even having this debate again!
Either we are free people able to do as we please with our property or we are serfs on a plantation. Free people have both the right to self defense and access to the proper tools without them being dumbed down.
I really doubt it. The NRA should have been a natural ally in the Philando Castile case, for example, but they weren't. It might have worked back in the 90s when they were all about resisting the government and complaining about "jack-booted thugs" in uniform, but these days they're mostly in favor of authoritarianism.
I don't think it's a 2nd amendment issue, per se. But it's very similar. If you're afraid of your freedom to stand up against a tyrannical government being taken away, you should be especially afraid of your right to talk freely about those issues being taken away. That's what the NRA people should be thinking about.
I mean, I'm in favor of gun control for practical, statistical reasons. I want less gun violence, if we can manage to implement some means of accomplishing that. Hard problem, I know.
But I understand the intent of the second amendment. It's not to allow hunting. It's not to allow you to protect your home from robbers. It's definitely a statement of defiance to tyrannical government. So if the NRA doesn't agree to some similar line of thought on encryption, they're hypocrites.
I can't really articulate why but, for some reason, I don't really see the NRA getting involved in this too deeply. I could see them issuing a statement opposing it or some such but not much beyond that.
(Lifetime member of the NRA but I have no special insight or knowledge.)
I hope so. We're going to need all the allies we can muster.
I've long wondered whether this is a fight where the NRA would be an unlikely ally.
It shouldn't be that hard to cast strong encryption as a 2nd amendment issue, in emotional if not constitutional terms.
The voting record of the house is clear: The fact that something does not exist like a coherent plan for Obamacare repeal is not a reason not to vote for it. The evidence indicates that some members believe voting is like "liking" on Facebook. Consequences are entirely virtual. Until they are not anymore.
It is worth noting that Congress members may be worried about keeping some private things private. That may act as a real check.
From the article:
> Once they get close enough to the precipice, they’ll experience a salutary fear of consequences.
No, they won't. Look at how Trump got elected or Brexit was decided. Excessive stupidity of a project won't stop people from pursuing it.
Daring the current kakitocracy to do something this stupid seems like a bad idea.