On Secrecy

When everything is classified, then nothing is classified.”

I should suppose that moral, political, and practical considerations would dictate that a very first principle of that wisdom would be an insistence upon avoiding secrecy for its own sake. For when everything is classified, then nothing is classified, and the system becomes one to be disregarded by the cynical or the careless, and to be manipulated by those intent on self protection or self-promotion. I should suppose, in short, that the hallmark of a truly effective internal security system would be the maximum possible disclosure, recognizing that secrecy can best be preserved only when credibility is truly maintained.

Justice Stewart, New York Times v. United States, 1971.

Posted on October 2, 2013 at 1:28 PM62 Comments

Comments

Bryan October 2, 2013 1:46 PM

Bruce,

OT: It’s too bad your book Liars and Outlia^Hers went to press before the NSA scandle broke. The creation and handling of this situation by our elected and NSA leaders shows just how NOT to build trusted systems — more like how to disable them.

–Bryan

maxCohen October 2, 2013 2:32 PM

Although I agree with the quote above, the Justice does continue down a road that is taken advantage of by today’s Administration and Congress.

“But, be that as it may, it is clear to me that it is the constitutional duty of the Executive — as a matter of sovereign prerogative, and not as a matter of law as the courts know law — through the promulgation and enforcement of executive regulations, to protect [p730] the confidentiality necessary to carry out its responsibilities in the fields of international relations and national defense.
This is not to say that Congress and the courts have no role to play. Undoubtedly, Congress has the power to enact specific and appropriate criminal laws to protect government property and preserve government secrets.
…Moreover, if Congress should pass a specific law authorizing civil proceedings in this field, the courts would likewise have the duty to decide the constitutionality of such a law, as well as its applicability to the facts proved.” – Justice Stewart, New York Times v. United States, 1971. [ http://www.law.cornell.edu/supct/html/historics/USSC_CR_0403_0713_ZC3.html ]

Nick P October 2, 2013 2:36 PM

Interesting. Hard to say if it’s accurate, though. Many leakers and former intelligence types of the past pointed out that culture had a lure of “your one of us and get all the secrets nobody else can have.” Building up walls of secrecy, while keeping everyone else guessing, goes along with that rather nicely.

So, if they were motivating loyalty partly by letting people in on their exclusive world of secrets, then wouldn’t minimizing secrets undermine that? Both the value of being an insider and the perceived knowledge advantage they had over others? I doubt that Justice Stewart was looking into that angle.

Scott October 2, 2013 2:54 PM

One thing that becomes problematic is that the more you require a security clearance for, the more security clearances you have to give out. The more security clearances you give out, the more people with security clearances that will be compromised. It’s the same problem windows had: if enough programs needs administrator permissions to run, everyone who wants to use them needs to be an administrator. Now every application you download, your web browser, that executable someone you have never heard of emailed to you, etc. runs with admin permissions; what can go wrong?

Either we need to stop classifying things that are trivial as confidential or secret, or we need new two new security levels: “Career Damaging” and “Embarrassing.”

Jonathan North October 2, 2013 3:01 PM

When everything is classified, then nothing is classified.

That would be true if everyone who had access to anything classified had access to everything that’s classified, and if it were actually everything that were classified.

If it’s only internal government documents that are classified, then that effectively neuters FOIA-type laws.

If government and private documents are classified, but each government agency protects its documents from all other agencies and non-government officials who do not have a reason to access that information, then that both neuters FOIA and limits potential abuse (for instance, a Department of Energy intern can’t use NSA-gathered data to stalk their significant other).

There are good arguments against governments classifying as much as possible, but that Justice Stewart quote doesn’t provide one.

Noah B. October 2, 2013 3:06 PM

I find the judge’s comments absurd, and not congruent with the prevailing social norms of our times.

The vast majority of us wear clothing to conceal our bodies, and do not live in glass houses. We make reasonable efforts to hide our liquid assets from public view. (The obvious exception is our relationship with tax authorities – and even here, not all people are equally transparent).

If I am reading the judge’s words correctly, he seems to be saying: “Hide nothing, put it all on display”.

WHY SHOULD WE LISTEN ? What are the social and personal benefits gained by heeding this advice?

Scott October 2, 2013 3:15 PM

@Noah B.

WHY SHOULD WE LISTEN ? What are the social and personal benefits gained by heeding this advice?

The idea is that democracies are governments by the people. Democracies work when people can make informed decisisons; the more propaganda, the more that is made secret, the more politicians and department heads lie about what they are doing, the less our democracy is functional. Not like the USA has a very functional democracy to begin with, being that we usually only have a choice between two people at any level of government, but it’s getting harder and harder to make an informed choice between those two people.

Mike October 2, 2013 3:59 PM

It would be great if nothing was classified and there were no secrets. But as long as countries don’t trust one another, I don’t see this ever happening. I don’t think you can just look at single country alone when analyzing this problem. It’s a big playing field.

Bauke Jan Douma October 2, 2013 5:28 PM

Interesting article about a wise man (Stewart), who suffers from an
affliction more often seen among the wise: presumption of similar
high intellectual and rational function and moral acuity among other
executives (of government).
Even with that — esp. in the USA we have seen a fast deterioration of
these traits in the past few decades, and an accellerated decline after
9/11.

The USA seems a ‘country’ enveloped in total neurotic fear: a bonanza
on oil, a bonanza on data. What this all reminds of is Lebensraum.
No I won’t be mentioning the H-word here (at least not be the first), but
there are similarities (as there are oddities).

Some more comments:

a.
’71 plus 28 is 1989? Which is it 18 or 1999?

b.
The article is about a matter that was put before a US justice. Important
fact. Where in the current crisis (and crisis it is) is the parallel to 1971?

bjd

Impossibly Stupid October 2, 2013 6:04 PM

@Noah B.
If I am reading the judge’s words correctly, he seems to be saying: “Hide nothing, put it all on display”.

Then you’re not, because he quite directly states “maximum possible disclosure”. Instead of starting up a security evaluation for absolutely every bit of information (which leads to silly things like treating leaks as Top Secret after they’ve already been published), you only invest in security for things that absolutely must be secure.

In a related aside: Thank you, Bruce, for (still) not requiring us to set up accounts to comment on your blog. One less bit of secrecy I have to maintain. It is appreciated.

Jay October 2, 2013 8:15 PM

@Noah B

If the government is entitled to look at pictures of you naked (even if you thought they were private), search your house, and look into your financial records – then why do you think you aren’t putting everything on display?

Quid pro quo – we can keep no secrets from them; so an effective democratic government cannot keep secrets from its citizens either.

Dirk Praet October 2, 2013 8:24 PM

Reminds me of the common shopfloor wisdom “When everything is urgent, nothing is urgent”. Managers and salesreps hunting for bonuses usually don’t like it when cranky engineers pull that on them, especially when fiscal quarter or year closure is coming up again.

Spaceman Spiff October 2, 2013 8:51 PM

Well, the old saw about “the best kept secrets are those hidden in plain sight” keeps coming to mind.

65535 October 3, 2013 2:09 AM

Most of the comments have covered my thoughts. I will make my post short.

“National Security” has be grossly over stated and twisted. General Alexander doesn’t need every American’s communication records, financial records, and social records for the last five years to do his job! Sure, that enormous power makes his job easier or more interesting – but, it is unjustified and unneeded. He has gone too far!

“National Security” should never be used as tool to circumvent the Fourth Amendment nor any other part of the US Constitution. “National Security” and “Secret Courts” are an abuse of power. And, that abuse of power is in intolerable.

In my opinion, The NSA could do more with less. I say cut their budget significantly and reduce their ability to poke into every corner of our lives.

Peter A. October 3, 2013 5:39 AM

A side note on “national security” expression and its meaning.

One should contemplate what or whom all the measures taken in the name of “national security” actually secure and from what. Do they secure “the nation” from some indescribable threats or maybe the indescribable affairs of the guvmint from “the nation”.

It doesn’t help at all if you recall that historically almost every organizational tool of oppression of fallen regimes in a few past decades had “security” or an equivalent word as a part of its name.

vas pup October 3, 2013 9:36 AM

@ Nick P: “motivating loyalty partly by letting people in on their exclusive world of secrets”.
Nick, bingo! That creates loyalty as being choosen to know something exclusive (and negative in particular) about somebody special: ‘You are special: having billions, power, social/celebrity status, but I am special (like Mr. Snowden) having all private information on you’. Being choosen creates status ( at least psychologically) and boost Ego of somebody with brains, but no other assests stated above’.

I guess different level of classification should be applied to methods (general disclosure of governemnt/Agency’s methods of conducting day-by-day activity – min classification) and details of particular operations /targets/identities/etc. applying those metods (max classification).

Scared October 3, 2013 1:30 PM

The Lavabit Story: The Real Reason Why The Private Email Provider Was Forced To Shut Down
http://siliconangle.com/blog/2013/10/03/the-lavabit-story-the-real-reason-why-the-private-email-provider-was-forced-to-shut-down/

Levison apparently gave in and offered the keys for ONE user (guess who)…. but with a twist:

‘Levison was not about to just give the government what it wanted without making it sweat however, and so he fired back by producing an 11-page print out in 4-point type of the private SSL keys. The government was not amused and stated that in order for the information to be usable, the FBI would have to “manually input all 2,560 characters, and one incorrect keystroke in this laborious process would render the FBI collection system incapable of collecting decrypted data.’

So now we know that not only are the scientists at NSA (FBI???) lazy, they are also poor typists.

Nick P October 3, 2013 1:52 PM

re Lavabit

So, we now have details. Rather than some sneaky stuff, they used a pen register with court authorization. I’m more inclined to be on the government’s side here. They have the legal right to collect information on someone with a court’s approval. They got the approval. Lavabit’s owner resisted them at every point even sending them the data in 4pt font (LOL) in one case. Then, rather than comply with the law, he shut his service down. Textbook case of obstruction of justice.

The article concludes with:

“What happened to Lavabit is proof that the government will do anything and everything it can to spy on its people, even if good people, like Levison is fighting to keep people’s privacy.”

Which is utter BS. The government was acting very legitimately in this case. What happened to Lavabit was proof that if you do everything you can to resist a court order you might be in legal trouble. Americans as a whole know “go f*** yourself” is not an acceptable defence to a court order. Lavabit’s owner did this to himself and did it over the wrong case, imho. He should have saved it for a much less legitimate (NSA-style) issue.

And, seriously, he ran a privacy-centered email business without expecting that the government might demand information on someone? And with legal authorization to get it? Amazing…

CallMeLateForSupper October 3, 2013 4:36 PM

“… an 11-page print out in 4-point type of the private SSL keys. The government was not amused…”

But I was amused. 🙂 “Here…. I’ll even give your typists a swift kick to get ’em started.”

Seriously though, mistaking “o”, “O”, or “0” for one of the other two can be very easy, depending on the typeface. I typed a 1024-bit key from a printout some years back, and those particular characters gave me fits. I finally got it right though. The NSA should be able to do at least as well. Maybe all the wire and fibre sucking spoiled them?

Scott October 3, 2013 5:06 PM

For a 4pt font at 11 pages, that must have been a 4096 bit key printed in binary with one line per page. Otherwise, either the key isn’t an SSL key as stated (you are talking unsupportedly long keys), or it’s BS.

@Nick P

The article stated it was the SSL private key. That isn’t just one user’s data, that allows them to decrypt all data for all users.

Bauke Jan Douma October 3, 2013 5:29 PM

@ Nick P

Wow, some apology you’re giving there.

Have you forgotten was this was all about? No you haven’t,
but you call it:
“(…) utter BS. The government was acting very legitimately
in this case”.

Well, in that case, the government, because it’s the government,
you know, must by definition be acting legitimately.
Nice.

bjd

Scott October 3, 2013 5:47 PM

Getting a court order for an SSL private key is like getting a search warrant for Manhattan.

Anon October 3, 2013 5:53 PM

@Scott

Of course, if lavabit had played ball to begin with and given up Snowden’s email account, the feds never would have asked the SSL private key.

@Anon October 3, 2013 6:10 PM

That doesn’t make it right. Getting the keys necessary to decrypt all traffic from all users because the person who ran the service was being difficult is ridiculous. “You defied the order? Well, screw you, now we want to be able to read everyone’s email, and not just the original subject of the order, because spite!”

Nick P October 3, 2013 11:45 PM

@ Bauke Jan Douma

Why I’m not blaming Feds for Lavabit situation

There’s no apology. There’s no interpretation. And although I’m a fan of Snowden’s revelations, there’s no legal debate about the Lavabit situation. This situation per the new details is one way far as responsibility is concerned and it isn’t on the government (for once). In case you live outside USA and its laws, let me explain.

America has a Constitution that outlines rights and responsibilities for both the people and government. Just the basics, at least, and only when the government cares (sigh). Our Constitution grants us protection against unlawful search and seizure. Law enforcement is “supposed” to have probable cause, obtain a warrant from a court if there’s no consent, and search for exactly what’s in the warrant. When you get a warrant, you can try a legal counter but generally the debate happens in court later to dismiss the evidence on procedural grounds. If you resist, you may be hit with a number of responses that result in massive problems for your life. The last thing anyone with brains says to police with a warrant is to get lost. Americans know better.

(The “hardcore gangstas” are an exception to that rule. They often brag in hip hop songs and on media about how much profanity they screamed at judges and cops. They also brag about how much jail time they served. There might be a connection there. Moving on.)

Context is the next issue you mentioned. Yes, I certainly remember the context of the Lavabit issue as I was in the debate here on this blog. Snowden, a Lavabit user, leaked classified information about NSA programs that they considered highly important for their mission. These programs constituted [according to certain people including me] massive violations of diplomacy and civil rights although they were, legally speaking, in a grey area. Snowden then publicly confessed to that felony. Far as courts go, that’s open and shut against him.

So, although I’m a fan of his revelations, he chose to be a martyr for his cause because the law is pretty clear on what happens when you publish classified information. Snowden even said as much on video. That he used Lavabit means Feds moved to target his account there to gather evidence for their investigations.

So, now we’ve made it to Lavabit. More context. At the time, none of us knew what was happening to Lavabit. I thought that perhaps NSA and feds were playing hardball with Lavabit trying to get them into a program like the ones commonly correlated with Prism in the media. Many of us thought Lavabit was being abused heavily and asked to do very immoral things like backdoor everything as a response to the situation. What I did know is:

  1. If I were Snowden, I’d assume my Lavabit account would be compromised especially given the nature of capabilities he said NSA had. At best, it would be good until Snowden ID’d himself and only a temporary obstacle from there.
  2. If I were Lavabit, operating in a country that mandates lawful intercepts at times, I would have a method for doing that to give me a shot at protecting the majority of my customers if the Feds came after just one or a small group.

So, what happened? The guy was extorted or renditioned? NSA forced him to install taps on all their customers like with certain carrier type companies? He was put on the Do Not Fly list like certain reporters? He was forced to use extra weak crypto like in leaks? He was forced to send malware to his users a la Silk Road? His assets were frozen by IRS entirely like many American businesses? His equipment all seized without notice by FBI like they did to several colos, putting his company out of business? He got blown up by a predator drone? He committed “suicide” just before a trial? I mean, considering these have happened to people who did far less than harbor a high profile target like Snowden, who knows what our very scary (and scared at the time) government might do to Lavabit’s owner for resisting.

Actually, none of that happened. What happened was they went to a court, presented some evidence to obtain court permission, and delivered that to Lavabit. Lavabit resisted. This went back and forth a bit without the owner being thrown in prison for obstruction, contempt of court, supporting the enemy, or who knows what else they might come up with angry as they were. His assets weren’t frozen, his equipment wasn’t seized, the courts weren’t bypassed, his initial resistance was tolerated, and he’s still alive. If you look at my paragraph on our legal system, you’ll see that the Feds operation against Lavabit… followed the rules. I can’t tell if I felt surprised, confused, or grateful at the news that this was all the Lavabit owner was being hit with.

So, we have a system where all of our citizens know we must follow a court approved search request. Service providers know they must provide information on customers given such a request. And realists like this blog attracts know we live in a surveillance state that can go MUCH further without accountability [unfortunately]. So, the thought that Lavabit’s owner got a series of proper requests and resisted them while also trying to piss the court off is simply… ridiculous. It makes no sense from my perspective as an American citizen who knows there’s a fine line between flexing one’s rights and asking to go to prison.

Of course, like I said in the prior debate, I respect his decision if he was entirely focusing on a personal principle of total resistance to what he perceived as tyrannical like a modern Thoreau. So be it if he chooses to destroy his business, leave his customers in a rough situation, and go to prison to stop the feds from obtaining Snowden’s emails. To be fair, though, I had to point out that the feds acted more properly in this case than they usually do or even needed to given the target. They have to go after Snowden and it should be hopeful to privacy service providers that they acted so properly in this case.

For good principled reasons or for foolishness, the mess Lavabit’s owner is in is entirely his own doing. It would have happened with any case if he acted the way he did toward US law enforcement and courts. Most people here know better than to act that way and those that don’t routinely suffer the consequences.

@ Scott

It’s a decent point but Anon 5:53pm is getting my overall gripe. Lavabit’s owner countering the court by offering just the data on the desired account or having a measure for selective collection might have worked out. Maybe. Instead, he was 100% non-cooperative across the board. That doesn’t work out in America unless you’re one of a few hyperpowerful entities such as Goldman Sachs or Haliburton. 😉

Figureitout October 4, 2013 12:05 AM

Nick P
–I don’t understand why it’s necessary for one to run a business to be legal. The law has many many stupid laws and it’s one of those bloated things that no one really knows and it’s degrading many citizen’s quality of life if enforced to the letter. When the service you’re offering is INFOSEC w/ regards to email; then from an intellectual perspective and not a legal one, why is that stupid? If you don’t provide that service, then you’re shortchanging your customers and they should just go grab a yahoo.com email address for free and let their ideas/work get stolen easy. Why must one keep his/her business operating if it violates its sole mission and what if there were undisclosed financial problems w/ the business? I don’t fully trust what is being said by the prosecutors, let’s here what the defense has to say.

Look into the American Revolution, meaning the individual Americans that revolted in their own ways, and the build-up to this overwhelming outburst that defeated the most powerful army in the world at the time. Those genes still exist kind of in the population here; they just been beaten down by a police state.

Figureitout October 4, 2013 12:18 AM

Nick P
–Basically, what this situation is telling all the entrepreneurs out there trying to protect their ideas from Chinese copiers, to get yahoo.com email addresses. And to kill the INFOSEC private industry b/c it doesn’t give the gov’t the keys. Thanks, I know not to give a sh*t about internet security and just use yahoo.com and facebook for my business that I don’t want stolen by any gov’t. So this business will move to places where it’s allowed….Hello Svalbard or Totally unnamed island where I can store my server and run a cable to. Bye bye $$$ and innovation in the USA b/c what self-respecting security expert wants to start a company in a police state?

Dirk Praet October 4, 2013 4:52 AM

@ Nick P

Why I’m not blaming Feds for Lavabit situation

I agree with your analysis. Both Levison and Snowden should have known that Lavabit under common (as opposed to secret) US law could have been served with a targeted warrant for his communications.

Which is not to say that I wasn’t highly amused by Levison handing over the keys on paper and the feds complaining about it. It kinda reminded me of an hilarious scene in the Return of the Pink Panther (1975) where Peter Sellers shouts “Follow that car” to a cabbie who promptly gets out of the taxi and makes a run for it. Not that I would ever dream of comparing the FBI to Inspector Clouseau, though. I am a very respectful person.

Impossibly Stupid October 4, 2013 10:53 AM

@Nick P
“the law is pretty clear”

This bit sums up everything that is wrong with your line of thinking. What is legal doesn’t define what is right. What is legal is not immune from influence by bad actors. And, no, what is legal is hardly ever clear. If it were clear, after all, we wouldn’t need courts to make rulings on it. This is especially true when secret courts exist to rubber stamp the activities of secret agencies. Due to their own actions, the “Feds” no longer get the benefit of my doubts; if they get the benefit of yours over your fellow citizens, then you’ve lost the idea of America.

More directly, take your (off topic) blathering to your own blog. Establish your identity if you think there is any value in your rambling on. Your actions here make you sound like little more than another anonymous paid shill.

Dirk Praet October 4, 2013 11:31 AM

@ Impossibly Stupid

Just laws are about striking a right balance between individual freedoms and the interests of society as a whole. In this particular case, the FBI’s actions to obtain Snowden’s communications (and his only) IMHO were proportionate and entirely in line with both the letter and the spirit of the 4th Amendment. There was probable cause and a warrant was served.

Although indeed off-topic, there is no reason to scold someone for taking an unpopular stand, especially if that person takes his time to come up with a more than reasonable and well-articulated argument. Which is a trademark of any type of civilised discussion to the benefit and intellectual enrichment of both parties, irrespective whether they agree or not.

name.withheld.for.obvious.reasons October 4, 2013 12:00 PM

@ Dirk
Well articulated Dirk, and another stab at jurist humor.

Thanks

Nick P October 4, 2013 2:19 PM

@ Dirk

“Which is not to say that I wasn’t highly amused by Levison handing over the keys on paper and the feds complaining about it. It kinda reminded me of an hilarious scene in the Return of the Pink Panther (1975) where Peter Sellers shouts “Follow that car” to a cabbie who promptly gets out of the taxi and makes a run for it. Not that I would ever dream of comparing the FBI to Inspector Clouseau, though. I am a very respectful person.”

Yeah, it was really funny. Nice comparison.

@ Impossibly Stupid

Yes, there are bad laws, bad judges, bad prosecutors, etc. Yet, Americans tolerate our system because it often works well enough compared to others. And there are certain legal concepts that are extremely clear having been established over a hundred years of court cases. One is search and seizure with its rules which our Founding Fathers established in the Constitution itself. If the government fails to follow them, it can have its case thrown out due to “procedural error” and this has happened many times. However, if they do things right, we’re expected to turn over whatever is on the warrant and debate it later in court.

This is law 101. This is what every defence attorney will tell you to do when faced with a warrant. We can all live in make believe land where we pretend the law doesn’t exist over whatever abuses might happen, tell the cops to go f*** themselves, and risk a shutdown or re-enactment of Waco. Or we can anticipate that we might be served a warrant for information on our services’ users and have a plan for how to deal with that. And push Congressmen and fellow voters to improve upon existing laws as American voters are supposed to. I choose the latter approach because I can’t protect anyone (esp family) if I’m in prison or dead.

@ figureitout

“I don’t understand why it’s necessary for one to run a business to be legal. ”

To keep from going to prison. Same reason you don’t commit crimes in front of cops personally. That simple. In this case, a guy commited a serious felony and used an email account in the process. Like his cause or not, that’s a very justifiable reason for cops to search his email account. Snowden’s only chances stateside are Congress changing laws with immunity to him, a Presidential pardon after he’s convicted, or jury nullification in court. On other hand, fighting the concept of law itself is foolish and won’t go anywhere.

People wanting to save Snowden should put effort into one of the three options I mentioned, esp nullification, rather than griping that Feds are investigating felonies and shouldn’t. (rolls eyes)

” If you don’t provide that service, then you’re shortchanging your customers and they should just go grab a yahoo.com email address for free and let their ideas/work get stolen easy. ”

“Basically, what this situation is telling all the entrepreneurs out there trying to protect their ideas from Chinese copiers, to get yahoo.com email addresses. And to kill the INFOSEC private industry b/c it doesn’t give the gov’t the keys.”

That’s a false dichotomy. It’s not a choice between “have nothing” or “give opponents everything.” Where did you even get that? There’s a ton of threats to software, I.P., email, etc. They include organized crime, shady employees, black hat hackers, foreign states (main IP thieves), and the Feds. In the US, you can legally protect yourself from ALL of these except for the Feds during an investigation. Is it ideal? No. Does it let you stop hackers, IP theft, etc? Yes. And, of course, there are ways to deal with that last one for people who are willing to go through the trouble.

Likewise, Lavabit protected users from many threats. Even with domestic searches, it would be way better than regular email service. When he closed it down, he left his customers to the competition (like Yahoo you mentioned) which left them vulnerable to every other threat. Very bad for them. So, if his principles forced it so be it, but a huge amount of security (minus Fed protection) is better than no security. And his former customers now have no security. And no access to their emails or email contact lists if they depended on Lavabit too much. Bad squared.

“Look into the American Revolution, meaning the individual Americans that revolted in their own ways, ”

People keep comparing it to this but it’s nothing like this. In the American Revolution, Americans were opposed to British rule and taxation without representation to the point they’d be willing to fight in large numbers. They ended up using bullets against their opponents who used the law only to serve themselves. The current situation is a minority of Americans that are displeased with NSA activities that the majority let happen and continue to ignore. And nonviolent options are being considered. It’s as opposite the Revolution as it could possibly be.

I mean, it would be interesting to see it all get changed by American laypeople rushing Quantico and Ft Meade shouting about their Facebook getting spied on but… I just can’t wrap my mind around it actually happening.

Impossibly Stupid October 4, 2013 7:00 PM

@Dirk Praet

The mistake you’re making is that you apparently think the laws are still just. Again, we live in an America that has secret courts that rubber stamp secret actions. There is nothing “proportionate” about the fact that Lavabit still cannot freely discuss what the government has been doing to them. Until the pendulum swings back, the only stance for an American to take is on the sides of their fellow citizens. Popularity is not the issue.

@Nick P

Oh, please! Americans continue to tolerate things because people are lazy and change is hard. The Patriot Act threw out a lot of the things you claim are so cut and dried. The issue, if you have any wish to stay on topic, is about the ills of the current climate of extreme SECRECY, not the two centuries of public law that came before it. Unless you can focus on that, I’m not going to waste my time with you.

Rick Damiani October 4, 2013 7:27 PM

@Nick P

It’s my understanding that Lavabit did turn over all the Snowden-related data they had. Where he balked is when the FBI asked for the ability to get into all the accounts.

Look here:

http://www.nytimes.com/2013/10/03/us/snowdens-e-mail-provider-discusses-pressure-from-fbi-to-disclose-data.html?pagewanted=all&_r=0

From the article:

[…]Mr. Levison was willing to allow investigators with a court order to tap Mr. Snowden’s e-mail account; he had complied with similar narrowly targeted requests involving other customers about two dozen times.

But they wanted more, he said: the passwords, encryption keys and computer code that would essentially allow the government untrammeled access to the protected messages of all his customers. That, he said, was too much.

“You don’t need to bug an entire city to bug one guy’s phone calls,” Mr. Levison, 32, said in a recent interview. “In my case, they wanted to break open the entire box just to get to one connection.” […]

Anon October 4, 2013 9:47 PM

@Rick

You should read the released documents to get the full history, https://www.documentcloud.org/documents/801182-redacted-pleadings-exhibits-1-23.html, but this appears to be the timeline.

June 10th – a judge issues a court order just for Snowden’s metadata

June 11th – Levison receives the order, but tells the FBI to screw itself.

June 28th – a judge issues a second court order for Snowden’s metadata, Levison tells the FBI to screw itself again.

June 28th – a judge issues a third court order for Snowden’s metadata, this time threatening Levison with contempt of court, but Levison tells the FBI to screw itself a third time

July 11th – Levison is served with a subpoena telling him to bring the encryption keys to court on the 16th

July 13th – Levison makes a ridiculous counter-offer to the FBI, that he would provide the requested data, but only after an absurd 60 day delay and then only once every 60 days instead of in real or near real time.

July 16th – Another court hearing with Levison

August 1st – Levison offers to comply with court order.

If Levison had just given the FBI what it wanted after the first, second, or even third court order, the FBI never would have asked for Lavabit’s private keys.

Nick P October 5, 2013 1:04 AM

@ Rick, anon, Dirk Praet

The court documents: VERY interesting

Thanks anon for the link. Rick, I was about to agree with your point and work from there until I saw those court documents anon posted. Lavabit’s owner was definitely pushing their buttons, both Feds’ and the courts. Feds might have been overreaching as well although their request looks pretty vanilla. The doc’s made for interesting reading. You get to see the games both sides’ lawyers played. Both Lavabit’s motion to quash and Feds’ counters made good points that will probably be copied in the future.

This counter from the Feds was particularly clever:

“Any resulting loss of customer “trust” is not an “unreasonable” burden if Lavabit’s customers trusted that Lavabit would refuse to comply with lawful court orders. All providers are statutorily required to assist the government in the implementation of pen-traps… and requiring providers to comply with that statute is neither “unreasonable” nor “oppressive.” In any event, Lavabit’s privacy policy tells its customers that ‘Lavabit will not release any information related to an individual user unless legally compelled to do so.”” (emphasis in original doc)

Which is pretty much what I said in original discussion: nobody could reasonably expect US companies to not cooperate with US LEO’s. Turns out their own Terms of Service says so and was used against them in court. Ouch.

The arguments around there being no abuse of the encryption keys because the law forbids it would sound good to a lay judge but were bogus in light of Snowden revelations. However, Lavabit’s attorney failed to show evidence that the data might be abused esp in light of Snowden leaks. Judge apparently thought it was wild speculation and outside of the court’s experience in real cases so the judge backed the other side. Future cases might want to focus more on this area as WE KNOW THERE IS ABUSE. Trick will be proving that to lay judges without too much confusing despite all technical details.

Takeaway for Privacy-centered Services: Most important thing in case!

This part right here that the judge said:

“I can understand why the system was set up, but I think the government is — government’s clearly entitled to the information that they’re seeking, and just because you-all have set up a system that makes that difficult, that doesn’t in any way lessen the government’s right to receive that information just as they would from any telephone company or any other e-mail source that could provide it easily. Whether it’s – in other words, the difficulty or the ease in obtaining the information doesn’t have anything to do with whether or not the government’s lawfully entitled to that information.”

The judge says that every US company offering a service like this should anticipate that the court might need to acquire such information. Further, Lavabit was designed in a way (due to ultraprivacy needs) to require putting all its users at risk to comply with the order for information on just a few. This may not have been intentional. However, the judge found that the trouble Lavabit’s own design posed for handling an intercept was irrelevant to the requirement that Lavabit turn over the information. (Unsurprising.) If Lavabit wanted to create difficulties for intercepts, the consequences of that were Lavabit’s own responsibility.

The Big FAIL

People, the judge ASKED FOR AN ALTERNATIVE TO THE PEN REGISTER! The judge wanted a clear alternative that would get the job done [from his perspective]. The only alternative Lavabit suggested involved a week to put together, 60 days of delay for the data, a few grand, uncertainty ,and the requirement to trust the person whose been resisting so far. Some “alternative”… Lavabit screwed themselves at that point. They might have convinced the judge to adopt an alternative if it was cheap, quickly coded, and had independently verifiable integrity protection.

A Legal Problem With a Technical Solution?

Seeing how close Lavabit got, I think a future privacy-oriented communication service could do better if they design their system with high confidence, selective, real-time, lawful intercept in mind. I’d recommend using a TPM-based solution (e.g. NSA’s HAP from Gen Dynamics) as one could show documents proving that even NSA trusts such solutions to prevent software tampering and even helped design it for government use. All of US government and defence contractors’ product concepts on things such as attestable integrity would be evidence for the privacy-preserving company. (The irony!)

So, the service would be designed to protect individual users, intercept a subset of them, run software Feds could evaluate for potential integrity, run on trusted computing enabled machines to prove that software was running, and give them information in real time. I’d add support to shift select workload or IP’s activity in a way that’s unnoticeable to customers and only put targeted accounts on the machines Feds could access. Gives the Feds live data on targeted accounts and keeps other accounts off those machines. And for icing on the cake, throw in an independent evaluation by a government favored entity like SAIC or Cygnacom that says the software should work as described, with their signature on the system image.

This might work. If it does, it protects the majority of users while giving the court needed data with evidence of total compliance. If it doesn’t, it’s more evidence in the long-term case of their overreach. Least it’s something that might help the next company keep from having to compromise all their users to deliver data on a few.

Clive Robinson October 5, 2013 6:27 PM

@ Nick P, Dirk Praet,

After reading this a thought occured…

The argument for “metadata” made by the authorities is basicaly “public” because it can be “seen” by those working for the “common carrier” and this some how makes it not private(/secret).

However it is easily possible to set an Email system up so it’s never at any time visable to those working for the “common carrier”.

The first step in this is securing the link from the customer end point (ie SSL etc)…

Now if the “metadata” is only visable at the end points then the LEO standard argument fails, because it’s nolonger on the outside of the message, it is part of the message and thus protected.

Now how you would build such a system is not immediatly obvious but some of the techniques used by mix networks and P2P networks would go a ways towards it. The problem is of course getting the idea across to a judge that what appears to be “magic” is infact a technical problem of some complexity that renders the LEO argument compleatly void.

Nick P October 5, 2013 10:50 PM

@ Clive Robinson

That’s possible. The other issue here is that the lawful intercept system, if eventually forced, should be easy enough to implement to (a) convince court to use it and (b) not require giving them control of the network. If keys alone won’t do it, their next step will be to force the vendor to backdoor the software (or send entirely different software). They’d get what they want one way or another so the protections should be strong, but allow selective disabling.

RobertT October 5, 2013 11:16 PM

I think we’re going to see a period where security products are simply no longer developed in the US.

Not to put to fine a point on it but US products simply wont be trusted.

The semiconductor industry went through a period post tech wreck where patent enforcement process was so complex in the US that product development was practically impossible. A number of very successful companies established themselves in Asia simply to avoid the US patent reach.

If all US security products must be backdoored for NAS compliance than the solution is rather simple…..

Anon October 6, 2013 12:55 AM

@RobertT

You’re comment doesn’t make any sense to me. Any company that wants to sell products in the US has to be compliant with US patents. Where development happens doesn’t matter.

Figureitout October 6, 2013 1:27 AM

Anon
–Then even worse, the secure products won’t even be sold in the U.S. Lol at where development happens doesn’t matter…Yeah it means the human capital will move to other countries and all the businesses around here will be fashion malls, mcdonald’s, and walmart.

Nick P October 6, 2013 12:27 PM

@ RobertT, anon, figureitout

Whether or not avoiding US is a good idea, I continue to design solutions for the US case b/c leaving isn’t an option for many companies that care about the bottom line. The current American software market is $136.6 billion, or almost half of all worldwide software sales. Every software product or accompanying device can be a backdoor.

So, we will have no shortage of backdoors or weak security products in the future because security/software vendors aren’t giving up $136.6 billion to protect us from something that people kind of voted for. And to do business here, they will be forced to comply with US rules. Economics trumps ethics for most companies, so the situation will not change.

(On the other hand, foreign countries might buy less US products. We’re seeing signs of this already.)

James Sutherland October 6, 2013 4:32 PM

It is indeed a valid point, that the more broadly you apply your “classified” rubber stamp, the less significance it carries, in multiple ways.

If the canteen menu is top secret, you need to give your catering staff top secret clearance. No, that doesn’t mean they get access to the nuclear launch codes – but it means you are marking them as having your highest level of trust, when they don’t actually need that in a sane system. You’re diluting your personnel vetting resources. You see a canteen worker talking on an external line, holding an orange-banded file – and you’re told it’s OK, that’s just next week’s food order.

In many ways it’s like the “minimum privilege” approach to operating systems: if absolutely every operation needs ‘root’-type access, every daemon is going to need to run as root. Apply a bit more care, and you can limit that to a much smaller set of code – and devote much more effort to securing that sensitive code properly.

Like a comment above says, when everything is maximum priority, nothing has any priority: focus on the stuff which is genuinely important and you will get better results.

wael October 6, 2013 6:26 PM

@ Nick P,

Are you implying that foreign companies are more ethical than their US counterpart?

Nick P October 6, 2013 6:44 PM

@ Wael

Im implying protection from US LEO/TLA threats can be much easier outside of the US. 😉

Nick P October 7, 2013 4:54 PM

@ Wael

Ha! Up to a point probably. I’d particularly think like you for chips made at the fabs for mobile devices. So few of them it’s hard for me to imagine a nation state NOT compromising them. The fab problem worries me more than any other problem. Let’s call it the modern, hardware version of Ken Thompson’s “Trusting Trust” issue. 😉

Wael October 7, 2013 6:06 PM

@ Nick P,

I previously wrote: No one has total control (let alone assured) – subversion is a possibility as well here.
We previously covered the subversion discussion. Ummm, my disposition has not changed since. And apparently, neither has yours!

PS: Replace “nothings” with “nothing’s” in the above link.

Nick P October 7, 2013 7:20 PM

@ Wael

“We previously covered the subversion discussion. Ummm, my disposition has not changed since. And apparently, neither has yours!”

Seems to be the case. Although, I do differentiate between provably no subversion and provably low likelihood of subversion. I believe an individual can obtain the latter if careful. This is easier with older hardware. It can also be done with hardware designed for limited, non-Internet applications that can be modded into general purpose, connected systems. (e.g Playstation 2) Both of these have low probability of NSA subversion.

Wael October 7, 2013 7:36 PM

@ Nick P,

Hmm. Why are you limiting your accusations to NSA? Their counterparts around the world are just as likely to subvert “things”. Be fair!
Then again, how do you quantify the likelihood of subversion? Heuristics, rules of thumb, and using “older hardware”? You know, older hardware was state of the art at one point! And your buddies have surely subverted it 🙂 I’ll buy your argument if you can tell me the date NSA started subverting systems and chips. I’ll be looking for your post titled: “The day NSA subverted Microprocessors” 🙂

Nick P October 8, 2013 2:34 PM

@ Wael

“Hmm. Why are you limiting your accusations to NSA? Their counterparts around the world are just as likely to subvert “things”. Be fair!”

My opponent for privacy technology at the time was NSA, not other agencies around the world. They have little to no power over me. Domestic LEO’s, on the other hand, have quite a bit. So, the focus was on them for potential subversion.

“Then again, how do you quantify the likelihood of subversion? Heuristics, rules of thumb, and using “older hardware”?”

Stuff like that pretty much. Probably best to illustrate it with that example.

One thing people keep forgetting is that NSA was much more careful about targetting Americans back then. They had strict rules on it and they seem to have been often enforced. Their philosophy at the time was to restrict strong crypto because otherwise they couldn’t break it. They also promoted public initiatives like key escrow. I have no evidence to suggest this was all a ploy to hide the fact that they had remote, root, kernel mode access to all PC’s in America. The more likely answer is that NSA didn’t subvert the majority of chips for the majority of the industry’s life and started their hardware subversion work closer to (or after) 9/11. And this was why they were so focused on the crypto such systems used back then.

The other thing I’m factoring in about old hardware is its properties. Old hardware was limited and pushed to the max due to high competition. Backdoors might take up precious resources. The chips were also easier to verify and in quite a few different ways (incl visual inspection). The competition in hardware market meant, rather than monoculture, there was a diverse array of hardware and software options. Even networking had competiting standards: SNA vs DECnet vs TCP/IP vs IPX/SPX. All this together would make a subverter’s job extremely difficult, esp. considering the legal climate back then didn’t favor NSA.

The last point is that NSA was still working out the details to COMPUSEC and INFOSEC back then. They were seasoned amateurs, rather than masters. Remember that it was NSA that ran the Orange Book evaluations, fostered development of highly secure commercial systems, and actually used many products they evaluated. The B3/A1 products didn’t get certified until there were no remaining vulnerabilities that NSA pen testers could find. After a short while, those were using Intel processors and boards. 😉 I think two had custom firmware for it but that would be necessary anyway given the requirements. So, that’s a potential risk area, but the overall trend was NSA themselves trusted the chips in systems that had to be “provably secure” by their standards.

Conclusion

So, you can combine these and come to the conclusion that you’re entirely right: I have no evidence the systems I recommended were subversion-free and everyone following the advice might be doomed. At the same time, one can conclude that the odds are against the older systems being backdoored as evidence is against it. One can replace firmware and drivers, or put a guard in front of it, if extra paranoid. Such kinds of analysis can be applied to modern systems looking at features, fab processes, suppliers, etc. It’s just way more complicated and I have little confidence in any conclusions from those analyses.

Bonus tidbit: One of the older secure OS’s was KeyKOS. EROS and COYOTOS improved on KeyKOS’s design. I recently found out that KeyKOS was ported to SPARC at one point. There are open SPARC cores available. Also, several research projects building security/integrity into chips use SPARC-based cores for prototypes. So, we have an open processor design (w/ supporting open SOC components), new tech to protect trusted software integrity on it, a field proven high security OS architecture, and modern implementations of it that resolve old issues. Wael, can you see the kind of trustworthy computing base one might be able to make if these were combined? 😉 Just a thought….

Clive Robinson October 8, 2013 5:31 PM

@ Wael, Nick P,

There is another issue to consider as to why the NSA is the organisation to defend against…

Whilst the “brains trust” working there may not be the brightest or best when it comes to a bank roll for tangables they are probably number one.

And this has a knock-on effect in that the “brains-trusts” at other 5-Eyes members and of other countries intel services feed backwards into them. In effect they try to punch above their financial weight by intelectual prowess to buy their slice of the pie.

It’s one of the reasons I still don’t think the US were actualy the instigators of Stuxnet, they just jumped on what was seen as a “good idea” and provided technical input from other work they had been involved with.

Most of the Western World has a false perception of who is the greatest nuclear threat to world peace outside of the 5 permanent UN Security Council Members. At the top of the “real list” would be Israel with something aproaching 300-400 weapons based on the technical sophistication they were known to have had back in the 1980’s (see M.Vanunu revelations). Importantly not only do they have the weapons they have the delivery systems as well. Next down on the list is India & Pakistan they both have weapons and medeium range delivery systems, and a highly unstable political system over Kashmir and state sponsered terrorist organisations fighting in and around the area. Then North Korea they have a very limited nuclear capability, but importantly they have a space capable delivery system which in effect means it’s “world spanning”. South Korea, Japan and Taiwan all have delivery capability due to their interests in satelite launch systems, what their actual nuclear weapons capability is, is unknown. Though we know they certainly have sufficiently advanced technology and academic knowledge to build quite sophisticated nukes if they wished to as both SKorea and Japan are at the leading edge of fission/fusion research for energy security.

As for Iran that Israeli&US publicity are trying to “scare up” they are a long long way down on the list they don’t currently have devices or delivery systems and as has been observed by international monitors their visable systems are aimed at energy production not weapons production. Much of the technology they have for producing fuel has come from Pakistan via AQ Khan and whilst they did have ties with N.Korea these were fostered due in the main to US support of Iraq and Sadam. Since the demise of Sadam contact with N.Korea appears to have wained. It appears that originaly Iran was passing on uranium enrichment knowledge to N.Korea in return for rocket technology to use against Iraq (which were developing their own long range rocket systems to attack Iran alongside the super gun of Gerry Bull who Israel assasinated).

I know that my reasoning that US involvment was aimed at N.Korea has been less than popular but at the end of the day unlike Iran it is very much a closed country and exploiting the uranium enrichment process tie-up between Iran and N.Korea would have been “to good to pass up” not just because it would have set N.Korea back it would also have sown considerable mistrust between the two countries. As it turns out unbeknown to the US and International weapons inspectors the relationship had cooled anyway and N.Korea had headed off in their own direction using their own improvments to enrichment. N.Korea knew that they were targets of Stuxnet for a number of reasons and to prove a point that the “US had missed the boat” called in UN nuclear inspectors to in effect “rub the US noses in it” and make it ubundantly clear the US plan had failed and further attempts along that line would be pointless.

Thus in the end Stuxnet achieved very little other than start to open the worlds eyes to what the US real intent was with regards to the “cyber-estate” and dominance through the back door. I don’t known if this had any bearing on Ed Snowden’s thought processess and pushed him in the direction of exposing the NSA or not but it would certainly fall fairly well into what is known of his time line.

As for “future predictions” problems between N.Korea and S.Korea appear to have taken a couple of steps back from the toe to toe preparing for a punch up that was evident six moths to a year ago, and more recently surface changes in Iran have also caused a cooling down from the issues of the past decade and a half. I guess the world is waiting to see what Obama want’s his “swan song” to be and currently it appears to be “obamacare” which a discordent minority is trying their best to stop, and the net effect is keeping obama’s attention focused on “home” not “abroad”, which benifits the rest of mankind a lot.

Nick P October 9, 2013 12:20 PM

@ Clive Robinson

Why Iran, Why Stuxnet: Just Loose Ends…

Interesting analysis. I think you might be getting too caught up in the overt politics and building on that. Remember that overt politics are misleading unless you include the [known] covert ops in for the backdrop. This vid is one of my favorites as it presents a good summary of our activities in Iran, Iraq and Saudi Arabia. (3 1/2 min long)

So, we overthrow Iran’s leader for oil. That works for about 20 years. Situation blowsback with rise of Khomeni. Sadam freaks out thinking it might spill over to his country and starts fighting Iran. We partner up with Sadam, even sending him money and supplies. He invades Kuwait, maybe heading for Saudi Arabia, where there’s plenty oil and our troops. We turn on Sadam, start comparing him to Hitler in the media, and Americans are convinced we must fight him. OBL, another creation of ours maybe gone rogue, threatens us for having troops in Saudi Arabia, which we remove. In the end, we invade Iraq, the 2nd largest oil stash in Middle East, because we’re interested in “liberating” the people (that we caused to die en masse with sanctions). We also smash Afghanastan dismantling OBL’s network.

So, after a long time dominating Iraq, the US starts talking about how it must hit Iran. To me, the Iran situation is pretty simple: they’re a loose end. Iran’s situation is entirely US created. Their government hates the U.S. for what it did over there. They hate the U.S. support of Israel. They are also pursuing nuclear options. So, they’re both a loose end from old covert ops, an oil stash, and a potential nuclear threat (directly or indirectly through others). That’s enough for Rome, err U.S., to want to squash Iran. Any North Korea connection would be icing on the cake. Possible, but not the main objective.

“Thus in the end Stuxnet achieved very little other than start to open the worlds eyes to what the US real intent was with regards to the “cyber-estate” and dominance through the back door.”

Well said. That I totally agree with. It’s kind of sad when you think about it. The old papers I link to showed that the U.S. practically invented information security and the theories behind securing our tech infrastructure. We could have easily led the way being the Toyota of highly robust infrastructure and software. Instead, we’re the KGB or SS of infrastructure and software. (shrugs) What can ya do…

RobertT October 9, 2013 6:55 PM

@NickP and others
For the record:
I have NEVER actually seen an intentionally subverted semiconductor Chip.

I’ve seen many instances of on-chip stupidity where decision made during the design process created rather obvious information leakage paths. Privately I’ve thought about the likelihood that the mistakes were intentional, but usually a 10 minute talk with the person/ team responsible is all it takes to convince you that you shouldn’t search for a complex answer when incompetence adequately explains away all the stuff-ups.

While it would be relatively easy for me to insert a vulnerability into a chip, the hard part is making sure that the vulnerability is maintained even when the block is redesigned , relayout-ed, re-synthesized and (and this is a BIG AND) still passes all the back-end Formal verification tool flows. The type of error that meets these criterion is baked into the specification, which is precisely why protocol attacks are such powerful tools with which to subvert system security.

Wael October 9, 2013 7:07 PM

@ RobertT,

I have NEVER actually seen an intentionally subverted semiconductor Chip.

Would your record change if you expanded the meaning of a semiconductor Chip to include the silicon, the Microcode, and the firmware?

RobertT October 9, 2013 8:17 PM

@Wael,
“Would your record change if you expanded the meaning of a semiconductor Chip to include the silicon, the Microcode, and the firmware?”

At the raw silicon / wafer level I’m not sure that an attack is possible

Microcode is easily but not often subverted, it certainly makes you wonder when silly microcode capabilities exist that serve no function except to weaken the security. I think the engineering concept of maintaining this “soft” infinitely flexible intermediate level is absolute stupidity from a security perspective. A system with direct hardware support for specific critical functions is much easier to secure than the infinite flexibility of a microcoded intermediate level. Unfortunately it takes 10 times as much effort to verify a chip design with simulation tools, if you remove the flexibility of microcode.

Firmware is often subverted and used to make attacks sticky (persistent despite re-formatting disks, re-installing the OS and effective against Live-CD approaches). It is really unforgivable that so little attention is given to device firmware security. I know I had a laptop several years ago where the firmware had been intentionally altered to enable easy support for new viruses. I found the firmware problem because of driver incompatibility issues (sometimes it pays to run non standard OS’s), just for a test I loaded a standard MS OS and guess what the infected firmware worked perfectly (still infected..but no operational problems)

Figureitout October 9, 2013 8:45 PM

RobertT
–What about embedded peripherals added in a supply chain all around Asia? Yeah it’s the stealthy infection, but still mostly normal operation of my laptop that has seriously tarnished my trust in computer security as a whole. All the log files trying to figure out what I do…they were a tad too late and they’ve been riding the Derp Train for a couple years now. And the persistent ones that pop up no matter what you do…well guess I got a ham radio station PC or a mystery to [try] to solve when whoever (I have some very likely suspects) seriously infected me decides to brick the comp. lol. Maybe something done to the SATA bus. Like they did to my innocent mother’s; they will really pay for that.

Clive Robinson October 10, 2013 5:32 AM

@ RobertT,

    …the hard part is making sure that the vulnerability is maintained even when the block is redesigned , relayout-ed, re-synthesized and (and this is a BIG AND) still passes all the back-end Formal verification tool flows. The type of error that meets these criterion is baked into the specification, which is precisely why protocol attacks are such powerful tools with which to precisely why protocol attacks are such powerful tools with which to subvert system security.

You’ve touched on an area I’ve been thinking about for a while now. As you and others have seen me say for quite a few years now I’ve said the likes of the NSA would be directing their attention to subversion via,

1, Plaintext (specificaly known plaintext).
2, Protocols.
3, Standards.

We now have sufficient evidence to show that this list is in effect correct and probably predates the AES competition by some years (which probably brings CELP and other audio codec standards into the frame).

What I’ve been thinking about is subversion not of the actual chip directly but subversion of the tools especialy for FPGA and bespoke ASIC designs (basicaly the only option for low volume production security products).

As you are aware the tools are far enough away from transparancy to be considered without doubt a “black box” by the low volume designers, and I assume the same for SoC designers as well.

The same is not true for the NSA who have the resources not only to get hold of the design tools used by the designers but also any confidential “in house” tools by the foundries. They also have sufficient staff and other resources to “reverse engineer” these tools and identify their charecteristics, which as at the end of the day we are dealing with “device physics” will be broadly similar across many if not all the tools (there’s realy only one way to “skin Schrodinger’s cat” and that’s open the box).

Now if you know the strengths and weaknesses of these design tools it would be possible to make any particular design specification play to either the strengths or the weaknesses of these tools.

If we consider weaknessess like “power charecteristics” then for arguments sake using lots of “multiply steps” in an algorithm will make it more susceptable to side channel attacks. However when looking at the algorithm from just the maths analysis of the algorithm multiply steps are little different to addition or XOR steps.

Now it’s been sugested that the more an algorithm uses “reversable” steps (mul / add / XOR) the easier it is to analyse it’s power signiture. Further the more linear the general steps are likewise the easier the power signiture is to analyse. But CPU’s in general don’t have fast nonlinear steps, so algorithm designers pushing for speed on CPU’s are going to go with linear steps rather than nonlinear steps.

Thus as with AES and the issue of CPU caches “Not Saying Anything” will give rise to such linear processes becoming embedded in algorithms to the detriment of practical implementations.

Now several years after the AES competition we are still not taking on board these “practical implementation” issues in crypto primative competitions.

And whilst we sort of know about some aspects of power signiture analysis, I’m reasonably confident the likes of the NSA, GCHQ, et al are sufficiently ahead on the curve to be “up around the bend and out of sight” of the Open Community.

Thus I find myself thinking are there other “side channel” techniques in silicon that can be enhanced by the way you design or don’t design the actual algorithm. Further can you use such side channels at a distance as most “time based” side channels can be (due to the issue I also keep trotting out “Security-v-Efficiency”).

Now I am not saying this is what the NSA et al have done or are doing, however if I was in their shoes it’s certainly something I would investigate as a potential vector. And a further thought arises… We know the NSA are not getting the best brains these days industry is more lucrative by quite a large margin. So how do you “leverage the brains” out there to do it for you? Well I can think of one way and it’s like the old “use a poacher to catch a poacher” technique. What you do is get researchers to look for subversion detection techniques, they will think up all sorts of subversion techniques (as we have seen with one or two papers) as part of the process of comming up with detection techniques. You start the process as an “open one” which gets fresh faced graduates going down that path for their doctoral research, you then ensure that the only job openings are either in the Uni’s or Defence companies over which you have a significant hold due to the way you allocate the funding.

Which is kind of what we see currently…

Feel free to pick holes in my thoughts because the process will in part refine them and in part open new avenues to consider.

RobertT October 10, 2013 6:17 PM

@Clive Robinson

Codecs
I was involved with the whole 3G Audio Codec and pushed hard for MELP, frankly I was amazed by the amount of negative feed-back we were getting. Why wasn’t everyone as excited as we were about MELP, Half rate MELP (1200bps) is a heck of a codec and due to the lowish bit rates it obfuscates patterns in the speech that are obvious with CELP, I’ve always wondered why Telcos resisted the introduction of a Codec that could at least double or even triple their users/Khz bandwidth.

FGPA tools:
I dont have a lot of experience with FPGA tools except as far as using them as function prototypes to check out the interaction of new chip functional blocks. Basically we take our chip top level description and create an FPGA flow. this is done mainly so the software / firmware guys can get started while the main SOC is in development. We are never really interested with optimizing the FPGA flow, in fact we often wish to force it to use gate primitives rather than the inbuilt macro blocks. If I were looking for subversion in an FPGA I’d be focused on these macro blocks. Things like FFT/IFFT engines, large multipliers these are the building blocks and are completely invisible to the user whereas the gates level device equivalents can usually be found out. By including a MULT requirement, and a speed requirement for the MULT you drive the design specification towards including the dedicated functional block, So that gives you the built-in side channel.

The actual FPGA synthesis tools would also be an obvious point to attack however many users will take the time to understand what the FPGA tool synthesized from what they intended, they do this because different styles of Verilog coding can produce vastly different sized synthesized blocks, so optimizing your coding style lets you use a smaller cheaper device. That said re configurable FPGA’s are a security nightmare they could contain almost anything without the user ever knowing what they really had

SOC Tools
Corrupting an SOC digital synthesis tool, in an undetectable manner, would be a very difficult task. Frankly I’d probably just focus on corrupting the parts of the flow where chip designers dont pay much attention. The obvious target here is Testability.

Chip Testing is a real black magic area the design flow basically adds more serial (observability /controlability) points until some magic number testability score is reached. Asking WHY we cant simply…… is pointless because the testability methods (especially full scan) are all built into the physical test equipment. So even if you can find a better way using BIST (Built-in Self Test) chances are the production tester would not support your method. Now obserability is the enemy of security so you need to be VERY cautious about which registers make up the scan end points. I’ve seen a few snafus in this area where I did wonder “could anyone be that thick……” it was interesting to see that future screw-ups by that same individual confirmed that he was just incompetent. In some ways chip designers surreptitiously inserting security weaknesses is akin to a Grand-master trying to pass themselves off as a high school level chess payer, it is extremely hard to do this and make errors (weaknesses in this case) without showing in other ways that you intentionally did this. The best weaknesses are the ones where the Verilog code looks fine but the compiler/synthesizer builds something unexpected making that happen requires an indepth knowledge of the exact internal workings of the compiler. If you look at other instances of the same logic structure in their coding (maybe for other chips) and find that in all other instances they coded correctly than you have a smoking-gun. That said most chip companies are too busy to employee anyone to look at security weaknesses in their hardware.

You raise an interesting point about NSA not attracting the best’n’brightest from the chip design sector. I’d say this is definitely correct, they are way behind the times, many of the papers that I see for hardware exploits belong in a museum because they might be possible but they target a design flow/ production methodology that disappeared 10 or 15 years ago.

Fortunately I’m kept reasonably busy these days because given enough time and budget I’m certain I could come up with some really interesting compromises. As for leveraging the brains out there, maybe they intentionally publish half baked outdated ideas to get others to reveal what is new and actual. This is not a new strategy from what I’ve heard it has been a corner stone method of the great game since the game started.

I remember reading somewhere that the Polish Mathematicians originally working on the Enigma problem got some help indirectly from the Germans by proposing some stupid ideas and were corrected by responses to the effect, (heres what you’ve got to do…..)

Clive Robinson October 10, 2013 11:25 PM

@ RobertT,

Yup I can see why MELP would be unpopular with some “regulatory bodies” national representatives. I know that the UK used to regularly “put the boot in”[1] on certain telco standards specificaly to keep certain “useful features” alive and well. They even encoraged certain companies to make “encrypting phones” for export that used “variable frequency inversion” back in the 60’s&70’s knowing full well a sonorgram would reveal not just the spoken words but the speaker as well.

I think my idea about algorithm standards and the tools might have got slightly lost in translation. I don’t think that they need to change the tools in anyway just charecterise the process they implement. On knowing the weaknesses from the charecterisation they then specify algorithm standards that play to these weaknesses, giving rise to side channels in the practical implementation.

As I’ve said before the NSA must have known long before the AES competition of the likely outcome of CPU cache hits and “loop unrolling”. Specifing the competition in the way they did pushing for both speed in CPU and minimisation in chip area NIST almost guarenteed to open up side channels and atleast one Open community researcher warned of this early on but was ignored.

The result is as we now know was most AES implementations used the competition code with all the loops unrolled and thus with easily exploited time based side channels. And even today there are a lot of practicle AES implementations out there with time based side channels due not just to legacy executable use but also in new products using legacy code libraries…

The reason it happened was the competition beside encoraging speed/area optomisation only considered mathmatical/logical attacks on the algorithm not attacks on the practical implementation of the algorithm so such side channels were not considered let alone investigated. And of all the AES finalists it appears that some people believe that Rinjdael was the most likely to have implementation side channels…

This seperation between theoretical algorithm analysis and practicle implementation errors is a realy good way to get theoreticaly secure but practicaly weak systems in place. As I said if I was the NSA, GCHQ, et al it’s exactly the area I would look to exploit at all levels.

Hence thinking about how you would go about pushing the theoretical design in directions that would exploit weaknesses/issues in the tool chain.

As I said we already know some examples (mul & DPA), which raises the question of how many more are there that we don’t know due to lack of open community research in that area.

It could be as you suggest that the closed community of the NSA et al has fallen several generations behind and are in effect behind the curve to effectivly be on par with the open community, but it’s not a proposition I would bet on.

[1] The French equivalent is “put the clog in” withe the French word for a wooden shoe being “sabot” which gives us the word “sabotage”.

Leave a comment

Login

Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.