NSA Outs Top-Secret Report That Missed the Future of Supercomputing

This image may contain Human Person and Book
A Cray XMP-24 supercomputer used by the NSA.Photo: NSA

The mid-1990s were dark years for the National Security Agency. Its budget had been slashed, top technical talent was seeping out, and the company that made its supercomputers was in trouble.

You can get a sense of the agency’s worry — and its myopia — in a top-secret report on the state of supercomputing that the U.S. spy agency recently declassified.

Originally published in the winter 1995 edition of the NSA-only code-breakers’ journal Cryptologic Quarterly, the report opens a small window onto the secretive agency — although the glimpse it provides is now rather dated. One thing it does show is how wrong everyone was in the mid-’90s about what lay ahead for the supercomputer. Even the United States’ premiere supercomputer user was in the dark.

The report calls on the NSA to do what it can to save its favorite supercomputing company: Cray Research Inc., or CRI. “The commercial viability of CRI and the rest of the supercomputing industry is critical not only to NSA but also to the entire Western world crytanlytic community,” says the report.

Cray Research was sold to Silicon Graphics the next year and then spun into another company. Although the new Cray still makes some of the world’s most powerful supercomputers, they’re nothing like the futuristic freon-cooled machines the NSA was trying to preserve.

The NSA did get some things right. It correctly predicted the rise of companies such as IBM, whose Unix processors would become important in the supercomputing space. But it missed the major change that would completely reshape supercomputing over the course of the next decade: the rise of cheap consumer PCs clustered together and — most importantly — the rise of open source software such as Linux that made this all possible.

Clustered systems started taking over the supercomputing space just five years after the report was published, according to supercomputer expert Jason Lockhart, who spoke us via e-mail. Lockhart should know; he’s one of the Virginia Tech researchers who shocked the world in 2003 by building one of the world’s largest supercomputers out of cheap Macintosh PCs.

“The advances in commodity computing architectures made purpose built systems almost completely un-viable financially,” Lockhart says. “We saw most of the large system vendors collapse. Both SGI and Cray had to completely rethink their R&D efforts in order to remain alive in the marketplace.”

It’s hard to fault the NSA for missing these trends, says Matthew Aid, an NSA historian who first blogged about the report. After all, the entire technology industry was taken flat footed by the rise of Linux.

But the report was also published during a particularly tough time for NSA’s code-breakers, who were hammered by budget cuts in the early 1990s. “In the span of eight years. the NSA literally went deaf because of the changes in the technology,” Aid says. “The NSA lost a third of its manpower and third of its budget in the five years after the Soviet Union collapsed.”

Today the NSA is again building the world’s most powerful supercomputers, including a 260,000 square feet Multiprogram Computational Data Center at Oak Ridge National Labs. It is set to be completed in 2018.

NSA and the Supercomputer-2[#embed: https://www.scribd.com/embeds/112742911/content?start_page=1&view_mode=scroll&access_key=key-6ui3cazzvez0zdiewg2]