In a startling reversal of a 65-year assumption, a group of researchers at MIT and the National University of Ireland, Maynooth published a paper proving a long-standing 1948 information entropy theory inaccurate.
Essentially, hackers and code breakers can break encryptions significantly faster than anyone realized.
The paper, publicized today in an MIT news release, disproves the Shannon Theory of entropy, which is based on the average probability that a given string of information bits will occur in a particular type of digital file. Authors Ken Duffy and Mark Christiansen of NUI, along with MIT electrical engineering professor Muriel Médard and student Flávio du Pin Calmon, explained that Shannon’s theory of averages holds for general communications, but not for cryptography where the worst-case scenario or improbable correlation is key to cracking an encryption.
One vulnerable area is user-selected passwords. “It’s still exponentially hard, but it’s exponentially easier than we thought,” Duffy said. “Attackers often use graphics processors to distribute the problem, you’d be surprised at how quickly you can guess stuff.”
The researchers also extended their findings to possible hacking vulnerabilities in embedded chips on credit cards and entry key cards, a concept they’re presenting at the 2013 Asilomar Conference on Signals and Systems.
Yet faced with a fundamental misconception about encryption security in general, the paper’s implications could reach further. Encryption is an inexact science and a topic of much debate lately. Whether this discovery relates to recent SD Times reports on encryption flaws in SIM cards and fervor over e-mail encryption in the wake of PRISM remains to be seen.