To this day, no other encryption scheme is known to be unbreakable. The problem with the one-time pad so-called because an agent would carry around his copy of a key on a pad and destroy each page of digits after they were used is that the two parties to the communication must each have a copy of the key, and the key must be kept secret from spies or eavesdroppers.
Quantum cryptography solves that problem. More properly called quantum key distribution, the technique uses quantum mechanics and entanglement to generate a random key that is identical at each end of the quantum communications channel.
The quantum physics ensures that no one can eavesdrop and learn anything about the key: any surreptitious measurements would disturb subtle correlations that can be checked, similar to error-correction checks of data transmitted on a noisy communications line.
Encryption based on the Vernam cypher and quantum key distribution is perfectly secure: quantum physics guarantees security of the key and Shannon's theorem proves that the encryption method is unbreakable. At Bell Labs and later M.
At other times he hopped along the hallways on a pogo stick. He was always a lover of gadgets and among other things built a robotic mouse that solved mazes and a computer called the Throbac "THrifty ROman-numeral BAckward-looking Computer" that computed in roman numerals.
In he wrote an article for Scientific American on the principles of programming computers to play chess [see "A Chess-Playing Machine," by Claude E. Shannon; Scientific American , February ]. In the s, in one of life's tragic ironies, Shannon came down with Alzheimer's disease, which could be described as the insidious loss of information in the brain. The communications channel to one's memories--one's past and one's very personality--is progressively degraded until every effort at error correction is overwhelmed and no meaningful signal can pass through.
The bandwidth falls to zero. The extraordinary pattern of information processing that was Claude Shannon finally succumbed to the depredations of thermodynamic entropy in February But some of the signal generated by Shannon lives on, expressed in the information technology in which our own lives are now immersed. Graham P. Collins is on the board of editors at Scientific American. Already a subscriber? Sign in. Thanks for reading Scientific American.
The paper laid out what is now known in finance theory as the Kelly Criterion, a rule for allocating capital to risky propositions, whether at the blackjack table or in the stock market. Information scholars are currently following the lead of physics, exploring beyond classical assumptions about the state of a communication system and incorporating concepts from quantum mechanics.
Instead of using bits that resolve to one of two binary states, quantum information processing considers the possibility of information having multiple states that can be superposed. Shannon had a way of getting behind things. Are there other phenomena that could be understood and described using the template provided by information theory?
In , the Cambridge physiologist H. The parallels between information and economics identified by Kelly open the question of whether the concepts should complete the round-trip from Shannon to Kelly and back to Shannon.
The information theory equivalents of financial concepts like risk, return, volatility, and the Sharpe Ratio could offer insight and discipline to some forms of communication. Without meaning, information theory can solve the engineering problem—but only the engineering problem. Granting engineering problems primacy in questions of meaning, however, seems like a hasty, ignominious surrender.
We should at least wait to size up our coming mechanical overlords by taking stock of how current advances in artificial intelligence and machine learning play out. Shannon clearly did not envision that kind of future for information theory.
Smack in the middle of his paper, in a statement that almost serves as a giant asterisk, Shannon turned to James Joyce:. The Basic English vocabulary is limited to words and the redundancy is very high.
This is reflected in the expansion that occurs when a passage is translated into Basic English. Joyce on the other hand enlarges the vocabulary and is alleged to achieve a compression of semantic content. Yet it is impossible to conclude that this comment, referencing possibly the least-mechanical English-language writer of all, was anything but intentional.
While winning his court case for a meaning-free information theory, Shannon left the door open for appeals. Theoretical approaches to meaning, in fact, do abound. One was Warren Weaver. By definition, translation is a task that requires more than the mere replication of a message.
Weaver once wrote about a hack that would help prepare verbal inputs to make them digestible for even early computers:. Suppose we take a vocabulary of 2, words, and admit for good measure all the two-word combinations as if they were single words. The vocabulary is still only four million: and that is not so formidable a number to a modern computer, is it?
It did more than just send or receive information about its maze. In considering meaning, the humanists and the scientists are heading toward the same destination. What Shannon was doing was not all simply hard math. It was thinking about problems that really, at the same time, consumed people in linguistics and philosophy as well.
Could information theory open a channel between the math-science tribe and the humanities tribe? Or perhaps between people and machines? The bots were trained in English, but suddenly became fluent in a pidgin understandable only to each other. It is reminiscent of the transformations that human language has undergone at the hand of text messaging IMHO.
If we develop that thought further, information theory may have a role to play as a rubric for good communication—one that cuts across at least two cultures. Unlike gender inequality, racial inequality Robin Dembroff , Dee Payton. A critique that anticipated the political Robin D. This came as a total shock to the communication engineers of the day. Given that framework of uncertainty and probability, Shannon set out in his landmark paper to systematically determine the fundamental limit of communication.
His answer came in three parts. First, Shannon came up with a formula for the minimum number of bits per second to represent the information, a number he called its entropy rate, H. This number quantifies the uncertainty involved in determining which message the source will generate. The lower the entropy rate, the less the uncertainty, and thus the easier it is to compress the message into something shorter.
For example, texting at the rate of English letters per minute means sending one out of 26 possible messages every minute, each represented by a sequence of letters.
In reality, some sequences are much more likely than others, and the entropy rate is much lower, allowing for greater compression. Thus, information is like water: If the flow rate is less than the capacity of the pipe, then the stream gets through reliably. While this is a theory of communication, it is, at the same time, a theory of how information is produced and transferred — an information theory.
His theorems led to some counterintuitive conclusions. Suppose you are talking in a very noisy place. Maybe repeating it many times? Sure, the more times you repeat yourself, the more reliable the communication is. Shannon showed us we can do far better. Repeating a message is an example of using a code to transmit a message, and by using different and more sophisticated codes, one can communicate fast — all the way up to the speed limit, C — while maintaining any given degree of reliability.
This surprising result is a cornerstone of the modern digital information age, where the bit reigns supreme as the universal currency of information. His theory is as fundamental as the physical laws of nature. In that sense, he was a scientist. Shannon invented new mathematics to describe the laws of communication. He introduced new ideas, like the entropy rate of a probabilistic model, which have been applied in far-ranging branches of mathematics such as ergodic theory, the study of long-term behavior of dynamical systems.
0コメント