Shannon-Weaver Model of Communication

Maeve Donohue
Design Theory & Critical Thinking
3 min readJan 24, 2017

In 1948, Claude Shannon, an American engineer and mathematician for the Bell Telephone Company wrote an article with Warren Weaver, an American scientist, in the “Bell Systems Technical Journal” called “A Mathematical Theory of Communication” which also became known as the “Shannon-Weaver model of communication.”

Shannon’s goal was to create a theory to help engineers find the most efficient way of transmitting electrical signals from one place to another.

The image below is a widely seen diagram of Shannon-Weaver’s model:

My concept map of the model:

It hasn’t been reviewed by my teacher yet, so I’m not sure if it’s right :)

The concepts in this model were widely used in communications research. I found this website to be very helpful in explaining some of the concepts I didn’t fully grasp from the reading. Selected notes from that site below:

1.) Entropy: the measure of uncertainty in a system

“Uncertainty or entropy increases in exact proportion to the number of messages from which the source has to choose. In the simple matter of flipping a coin, entropy is low because the destination knows the probability of a coin’s turning up either heads or tails. In the case of a two-headed coin, there can be neither any freedom of choice nor any reduction in uncertainty so long as the destination knows exactly what the outcome must be. In other words, the value of a specific bit of information depends on the probability that it will occur. In general, the informative value of an item in a message decreases in exact proportion to the likelihood of its occurrence.”

2.) Redundancy: the degree to which information is not unique in the system

“Those items in a message that add no new information are redundant. Perfect redundancy is equal to total repetition and is found in pure form only in machines. In human beings, the very act of repetition changes, in some minute way, the meaning or the message and the larger social significance of the event. Zero redundancy creates sheer unpredictability, for there is no way of knowing what items in a sequence will come next. As a rule, no message can reach maximum efficiency unless it contains a balance between the unexpected and the predictable, between what the receiver must have underscored to acquire understanding and what can be deleted as extraneous.”

3.) Noise: the measure of information not related to the message

“Any additional signal that interferes with the reception of information is noise. In electrical apparatus noise comes only from within the system, whereas in human activity it may occur quite apart from the act of transmission and reception. Interference may result, for example, from background noise in the immediate surroundings, from noisy channels (a crackling microphone), from the organization and semantic aspects of the message (syntactical and semantical noise), or from psychological interference with encoding and decoding. Noise need not be considered a detriment unless it produces a significant interference with the reception of the message. Even when the disturbance is substantial, the strength of the signal or the rate of redundancy may be increased to restore efficiency.”

4.) Channel Capacity: the measure of the maximum amount of information a channel can carry

“The battle against uncertainty depends upon the number of alternative possibilities the message eliminates. Suppose you wanted to know where a given checker was located on a checkerboard. If you start by asking if it is located in the first black square at the extreme left of the second row from the top and find the answer to be no, sixty-three possibilities remain-a high level of uncertainty. On the other hand, if you first ask whether it falls on any square at the top half of the board, the alternative will be reduced by half regardless of the answer. By following the first strategy it could be necessary to ask up to sixty-three questions (inefficient indeed!); but by consistently halving the remaining possibilities, you will obtain the right answer in no more than six tries.”

--

--