The Machine that is our Society:
University of Colorado at Boulder
Communication has the ability to control, and takes on this role actively whether the communicator was actively trying to accomplish this or not. It’s like the saying, ‘one cannot not communicate.’ Everything we do communicates, so understanding communication, and thus how to avoid communication problems, becomes even more important. By applying cybernetic theory to the concept of social contracts, and to a specific communication problem occurring currently with Toyota, the theory allows us to disassemble the system into individual components, allowing problems to surface.
Explanation of the Theory
Norbert Wiener (1954 )defines cybernetics as “the science of communication and control in machines and living organisms.” His theory was that of communication and control, such that the basic function of communication would be to control the environment, and this theory would explain the process. Under his theory, communication would be viewed as “information processing” (R.T. Craig, lecture, February 23, 2010). Wiener proposes that humans are not essentially different from machines in that if he were to give an order to a machine, he would be aware of the order that was sent out, as well as receive some sort of feedback that the order was received.
In every system, there is entropy, which is the amount of randomness or disorder (R.T. Craig, lecture, February 23, 2010). Noise is a common source of entropy in that if the message is not received accurately, errors are created. The higher the rate of errors in the system, the higher the entropy becomes. Furthermore, an unpredictable system would have the highest entropy. There is a natural tendency for entropy to increase, so counteracting this is necessary in order to maintain organization. Simple machines, such as a battery-operated clock, have no way to counteract entropy. On the other hand, cybernetic machines, like thermostats and people, are able to use information to make the necessary corrections. Cybernetic machines are able to track the environment by using “sense organs.” These sense organs then let us make adaptive responses.
There are two kinds of feedback: negative, and positive feedback. Negative feedback is that which counteracts change in the environment, while positive feedback amplifies change in the environment (R.T. Craig, lecture, February 23, 2010). Negative feedback is corrective, while positive feedback reinforces a deviation. Both kinds of feedback can work together, but if positive feedback continues without being confronted by negative feedback, this can destroy a system. If it does not stop somehow, then this is uncontrolled change.
To apply cybernetic theory to a practical problem, consider social contracts. The term social contract includes a broad variety of philosophical theories that are used to maintain social order (New World Encyclopedia). Basically, this means that in order to benefit from social order, citizens must give up some rights to the government. John Locke, one of the famous philosophers responsible for setting the groundwork for democracy, wrote, “The care… of every man’s soul belongs unto himself and is to be left unto himself. Laws provide, as much as is possible, that the goods and health of subjects be not injured by the fraud and violence of others; they do not guard them from the negligence or ill-husbandry of the possessors themselves” (Locke, 1689/2004). Thus, Locke is saying that while humans are able to make their own decisions, in order to be an active member under society, people must abide by the rules in place, therefore relinquishing some of their control. On a more basic level, Locke is saying that a social contract will happen if there is a government. However, the mere existence of these laws does not ensure they will be followed.
Oftentimes changes and/or additions need to be made to maintain order throughout the system. A key component of the social contract is that the civil rights acquired are not indelible; rather they can be negotiated and changed by elections and legislature (New World Encyclopedia). On a large-scale, our government is a complex cybernetic machine that regulates control. Under our democracy, we are able to change or modify our laws, which occurs continually. The system’s sense organs are able to gauge changes in the environment, adapting the system’s organization in order to counteract increasing entropy by using negative feedback. In this way, when a law fails to prove successful, legislation would need to provide negative feedback in order to counteract the change, preserving the equilibrium of the system.
A current example of this can be seen in the lawsuit against Toyota for its manufacturing defect. Toyota had been selling cars with gas pedals that would occasionally “stick,” thus being extremely dangerous (New York Times). Toyota’s profits had been positive feedback for them, as they had been, and still are, an extremely profitable car company. Even after receiving negative feedback from buyers, Toyota failed to acknowledge the receipt of these complaints, and did not offer any kind of support for these customers. Since they did not counteract the natural tendency for entropy to increase with negative feedback, the errors in the system increased. In fact, Toyota continued to ignore messages for four months before the problem was leaked to the public after a serious accident. Recently, the National Highway Traffic Safety Administration sought after the maximum fine of $16.4 million against Toyota for not complying with the five-day grace period for reporting their knowledge of the safety defect (New York Times). This fine is serving as negative feedback within the system and is attempting to counteract the disorganization that occurred.
It is important to understand that under the current social contract, Toyota’s penalty is the maximum fine that can be levied. However, this amount is small-scale to Toyota when compared to their net profits, which has led to the desire of raising the maximum fine for future cases (New York Times). This could take years to change, but would not be a fault of the people being slow to respond. Instead, the government would be slow to respond. This would be a flaw in this particular cybernetic system, not against cybernetic theory in general.
Through the lens of cybernetics, both social contracts and more broadly, our government, can be viewed as communication systems of control. Through communicating laws and regulations on how to live life, social order is maintained. When errors or gaps in the system arise, laws and regulations are negotiated and renegotiated in order to reduce entropy. Toyota’s lawsuit can be understood through the lens of cybernetics, as well as the desire to change the maximum fines in the future.
Wiener coined cybernetic theory in 1954, which was decades before the current state of computers and information technology. His theory is especially useful when thinking of communication systems as machines. The components of his theory, including sending and receiving messages, entropy, negative and positive feedback, and sensory organs, are still practical and informative ways of understanding communication systems today. However, while cybernetic theory does explain these attributes quite well, the theory is not as successful in explaining how meaning is created in the communication process or identifying the source for entropy in the system. Looking to systems theory as discussed by Stanley Deetz would provide the necessary framework for this understanding.
Deetz writes, “A systems perspective assumes that meaning arises between people as they communicate rather than being a knowable fixed property prior to interaction” (Deetz, p. 2). Assumptions of a systems perspective include that interactants' meanings can be unclear and multiple, and are multi-leveled. Because it is impossible to not communicate, messages are constantly being produced in the system regardless of whether the sender is trying to convey a message. These assumptions go beyond cybernetic theory in explaining how communication errors are formed and magnified, increasing entropy. Furthermore, Deetz's systems theory assumes that a system can either be “representative of the participants and openly form them in response to the environment… or may become skewed and systematically distorted” (Deetz, p. 5). This is especially important to notice in that there are power differences in our government, as well as in all of society, which grant certain people more control than others.
Another downfall to cybernetic theory is that the theory views humans as comparable to machines. While this is clearly an advantage to the theory as well, reality is that humans are fundamentally different than machines. Humans have a genuine capacity to empathize, can show a wide range of emotions, and can argue over morality. Machines, however, deal with “yes” or “no” answers, and have no capacity to deal with grey area. What is right and wrong can be subjective, and because computers don’t understand morality, they are not equipped to deal with such problems. This, I argue, is a big problem for applying cybernetic theory to humans.
All communication theories are abstractions of the broader concept of communication, and different theories are used to frame different problems. Cybernetic theory calls attention to machine-like features of communication, frames communication problems as system malfunctions, and is able to address issues of control within a system. However, other theories, such as Stanley Deetz’s systems theory, are more applicable when analyzing how meanings are formed, translated, and sometimes distorted. Even though Wiener developed cybernetic theory long before the onset of computers and modern technology, his theory remains applicable to communication problems today, offering insight to improvement as well as understanding of complex communication systems.
Deetz, S. (n.d.). Linear or systems models of communication. University of Colorado at Boulder.
Locke, J. (2004). Two treatises of government and a letter concerning toleration (I. Shapiro, Trans.). New Haven: Yale University Press. (Original work published 1689)
Maynard, M. (2010, April 5). U.S. is seeking a fine of $16.4 million against Toyota. New York Times. Retrieved from http://www.nytimes.com/2010/04/06/business/06toyota.html?scp=4&sq=toyota&st=cse
New World Encyclopedia. (n.d.). Social Contract. Retrieved from http://www.newworldencyclopedia.org/entry/Social_contract
Stanford Encyclopedia of Philosophy. (2007). John Locke. Retrieved from http://plato.stanford.edu/entries/locke/
Wiener, N. (1954). Cybernetics in history. In The human use of human beings: Cybernetics and society (pp. 15-27). Boston: Houghton Mifflin.