In psychology, instrumental learning occurs when an organism learns a behavior that brings about certain outcomes.
Think about pet dogs. You may have seen them around their owners, and they often listen to their commands. When the owner says “sit”, the dog sits down. How does the dog, which cannot understand human language, perform this act?
They are taught such behavior through instrumental learning, also known as operant conditioning. Not just animals but human beings also learn various activities (say driving cars or using computers) through the same procedure.
Thorndike’s Instrumental Learning Concept
The term “instrumental learning” was coined by the American psychologist Edward L. Thorndike, and he explained this process through a “law of effect”, which states that:
in a given situation, a response followed by a satisfying consequence will become more likely to occur and a response followed by an annoying consequence will become less likely to occur. (Passer)
Thorndike was exploring how animals learn to solve problems and built a special cage, called the puzzle box, for this purpose (Passer, 2007). This box could be opened by stepping on a lever, and Thorndike placed a hungry animal (such as a cat) inside it.
The food was kept outside, so to get to it, the animal had to learn how to open the box.
The cat tried various things—scratching the walls, pacing around, digging through the floor—until it finally stepped on the lever, opening the door.
Over many trials, the cat’s performance improved, and eventually, it learned to press the lever as soon as the door was shut. Thorndike concluded that, since the performance improved slowly, the animal did not gain insight into the solution.
Instead, through trial and error, it gradually eliminated the responses that did not work and became more likely to perform the action that opened the door. He called this process instrumental learning because the animal’s behavior is instrumental in bringing about a certain outcome (1911).
See A List of 50 more Learning Concepts Here
Skinner’s Impression Management Theory
Building upon the work of Thorndike, Skinner developed the concept of “operant conditioning”.
This is because the organism operates on its environment in some way, and he defined it in the following way:
Operant conditioning is a type of learning in which behavior is influenced by the consequences that follow it (1938)
Like Thorndike, Skinner also built a special chamber, which he called the Skinner Box. Inside it, a lever is placed above a small cup, and when the lever is pressed, food drops into the cup. A hungry rat is put inside the chamber, and as it moves around, it accidentally presses the lever.
The food drops into the cup and the rat eats it. The rat’s behavior was recorded with a cumulative recorder, and Thorndike found out that the rat pressed the lever more frequently over time. He also identified several consequences, including reinforcement and punishment.
Reinforcement is when a response is strengthened by an outcome that follows it (Passer). By “strengthened”, we usually mean that there is an increase in the frequency of the response.
In our example, the food is a reinforcer, as it causes the rat to press the lever more frequently.
In contrast, punishment occurs when a response is weakened by outcomes that follow it. Imagine that in our box, instead of giving food, the lever delivers a tiny electric shock. Now, the shock will act as a punisher, reducing the rat’s frequency of lever pressing.
Following Darwin, Skinner viewed operant conditioning as a type of natural selection that allows an organism to adapt to the environment. Through this process, organisms learn to increase behaviors that lead to favorable results and reduce those that lead to unfavorable results; this is in line with Thorndike’s law of effect.
Impression Management Examples
- Education: Operant conditioning is widely used in the field of education. Skinner was deeply interested in the efficiency of traditional instructional methods, and he created mechanical teaching machines. Each machine presented study material, quizzed the student, and provided immediate feedback. Students could repeat the lesson or proceed to the next chapter. Later, personal computers would bring Skinner’s vision to the entire world. Computerized instruction continues to have immediate performance feedback and self-paced learning as its core principles, and it is used in business, industry, and the military (Perez, 2006).
- Workplace: Skinner’s work also revolutionized the workplace by changing our perception of motivation. A key behaviorist assumption is that poor performance should not simply be attributed to laziness or a bad attitude (Passer, 2007). Instead, the environment is probably not offering the proper consequences to reinforce the desired behaviors. Today, most corporations have training programs that enhance managers’ effectiveness in bringing about desired worker behavior. They also use incentive systems, such as bonuses and stock options, to encourage employees. Finally, token economics is also popular; it reinforces desired behavior with tokens (like points) that can later be traded for tangible rewards (recreation time, prizes, etc.).
- Applied Behavior Analysis: Operant conditioning led to a field called applied behavior analysis, which uses the behavioral approach along with the scientific method to solve individual & societal problems (Passer). Applied behavior analysts design & implement programs to change behavior. They measure its effectiveness by gathering data before and after the program. Applied behavior analysis has helped change a range of behavioral programs, from chronic hair-pulling to making people use their seat belts. (Byrd, 2002). It has also helped improve student performance, increased employee productivity, and led to greater energy conservation.
- Training Animals: From training your dog to sit on command to making animals perform remarkable acts, operant conditioning can do it all. As we saw earlier, both Thorndike & Skinner began their work by studying how animals learn, and it continues to be a major part of the field. Today, many animals are trained to become performers on screen or in circuses, while many assist people with disabilities. Police dogs help officers on routine patrol, often using their sense of smell to locate illegal drugs, hidden bombs, and missing people (Gazit, 2003). The US Navy even uses dolphins to patrol waters around nuclear submarine bases. (Morrison, 1998).
- Military Training: Modern military training uses both classical conditioning and operant conditioning to closely simulate actual combat. As human begins, we have an innate resistance to killing a member of our species. However, this reluctance is quite harmful for the military, as was revealed by S.L.A. Marshall’s book, Men Against Fire: only 15% of WWII soldiers fired with the aim of killing (1947). After the US army accepted his research, they introduced new training protocols, resembling operant conditioning. They replaced pop-up firing ranges with three-dimensional, man-shaped targets, which collapsed when hit. This provided immediate feedback and acted as positive reinforcement. More realistic training, praises from superiors, and marksmanship awards are some other methods that the military employs.
- Parenting: The realm of parenting is considerably impacted by operant conditioning principles. Parents utilize these principles to instill various behaviors within their children, developing a reward-based learning system. Here’s a practical example: a child who regularly ignores the chore of tidying a room might receive a tangible encouragement in the form of their favorite snack for performing the activity. Over time, they start associating cleaning their room with the reward and eventually with a sense of accomplishment, thereby reinforcing the behavior. Likewise, operant conditioning can be employed to discourage inappropriate or harmful behaviors. Not rewarding a child for such behavior will demonstrate the lack of positive correlation between it and their parents’ acknowledgment. By implementing these time-honored principles from this arena of behavioral psychology, parents constructively shape their children’s habits and values (Larson, 2010).
Conclusion
Instrumental learning, which later developed into operant conditioning, teaches organisms to engage more frequently in behavior that leads to favorable outcomes.
So, if stepping on a lever brings food, then a rat will do it more frequently. On the other hand, if it brings an electric shock, the rat will do it less often. These are examples of reinforcement and punishment respectively.
Operant conditioning helps organisms, including human beings, adapt to their environment. From parenting children at home to managing employees at work, operant conditioning can improve our lives in innumerable ways.
References
Byrd, M. R., Richards, D. F., Hove, G., & Frima, P. C. (2002). Treatment of early onset hair pulling as a simple habit. Behavior Modification, 26, 400–411.
Gazit, I., & Terkel, J. (2003). Explosives detection by sniffer dogs following strenuous physical activity. Applied Animal Behaviour Science, 81, 149–161.
John Hopson. (2001). Behavioral Game Design. Game Developer.
Marshall, S.L.A. (1947). Men Against Fire: The Problem of Battle Command in Future War. Washington: Infantry Journal.
Passer, M. W., & Smith, R. E. (2004). Psychology: The science of mind and behavior. London: McGraw-Hill.
Perez, R. S., Gray, W., & Reynolds, T. (2006). Virtual reality and simulators: Implications for web-based education and training. In H. F. O’Neil & R. S. Perez (Eds.), Web-based learning: Theory, research, and practice. Erlbaum.
Skinner, B. F. (1938). The behavior of organisms: an experimental analysis. Appleton-Century.
Thorndike, E. L. (1911). Animal intelligence: Experimental studies. Macmillan Press.
Morrison, D. C. (1988). Marine Mammals Join the Navy. Science, 242, 1503–1504.