A variable ratio schedule of reinforcement applies an award after varying numbers of times a goal behavior has occurred. It is one of four types of partial reinforcement schedule.
The variable schedule causes a randomness effect where people don’t know when they will be rewarded (or punished) for their behavior but they know there is a chance each time.
For example, the reward may be given after the 5th, then 3rd, then 11th occurrence of the goal behavior.
The variable ratio schedule is one of four schedules of reinforcement identified by B. F. Skinner. The other three schedules are: fixed ratio, fixed interval, and variable interval. Each reinforcement schedule produces a unique set of behavioral patterns and contain different strengths and weaknesses, such as differing speeds of behavior acquisition.
The variable ratio schedule produces a very high and steady frequency of the goal behavior. In addition, there is very little post-reinforcement pause. Because the number of behaviors required for reward changes, the subject of experimentation keeps a steady rate of response.
Variable Ratio Schedule of Reinforcement Examples
- Mrs. Linwood loves to play the scratch-offs at the convenience store near her home.
- Ben checks his Facebook account frequently to see how many likes his most recent post received.
- Mrs. Jones likes to give pop quizzes to her students. This rewards students who study frequently.
- Jasmine and her sister like going door-to-door selling girl scout cookies. Sometimes they get a sale, sometimes they don’t.
- Professor Jenkins likes to call on students at random to see if they read that week’s assignment.
- Coach Jacobs likes to praise his players for trying hard during drills. But he doesn’t give them praise each and every time; sometimes he does, and sometimes he doesn’t.
- Mrs. Singh checks on her teenagers’ homework every night. Sometimes she gives them a reward for doing well, but not always.
- Mitchell lives in Las Vegas and likes to play slot machines the most. One time he won twice in one day; but then didn’t win again for months.
- Jenna likes to bake cookies for her boyfriend when they get along really well. But, she doesn’t want to spoil him so she doesn’t do it every time he is nice.
- Mr. Jones likes to distribute small bonuses randomly to his employees for working hard.
- The police are sometimes at a speed trap and sometimes aren’t. Because they know there’s a chance the police will be there, drivers tend to drive at the correct speed past that speed trap every time.
Case Studies of Variable Ratio Schedule of Reinforcement
1. Slot Machines
There are many features of slot machines that make them so enjoyable to play for so many people. There are the bright colors, the flashing lights, and the high-pitched exciting sounds. And then, there’s the payoff…or at least the possibility of a payoff.
Each slot machine can be independently programmed to produce a very specific schedule of payoff. Casinos apply a variable ratio schedule of reinforcement, so that the payoff is highly unpredictable.
For example, one machine may have a VR-120 schedule. That means that on average, it will produce one payoff for every 120 times played.
However, it’s not always 120. The number of times required to be played before a payoff varies each and every time. So, one time it might be 90, one time it might be 55, and another time it might be 155.
But, the average of all of those will equal 120.
The above video describes other features that make slot machines so compelling.
2. Video Game Loot Boxes
A “loot box” refers to a container in a video game that holds various kinds of rewards. The rewards in the box could alter the game in some exciting and meaningful way; they could change the player’s avatar or improve their game play.
Like other aspects of video game-play, obtaining rewards as a function of in-game performance is often on a variable ratio schedule. While players know clearly how to obtain some rewards, others are less predictable.
Game designers are well-versed in the use of reinforcement schedules to keep players locked-in to game-play. For example, the opportunity to purchase a loot box can occur at any moment.
However, loot boxes have become controversial because they can sometimes be purchased with monetary forms of payment. From the perspective of some academics and concerned government officials, this makes the situation similar to gambling
The issue has been recognized as so serious that several countries have enacted legislation, including n Belgium, The Netherlands and Denmark, with several other nations (including the United Kingdom, Australia, Sweden and the United States) considering similar action (McCaffrey, 2019).
3. The Skinner Box
The Skinner box is a glass-enclosed machine that contains a lever or button that an animal can press to receive a reinforcement.
The box has a device that allows the researcher to control the schedule of reinforcement and charts the animal’s behavior.
When the lever or button is pressed, then the animal is rewarded with a food pellet or water. Other stimuli could be presented as well, such as a light or sound to demonstrate when the food is available (this is known as a discriminative stimulus), or even electricity applied to the floor as a form of punishment (in which case the rat hits the lever to escape an aversive stimulus – this is known as escape learning).
Using the box allows the researcher to control nearly all aspects of the environment, a key principle of good research. A researcher can examine the effects of different reinforcement schedules on behavior while eliminating the influence of other factors.
The above video shows B.F. Skinner talking about the box and the variable ratio schedule.
4. Checking Your Social Media Posts
There is no doubt that most people in the industrialized world spend a lot of time on social media. Among the many ways of measuring this activity, some numbers stand out. For example, in North America, people spend an average of more than 2 hours a day on social media such as Facebook and Instagram.
There are numerous factors involved that make people so consumed with social media: engaging content, ease and availability of news and entertainment, and desire to connect with friends.
It’s possible to examine this issue from a schedule of reinforcement perspective as well. For example, when we check on our posts to see if it has been liked, we are putting ourselves on a variable ratio schedule.
Each time we check, we might be rewarded by seeing that we have gained 10 likes for our post. The next time we check however, there may be no new likes.
This is a variable ratio schedule of reinforcement. The number of behaviors required in order to receive reward changes each time.
Of the four schedules of reinforcement, the variable ratio leads to the highest rate of behavior.
5. Predator Hunting Success Rates
Believe it or not, the hunting success rate of some of the world’s most fierce predators is actually quite low. While that of other, seemingly more docile creatures, like the domestic cat, is quite high. There is a wide range of success rates across the animal kingdom, as can be seen in the above video.
It’s possible to examine these success rates from a schedule of reinforcement perspective. If the success rate of a predator is 100%, that would clearly be a fixed ratio schedule of one. Each hunt results in one meal.
However, when the success rate is in the single digits, such as 5%, then it means that the predator must hunt 20 times before being rewarded.
In reality, the number of attempts required in order to receive a reward is going to change. One week, the predator may need to engage the hunt 15 times before being rewarded. The subsequent weeks, that number may be higher or lower.
This means the predator is on a variable ratio schedule. The number of hunts required for reward is unpredictable.
The variable ratio schedule of reinforcement produces the highest rate of behavior of all the schedules. Since the number of behaviors required to receive a reward changes each time, the animal subject, or human, is in a constant state of quest.
Slot machines and video games take advantage of the variable ratio schedule by integrating it into game-play to keep players locked-in mentally. This can lead to serious issues of addiction and gambling.
The variable ratio schedule can also be seen in the animal kingdom. Success rates of predators can be surprisingly low, but they are always unpredictable. That means that the number of hunts required for reward is constantly changing.
Drummond, A., & Sauer, J.D. (2018). Video game loot boxes are psychologically akin to gambling. Nature Human Behaviour, 2, 530-532.
Ferster, C. B., & Skinner, B. F. (1957). Schedules of reinforcement. New York: Appleton-Century-Crofts.
King, D., Delfabbro, P., & Griffiths, M. (2010). Video game structural characteristics: A new psychological taxonomy. International Journal of Mental Health and Addiction, 8(1), 90-106.
McCaffrey, M. (2019). The macro problem of microtransactions: The self-regulatory challenges of video game loot boxes. Business Horizons, 62(4), 483–495.
Morgan, D. L. (2010). Schedules of reinforcement at 50: A retrospective appreciation. The Psychological Record, 60, 151–172. https://doi.org/10.1007/BF03395699
Reed P. (2001). Schedules of reinforcement as determinants of human causality judgments and response rates. Journal of Experimental Psychology. Animal Behavior Processes, 27(3), 187–195.
Skinner, B. F. (1958). Reinforcement today. American Psychologist, 13(3), 94–99.
Spicer, S.G., Nicklin, L.L., Uther, M., Lloyd, J., Lloyd, H.M., & Close, J. (2021). Loot boxes, problem gambling and problem video gaming: A systematic review and meta-synthesis. New Media & Society, 24, 1001-1022.