Slot Machines Operate On A ____ Reinforcement Schedule

With the slot machine, we never know when we're going to win, but we know we won't win if we stop pulling that handle. That's what keeps us playing in the hopes of hitting the jackpot. Fixed interval reinforcement is like your paycheck because you go to work every day, and on a schedule, you're rewarded with a sum of money; whereas a variable. Modern slot machines use a computer to generate random numbers, and these determine the outcomes of the game. The important thing to remember is that the results are truly random. The game doesn’t work on any kind of cyclical basis, and slot machine jackpots don’t become due. Slots don’t get hot or cold, either.

I’m back at my keyboard after a few weeks’ hiatus during which I spent all my free time landscaping. I completely understand how the stereotypical scene of the stay at home mom working out in the garden developed. You can use your hands in a productive manner, making progress rather than just cleaning up and fixing things. There are small, well-defined projects that can be accomplished and checked off your list rather than the ongoing and endless refinement of child rearing. And perhaps the most appealing feature of gardening is that your plants cannot talk, whine, yell, etc. 🙂

Back to the topic at hand, we’ve already had an introduction to operant conditioning (see Understanding Reinforcement vs. Punishment from 2/8/18 and Using Operant Conditioning to Train Your Children to Have Good Manners from 3/1/18) so now it’s time to delve deeper and discuss schedules of reinforcement. Here we’ll continue to focus on parenting techniques derived from operant conditioning (e.g., positive and negative reinforcement) and develop an understanding of how when we use these techniques makes a huge difference in our child’s response.

Schedule

There are two categories for reinforcement schedules. First, a continuous schedule of reinforcement means that every single behavior is reinforced. For example, every time your child eats their veggies at dinner they get dessert.

Second, a partial schedule of reinforcement means that the behavior is only reinforced some of the time. In this example, sometimes your child eats their veggies and they get dessert but sometimes no dessert is offered.

Which reinforcement schedule is better? Researching his pigeons and rats, Skinner discovered what he dubbed the partial-reinforcement effect wherein behaviors that are only partially reinforced (i.e., not every time the behavior occurs) are longer lasting, less prone to extinction as he called it. That’s a relief from a parental perspective because you don’t have to reinforce every single desired behavior to get your children to behave well. According to this theory, if your child gets dessert every night and you stop that reward, they’ll quickly stop eating their veggies whereas if you only sometimes give your child dessert and then stop, they’ll continue eating their veggies for a while longer.

Things get a little more complicated when you look at the different ways that partial reinforcement schedules can be defined. There are 4 subcategories for partial reinforcement schedules: ratio, interval, fixed, and variable. If the schedule is developed based on the frequency of a behavior, it is a called a ratio schedule. If a schedule is developed based on theamount of time that has elapsed, it is called an interval schedule. Bear with me; these concepts are going to be so critical later when I introduce sleep training techniques like the Ferber Method and Cry It Out. It’s about to get a little tricky because it’s the interaction of these subcategories that has utility. We’re going to need a chart to wrap our heads around these and the different combinations of these subcategories.

RatioInterval
FixedFixed # of behaviors

Ex: Factory worker

Variable # of behaviors

Ex: Slot machine

VariableFixed amount of time

Ex: Friday spelling tests

Variable amount of time

Ex: Pop quizzes

In a fixed-ratio schedule, your child’s behavior is reinforced after a fixed number of times, like a factory worker who gets paid $10 for every 100 toothpaste caps he puts on a tub of toothpaste. For children, this might mean for every 5 times they eat their veggies, they get a treat.

Slot machines operate on a __ reinforcement schedule

In a variable-ratio schedule, the number of times a child has to exhibit the behavior to get the reinforcer varies randomly, as with slot machines. So, if your child eats their veggies today, they get a treat, but they won’t get another treat until they eat their veggies 6 times, and after that it will be 3 veggie eatings to earn a treat.

Slot Machines Operate On A __ Reinforcement Schedules

There are still two subcategories to cover. In a fixed-interval schedule, the behavior must be exhibited after a specified amount of time has elapsed before the reinforcer is given. For example, a student who has a spelling test every Friday engages in the behavior of studying and is rewarded with a good test grade, but only on Fridays. Studying during the week might help them on Friday but they only get the reward on Friday. Back to the dessert example, if your child eats their veggies on Friday, they get dessert and it’s not dependent on whether they ate their veggies the rest of the week.

Slot Machines Operate On A __ Reinforcement Schedule 2019

In a variable-interval schedule, there is a varying amount of time that must pass between rewarded behaviors, as in pop quizzes. For a young child, this might mean dessert is offered tonight, then not for week, then not for two days, then offered the next day. In this dessert example, the difference between variable-ratio and variable-interval schedules is subtle – the difference is simply how the reward timing is defined, by the accumulated number of behaviors oran amount of time that must pass before the one critical behavior that earns the dessert.

Which partial reinforcement schedule is the best? It depends what behavior you are trying to change and what you know about your child’s emotional stability and understanding of delayed gratification. There are some well-researched phenomena to help guide your reward distribution. Ratio schedules (fixed or variable) are most likely to increase the frequency of a behavior – the more the child cleans up, the more likely they are to get the treat. Compared to variable-ratio schedules, in fixed-ratio schedules, you tend to see more of a lull in the desired behavior immediately after the reinforcer is given because the child knows how many times they have to do the desired behavior before earning the next treat. In fixed-interval schedules (like the spelling test), you tend to see long periods without the desired behavior (studying) then a surge of behavior prior to the end of the interval (the test). Variable-interval schedules tend to result in consistent patterns of behavior where you study regularly just incase there’s a pop quiz tomorrow. From a parental perspective, if you want to see change fast, implement a ratio schedule. If you want to train your child to be consistent in their behavior, variable schedules, whether ratio or interval, are better than fixed schedules – keep them on their toes! Variable schedules are also harder to extinguish, meaning that your child will keep up the good behavior for a longer time than with fixed schedules even if you remove the reinforcer. If you’re a psychology nut like me, this is fascinating stuff, though a little tricky to wrap your head around at first. And the parenting applications are numerous; more on that another day.