Slot Machines Represent A Variable Ratio Reinforcement Schedule

Slot Machines Represent A Variable Ratio Reinforcement Schedule

Details

Click here to study/print these flashcards.
Create your own flash cards! Sign up here.

Additional Psychology Flashcards

Cards Return to Set Details

Term
Define Primary Reinforcer and list two examples
Definition

A primary reinforcer is Naturally occurring need that pushes us to act.

Food, Sex, Water

Term
Define Secondary Reinforcer and list two examples
Definition

A primary reinforcer is Learned Behavior that pushes us to act.

Money, Token, Grades

Term

Identify the type of reinforcement schedule

fixed ratio, variable ratio, fixed interval, variable interval

For Every 4 eggs I stuff I get a chocolate chip.

Definition
fixed ratio
Term

Identify the type of reinforcement schedule

fixed ratio, variable ratio, fixed interval, variable interval

After 10 minutes if I have put together a Halloween Basket I get a quarter.
Definition
fixed interval
Term

Identify the type of reinforcement schedule

fixed ratio, variable ratio, fixed interval, variable interval

Every 3 minutes or so, I get a sticker if I'M sitting in my seat.
Definition
variable interval
Term

Identify the type of reinforcement schedule

fixed ratio, variable ratio, fixed interval, variable interval

Every five gifts I wrap I willl get a hug.
Definition
fixed ratio
Term

Identify the type of reinforcement schedule

fixed ratio, variable ratio, fixed interval, variable interval

For aprrroximately every 15 minutes I pull the lever on the slot machine I win.
Definition
variable ratio
Term

Identify the type of reinforcement schedule

fixed ratio, variable ratio, fixed interval, variable interval

After about 2 hours I check my facebook status update and there is a comment.
Definition
variable interval
Term

Identify the type of reinforcement schedule

fixed ratio, variable ratio, fixed interval, variable interval

Every 2 hours mom does a room check and if I'm in bed when she comes to my room got a smiley face on my chart.
Definition
fixed interval
Term
Decribe the bobo doll expirament conducted by Albert Bandura
Definition

A Child is shown a movie of an adult beating up a bobo doll. Then the child is placed in a room with the doll. Not only did children beat up doll, but found new a creative way to dole out punishment.

Children who didn't observe an adult hitting the bobo doll were less novel in their agression.

This indicates that the behavior can be learned through observation.

Term

(1) memory requires consious awareness, while

(2) memory does not.

Definition

1. explicit

2. implicit

Term
What are the 2 types of explicit memory?
Definition

A. Semantic

B. Episodic

Term
What are the 3 stages of Memory and decribe each?
Definition

1. Sencesory - Fleeting memory that is an exact copy held for about a second

2. Short term - Short term storage for tempory information 7 +/- 2 items. Stored as images and sound.

3. Long Term - Lasting memories that are stored forever. Stored based on meaning and importance.

Term
What is the serial position effect?
Definition
When remembering items we useually recall first items because they are in our long term memory and the last because they are in our short term memory.
Term

Extra Credit:

In what stage of memory do we find the working memory?

Definition
Short Term Memory AKA Mental Scratch Pad

I’m back at my keyboard after a few weeks’ hiatus during which I spent all my free time landscaping. I completely understand how the stereotypical scene of the stay at home mom working out in the garden developed. You can use your hands in a productive manner, making progress rather than just cleaning up and fixing things. There are small, well-defined projects that can be accomplished and checked off your list rather than the ongoing and endless refinement of child rearing. And perhaps the most appealing feature of gardening is that your plants cannot talk, whine, yell, etc. 🙂

Back to the topic at hand, we’ve already had an introduction to operant conditioning (see Understanding Reinforcement vs. Punishment from 2/8/18 and Using Operant Conditioning to Train Your Children to Have Good Manners from 3/1/18) so now it’s time to delve deeper and discuss schedules of reinforcement. Here we’ll continue to focus on parenting techniques derived from operant conditioning (e.g., positive and negative reinforcement) and develop an understanding of how when we use these techniques makes a huge difference in our child’s response.

Reinforcement

There are two categories for reinforcement schedules. First, a continuous schedule of reinforcement means that every single behavior is reinforced. For example, every time your child eats their veggies at dinner they get dessert.

Second, a partial schedule of reinforcement means that the behavior is only reinforced some of the time. In this example, sometimes your child eats their veggies and they get dessert but sometimes no dessert is offered.

Which reinforcement schedule is better? Researching his pigeons and rats, Skinner discovered what he dubbed the partial-reinforcement effect wherein behaviors that are only partially reinforced (i.e., not every time the behavior occurs) are longer lasting, less prone to extinction as he called it. That’s a relief from a parental perspective because you don’t have to reinforce every single desired behavior to get your children to behave well. According to this theory, if your child gets dessert every night and you stop that reward, they’ll quickly stop eating their veggies whereas if you only sometimes give your child dessert and then stop, they’ll continue eating their veggies for a while longer.

Variable Interval Reinforcement Schedule

Things get a little more complicated when you look at the different ways that partial reinforcement schedules can be defined. There are 4 subcategories for partial reinforcement schedules: ratio, interval, fixed, and variable. If the schedule is developed based on the frequency of a behavior, it is a called a ratio schedule. If a schedule is developed based on theamount of time that has elapsed, it is called an interval schedule. Bear with me; these concepts are going to be so critical later when I introduce sleep training techniques like the Ferber Method and Cry It Out. It’s about to get a little tricky because it’s the interaction of these subcategories that has utility. We’re going to need a chart to wrap our heads around these and the different combinations of these subcategories.

RatioInterval
FixedFixed # of behaviors

Ex: Factory worker

Variable # of behaviors

Ex: Slot machine

VariableFixed amount of time

Ex: Friday spelling tests

Variable amount of time

Ex: Pop quizzes

Slot machines provide reinforcement on a schedule of reinforcement. Using a ratio schedule, reinforcement is provided based on. Answer to The schedule of reinforcement associated with playing slot machines and other types of gambling isa.

In a fixed-ratio schedule, your child’s behavior is reinforced after a fixed number of times, like a factory worker who gets paid $10 for every 100 toothpaste caps he puts on a tub of toothpaste. For children, this might mean for every 5 times they eat their veggies, they get a treat.

Slot Machines Represent A Variable Ratio Reinforcement Schedules

In a variable-ratio schedule, the number of times a child has to exhibit the behavior to get the reinforcer varies randomly, as with slot machines. So, if your child eats their veggies today, they get a treat, but they won’t get another treat until they eat their veggies 6 times, and after that it will be 3 veggie eatings to earn a treat.

There are still two subcategories to cover. In a fixed-interval schedule, the behavior must be exhibited after a specified amount of time has elapsed before the reinforcer is given. For example, a student who has a spelling test every Friday engages in the behavior of studying and is rewarded with a good test grade, but only on Fridays. Studying during the week might help them on Friday but they only get the reward on Friday. Back to the dessert example, if your child eats their veggies on Friday, they get dessert and it’s not dependent on whether they ate their veggies the rest of the week.

Examples Of Variable Ratio Reinforcement

In a variable-interval schedule, there is a varying amount of time that must pass between rewarded behaviors, as in pop quizzes. For a young child, this might mean dessert is offered tonight, then not for week, then not for two days, then offered the next day. In this dessert example, the difference between variable-ratio and variable-interval schedules is subtle – the difference is simply how the reward timing is defined, by the accumulated number of behaviors oran amount of time that must pass before the one critical behavior that earns the dessert.

Which partial reinforcement schedule is the best? It depends what behavior you are trying to change and what you know about your child’s emotional stability and understanding of delayed gratification. There are some well-researched phenomena to help guide your reward distribution. Ratio schedules (fixed or variable) are most likely to increase the frequency of a behavior – the more the child cleans up, the more likely they are to get the treat. Compared to variable-ratio schedules, in fixed-ratio schedules, you tend to see more of a lull in the desired behavior immediately after the reinforcer is given because the child knows how many times they have to do the desired behavior before earning the next treat. In fixed-interval schedules (like the spelling test), you tend to see long periods without the desired behavior (studying) then a surge of behavior prior to the end of the interval (the test). Variable-interval schedules tend to result in consistent patterns of behavior where you study regularly just incase there’s a pop quiz tomorrow. From a parental perspective, if you want to see change fast, implement a ratio schedule. If you want to train your child to be consistent in their behavior, variable schedules, whether ratio or interval, are better than fixed schedules – keep them on their toes! Variable schedules are also harder to extinguish, meaning that your child will keep up the good behavior for a longer time than with fixed schedules even if you remove the reinforcer. If you’re a psychology nut like me, this is fascinating stuff, though a little tricky to wrap your head around at first. And the parenting applications are numerous; more on that another day.