All MCAT Social and Behavioral Sciences Resources
Example Questions
Example Question #1 : Associative Learning
Alice is watching videos on her cellular phone during math class and her teacher notices. Her teacher scolds her to stop and tells her she must write an essay about why math is important.
Which type of operant conditioning is the teacher using?
Positive reinforcement
Negative reinforcement
Negative punishment
Positive punishment
None of these
Positive punishment
The correct answer is “positive punishment.” The teacher is adding the essay assignment making it positive, and she is aiming to decrease the behavior of watching videos in class making it a punishment. As shown in the figure, if something is removed (e.g. loss of computer privileges) this would be negative, and if the aim is to increase the behavior (e.g., a parent trying to make a student study more) it is considered to be reinforcement.
Classical conditioning involves the pairing of a new stimulus to the desired response. A famous example is Pavlov's dogs that salivated (i.e. unconditioned response) in response to seeing food (i.e. unconditioned stimulus). During the conditioning period, a neutral stimulus (e.g. a bell) is rung while showing the food to the dogs; that is, the neutral stimulus is paired with the unconditioned stimulus. Following the conditioning period, the bell alone (i.e. without the presence of food) triggers salivation in the dogs. The bell becomes the conditioned stimulus and the salivation becomes the conditioned response.
Example Question #1 : Operative Conditioning
Alice is trying to train her dog to sit on command before her in-laws come over. She does not care if the behavior lasts after their visit; she just wants him to learn the trick quickly. She plans to use dog biscuits at first to reinforce the behavior.
Based on her situation and desires, what type of reinforcement schedule should Alice follow?
Intermittent punishment
Intermittent reinforcement
Continuous reinforcement
None of these
Continuous punishment
Continuous reinforcement
The question specifies that Alice wants the dog to learn quickly; regardless if he loses the skill quickly. This describes the result of following a continuous reinforcement schedule (i.e. rewarding the dog with a biscuit every time he sits after the command. There is rapid acquisition of the desired behavior; however, continuous reinforcement results in rapid extinction.
In contrast, intermittent reinforcement results in slow acquisition yet slow extinction of the behavior. If Alice were determined for her dog to acquire this skill for the long-term, this schedule would be more appropriate. That is, she should sometimes reward the dog with a biscuit and sometimes not. It will take longer for the dog to learn to sit on command, but the behavior will last longer.
Example Question #11 : Associative Learning
Joey hates cleaning his room. When his father asks him to clean up, he throws a tantrum. As a result, his father—wanting the tantrum to end—gives up and allows him to continue playing instead of cleaning.
The next day Joey's mom walks towards his room, and he anticipates that she is going to ask him to clean his room. As she enters the room, he quickly pretends he isn't feeling well and asks if he can lie down to watch television and she reluctantly agrees.
What are the two types of operant conditioning demonstrated in the given scenario?
Negative punishment following by positive punishment
Both are examples of classical conditioning.
Negative reinforcement followed by positive reinforcement
Escape followed by active avoidance
Positive punishment followed by negative reinforcement
Escape followed by active avoidance
The first example demonstrates escape conditioning. Once presented with the aversive stimulus (i.e. being asked to clean his room), Joey learns that a particular behavior (i.e. throwing a tantrum) will allow him to escape from the aversive stimulus.
The second example demonstrates active avoidance conditioning. In this situation, Joey anticipates the aversive stimulus (i.e. he hears his mother walking towards his messy room). He learns that a particular behavior (i.e. faking illness) will allow him to avoid the presentation of the aversive stimulus.
Both are subtypes of negative reinforcement, which is a type of operant conditioning not classical conditioning. Joey's behaviors are being encouraged through the removal of an unwanted stimulus—cleaning his room.
Example Question #4 : Operative Conditioning
Jimmy and Nate both volunteer at the dog pound. Jimmy loves animals of all kinds and loves the chance to be around dogs. Nate doesn’t particularly like animals, but he needs service hours for a club he is in at school.
The boys take the dogs out to exercise twice every week. When it is time to go, they open the door and call the dogs to come back inside. If the dogs return without additional coaxing, then they get a treat. If the boys have to go get the dogs, then the animals do not get a treat. How would a researcher in operant behavior describe this practice?
Positive punishment and negative reinforcement
Negative punishment and negative reinforcement
Positive reinforcement and negative punishment
Positive reinforcement only
Positive reinforcement only
The boys are applying a stimulus to encourage good behavior (i.e. positive reinforcement). It may appear that they are also taking away a stimulus to discourage bad behavior (i.e. negative reinforcement); however, in this situation, it must be one or the other, not both. Since the “normal condition” is no treat, the treat is an added stimulus. The poor behaving dogs are not subject to removal of something good, but rather the normal condition. Positive punishment would be correct if the boys physically hurt the dogs that did not obey. Negative reinforcement would be correct if the boys took away something (e.g. a shock collar) from the dogs that came in on time.
Example Question #5 : Operative Conditioning
Jimmy and Nate both volunteer at the dog pound. Jimmy loves animals of all kinds and loves the chance to be around dogs. Nate doesn’t particularly like animals, but he needs service hours for a club he is in at school.
At feeding time, Jimmy rings a bell and the dogs immediately go to the feeding station. While Jimmy is away on vacation for two months, Nate rings the bell at many different times other than mealtime. When Jimmy returns and rings the bell, the dogs look up, but do not go to the feeding station. Which of the following best describes this situation?
Symbolic interactionism
Extinction
Reductionism
Weber’s law of perception
Extinction
When a stimulus is applied without the corresponding response, the actions learned through operational conditioning can be lost (i.e. “extinction”). In this case, the dogs were originally rewarded with food every time the bell rang. When Nate rang the bell and did not feed them, they subconsciously disconnected the two events.
The other choices are incorrect. “Reductionism” describes how oversimplification (especially those related to human processes) can lead to a loss of meaning by looking past complexities. “Weber's law of perception” is a method of describing the maximum amount of stimulus that can be applied without a subject noticing. Last, “symbolic interactionism” is a sociological theory which states that peoples interactions are based on symbolic meanings of gestures, objects, and titles
Example Question #6 : Operative Conditioning
American psychologist B.F. Skinner is best known for his work in operant conditioning.Like all great academics, Skinner was not without influence. His work was inspired primarily based on Thorndike's Law of Effect. This law states that behaviors paired with positive consequences/effects/rewards are likely to be repeated, while behaviors paired with unpleasant consequences/effects/rewards are likely to be avoided.
While this principal inspired Skinner, he researched it further and named this principle operant conditioning. Skinner's research showed that conditioning/learning could occur through the use of punishments and rewards. The two important concepts of operant conditioning include punishment and reinforcement. Finally, Skinner also discussed the possibility of a neutral operant, which neither increased nor decreased the likelihood of a behavior occurring.
A young girl does not want to eat her spinach at dinner. She knows that if she does not eat her spinach, her father will not let her leave the table. In response, the girl gives her spinach to the family's dog under the table while her father is not looking. This is an example of which of the following?
Escape
Extinction
Acquisition
Avoidance
Avoidance
"Avoidance" refers to someone performing a behavior in order to avoid a negative stimulus. In this case, the young girl is avoiding the negative stimulus (i.e. not being allowed to leave the dinner table) by giving her spinach to the dog. "Escape" is similar to avoidance, except that it involves engaging in a behavior to get away, or stop a negative stimulus, as opposed to avoiding that stimulus altogether. Finally, "acquisition" and "extinction" refer to processes related to conditioning and a loss of a conditioned response, respectively.
Example Question #7 : Operative Conditioning
American psychologist B.F. Skinner is best known for his work in operant conditioning.Like all great academics, Skinner was not without influence. His work was inspired primarily based on Thorndike's Law of Effect. This law states that behaviors paired with positive consequences/effects/rewards are likely to be repeated, while behaviors paired with unpleasant consequences/effects/rewards are likely to be avoided.
While this principal inspired Skinner, he researched it further and named this principle operant conditioning. Skinner's research showed that conditioning/learning could occur through the use of punishments and rewards. The two important concepts of operant conditioning include punishment and reinforcement. Finally, Skinner also discussed the possibility of a neutral operant, which neither increased nor decreased the likelihood of a behavior occurring.
In Skinner's famous experiment, he created a "Skinner box" that he used to learn about the behaviors of rats. In one experiment, when a rat pressed a lever in the box, a food pellet dropped down. Which of the following best describes how the rat initially learned to press the lever?
Negative punishment
None of these
Observational learning via Skinner
Random chance.
Random chance.
In Skinner's famous experiment, the rats ran around in their boxes, and accidentally pressed the lever by "random chance," which made a food pellet drop down. "Observational learning" is a term used to describe learning through imitating others, and is unlikely to occur between different species. Finally, based on the example, we have no reason to believe that any sort of punishment is occurring.
Example Question #8 : Operative Conditioning
American psychologist B.F. Skinner is best known for his work in operant conditioning.Like all great academics, Skinner was not without influence. His work was inspired primarily based on Thorndike's Law of Effect. This law states that behaviors paired with positive consequences/effects/rewards are likely to be repeated, while behaviors paired with unpleasant consequences/effects/rewards are likely to be avoided.
While this principal inspired Skinner, he researched it further and named this principle operant conditioning. Skinner's research showed that conditioning/learning could occur through the use of punishments and rewards. The two important concepts of operant conditioning include punishment and reinforcement. Finally, Skinner also discussed the possibility of a neutral operant, which neither increased nor decreased the likelihood of a behavior occurring.
Which type of reinforcement schedule refers to reinforcement occurring after an unpredictable number of behavioral occurrences?
Variable-ratio
Fixed-ratio
Variable-interval
Fixed-interval
Variable-ratio
Immediately, both answer choices with the word "fixed" can be eliminated because the problem states that reinforcement occurs after an unpredictable number of occurrences. Additionally, a "variable-interval" schedule refers to reinforcement occurring after an inconsistent amount of time, rather than occurrences. Finally, as the questions asks, a "variable-ratio" schedule is one in which reinforcement occurs after an unpredictable number of behavioral occurrences.
Example Question #9 : Operative Conditioning
American psychologist B.F. Skinner is best known for his work in operant conditioning.Like all great academics, Skinner was not without influence. His work was inspired primarily based on Thorndike's Law of Effect. This law states that behaviors paired with positive consequences/effects/rewards are likely to be repeated, while behaviors paired with unpleasant consequences/effects/rewards are likely to be avoided.
While this principal inspired Skinner, he researched it further and named this principle operant conditioning. Skinner's research showed that conditioning/learning could occur through the use of punishments and rewards. The two important concepts of operant conditioning include punishment and reinforcement. Finally, Skinner also discussed the possibility of a neutral operant, which neither increased nor decreased the likelihood of a behavior occurring.
A man wants to teach his puppy to sit. In order to teach the dog, the man tells the puppy "sit" and rewards her with a treat each time she does so successfully. Eventually, he also uses a hand signal while saying the word "sit." After some time, the dog begins to respond to both the word "sit" and to the hand signal. In this example, which of the following can be considered a secondary reinforcer?
Both the word "sit", and the hand signal
The word sit
The treat
The hand signal
Both the word "sit", and the hand signal
A secondary reinforcer is learned; therefore, it has no innately desirable attributes. On the other hand, a primary reinforcer is innately pleasing. In this example, the treat is innately pleasing to the dog, and is the primary reinforcer. Both the word "sit", and the hand signal have been paired to each other, as well as to the treat. Because they have been paired to the treat (i.e. the primary reinforcer) they have become secondary reinforcers.
Certified Tutor
Certified Tutor