Psyc Study Guide
Psyc Study Guide
o Behaviorism
Roots
Watson (1913)
Skinner (1938)
Similarities and Differences
o What is Behavior
Behavior (B): any action (Voluntary or involuntary) exhibited by an
organism
Eye blink
Walking
o Watson’s Methodological Behaviorism (1913)
Method: Introspection Experimental Method
Subject Matter of Psychology: Consciousness of Mind, Directly
Observable and measurable behavior
Causes of behavior: Mentalist, Environmental Events
o Why should we reject Mentalism
Circular logic
You must provide an explanation that is independent of behavior in
question
According to scientific standards, mentalistic standards are NOT
real
Ascribing mental causes obstructs inquiry into the real causes
o Modern Behaviorism: Skinners Radical Behaviorism
Two types of internal events
Private events
o Hunger, thoughts, sensations dreams
o Natural/ physical
o Covert behavior
Mental events
o Non-natural, fictional
o Abstractions, ideas, concepts, constructs
o Skinner and Watson: Another difference
Watson: All behavior are reflexive
Skinner: Two types of behaviors
Reflexive behaviors
o Mechanistic causality
Operant Behaviors
o Affected by consequences
o Research Methods
Descriptive research
Experimental Research
Types of Experimental Methods
Group Design
Single-Subject Design
o ABA Reversal
o Multiple Baseline
o Research Methods
Descriptive: Provide a description of B
Experimental: Provides explanation for B
Goal of an experiment: To establish if a certain factor, or variable causes a
change in behavior.
Independent Variable (IV) Factor under investigation
Dependent Variable (DV) Behavior being measured
Burden of Proof: IV is indeed what caused change; no other variable in
study could have done it
o Experimental Research
Compare 2 or more variables
All other conditions must be equal (No confounding variables)
You need:
A comparison
Everything else equal, except for the IVs,
Measures
Random assignment of subjects to conditions
o Types of experimental designs or methods
Group Design
Single-subject design
ABA reversal
Multiple Baseline
Lecture 2
o Innate Behavior
Innate behavior defined
Four Criteria
Types of Innate Behavior
Reflexes
Fixed Action Patterns
Reflexes and FAPs, similarities and differences
Innate and Learned Behavior in Balance
o Innate Behavior: Behavior that is not learned
4 Criteria: Unlearned, Invariant, Universal, Adaptive
How do we show that a behavior is unlearned?
o Two Types of Innate Behavior
1. Reflexes (S -> R)
US -> UR
Example of Innate Reflexes
Allergens Sneeze
Food Saliva
Light Pupil Constriction
2. Fixed Action Patter (FAP): Specific Sequence, or pattern, of behavior
elicted by a specific stimulus (“releaser” or “sign stimulus”)
Stickleback Fish
FAPs are neither intentional nor purposeful
o Spiders and cocoon building
o Graylag goose
Motivational Conditions Sometimes needed
o Reflexes and FAPs Similarities and differenes
Similarities:
4 Criteria
Specific Stimulus
Differences
Reflex: 1 Action
FAP: more than 1 action/ behavior
Reflex: Part of organism
FAP: Whole organism
FAPs in humans? (Motherly crap)
o Balancing Innate and Learned Behavior in Nature
Do all organisms need ability to learn in order to survive?
Static vs. Dynamic Environments
The advantage of learning
The cost of learning
o Effects of Repeated stimulation
Habituation
Related Phenomena
Sensitization
When Habituation? When Sensitization?
o Dual Process Theory
o Habituation
Habituation: decrease in the strength of a behavior/ response
White Noise; intermittent weak stimulus
Adaptive: Avoid sensory overload
Keeps us open to new stimuli
Stimulus-specific
o Phenomena related to Habituation
Spontaneous Recovery
Recovery of a habituated response after a break
Need a reasonable amount of time
Retention of habituation (Long term)
Complete spontaneous recovery may not occur
Dishabituation
Recovery of habituated response after a new stimulus is present
What is common and what is different(in terms of procedure and what is
measured between
Spontaneous recovery and dishavituation?
o Common: measure amount of response to stimulus
o Difference: Spontaneous recovery has span of time;
dishavituation presents a single novel stimulus
Procedures used to demonstrate stimulus-specificity of habituation
and dishabituation
o Sensitization
Sensitization: increase in strength of behavior
Gunfire, moviegoer
NOT stimulus-specific, presentation of one stimulus may increase
response to another stimulus
o Dual Process theory
Each stimulus presentation results in 2 opposite processes, called
habituation and sensitization. Any change in behavior is a net result of 2
processes
Higher intensity sensitization
Lower intensity havituation
Moderate intensity Sensitization, then habituation
o When habituation when sensitization? Dual Process Continued
Habituation is a continuous process it continues to increase with
stimulus presentation; starts to decay only after stimulus ceases
Sensitization is a temporary process it begins to decay while stimulus is
still being presented
o Stimulus Frequency and Intensity
Frequency
Both are direct functions of stimulus frequency
Intensity
Sensitization: direct function of stimulus intensity
Habituation: Inverse function of stimulus intensity
Lecture 3
o Classical Conditioning (Pavlovian Conditioning
Associations/ pairings
Terminology
Measures/ types of trials
Related Concepts
o Learning Via Association
Learning in Classical conditioning is by association
Key: Pair 2 stimuli together- one has some important survivial
characteristics, one does not
Learned stimulus must occur before presentation of the stimulus
that elicits the reaction
Through pairing, the once neutral stimulus becomes a conditioned
stimulus
CS CR chain is a learned conditioned reflex
Most stimuli are external
Important questions to ask in classical conditioning
What was learned Conditioned?
What was innate Unlearned unconditioned
What is the stimulus
Which is the response
o Pavlovs Procedure
Food paired (associated) w/ metronome (NS)
Result NS become CS
Food paired with bell
NS become CS
How do we know this change occurred
Saliva flowed during presentation of just the CS before he US was
presented
Measures in classical conditioning
Percentage of CRs: % of trials in which CR occurred
Magnitude of CR: amount of saliva produced
o Percentage and magnitude of CR should both increase with
successive trials
o Latency of CR: time between onsets of CS and CR
Latency typically decreases with successive trials
o Direct physiological response
Changes in HR, BP, muscular tension
Measures in Classical Conditioning
Indirect measurements
o Approach to/ avoidance of CS
Two Types of Trials
Conditioning Trials (Training trials, regular trials) Trials in which
there is a CS-US pairing
Test trial: Trials in which the CS is presented alone
o Usually interspersed among Conditioning trials
o Typically present ~1 test trial among every 10 conditioning
trial or so
o Related Concepts
Extinction: CS repeatedly presented w/o US
Ring bell: No food -> Little or no saliva
Crying “wolf”
Disinhibition
Recovery of CR during extingtion after a novel stimulus
Like dishavituation, but for CR inhibited by extinction
Spontaneous Recovery: Reappearance of CR after time passes after
extinction
o Running into your ex, those feelings return (briefly)
Lecture 4
o Classical Conditioning: Special Procedures
Excitatory/ Inhibitory Conditioning
Effects of experiences that precede ClassCon
Lateral Inhibition
Higher order Conditioning
Sensory Pre-Conditioning
Compound Stimuli
Blocking
Overshadowing
Timing
o Excitatory/Inhibitory Conditioning
Excitatory Conditioning: CS+
NS Presentation of US Bell food
Inhibitory Conditioning CS-
NS absence or removal of US
Owner of sacry dog is there dog doesn’t bite
Occasion setting: Signals CS –US contingency
Presence or absence of stimulus of stimulus affects CR
EG light: bell –food; no light :bell no food
o Higher order conditioning
Second order conditioning
Metronome:food salivation
Metronome salivation
Light:Metronome salivation
Light Salivation
Pariing a new stimulus with an established CS to elicit an established CR
The new stimulus becomes a CS2 and elicits CR 2
CR2 is usually lower in magnitude that CR1
Latent Inhibition
Novel stimulus more effective for conditioning
Explanation of just a friend zone
Sensory pre-conditioning
Higher order conditioning, stimulus becomes a CS even though it
was never paired with US
Difference: two stimuli paired before US was ever presented –
neither had yet become a CS
Compound Stimuli
Overshadowing
o The stronger component of a compound stimulus becomes
a CS, but the weaker component will not
Gunfire + light taping: candy saliva
Gunfire saliva
Light taping no saliva
Blocking
o Presence of an established CS interferes with conditioning
a new CS
Red light: candy Saliva
Redlight and green light: Candy saliva
Green light no saliva
Too focused on red doesn’t give a shit about green
o Timing of classical conditioning
Delayed conditioning: most effective
CS onset, US onset, CS offset, US offset
Trace Conditioning
CS onset, CS offset, US onset, US offset
Simultaneous conditioning
CS and US at same time
Backwards conditioning effect: Least effective
US onset then CS onset
o Theories of classical conditioining
Two types of theories
Type of association formed
Nature of the CR
Pavlovs stimulus substitution theory
Siegels compensatory CR theory
Rescorla Wagner theory
o Two types of theories
Type of association formed as a result of classical conditioning
S-S (stimulus-stimulus)
S-R (Stimulus- response)
o Research emphasizes S-S associations more
Form/ nature of the CR ( eye blink, wing beats)
o Pavlovs stimulus substitution theory
US stimulates a US center in the brain which excites a response center
CS stimulates a different part of the brain than US
After Pairings CS-US neural connection made
CR should take form of UR : light-food: dog licks light
Preparatory Response theory
Form of the CR is dependent on type of S
Rat-shock: jump ligh-shock light: freez
o Siegels Compensatory CR theory
US: Drug + primary effect of drug
Coffee US: Caffeine and alertness
UR= response that opposes drugs primary effect
Coffee example: UR = sleepiness
UR is a compensatory response
UR occurs after the drugs primary effect
Situation/ environment in which you take your drug that always preceds
your drug intake become CS
Starbucks becomes CS
CR = UR sleepiness both are compensatory
Another conditioning Example
Beer intake (bar setting –CS_ Increased CR
CR occurs before primary effect
Size of CR increases with training
o Opposes 1^0 effect more => drug has lesser effect
o This is known as Chronic Tolerance
Siegels Compensatory CR theory
o Results from learning association between drug intake and
environment, NOT from repeated exposure to drug
o Depends on context of drug intake: situational specificity
o Context becomes CS and brings out CR
o Rescorlas Wagner Theory
US supports limited amount of conditioning,
Associative value distributed among CS’s
Stronger US support more conditioning
Overshadowing, blocking, over expectation effect
Tone: 0-> 10 Food max = 10 salivation
Light 0-> 10 Food Max = 10 -> salivation
Tone V=5 salivation
Limits of love to give
o Therapeutic Applications of Classical Conditioning
Stimulus Generalization and discrimination
Classical conditioning based therapies for eliminating the CR
Flooding
Counterconditioning
o Systematic Desensitization
o Aversion therapy
o Generalization/ discrimination
Stimulus Generalization
CR elicited by stimuli similar to original CS
Stimulus discrimination
CR elicited more by specific stimuli
More similarity to CS -> stronger CR
Experimental neurosis: difficulty discriminating
o Unpredictable events -> neurotic syymptons/ anxiety
o EG circle = CS+, oval = CS – gradually made
indistinguishable
Generalization contributes to phobias
Eliminating the CR
Extinction: Presenting the CS for short periods of time repeatedly
without presenting US
Metronome CS No salivation
Needs short CS presentation just as before
Fears and phobias
o Acquired via classical conditioning
Dog: bite pain dog= > fear
Fears and phobias
Acquired via other means observation
o Classical condition based therapies can help treat phobias,
regardless of how they were acquired
Flooding
o Prolonged exposure to CS while preventing escape
o Counterconditioning
Counterconditioning: learning a new incompativle association that
counteracts the original one
CS Cr metronome saliva
Old CS metronome: shock fear
Metronome -> fear
New CR incompatible with old CR reciprocal inhibition
Systematic Desensitization: series of relaxations paired with hierarchy of
fear eliciting stimuli
Phobias relaxation paired with old CS of phobia
Old CR (fear) and new CR (calm) are incompativle
o Counterconditioning Systematic Desensitization
1.fear hierarchy is established
2. Physical relaxation training
Train to be aware of tension
3. Gradual presentation of hierarchy items
Start with least fear arousing
Can be in vivo or imaginal
Aversive therapy: therapy for detrimental behaviors in which one cannot
stop engaging
For addictive behaviors
Develop aversive cr to stimuli associated with desirable behavior
Study Guide
Chapter 1
1. Learning: A relatively permanent change in behavior that results from some type of
experience
2. Behavior: Any activity of an organism that can be observed or somehow measured
3. Behaviorism: Natural Science approach to psychology that traditionally focuses on
study of environmental influences on observable behavior
4. Latent Learning: Learning that occurs in the absence of an observable demonstration of
learning and only becomes apparent under a different set of conditions
5. Law of Contiguity and Parsimony: (Contiguity) Law of association holding that events
that occur in close proximity to each other in time and space are readily associated with
each other.
(Parsimony) Simpler explanations for a phenomenon are generally preferable to more
complex explanations
6. Empiricism vs. Nativism: Nativism: assumption that persons characteristics are inborn
nature perspective while empiricism is that behavior patterns are learned not inherited
nurtured
7. Social Learning Theory: brand of behaviorism that strongly emphasizes importance of
observational learning and cognitive variables in explaining human behavior. More
recently been referred as social cognitive theory
8. Reciprocal Determinism: assumption that environmental events, observable behavior
and persons variables reciprocally influence each other
9. Radical Behaviorism: A brand of behaviorism that emphasizes the influence of the
environment on overt behavior, rejects the use of internal events to explain behavior, and
views thoughts and feelings as behaviors that themselves need to be explained
10. S-R theory: The theory that learning involves establishment of connection between
stimulus and response
Internal Events
Consciously perceived thoughts and feelings and unconscious drives and motives, not supported
by Watson
Difficult because we rely on verbal reports of what people are internally feelings
Difficult to determine actual relationship of thoughts and feelings to behavior, do thoughts and
feelings precede behavior?
Do not have meants of directly changing internal events, means of changining internal events
and external behavior is change aspect of environment. Instruction creates relaxing thoughts,
manipulating environment
Pseudo explanations
Skinner: believed that behavior was result of intereaction between genes and environment.
Operant conditioning bears resemblenace to envolutionary principle of natural selection.
Adaptive characteristics in population increase. Behaviors that lead to favorable outcomes are
likely to be repeated
Watson: humans inherit only fundamental reflexes along with 3 basic emotions, love rage and
fear, everything else he believed is learned
Other vocab
1. Operant learning: rat more likely to hit lever because food will come out
2. Law of similarity: cars and trucks same category
3. Law of contrast: tall and short clean and dirty
4. Law of contiguity: associate thunder and lightning
5. Law of Frequency: frequent items friend with perfume
6. Mind body dualism: holds that some human behavior reflexes are automatically elicted
by external stimulation while other behaviors are freely chosen and controlled by mind
(Descartes)
7. British Empiricists: knowledge is function of experience
8. Structuralism: possible to determine structure of mind by identifying basic elements
(Edward)
9. Introspection: describe internal experiences
10. Funcitonalism: mind evolved to help adapt around world (William James)
11. Methodological behaviorism: psychologists should study only behaviours that are
ovservable
12. Neobehaviorism: brand of behaviorism that utilizes intervening variables in form of
hypothesized physiological processes to explain behavior (Hull)
13. Cognitive behaviorism: utilizes intervening variables, form of hypothesized cognitive
process to explain behavior
14. Cognitive map: mental representation of ones spatial surroundings
Chapter 2
Case Studies are also descriptive involving intensive examination of one or a few individuals.
Based on observation bias reduced to minimum Detailed info about behavior
Experimental research: one or more independent variable are systematically varied to determine
their effect on dependent variable.
Control group design: individuals assigned to experimental group or control group. Experimental
group is manipulated, control are not Free food makes it harder to respond to get other food
Single Subject Designs: require only one or a few subjects to conduct entire experiment. In
single comparison design, behavior in a baseline condition is compred to a behavior in a
treatment condition.
Other Vocab
Chapter 3:
1. Habituation: Decrease in strength of an elicited behavior. One tap startle, multiple tap
not so much
2. Dishabituation: When habituated responses reappear following presentation of a
seemingly irrelevant novel stimulus. Be ok with gun shots, stud comes by, distracted and
is startled again
3. Spontaneous Recovery
4. Sensitization: Increase in strength of an elicited behavior following repeated presentation
of stimulus fucking bombs man
5. Fixed Action Pattern: fixed sequence of responses elicited by a specific stimulus.
6. Opponent-process theory
7. Reflex: relatively simple, automatic response to a stimulus
8. Sign Stimulus/releaser
9. Type of Conditioning
10. US: Unconditioned stimulus
11. UR: Unconditioned response
12. CS: Conditioned Stimulus
13. CR: Conditioned Response
Other Vocab
Opponent Process theory: an a process elicted by an event, b process elicted by a process
counteracts a process. B process is slower to increase and slower to decrease. Repeated
presentations cause increase in both strength and duration
Chapter 4
Compare and contrast variations extenstions and limitations of classical conditions and the
additional phenomena associated with conditioning
Key terms
Chapter 5:
Key Concepts:
Compensatory-response model and its implications for drug use and tolerance
CS that has been repeatedly associated with primary response A to a US will eventually come to
elicit compensatory response B
Heroin Decreased BP (A) Increased BP (B), NS becomes CS such as in certain room, body
is prepared so rather than going through the decreased blood pressure goes straight to increased
blood pressure
Vivo vs Imaginal: Benefits of both as vivo may elicit a huge response but also may cause that
people to not just beconditioned to the fake.
S-R stimulus response model: ns becomes directly associated with UR and therefore elicits UR
S- S- stimulus model NS becomes directly associated with US and because of this association.
Dog bites kid, bit= pain child fears dog
Rescorla Wagner Theory: Proposes that given US can support only so much
conditioning .everything is given an associative value. And cant use up all the resources.
Article 1
Like a rat:
Before Antecents during and Consequences after. What influenced animals to press lever
in laboratory cage. Different species learn faster reward and punishment offers the same
results.
Only a brief period is in order to change behavior . Also reward for ahands off.
Tiny Tyrants:
Don’t beat your kids and stuff. Telling them what you want then rewarding for following
is better than explaining why they did shit wrong or yelling at them.
Class Notes:
General info:
-Questions drawn from lectures, videos, articles, and relevant book chapters; this guide will cover the
most important concepts, but any assigned reading material or content discussed in class is fair game!
From chapter 6
Key Concepts: operant conditioning- what does it involve; think about examples (from class as well as
your own) and differentiate between positive & negative and reinforcement & punishment.
From chapter 7:
Key Terms: bliss point, chained schedule, continuous & intermittent reinforcement schedule, DRH, DRL,
DRP, drive reduction theory, FD/FI/FR/FT/VD/VI/VR/VT schedules, goal gradient effect, noncontingent
schedule of reinforcement, Premack principle, response deprivation hypothesis.
Key Concepts: compare/contrast the schedules of reinforcement and the typical response patterns they
elicit. Compare/contrast the theories of reinforcement.
From chapter 8:
Key Terms: DRO, discrimination training, discriminative stimulus for extinction, extinction, extinction
burst, fading, generalization gradient, multiple schedule, partial reinforcement effect, peak shift,
resurgence, spontaneous recovery, stimulus control, stimulus generalization,
Key Concepts: aspects of extinction and stimulus control. Don’t worry about matching-to-sample or
behavioral contrast.
From chapter 9:
Key Terms: learned helplessness, one- and two-process theories of avoidance, experimental neurosis.
Key Concepts: problems with punishment, effective use of punishment, types of punishment, compare &
contrast the forms of noncontingent punishment. Don’t worry about theories of punishment.
Key Terms: bias, commitment, concurrent schedule, impulsiveness, matching law, melioration theory,
overmatching, self-control, undermatching.
Key Concepts: deviations from matching; drawbacks of melioration; Ainslie-Rachlin Model of self-
control; factors that influence self-control and impulsiveness.
No Brakes article: adolescent risk: why it happens & common interventions don’t work
Cognition, Creativity, and Behavior video: understand the purpose of the experiments and the main
points that Epstein and Skinner were making about mental processes, creativity, and self-concept.
Lecture 6:
Operant Conditioning
o Classical vs. Operant conditioning
o 4 basic operant procedures
o Factors that make operant procedures more effective
o Primary and secondary reinforces
o Shaping
o Conditioning
Classical Conditioning: Operant Conditioning
Reflex responses are Learning based on consequences of
associated with new stimuli responding
Reflexice behavior Operant Behavior
Key component Key component
Assoc bet. 2 stimuli consequences of ones B
Measures Measured only by the
Percentage of CRs
Magnitude of CRs
Latency of CRs
Direct physiological Response
Indirect (Approach/
o 4 basic procedures
Positive Reinforcement
Negative Reinforcement
Positive Punishment
Negative Punishment
o Operant Conditioning – 4 consequences
Something good can start or be presented
Something Good can end or be taken away
Something bad can start or be presented
Something bad can end or be taken away
o Operant Conditioning (Voluntary)
Operant Conditioning: Learning through voluntary behavior and its
subsequent consequences
Reinforcement: Strengthens a response and makes it more likely to occur
Punishment: weakens a response and makes it less likely to recur
o Operant Conditioning – 4 consequences
Positive: technical term for an event started or an item presented, since its
something that added to animals environment
Negative: technical term for an event ended or an item taken away is since
its something that subtracted
o Operant Conditioning – Reinforcements
Positive reinforcement: when a response is followed by receiving a reward
or other desirable event
Negative reinforcement when a response is followed by removal of an
unpleasant event ( stop headache aspirin)
o Operant Conditioning – Punishments
Punishment: Any event that follows a response and decreases likelihood of
occurring again
Positive punishments = adding a stimulus weakens likelihood of
reoccurrence (run 4 extra laps)
Negative punishment = taking away a stimulus weakens likelihood of
reoccurrence (makes you sit somewhere else)
o Side Effects of Punishment
Increased Aggression
Passive aggressiveness
Avoidance behavior
Modeling
Temporary suppression
Learned helplessness
Uses of partial reinforcement
o Shaping
Reinforce successive approximations of target behavior
One step at a time
Keep raising the bar
Creating new behaviors
Shaping
o Begin with a behavior the subject already does and shape it
to the behavior you want
o Need to reinforce each step (successive approximation)
Stop reinforcing a step to encourage subject to try
new step
Goal: perform target behavior
Subtypes of negative reinforcement
o Escape
Experience something, then perform behavior (Rain open umbrella)
o Avoidance
Perform behavior before you experience something (umbrella going into
rain)
Making Operant conditioning more effective
o Contingency (Dependency)
o Contiguity (Immediecy)
o Size
o Deprivation
Primary and secondary Reinforcers
o Reinforcer = consequence that strengthens behavior
o Primary positive SR: biological need
Food water sex sleep
o Primary negative SR : escape
o Secondary Reinforcers (conditioned reinforcers)
SRs have acquired value through being:
Paired with established SR
Exchanged with an established SR
Established SRs can be first degree or 2nd reinforcers
o Money; Praise, social SR
Secondary Reinforcer Subtypes
o Standard: exchangeable only for one type of primary SR
o Generalized Conditioned Reinforcer
Can be exchanged for multiple reinforcers
o Advantages
Cheaper
More immediate
Avoid satiation
More conveniently delivered
Not tied to any specific deprivation
Extrinsic and Intrinsic Reinforcers
o Extrinsic Reinforcement: getting an external reinforcer for performing a behavior
Reading war and peace for a good grade in class
o Intrinsic reinforcement: getting reinforcement simply by performing the behavior
Reading war and peace for fun
Lecture 7
Schedules of Reinforcement
o Continuous vs. Intermittent Schedules
Ratio vs. Interval Schedules
Fixed vs. Variable Schedules
o Response Patterns and Rates
Continuous vs. Intermittent Schedules
o Schedule of Reinforcement: rules according to which the behavioral response (R)
is reinforced
o Continuous (CRF) = every response is reinforce
o Intermittent schedules are far more common
Ratio Schedules: reinforcement SR depends on number of times a
response R is repeated, or the amount of behavior that is emitted.
More responses (Rs) = more reinforcers (SRs)
Interval Schedule: reinforcement SR depends on passage of time, plus one
response
Responses during interval are not reinforced, Subjects must wait
enough time has passed and then make one response to get SR
Fixed Schedule: SR is predicatable
Fixed Ration (FR) = SR after every X responses
Fixed Interval (FI) = get SR after every X seconds, plus one
response
Variable schedule: SR is unpredictable
Variable Ratio (VR) = get SR after about every X responses
Variable Interval (VI) = get SR after about every X seconds plus
one response
Response Patterns and Rates
o Pattern of responses: how responses are distributed between reinforcers
o Rate of responses: number of responses per minute
o FR: Rate is high until SR then pause Pattern ressembles steps
o FI: Rate starts low, increases closer to SR with post reinforcement pause
Pattern resembles shallow scallops
o VR: Rate is extremely high and steady with NO pause
Pattern is steep slope
o VI: rate is steady with no pauses, not as high as VR pattern is a moderate slope.
Schedules of reinforcement
o Fixed ration- buy 5 get 1 free cards
o Variable Ration- slot machine
o Fixed interval- paycheck
o Variable interval – study for pop quiz
Superstition
o What occurs when a reinforcer is not contigent
o S connects unnecessary Bs with non contingent SR
Human behavior will change over time
We do learn, even under non-contingent reinforcement
Lecture 8:
Theories of reinforcement
o Skinners views
Of reinforcement
No theory required
An empirical issue
If something acts to increase a behavior then it’s a SR
Inductive view point
A SR is a stimulus that reinforces the operant R
o Two basic theory types
o Hulls drive reduction theory
o The premack principle
o Response Deprivation Theory
o Bliss point theory
Two types of theories
o Reinforcers as stimuli (Hull)
o Reinforcers as activities
SR = eating the food
SR = watching TV
o 3 main reinforcers as activities theories
Hull’s Drive-Reduction theory
o Stimuli that reduce a biological need, or that reduce strong adversive stimulation
will become reinforcers
o Drive: Hunger Act: Lever Press Drive Reduced: Receive Food
o Drive: Escape shock pain Act: lever press Drive reduced Neg SR shock
ends SR
o 3 types of reinforcement
Primary reinforcers
Stimli that reduce drives (food)
Stimuli that result in reduction of another strong aversive stimulus
(painkillers)
Secondary Reinforcers
Stimuli associated with drive- reducing stimuli (kitchen, cooking
showsm, meeting with snack)
Problem with Hulls theory
o Researchers found stimuli that had high reinforcement value, yet did not reduce
any drives
Monkey pulled chains to see lab and its workers
Rats pressed levers to stimulate pleasure centers
Premacks principle
o Reinforcement is a contingency between 2 behaviors
Instrumental R and contigent R
o Getting to do the second behavior is contigent upon first peroming the
intrsumental behavior
o Doing the first behavior is instrumental in your being able to do the second
contigent behavior
o Can consider contingent activity as reinforcer
o Not all behaviors are reinforceers
o More probable Behaviors reinforce less probable behaviors
o Contingent Behaviors have higher probability than instrumental behavior
Instrumental behavior goes down - contingent behavior goes up
Measure through free baseline (paired baseline)
o The bottom line:
Getting to know the more probable Behavior is contigent upon performing
less probable instrumental Behavior
To demonstrate significant contingency effect you must perform
instrumental behavior above baseline level
o Shortfalls of Premack’s principle
Premack assumed that the more preferred activity were those that people
spent more time doing during baseline
Doing hw: 30 minutes – multiple hours
Having sex typically less than homework
o Premack did not attach time amounts to his behaviors
Free baseline: run 60 min drink 30 min
Contingency: drink 15 min run 30 min
Result – Rat drank 30 min then ran 60 min
No increase above baseline no contingency effect
Response- Deprivation Theory
o Reinforcement effect will occur only when the reinforcement contingency
deprives the S of the contingency activity.
Performing baseline amount of instrumental R earns you less than baseline
amount of Contingent R
Make S perform instrumental Behavior at a higher level than its baseline
to obtain at least the haseline level of the contingent Behavior
o Premack: behavior probabilities; Response deprivation behavior preferences
o Contingent activity doesn’t have to be more probable
Any activity can be reinforcing as long as subject is deprived of
performing it at baseline
o Use less preferred activity as instrumental behavior to gain access to the more
preferred contingent Behvaior
Employ these in ratios that are above baseline
o Subjects do not always perform contingencies as predicted/ calculated by reponse
deprivation
o Subjects instead alter their behaviors to compromise
Bliss Point theory
o Baseline = behavioral ideal
Behavioral ideal drives performance
Any contingency imposed disrupts bliss point
S increases frequency of BI as a compromise
Point of Mininum Deviation
Accurately predicted by formulae
Lecture 9
Extinction
o Operant extinction
No longer reinforcing operant behavioral response
Ultimately reducing behavior probability to zero
o Ideally combined with DRO (esp. DRI)
Effects of Extinction
o Initial and temporary effects
Extinction burst
Emotional behaviors
Frustration
o Emotion when no longer reinforced
Aggressopm
o Kicking cursing biting
o Other Efefcts (most occur soon, disappear quickly)
Increase in behavior variability
Resurgence
Depression
o Spontaneous recovery
Recovery of extinguished response after a sufficient break
Ultimately response decreases
Resistance to extinction (RTE)
o Ext. takes longer under intermittent reinforcement
PRE (partial reinforcement effect)
o Resistance to extinction: extinction takes longer
CRF schedules have low RTE
Intermittent schedules have high RTE
o To decrease RTE best strategy
Cobine extinction of inappropriate behavior with reinforcement of
incompatible behavior DRI
Generalization and Discrimination
o Discriminative stimuli
o Stimulus control
o Stimulus generalization
Generalization gradient
o Stimulus discrimination
Discrimination training
o Peak shift
Stimulus Discrimination: tendency for an operant response R to be emitted more in
presence of one stimulus than another
Discriminative Stimuli
o Pigeon learns to emit key peck R to obtain SR of food
R (peck) SR (food)
o Reinforce R only in presence of a lighted key
SD yellow key: R(peck) SR (food)
o Discriminative Stimulus (SD) a stimulus in presence of which responses are
reinforced and in the absence of which they are not reinforced
Stimulus Control
o Stimulus control: when presence of SD reliably affects probability of behavior R.
Examples:
o Pigeon more likely to peck light when lit up
o Drivers more likely to pull into gas station when low fuel light comes on
Stimulus generalization: tendency for an operant response R to be emitted in the presence
of a stimulus that is similar to SD
o SD yellow key: R (peck) SR (Food)
Discrimination Training
o Not always desirable for a behavior to occur in every situation
(flirting during a funeral)
o How do we behave appropriately in each situation
o Discrimination traning involves reinforcement of responding in the presence of
one stimulus SD and not another stimulus
Discriminative Stimulus for extinction S delta: stimulus that signals absence of
reinforcement
Discrimination Training Example
o Pigeon SD yellow key: R peck SR food
o AND S delta (Blue Key): R Peck NO FOOD
2 types of discrimination training
o Intradimensional: both SD and Sdelta are from the same dimension (same
spectrum, noise type, etc.)
o Interdimensional: SD and Sdelta are from different dimensions (SD =light; S delta
= sound)
Having a stimulus being ON vs. OFF also counts SD = light on S delta =
light off
o Intra = within
o Inter = between
The Peak Shift
o Peak shift has two crucial elements:
Peak is no longer centered over SD
The entire gradient has shifted away from Sdelta
o What causes the Peak Shift?
Interaction between Excitatory and Inhibitory Gradients
Excitatory gradient (SD)
Inhibitory gradient (S delta) usually more flat
Peak Shift occurs only after Intradimensional Training
Lecture 10
Lecture 11
Timeout Article
Timeouts should be used sparingly, side effect of excessive punishments are more
significant than benefits
Brief: timouts positive effect on behavior is almost all concentrated in first 2 minutes
Immediete: following unwanted behavior done immeietely
Isolation: let him along
Administered calmly and without repeated warning
Pivot and be firm take away privileges
Anger Article
Shock and Awe: respond swiftly with indignation, one of most common and least
effective
o Immediete: negative back and forth with kid
o Long term effect: increase occurrence of disrespect
o Side effect: teaching children to do the same to others
The Evil Eye: stare down child with dire expression and say nothing
o ImmediateL escalate and continue childs behavior
o Long term:better than full range
o Side: help develop calmer behavior on childs part in long run, too harsh is still
inflammatory
Rational Saint:
o Go to child and in gentle voice explain why she is doing is not appropriate ,
ineffective
o Immediate: not change behavior
o Long term: modeling of calm in response to range wil have positive influence
over time, slow to cocur
o Side: not teach child proper behavior
Ringmaster: divert childs interest to something else
o Immediate: not likely to work
o Long term: no effect
o Side: avoid task of teaching other ways to handle stress
Void: Ignore and walk away
o Immediate: withholding all attention deescalates child behavior , end childs
comments
o Longterm: ignoring decrease likelihood of respect of time
o Side: weak response
Mona Lisa:
o Show no emotion, skught amused, relatively effective
o Immediate: deescalate
o Decreaseses likelihood of future battles,
o Side: not taught lesson but asking for restraingt from yourself
Parking ticket
o Most effective option, take away privilege and walk away
o Lognterm: decrease happeniogn again
Misbehavior Article
Find positive opposite identify whaty can be let go or not. Age creates
Why focus on it
Chapter 7
Schedule of Reinforcement
Schedule of Reinforcement is response requirement that must be met to obtain
reinforcement
o Ex: Does each lever press by rat result in food or several lever pulls later?
o Did mommy give cooky once or every time
Continuous Versus Intermittent Schedules
o Continuous reinforcement schedule: one in which each specified response is
reinforced. Each time a rat presses lever, it obtains food pellet. Each time dog
rolls get treat, each tine turn ignition motor starts
o CRF useful when behavior first shaped or strengthened. Praise child each time
brush teeth
Intermittent Reinforcement Schedule: some responses are reinforced
o Ex: one in which only some responses are reinforced, only some of rats lever
presses result in food pellet. Not every person accepts a date.
o 4 Types: Fixed Ratio, Variable Ratio, Fixed Interval and Variable Interval
o Steady State behaviors: stable pattern
o Fixed Ratio Schedule: reinforcement is contigent upon a fixed, predictable
number of responses.
FR 5: Press 5 times to obtain food
FR 50: press lever 50 times to obtain food pellet.
Earning a dollar every 10 toys assembled = FR10
FR1 = CRF continuous reinforcement, each response is reinforced
Produce higher rate of response following a pause called post
reinforcement pause.
Take break following reinforce, short break after each chapter,
break = higher rate of response.
FR pattern – “break and run pattern” a short break followed by run
of response
Starting a task is harder
Longer tasks FR 100 vs FR 30 produce a longer time pause
Reinforcer easy to obtain: Dense or Rich, hard = lean
Moving from dense lean should be a slow process, too fast of a
jump is called ratio strain, disruption in responding
Variable Ratio Schedule: Reinforcement is contigent upon a varying,
unpredictable number of responses
VR5, emit average of 5 lever presses for each food pellet.
Generally produce a high and steady rate of response with little or
no postreinforcement pause.
o Each response on VR schedule has potential of resulting in
reinforce
o Gambling, unpredictable nature of activit results in high
rate of behavior
o The less often the victimizer reinforces the victim, the more
attention he or she receives from the victim. Victim works
hard to get the partners attention that he or she actually
reinforces the process of being largely ignored.
Fixed Interval Schedule: reinforcement is contingent upon the first
response after a fixed, predictable period of time.
First lever press after a 30 second interval has elapsed results in
food, another 30 seconds must follow for food pellet.
Increase rate of response as post reinforcement pause followed by
gradually increasing rate of response
Indicated time would be discriminative stimulus SD, As time
progresses look at watch more.
Variable Interval Schedules: Reinforcement is contigent upon the first
response after a varying unpredictable period of time. Average interval of
30 seconds will result in food.
Produce a moderate, steady rate of response with little or no
postreinforcement pause.
Comparing the 4 basic schedules:
o Produce quite different patterns of behavior
Ratio produces higher response than interval. Reinforcer in such schedules
is entirely response contigent.
Fixed schedules produce postreinforment pauses wheras variable
schedules do not. ON VS there is always possibility of relative immediate
reinforce even if one has just attained a reinforce.
On a fixed schedule attaining one reinforce means the next reinforce is
some distance away.
On FR schedule, this results in a short post reinforcement pause after
grinding out another set of responses
On FI schedule post reinforcement pause is followed by gradual
increasing rate of response as interval draws to a close and reinforce
becomes imminent
Other Simple Schedules of Reinforcement
o Duration Schedules: reinforcement is contingent on performing a behavior
throughout period of time.
Ex: Rat must run 60 seconds to earn one pellet of food (FD-60 Schedule)
o Variable Duration Schedule: behavior must be performed continuously for a
varying unpredictable period of time.
o FR schedules, one knows precisely what was done to achieve the reinforce, On
FD schedule, what constitutes continuous performance.
Ex: lazy rat can walk or an energetic rat can run fast. Both will receive
reinforce.
Response Rate Schedules
o Different types of intermittent schedules produce different rate of response
o Reinforcement is contingent upon organisms’ rate of response.
o Differential Reinforcement of High Rate (DRH): Reinforcement is contingent
upon emitting at least a certain number of responses, reinforcement is provided
for responding at a fast rate.
One type of response is reinforced while another is not.
DRH schedule, reinforcement is provided for a high rate of response and
not for a low rate.
Ex: Rat gets food from at LEAST 30 lever within a minute, DRH
ensure high rate of responding
o Differential Reinforcement of Low Rate (DRL): minimum amount of time
must pass between each response before reinforce delivered. Reinforcer provided
from slow response
Rat get food only if it waits 10 seconds between each press. In DRL
schedule, responses prevent reinforcement from occurying, responding
before 10 seconds must not occur
o Differential Reinforcement of Paced Responding (DRP): reinforcement is
contingent upon emitting a series of response at a set time, reinforcement is
provided for responding neither fast nor slow
Ex: rat get food emits 10 consecutive response, separated by 1.5-2.5
seconds
Orchestra, music
Noncontingent Schedules: Non contingent schedule of reinforcement, the reinforcer is
delivered independently of any response. Response not required for reinforcer to be
obtained.
o Fixed Time Schedule: reinforcer is delivered following a fixed period of time
Ex: pigeon gets food every 30 seconds regardless of what happens
Free reinforcer
o Variable Time Schedule: reinforcer developed follows unpredictable period of
time
o Noncontingent reinforcement account for forms of superstitious behavior
Ex: accidentally reinforced behaviors within pigeons. They would do same
things hopefully for same frequency.
Adjunctive Behaviors: fidgeting while waiting
o Carrying lucky charm
o Pigeons that receive free reinforces will work less vigorously for the contingent
reinforces.
o Effective in reducing frequency of maladaptive behaviors. If attention given on
noncontingent basis, they will not act out.
o Unconditional positive regard viewed as form of noncontingent social
reinforcement.
Complex Schedules of Reinforcement
o Complex Schedule: combination of 2 or more simple schedules
o Conjunctive schedules: type of complex schedule in which requirements of 2 or
more simple schedules must be met before reinforcer delivered
Ex: FI 2 minute FR 100 schedule, reinforcement is contingent upon
pressing 100 level presses and completeing one lever press following a 2
minute interval
o Adjusting Schedules: Response requirement changes as a function of the
organisms performance while responding for the previous reinforcer
Ex: FR 100 schedule, if rat completes all 100 within 5 minutes, increase
requirement to 110
o Chained Schedules: sequence of two or more simple schedules, each of which
has own SD and the last of which results in a terminal reinforcer. In other words,
person or animal must work through series of component of schedules to obtain
reinforcer.
o Green Key: Peck Red Key: Peck Food
SD R SR/SD R SR
SD: Discriminative Stimulus
SR: Reinforcing Stimulus
o Presentation of red key is both secondary reinforcer for completing VR 20
schedule and SD for responding on subsequent FI 10 second schedule. Can create
3 link chain
o Once pigeons learn which schedule is associated to which key, show appropriate
response patterns for schedules.
Pigeons display long pauses and slow rate of response on white key
compared to other two. Responded most on red key
o Goal Gradient Effect: increase in strength or efficiency of responding as one
draws near to the goal.
Ex: rat runs faster through maze as it goes
Backwards chaining: training final link first and initial link last
Establish red key as secondary reinforcer associated with food.
Presentation of red key can be reinforced corresponding green key, green
then with white
Theories of Reinforcement
o Drive Reduction Theory: an event is reinforcing to the extent that it is associated
with a reduction in some type of physiological drive
o Hunger produces food drive, seek food, drive reduced when food found
o Incentive Motivation: Motivation derived from some property of reinforcer.
Ex: playing video game for fun
Spicy food is incentive motivation
o Premack Principle: reinforcers can often be viewed as behaviors rather than
stimuli. Lever pressing not for food but for act of eating food (behavior)
o Premack Principle: high probability behavior can be used to reinforce low
probability behavior
Ex: rat is hungry, eating food has high likelihood of occurrence than
running on wheel. Eating food is (HPB). Rat will run wheel in order to
obtain food (LPB)
o Work then play
Responsive Deprivation Hypothesis:
o Premack principle requires us to know relative probabilities of 2 behaviors before
judge. States that behavior can serve as a reinforcer when:
1. Access to behavior is restricted
Frequency falls below preferred level. (base line level of occurrence)
Rat run 1 hr, only let him run 15 min, now will work to run on
wheel
Behavior Blis Point Approach
o Between 2 or more activities, organisms with free access to alternative activities
will contriute its behavior in such ways to maximize overall reinforcement
o Distribution of behavior represents optimal reinforcement available from 2
activities
Rat forced to run twice as long as wheel than explore maze, (likes to do
opposite) it will find a way to get close to making same ratio it will have
to do feel.
Chapter 8
Chapter 9
Chapter 10
Class notes
Autism and Behavioral Therapy
o Autism Spectrum Disorder
Autism, Aspergers Syndrome, PDD NOS
Autism Spectrum Disorders
o 1 in 50 have autism spectrum disorder
o Prevalence of ASD has increased by 285 percent over 12 years
o 5 times more common in boys
o Present across all culture and socio economic statuses
o Very Heterogeneous Population
Diagnostic Criteria 1
o Impairment in social interaction (at least 2)
Nonverbal social interaction
Peer relationships
Spontaneous sharing
Social or emotional reciprocity
Diagnostic Criteria 2
o Impairment in communication
Delay in language development
Impairment in initiating and maintain conversation
Stereotyped and repetitive use of language
Lack of imaginative play
Diagnostic Criteria 3
o Restricted repetitive an stereotyped pattern of behavior interests and activities (1)
Restricted range of interest
Insistence on specific nonfunctional routines or rituals
Stereotyped and repetitivce body movements
Preoccupation with parts of object
Basic behavioral Therapy components
o Antecedent: Stimulus that elicits the childs behavior [pick up box of crayons say crayons]
o Behavior: response by the child [child says crayons]
o Consequence: what is maintain the behavior [give child box of crayons]
Prompting, getting child to say something
You want more water? You say yes, child says yes, give child water
Child says ba, you say book child says book, give him book
Pivotal response training
o Pivotal areas
Motivation, responsivity to multiple cues
Why target pivotal areas?
o Maximum effective ness of intervention
o Minimize the treatment time necessary to learn new skills
o Pivotal response training
Antecedents (attention, shared control, multiple cues, maintainance skills)
Consequences (Contingent reinforcement, reinforcement of attempts, direct
reinforcement)
o Skills we can teach with PRT (language, play skills, social skills, academic skills)
Child attention and appropriate prompting
o Question/instruction/opportunity to respond should be clear, appropriate to the task,
uninterrupted and the child must be attending
Maintainance/ acquisition: Intersperse maintaince tasks with acquisition tasks to keep
motivation high and frustration low
Shared Control
o Teaching interactions should include strategies of shared control to enhance motivation,
these strategies include (Child choice, take turns)
Multiple Cuers: Structure environment to increase childs responsivity to multiple cues
Contingent Reinforcement: any response to a childs behavior must be contingent on the correct
behavior or attempt
Reinforcements of attempts: goal directed attempts to respond to questions, instructions, or
opportunities should be reinforced
Direct reinforcemet: there should be a direct relationship between the reinforcement and the
desired behavior
Physical cognition: tool use, social cognition, social learning (diet mate selection, imitation and
mimicry)
Physical cognition: encoding storage retrieval and processing of feature information related to
objects determine relationship between object in world, recognize objects, tool use,
acquiring food, nest building, weapons, concealment
Social Cognition: encoding storage retrieval and processing of social information related to other
members of a community, individuals living in a social group regulary interact with other
members of that group
o To be successful in social group, animals need to have knowledge of interactions and
they they are currently doing. Western scrub jay, cache seeds, but pay attention to
whether other Jays are around
Social learning, animals learn a behavior by observing another animal
o Chimpanzee community has unique traditions, chimpanzees on the west but not the
east side cracks nuts. Suggests cultural rather than genetic, transmission of behavior
Animals learn a behavior by observing another animal
o Demonstrator: the animal producing the behavior
o Observer: animal watching the behavuior and later reproducing all part of it
o Mouse prefers demonstrated chocoloate over cinnamon, smell food
o Guppies: female observers select males previously seen with a partner, as opposed to a
male seen alone
Imitation: copying the response of a demonstrater when it leads to awards
o Overimitation: tendency to copy unnecessary actions, chimpanzees open box, copying
only necessary actions , child copy all actions why? Eager to please
o Demonstrater intentionally modifies behavior in presence of naïve animal, incurs a cost
or no immediate benefit, observer learns faster than it would have
Mimicry: copying response of a demonstrateor when it doe not result in any tangible reward,
social
Tell parents to back off, stop asking child to do desired behavior, OK not to do it at all. Mask
attitude of enthuthiasm or rage bland disinterest.
Behavior is controlled by antecedents, urgency can inspire push-back and resistence to even the
most rational pitch. Normal
Reactance: reaction that is directly opposite to some rule or request. Occurs when someone
feels he is being pressured and there is some added limit being placed on his ffreedom or
choice.
Ironic processes: oppositional actions within ourselves. Pressure we put on ourselves to do
something backfires
However, when we are stressed or overloaded—when we're trying to multitask, for instance—
the monitoring can break down and those other things we're trying to exclude are much more
likely to come up and be expresse
Motivation Article
People who have a lot to do full suit case, tunneling, focus on whats at hand and not whats the
bigger picture
Possibility of borrowing resulted inlower scores for the time poor overall. Time starved person
rushes to meet deadlines w/o taking time to order his affair for the future
Stress of scarcity leaves insuffiecient energy to pay attention to long run decisions. Poor aren’t
poor because they make bad financial decision, it’s the stress of poverty that leads to bad
financial decision in the first place
Take on fewere obligations., should have occasional periods of self reflection
People are more readily afraid of events that have some type of evolutionary association with
danger, such as with snakes rather than cars. This innate tendency for an organism to more
easily learn certain types of behaviors or to associate certain types of events with each other is
known as Preparedness.
Preparedness and Conditioning
o Fear conditioning is one form of classical condition in which preparedeness plays an
important role
Tase aversion conditioning, form of classical conditioning in where food item
has been paired with stomach ace becomes aversive stimulus. Doesn’t like food
after it makes you feel bad
Even though it may not be the food, could be flu, the person still doesn’t like the
food. After making rat sick with injections rat doesn’t want sweet water
anymore
Sweet water (NS): X-ray (US) -> Nausea (UR)
Sweet Water (CS) Nausea (CR)
Stimulus generalization: occurs when food taste similar to aversive item and
perceived as aversive. One fish hate all fish hate.
Can be extinguished if aversive food item is repeatedly digested without
illness
Overshadowing, develop aversion of stronger tasting food than mild
tasting food onions potatoes.
o Presence of food that has aversive associations can block
development of asserve associations to other food.
o Eat peas (hate) and bad fish, you will think getting sick because
of peas
Latent inhibiton: more likely to associate relatively novel item such as
unusual liquor with sickness than familiar items such as beer.
o Ex: a rat encounters posion, becomes ill not go for poison again,
eat a little move on to food it has eaten already.
Differences in taste aversion
o The formation of Associations over Long Delays.
NS and US must occur in temporal proximity in classical conditioning for it to
work.
Taste aversions can develop 24 hrs later, protect animal from being a dumb shit
and eating poison over and over again.
o One Trial Condition
Strong conditioned taste aversion can be achieved with only a single pairing.
Much like fear conditioning. Minimize possibility of repeat
o Specificity of Association
Feel nauseous, associate to the meal, rat associates not feeling good with food
not injection
Associate with food rather than anything else. Called CS-US relevance, associate
certain types of stimuli with each other
Rat conditioned with sound and sweet water, given injection go for normal
water, given shock still go for sweet water.
Physical pain deals with visual and auditory stimuli, nausea deals with taste
Quail and Mouse were given a choice between Dark Blue Sour water, Dark Blue
water and sour water placed in front. Mouse go for Dark blue because deals
with taste. Quails go for sour water because they use vision for food. Rats are
night time feeders and use smell and taste
Woman are better at odor and create more taste aversion. During pregnancy,
these taste aversion help to ingest anything bad.
Preparedness in operant conditionin
o Song can be used to train birds to go to certain areas and food used to peck at certain
things. Biological preparedness plays role in operant conditioning
o Easier for bird to fly away to avoid shock but not peck something to avoid shock
o Aversive stimulus elicits Species-Specific Defense Reaction (SSDR), in natural
environment is often effective in countering danger
Rats run or freeze in painful situation, because they are naturally brought out in
face of danger.
May even stand still and get shocked
Operant-Respondent Interactions
o 2 further examples of overlap between operants and respondants: Instinctive drift and
sign tracking
o Instinctive drift:
Brelands trained animals, pig throw coin everywhere not just put into bank
same with raccoon
Fixed action patterns had emerged to interfere with operant behaviors patterns.
The coin became associated with food. And the raccon started rubbing it as it
would for shellfish
Coin (SD) Deposit in bank R Food SR
Coin (NS): Food (US) Rooting (UR)
Coin (CS) Rooting (CR)
As classically conditioned response increased in strength it overrode operant
conditionied respond
Instinctive drift: instance of classical conditioning in which genetically based FAP
gradually emerges and displaces behavior that is being operantly conditioned
o Sign Tracking
Light not only associated with food but also acquired appetitive properties.
In sign tracking, an organism approaches a stimulus that signals the
presentation of an appetitive event. The approach behavior seems very much
like an operant behavior becvause it appears to be quite goal directed, yet the
procedure that produces it is more closely akin to classical conditioning
Ex: light presented before food cue, dog salivating on sight of light, dog walks
over to the light and starts displaying food-related behaviors towards it.
Autoshaping: type of sign tracking in which a pigeon automatically pecks at a
key because key light has been associated with delivery of food.
Behavior that starts off as an elicited behavior transformed into an
operant behavior. Pigeon initially pecks key because key light predicts
food, later pecks key in order to obtain food
When a bird pecked a key associated with water, it dod so when the eye
closed beak open, it was trying to drink the key that was associated with
water and eat key associated with food
They peck even when loss of food
Key light exerts such strong control over behavior that it overrides
negative punishment. Sign tracking despite resultant loss of reinforce is
Negative automaintainenance
Adjunctive behavior
o Indistinctive drift and sign tracking represent two types of anomalous behavior patters
that develop during operant conditioning procedure
o Adjunctive behavior: excessive pattern of behavior that emerges as by product of an
intermittent schedule of reinforcement for some other behavior. As one behavior is
being strengthened, another emerges as side effect of procedure , called schedule
induced behavior.
o Basic Procedure and Defining Characteristics
Falk first person to investigate, Rats trained to press lever also began drinking
excessive amount of water.
3 hr session drink 3 times amount of water
Schedule induced polydipsia: excessive thirst developed rapidly beginning in
first session and firmly established second session
Studies of adjuctive behavior employ FI or FT schedule
During the interreinforcement intervals adjunctive behavior occurs, time when
reinforce not available in a schedule. Low probability or zero probability of
reinforcement
Pigeons exposed to FI or FT began attacking nearby pigeon, or picture of stuffed
pigeon. Adjunctive behaviors can be generated not necessarily using food as
reinforce. Use electrical stimulation of pleasure centers to produce adjunctive
behavior of eating
According to Falk adjunctive behavior has several distinguishing feature
o Typically occur in period immedietly following consumption of intermitten reinforce
Rat eat food delievered then drink water. Start of interval tends to be
dominated by drinking then at the later end it is dominated by pressing lever
o Adjunctive behavior is affected by level of deprivation for the scheduled reinforce
Greater level of deprivation stronger the adjunctive behavior. Greatly leaving
out food = more drinking
o Adjunctive behaviors can function as reinforcers for other behaviors
High probability behaviors serve as effective reinforcers for low probability.
Press lever for food also press lever for water
o Optimal interval between reinforcers for the development of an adjuctive behavior
Rats will drink less in 5 second interval over 180 second interval
Adjunctive behavior in humans
o Adjunctive processes contribute to development of drug and alcohol abuse.
o Humans produce adjunctive just not as severe. Exposed to fixed interval schedules of
monetary reinforcement display increase tendency to drink water. Length of interval
between reinforcers was an important variable. LENGTH of variable is important
o Drinking beer and smoking happens after period of reinforcer
o Adjunctive processes encourage individual to frequently consume addictive substance
with person becoming addicted to it
Adjunctive behavior as displacement Activity
o Adjunctive behaviors represent type of Displacement activity an irrelevant activity
displayed by animals when congronted by conflict or thwarted from attaining a goal.
Maybe animal is pissed
o Displacement activities serve 2 purposes
They provide for more diversified range of behaviors in particular setting, often
beneficial. Pick at twigs causing birds to uncover more food.
Second: help animal remain in situation where significant reinforce might
eventually become available. Pecking ground gives bird something to do to wait
for the emergence of the bug
Adjunctive viewed as lack of self-control. People that give up an addiction are likely to seek out a
replacement.
Activity Anorexia
o One type of behavior generated as adjunctive behavior is wheel running
o Activity anorexia, it has some important implication for people who are undertaking a
diet and exercise program to lose weight, causes more running
o Basic and defining characteristics
Rat allowed to access food for 1.5 hrs feeding period, access to running wheem
22. 5 hrs interval between meals, they will spend increasing amount of time
running durinv interval
More they run, less they eat, less they eat more they run
Negative feedback cycle in which two behavioral tendencies increase running
and decrease eating. Within week all run and no eat. Process continues rats will
die
Activity Anorexia: high level of activity and low level of food intake generated
by exposure to restricted schedule of feeding. No wheel available and restricted
food do just fine. Display only moderate level of running and no tendency
towards self-starvation
o Comparisons with Anorexia Nervosa
Anorexia nervosa is psychiatric disorder in where patient refuses to eat
adequate amount of food and lose weight, require hospitalization.
Anorexia comes with high level of activity, exercise program, severe sort of
restless ness.
Increase in activity follows decrease in food intake, decrease in food intake
follows increase in exercise.
Athletes and dancers higher chance of being anorexic
People/ rats enjoy their food and take ime to eat even when anorexic.
Rat is physicaly restricted of accessing food humans impose self diet even when
food availab.e Free availability of food may be more apparent than real.
Bulimia: binge on food then purge by vomiting or taking laxatives.
o 2 types of anorexia: Reestricting Type, simple food restriction and
o Binge-eating/ purging type: dieting is combined with binging
Underlying Mechanisms
o Endorphins play role in anorexia, morphine like substance in brain to help pain
reduction. Implicated in feeling of pleasure, runners high
o Anorexia can be stopped by providing high level of food all of a sudden
Clinical Implications
o Anorexia has several clinical implications. Behavioral treatments for anorexia should
focus as much on establishing normal patterns of activity as they do on establishing
normal patterns of eating. Endorphin blockers to reduce pleasure
o Anorexia model suggests people who are dieting should eat several small meals per day
as opposed to a large meal. Increasing exercise level should do so slowly. Dieters ensure
that meals are balanced
Behavior Systems Theory
o Biologoical dispositions play a strong role in conditionis.
o Behavior Systems theory: animals behavior is organized into various motivational
systems, feeding mating, avoiding predators and so forth. Each of these systems
encompass a set of relevant response, can be activated by particular cues. Different
systems overlap such that response that is typically associated with one system may be
instigated by another system
o Feeding system in rat. Rat is hungry, engage in food related responses. Salivating
chewing etc. When food about to come it will salivate food not yet here it will search
cage, even farthere it will run and explore
o Lever pressing and maze work well because rats have evolved to run along enclosed
spaceand using forepaws to pick up food . Manipulating something such as lever is
natural
o Behavior systems theory: assigns important role to environmental cues that determine
which behavior will be activated.
Chamber in which stimulus rat could mechanically inserted and withdrawn
When rat inserted just before delivery of food, thus becoming CS for
food.participant rat would approach stimulus and engage in social behavior,
sniffing mouth pawing and grooming
Error! Hyperlink reference not valid.Pay close attent to other rats steal food.
Stimulus rat brought out social comporent of feeding system in participant rat.
Wooden block did not bring out social component. Rolling marbe brought out
clawing and gnawing.
Dog that bites when you take its food asserted dominance and is part of
behavior system theory
Processes that allow us to acquire new behavior patterns through indirect means.
Observational or Social Learning
o Watch people of what they did and follow, called observational learning
Behavior of a model is witnessed by an observer, and observer behavior is
alrtered. Observational learning is a social process, humans are social beings,
acquire new behavioral pattern this way. Learning is often referred to as social
learning. Social situation can change behavior.
Classical and operant conditioning. 2 elementary forms of social influence
Contagious behavior and stimulus enhancement
Contagious Behavior and Stimulus Enhancement
o Contagious Behavior: reflexive behavior triggered by occurrence of same behavior in
another individual. Everyone yawn because of one person
o One startled duck to get the rest ro fly off. Laugh tracks help people laugh
o Orienting reponses are also contagious, orient ourselves towards stimuli as well as
orient ourselves in direction others have oriented. Works for dogs with umans they
detect things we don’t detect
o Stimulus Enhancement: probability of a behavior is changed because individuals
attention is drawn to a particular item or location. Girl goes for candy causing you go go
for candy because she noticed it. The incentive is enough.
o When animals of same species come across scent, enough to pay attention and find
food.
o Behavioral contagion and stimulus enhancements are examples of social influence. They
are best at rudimentary forms of social influence. Result in momentary change in
behavior.
Observational Learning in Classical Conditioning
o Observational learning is involved in development of classically conditioned responses.
Stimuli involved are usually emotional in nature.
o Various emotional responses: Classically conditioned emotional responses result from
seeing emotional responses exhibited by others, this type of conditioning is called
Vicarious emotional Conditioning
Take place in 2 ways. First: Expression of fear in others may act as
unconditioned stimuli that brings out fear in oneself
Quickly learn events are dangerous.
JellyFish (NS): Fear of others (US) Fear in Oneself (UR)
Jellyfish (CS) Fear in oneself (CR)
Look of fear in others (NS) Frightening events (US) Fear in onself (UR)
Loof of fear in others (CS) Fear in oneself (CR)
Jellyfish (NS2) Look of fear in others (CS1) Fear in oneself (CR)
Jellyfish (CS2) Fear in oneself (CR)
After watchibng happy children play with toy observing child wants to play.
Observational Learning in Operant Conditioning
o Acquisition and performance
Acquired basic information to drive, until of age can actually Perform driving
o Acquisition
Requires that observer pay attention to behavior of model. Very sensitive to the
consequences of the models behavior. If behavior is reinforced, observer may
do behavior.
Observer receives reinforcement for the behavior to a model. Teaching is based
on this principle
Third: Whether the observer has sufficient skills to benefit from modeling.
Playing chopsticks vs. Beetoven is way different.
Personal Characteristics of a model can strongly influence the extent to which
we will attend to their behaviors. Attend to models that resemble us—for
example people dress the same. Follow people we respect
o Performance: reinforcement and punishment work to midfy behavior in modeling
situations in three ways
We are more likely to perform a modeled behavior when we have observed
model experience reinforcement for that b ehavior
Vicarious reinforcement: Model wears perfume attracts people of
opposite sex, wear the same perfume
Tell same joke that makes people laugh
nd
2 : We are more or less likely to perform a modeled behavior when we
ourselves will experience reinforcement or punishment for behavior
Tell joke that people like keep telling it, frown stop telling it
rd
3 : our own history of reinforcement or punishment for performing modeled
behavior
What is appropriate to model and who is appropriate to model off of.
When it is appropriate to perform behaviors that have been modeled by
others
Imitation
o Interchangeably with observational learning. True Imitation is a form of observational
learning that involves close duplication of novel behavior
o Ex: If hot girl flirt with guard let in, Kelly flirt in different way it would not be true
imitation, flirt in same way, true imitation takes place
o Children generally imitate a model
o Generalized imitation: tendency to imitate a new modeled behavior with no specific
reinforcement of doing so.
By reinforcing imitation of some behaviors therapists can produce in these
children a generalized tendency to imitate
Can Animals Imitate?
o Kittens better imitate mom because of stimulus enhancers, pay more attention. Follow
mom and manipulate box
o In an experiment Pigeons are more likely to do what they see.
o Adult imitation more similar to monkey than children, even though children are the
imitators.
o Debate whether or not animals can really imitate, orangatanes are able to copy
Social learning and Aggression
o Children observed adult models behaving aggressively toward bobo doll. Children who
observed were violent towards Bobo replicate behavior, perform same motor
movements toward same target
o Children who heard disapproving comments about fight produce far fewer aggressive
behaviors, only when disapproving adult was present.
o Absence increase in aggression
Social Learning and Media Violence: From Bobo doll to Grand Theft Auto
o Filmed violence was as effective as live violence for inducing violent behaviors in
observers
o Pacman to grand theft auto. The amount of violent media viewed in childhood is
correlated with aggressive and antisocial behavior 10 years later. Significantly related to
adult criminality
o Amount of TV watched in childhood is correlated with amount of aggressive violent
behavior toward others
o Males more aggressive than females, more hostile view. Girls inhibit violence,
demonstrate higher frequency of aggression to girls over guys
o Desensitization to violence allow females to feel that violence and aggression are
normal aspects of life which could lead them to violent relationships
o Some people may view violence and never grow violent. Exposure to media can increase
likelihood of becoming victim to violence
Language
o Written, Spoken, Symbolic
o Humans born with black box helps acquire language.
o Ability to use arbitrary symbols, is called reference
o Grammar set of rules that control meaning of a string of words. Monkeys have different
calls for different predators
o Productivity: finite number of rules for any language, once learned, infinite number of
expressions can be generated
o Situational Freedom, which means that language can be used in a variety of contexts
and is not fixed to a particular situation
Can Animals Talk
o Cross-fostering: because chimpanzees were raised in human foster homes.
Sign Language Experiments
o Monkeys are able to produce and understand other forms of language, American Sign
language.
o Project Washoe, raised by 2 scientists, use modeling, demonstrate sign open. Also used
technique called molding, placing apes hand in correct signing position and associate
o Strictly controlled tests of language use were performed with many of the chimpanzees
trained in sign language. All chimps seemed to pass Reference test.
Artificial Language Experiments
o Lived in laboratories and coversed via artificial language. Yerkish, easy grammar and
vocab basic feature of language. Artificial and highly controlled surroundings made
systematic assessment relatively easy. Could have just memorized sequence Please
machine give X.
o Dolphins are social and better to train. Trained to do symbolic language. Researches
have worked with two dolphin. Understand a bit of grammar
Rule Governed Behavior
o Use of language enhances ability to interact with one and another.
o Rule: verbal description of a contingency. In a setting, performing a behavior will bring
out a certain consequence.
o Rule-governed behavior:behavior generated through exposure to rules.
o Rules or instructions: extremely useful for rapidly establishing appropriate patterns of
behavior
o Follor rules given in order to behave effectively in certain settings
o Generalized tendency to follow instruction as it produces good results
Some disadvantage of rule-governed behavior
o Less efficient than behavior directly shaped by natural contingency. Devoting time to
play golf can’t just read instructions.
o 2nd drawback behavior is sometimes surprisingly insensitive to the actual contingency of
reinforcement operating in a particular setting. Locked into notion of a certain golf
swing
Personal Rules in self regulation
o Advantage outweigh disadvantage
o Personal rules: verbal description of contingency we present to ourself, go to library
o Say do correspondence: close match between what we say we are going to do and what
we actually. Carry out promises.
o Need to specify when and where and how a goal is to be accomplished
o Personal process rules: personal rules that indicate the specific process by which a task
is to be accomplished.
Watson This man founded the psychological approach
of behaviorism. He was also known for his
unethical "Little Albert" Experiment
Spatial Contiguity This is when two things appear near each other
in location
Fixed Interval This has a response pattern that starts low and
Schedule increases closer to the reinforcement, after
which there is a pause
Intensity and These are the two main factors that affect the
Immediacy strength of a punishment and how effective it is
in reducing a behavior
Terms Definitions