0% found this document useful (0 votes)
110 views95 pages

Psyc Study Guide

The document discusses behaviorism and its key concepts. It covers the roots and differences between Watson's and Skinner's behaviorism, research methods, innate versus learned behavior, habituation and sensitization processes, and classical conditioning including Pavlov's experiments pairing stimuli.

Uploaded by

ucsdlliu
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
110 views95 pages

Psyc Study Guide

The document discusses behaviorism and its key concepts. It covers the roots and differences between Watson's and Skinner's behaviorism, research methods, innate versus learned behavior, habituation and sensitization processes, and classical conditioning including Pavlov's experiments pairing stimuli.

Uploaded by

ucsdlliu
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 95

 Lecture 1

o Behaviorism
 Roots
 Watson (1913)
 Skinner (1938)
 Similarities and Differences
o What is Behavior
 Behavior (B): any action (Voluntary or involuntary) exhibited by an
organism
 Eye blink
 Walking
o Watson’s Methodological Behaviorism (1913)
 Method: Introspection Experimental Method
 Subject Matter of Psychology: Consciousness of Mind, Directly
Observable and measurable behavior
 Causes of behavior: Mentalist, Environmental Events
o Why should we reject Mentalism
 Circular logic
 You must provide an explanation that is independent of behavior in
question
 According to scientific standards, mentalistic standards are NOT
real
 Ascribing mental causes obstructs inquiry into the real causes
o Modern Behaviorism: Skinners Radical Behaviorism
 Two types of internal events
 Private events
o Hunger, thoughts, sensations dreams
o Natural/ physical
o Covert behavior
 Mental events
o Non-natural, fictional
o Abstractions, ideas, concepts, constructs
o Skinner and Watson: Another difference
 Watson: All behavior are reflexive
 Skinner: Two types of behaviors
 Reflexive behaviors
o Mechanistic causality
 Operant Behaviors
o Affected by consequences
o Research Methods
 Descriptive research
 Experimental Research
 Types of Experimental Methods
 Group Design
 Single-Subject Design
o ABA Reversal
o Multiple Baseline
o Research Methods
 Descriptive: Provide a description of B
 Experimental: Provides explanation for B
 Goal of an experiment: To establish if a certain factor, or variable causes a
change in behavior.
 Independent Variable (IV) Factor under investigation
 Dependent Variable (DV) Behavior being measured
 Burden of Proof: IV is indeed what caused change; no other variable in
study could have done it
o Experimental Research
 Compare 2 or more variables
 All other conditions must be equal (No confounding variables)
 You need:
 A comparison
 Everything else equal, except for the IVs,
 Measures
 Random assignment of subjects to conditions
o Types of experimental designs or methods
 Group Design
 Single-subject design
 ABA reversal
 Multiple Baseline

 Lecture 2
o Innate Behavior
 Innate behavior defined
 Four Criteria
 Types of Innate Behavior
 Reflexes
 Fixed Action Patterns
 Reflexes and FAPs, similarities and differences
 Innate and Learned Behavior in Balance
o Innate Behavior: Behavior that is not learned
 4 Criteria: Unlearned, Invariant, Universal, Adaptive
 How do we show that a behavior is unlearned?
o Two Types of Innate Behavior
 1. Reflexes (S -> R)
 US -> UR
 Example of Innate Reflexes
 Allergens  Sneeze
 Food  Saliva
 Light  Pupil Constriction
 2. Fixed Action Patter (FAP): Specific Sequence, or pattern, of behavior
elicted by a specific stimulus (“releaser” or “sign stimulus”)
 Stickleback Fish
 FAPs are neither intentional nor purposeful
o Spiders and cocoon building
o Graylag goose
 Motivational Conditions Sometimes needed
o Reflexes and FAPs Similarities and differenes
 Similarities:
 4 Criteria
 Specific Stimulus
 Differences
 Reflex: 1 Action
FAP: more than 1 action/ behavior
 Reflex: Part of organism
FAP: Whole organism
 FAPs in humans? (Motherly crap)
o Balancing Innate and Learned Behavior in Nature
 Do all organisms need ability to learn in order to survive?
 Static vs. Dynamic Environments
 The advantage of learning
 The cost of learning
o Effects of Repeated stimulation
 Habituation
 Related Phenomena
 Sensitization
 When Habituation? When Sensitization?
o Dual Process Theory
o Habituation
 Habituation: decrease in the strength of a behavior/ response
 White Noise; intermittent weak stimulus
 Adaptive: Avoid sensory overload
 Keeps us open to new stimuli
 Stimulus-specific
o Phenomena related to Habituation
 Spontaneous Recovery
 Recovery of a habituated response after a break
 Need a reasonable amount of time
 Retention of habituation (Long term)
 Complete spontaneous recovery may not occur
 Dishabituation
 Recovery of habituated response after a new stimulus is present
 What is common and what is different(in terms of procedure and what is
measured between
 Spontaneous recovery and dishavituation?
o Common: measure amount of response to stimulus
o Difference: Spontaneous recovery has span of time;
dishavituation presents a single novel stimulus
 Procedures used to demonstrate stimulus-specificity of habituation
and dishabituation
o Sensitization
 Sensitization: increase in strength of behavior
 Gunfire, moviegoer
 NOT stimulus-specific, presentation of one stimulus may increase
response to another stimulus
o Dual Process theory
 Each stimulus presentation results in 2 opposite processes, called
habituation and sensitization. Any change in behavior is a net result of 2
processes
 Higher intensity  sensitization
 Lower intensity havituation
 Moderate intensity  Sensitization, then habituation
o When habituation when sensitization? Dual Process Continued
 Habituation is a continuous process it continues to increase with
stimulus presentation; starts to decay only after stimulus ceases
 Sensitization is a temporary process it begins to decay while stimulus is
still being presented
o Stimulus Frequency and Intensity
 Frequency
 Both are direct functions of stimulus frequency
 Intensity
 Sensitization: direct function of stimulus intensity
 Habituation: Inverse function of stimulus intensity
 Lecture 3
o Classical Conditioning (Pavlovian Conditioning
 Associations/ pairings
 Terminology
 Measures/ types of trials
 Related Concepts
o Learning Via Association
 Learning in Classical conditioning is by association
 Key: Pair 2 stimuli together- one has some important survivial
characteristics, one does not
 Learned stimulus must occur before presentation of the stimulus
that elicits the reaction
 Through pairing, the once neutral stimulus becomes a conditioned
stimulus
 CS  CR chain is a learned conditioned reflex
 Most stimuli are external
 Important questions to ask in classical conditioning
 What was learned Conditioned?
What was innate Unlearned unconditioned
 What is the stimulus
 Which is the response
o Pavlovs Procedure
 Food paired (associated) w/ metronome (NS)
 Result NS become CS
 Food paired with bell
 NS become CS
 How do we know this change occurred
 Saliva flowed during presentation of just the CS before he US was
presented
 Measures in classical conditioning
 Percentage of CRs: % of trials in which CR occurred
 Magnitude of CR: amount of saliva produced
o Percentage and magnitude of CR should both increase with
successive trials
o Latency of CR: time between onsets of CS and CR
 Latency typically decreases with successive trials
o Direct physiological response
 Changes in HR, BP, muscular tension
 Measures in Classical Conditioning
 Indirect measurements
o Approach to/ avoidance of CS
 Two Types of Trials
 Conditioning Trials (Training trials, regular trials) Trials in which
there is a CS-US pairing
 Test trial: Trials in which the CS is presented alone
o Usually interspersed among Conditioning trials
o Typically present ~1 test trial among every 10 conditioning
trial or so
o Related Concepts
 Extinction: CS repeatedly presented w/o US
 Ring bell: No food -> Little or no saliva
 Crying “wolf”
 Disinhibition
 Recovery of CR during extingtion after a novel stimulus
 Like dishavituation, but for CR inhibited by extinction
 Spontaneous Recovery: Reappearance of CR after time passes after
extinction
o Running into your ex, those feelings return (briefly)
 Lecture 4
o Classical Conditioning: Special Procedures
 Excitatory/ Inhibitory Conditioning
 Effects of experiences that precede ClassCon
 Lateral Inhibition
 Higher order Conditioning
 Sensory Pre-Conditioning
 Compound Stimuli
 Blocking
 Overshadowing
 Timing
o Excitatory/Inhibitory Conditioning
 Excitatory Conditioning: CS+
 NS  Presentation of US Bell  food
 Inhibitory Conditioning CS-
 NS  absence or removal of US
 Owner of sacry dog is there  dog doesn’t bite
 Occasion setting: Signals CS –US contingency
 Presence or absence of stimulus of stimulus affects CR
 EG light: bell –food; no light :bell no food
o Higher order conditioning
 Second order conditioning
 Metronome:food  salivation
 Metronome salivation
 Light:Metronome  salivation
 Light  Salivation
 Pariing a new stimulus with an established CS to elicit an established CR
 The new stimulus becomes a CS2 and elicits CR 2
 CR2 is usually lower in magnitude that CR1
 Latent Inhibition
 Novel stimulus more effective for conditioning
 Explanation of just a friend zone
 Sensory pre-conditioning
 Higher order conditioning, stimulus becomes a CS even though it
was never paired with US
 Difference: two stimuli paired before US was ever presented –
neither had yet become a CS
 Compound Stimuli
 Overshadowing
o The stronger component of a compound stimulus becomes
a CS, but the weaker component will not
 Gunfire + light taping: candy  saliva
 Gunfire  saliva
 Light taping  no saliva
 Blocking
o Presence of an established CS interferes with conditioning
a new CS
 Red light: candy  Saliva
 Redlight and green light: Candy  saliva
 Green light  no saliva
 Too focused on red doesn’t give a shit about green
o Timing of classical conditioning
 Delayed conditioning: most effective
 CS onset, US onset, CS offset, US offset
 Trace Conditioning
 CS onset, CS offset, US onset, US offset
 Simultaneous conditioning
 CS and US at same time
 Backwards conditioning effect: Least effective
 US onset then CS onset
o Theories of classical conditioining
 Two types of theories
 Type of association formed
 Nature of the CR
 Pavlovs stimulus substitution theory
 Siegels compensatory CR theory
 Rescorla Wagner theory
o Two types of theories
 Type of association formed as a result of classical conditioning
 S-S (stimulus-stimulus)
 S-R (Stimulus- response)
o Research emphasizes S-S associations more
 Form/ nature of the CR ( eye blink, wing beats)
o Pavlovs stimulus substitution theory
 US stimulates a US center in the brain which excites a response center
 CS stimulates a different part of the brain than US
 After Pairings CS-US neural connection made
 CR should take form of UR : light-food: dog licks light
 Preparatory Response theory
 Form of the CR is dependent on type of S
 Rat-shock: jump ligh-shock  light: freez
o Siegels Compensatory CR theory
 US: Drug + primary effect of drug
 Coffee US: Caffeine and alertness
 UR= response that opposes drugs primary effect
 Coffee example: UR = sleepiness
 UR is a compensatory response
 UR occurs after the drugs primary effect
 Situation/ environment in which you take your drug that always preceds
your drug intake become CS
 Starbucks becomes CS
 CR = UR sleepiness both are compensatory
 Another conditioning Example
 Beer intake (bar setting –CS_  Increased CR
 CR occurs before primary effect
 Size of CR increases with training
o Opposes 1^0 effect more => drug has lesser effect
o This is known as Chronic Tolerance
 Siegels Compensatory CR theory
o Results from learning association between drug intake and
environment, NOT from repeated exposure to drug
o Depends on context of drug intake: situational specificity
o Context becomes CS and brings out CR
o Rescorlas Wagner Theory
 US supports limited amount of conditioning,
 Associative value distributed among CS’s
 Stronger US support more conditioning
 Overshadowing, blocking, over expectation effect
 Tone: 0-> 10 Food max = 10  salivation
 Light 0-> 10 Food Max = 10 -> salivation
 Tone V=5  salivation
 Limits of love to give
o Therapeutic Applications of Classical Conditioning
 Stimulus Generalization and discrimination
 Classical conditioning based therapies for eliminating the CR
 Flooding
 Counterconditioning
o Systematic Desensitization
o Aversion therapy
o Generalization/ discrimination
 Stimulus Generalization
 CR elicited by stimuli similar to original CS
 Stimulus discrimination
 CR elicited more by specific stimuli
 More similarity to CS -> stronger CR
 Experimental neurosis: difficulty discriminating
o Unpredictable events -> neurotic syymptons/ anxiety
o EG circle = CS+, oval = CS – gradually made
indistinguishable
 Generalization contributes to phobias
 Eliminating the CR
 Extinction: Presenting the CS for short periods of time repeatedly
without presenting US
 Metronome CS  No salivation
 Needs short CS presentation just as before
 Fears and phobias
o Acquired via classical conditioning
 Dog: bite pain dog= > fear
 Fears and phobias
 Acquired via other means observation
o Classical condition based therapies can help treat phobias,
regardless of how they were acquired
 Flooding
o Prolonged exposure to CS while preventing escape
o Counterconditioning
 Counterconditioning: learning a new incompativle association that
counteracts the original one
 CS  Cr metronome  saliva
 Old CS metronome: shock  fear
 Metronome -> fear
 New CR incompatible with old CR reciprocal inhibition
 Systematic Desensitization: series of relaxations paired with hierarchy of
fear eliciting stimuli
 Phobias relaxation paired with old CS of phobia
 Old CR (fear) and new CR (calm) are incompativle
o Counterconditioning Systematic Desensitization
 1.fear hierarchy is established
 2. Physical relaxation training
 Train to be aware of tension
 3. Gradual presentation of hierarchy items
 Start with least fear arousing
 Can be in vivo or imaginal
 Aversive therapy: therapy for detrimental behaviors in which one cannot
stop engaging
 For addictive behaviors
 Develop aversive cr to stimuli associated with desirable behavior

Study Guide
Chapter 1

1. Learning: A relatively permanent change in behavior that results from some type of
experience
2. Behavior: Any activity of an organism that can be observed or somehow measured
3. Behaviorism: Natural Science approach to psychology that traditionally focuses on
study of environmental influences on observable behavior
4. Latent Learning: Learning that occurs in the absence of an observable demonstration of
learning and only becomes apparent under a different set of conditions
5. Law of Contiguity and Parsimony: (Contiguity) Law of association holding that events
that occur in close proximity to each other in time and space are readily associated with
each other.
(Parsimony) Simpler explanations for a phenomenon are generally preferable to more
complex explanations
6. Empiricism vs. Nativism: Nativism: assumption that persons characteristics are inborn
nature perspective while empiricism is that behavior patterns are learned not inherited
nurtured
7. Social Learning Theory: brand of behaviorism that strongly emphasizes importance of
observational learning and cognitive variables in explaining human behavior. More
recently been referred as social cognitive theory
8. Reciprocal Determinism: assumption that environmental events, observable behavior
and persons variables reciprocally influence each other
9. Radical Behaviorism: A brand of behaviorism that emphasizes the influence of the
environment on overt behavior, rejects the use of internal events to explain behavior, and
views thoughts and feelings as behaviors that themselves need to be explained
10. S-R theory: The theory that learning involves establishment of connection between
stimulus and response

Skinner and Watson’s views on behaviorism


Watson: behaviorism is a natural science approach on psychology that focuses on study of
environmental influences on observable behavior. Principles of governing the behavior of
nonhuman species may be relevant to behavior of humans. Law of Parsimony, easy is better.
Objective based solely on study of directly observable behavior and environmental events that
surround it. Study perceivable events no conscience crap. Believe in S-R connections that propel
animal forward in step by step fashion. Skinner believed in CS

Skinner: radical behaviorism, emphasizes influence on environement on overt behavior, rejects


the use of internal events to explain behavior and views thoughts and feelings as behaviors that
themselves need to be explained. Does not reject inclusion of internal events in a science of
behavior. Skinner viewed internal events such as sensing, thinking , and feeling as covert or
private behaviors that are subject to the same laws of learning as overt. Studying - good grades
based on experience. Uninterested in using a persons description. Environment consequences of
our behavior manipulate environment to alter behavior. Believes environment determines both
external behavior and internal behavior.

Internal Events
Consciously perceived thoughts and feelings and unconscious drives and motives, not supported
by Watson

Difficult because we rely on verbal reports of what people are internally feelings

Difficult to determine actual relationship of thoughts and feelings to behavior, do thoughts and
feelings precede behavior?

Do not have meants of directly changing internal events, means of changining internal events
and external behavior is change aspect of environment. Instruction creates relaxing thoughts,
manipulating environment

Pseudo explanations

Hereditary factors … compare and contrast their positions

Skinner: believed that behavior was result of intereaction between genes and environment.
Operant conditioning bears resemblenace to envolutionary principle of natural selection.
Adaptive characteristics in population increase. Behaviors that lead to favorable outcomes are
likely to be repeated

Watson: humans inherit only fundamental reflexes along with 3 basic emotions, love rage and
fear, everything else he believed is learned

Other vocab

1. Operant learning: rat more likely to hit lever because food will come out
2. Law of similarity: cars and trucks same category
3. Law of contrast: tall and short clean and dirty
4. Law of contiguity: associate thunder and lightning
5. Law of Frequency: frequent items friend with perfume
6. Mind body dualism: holds that some human behavior reflexes are automatically elicted
by external stimulation while other behaviors are freely chosen and controlled by mind
(Descartes)
7. British Empiricists: knowledge is function of experience
8. Structuralism: possible to determine structure of mind by identifying basic elements
(Edward)
9. Introspection: describe internal experiences
10. Funcitonalism: mind evolved to help adapt around world (William James)
11. Methodological behaviorism: psychologists should study only behaviours that are
ovservable
12. Neobehaviorism: brand of behaviorism that utilizes intervening variables in form of
hypothesized physiological processes to explain behavior (Hull)
13. Cognitive behaviorism: utilizes intervening variables, form of hypothesized cognitive
process to explain behavior
14. Cognitive map: mental representation of ones spatial surroundings

Chapter 2

Compare and contrast: descriptive vs. experimental research


Descriptive research: describing behavior and situation within which it occurs. Does not involve
manipulation of variables. (Naturalistic observation and case studies)
Naturalistic Obs: involve systematic recording and observation in its natural environment Casual
observations biased towards researcher. Study inherited behaviors patterns. Monkeys washing
potatoes. Difficult to determine why these behaviors occur. Cannot intervene or ask participants
questions of clarification

Case Studies are also descriptive involving intensive examination of one or a few individuals.
Based on observation bias reduced to minimum Detailed info about behavior

Experimental research: one or more independent variable are systematically varied to determine
their effect on dependent variable.

Control group design: individuals assigned to experimental group or control group. Experimental
group is manipulated, control are not Free food makes it harder to respond to get other food

Single Subject Designs: require only one or a few subjects to conduct entire experiment. In
single comparison design, behavior in a baseline condition is compred to a behavior in a
treatment condition.

Methods and advantages/disadvantages of various research designs

Contiguity vs. contingency (read notes)

1. Appetitive/Aversive Stimulus: Appetitive: event that organism seeks out. Food is


appetitive stimulus when hungry. Aversive: event that organism avoid, electrick shock
2. Baseline: Normal frequency of the behavior that occurs before some intervention. Use a
card to check mark the amount of smokes, should research for several days to find
requency.
3. Contingency: predictive relationship between two events, occurrence of one event
predicts probable occurance of another. Contigency between lever and food getting
aballoon is contingent with visiting dentist
4. Control Group
5. Dependent/ Independent variables: Variable is a characteristic of a person, place, or
thing that can change over time. Temperature is a variable. Independent variable is aspect
of an experiment that systematically varies across different conditions in the
environment. It is manipulated. Food pellets independent. Dependent variable aspect of
environment allowed to vary freely to see if affected by change in independent variable.
What is measured in experiment. Total number of errors (Relation is called Functional
relationship)
6. Deprivation/ satiation: Deprivation: prolonged absence of an event that tends to
increase apetitiveness of that event,. Satiation prolonged exposure to an event
7. Establishing operation: Procedure that affects the appetitiveness or aversiveness of a
stimulus
8. Multiple-baseline design: treatment instituted at successive points in time for two or
more people settings or behaviors. 3 people trying to punish themselves. Baseline for
each person, at end of 1st week one begins treathemnt then 2nd on 2nd week so forth. If
improvement goes with treatment then good.
9. Reversal design: ABA, ABAB Design type of single subject design that involves
repeated alternations between a baseline period and a treatment period. Baseline then to
punishment. Smoking should decrease each time.
10. Spatial/temporal contiguity (bottom)

Other Vocab

1. Overt Behavior: behavior that can be observed


2. Covert Behavior: behavior that only perceived by person performing, dreaming,
thinking, etc.
3. Contiguity: closeness or nearness.
4. Temporal Contiguity: extent to which event occur together in time. Lighting and
thunder. Rat press lever for food
5. Spatial Contiguity: extentto which events are situated close in space. Food dispenser
closer to lever. Ring doorbell as opposed to knock if sound of knock is close to door door
bell not.
6. Intensity: responding can be measured in term of intensity. The intensity of a behavior
is force or magnitude of behavior, strength by the amount of saliva measured.
7. Duration: legth of time that an individual repeatedly performs a behavior. Concerned
with increasing or decreasing length of time behavior occurs.
8. Speed: Measure of how quickly or slowly a behavior occurs. Length of time to run
through a maze
9. Latency: length of time required for behavior to begin. Strength of conditioning can be
measured not in terms of amount of saliva but in terms of how soon dog begins to
salivate
10. Interval Recording: measurement of whether or not behavior occurs within a series of
continuous intervals, does not need to record every response. Useful if it is difficult to
determine point at which behavior starts and stop.
11. Time Sample Recording: variant of interval recording, one measures whether or not a
behavior occurs within a series of discontinuous
12. Topography: physical form of behavior, not rate of pressing lever, how it presses lever,
right or left paw
13. Number of errors: Any behavior in which responses can be categorized as right or
wrong can be assessed in number of errors. Number of wrong turns rat makes in maze

Chapter 3:

Compare/Contrast: Reflex vs. Fixed Action Pattern

Habituation vs. Sensitization

Dishabituation vs. Spontaneous Recovery,

Classical Conditioning-what does it involve

1. Habituation: Decrease in strength of an elicited behavior. One tap startle, multiple tap
not so much
2. Dishabituation: When habituated responses reappear following presentation of a
seemingly irrelevant novel stimulus. Be ok with gun shots, stud comes by, distracted and
is startled again
3. Spontaneous Recovery
4. Sensitization: Increase in strength of an elicited behavior following repeated presentation
of stimulus fucking bombs man
5. Fixed Action Pattern: fixed sequence of responses elicited by a specific stimulus.
6. Opponent-process theory
7. Reflex: relatively simple, automatic response to a stimulus
8. Sign Stimulus/releaser
9. Type of Conditioning
10. US: Unconditioned stimulus
11. UR: Unconditioned response
12. CS: Conditioned Stimulus
13. CR: Conditioned Response

Other Vocab
Opponent Process theory: an a process elicted by an event, b process elicted by a process
counteracts a process. B process is slower to increase and slower to decrease. Repeated
presentations cause increase in both strength and duration

Excitatory Conditioning: Cs comes to elicit a certain response such as saliva or fear.


Inhibitory Conditoning: Result of inhibitory conditioning CS comes to inhibit occurrence

Chapter 4

Compare and contrast variations extenstions and limitations of classical conditions and the
additional phenomena associated with conditioning

Key terms

1. Acquisition: Process of developing and strengthening a conditioned response through


repeated parings of CS or US
2. Blocking: Presence of an established CS interferes with conditioning a new one red light
green light
3. Compound stimulus: complex stimulus that consists of the simultaneous presentation of
two or more individual stimuli
4. Disinhibition: Sudden recovery of a response during an extinction procedure when a
novel stimulus is introduced
5. Experimental Neurosis: experimentally prodced disorder in which animals exposed to
unpredictable events develop neurotic like symptoms Oval and Circle
6. External Inhibition: Decrease in strength of conditioned response due to the
presentation of a novel stimulus at the same time as conditioned stimulus. Metronome
with dog = food, turn on light which distracts dog so less saliva
7. Extinction: Process whereby conditioned response can be weaked or eliminated when
CS is repeatedly presented in absence of US.
8. Higher Order Conditioining: process where stimulus that is CS associates with another
CS, second one is weaker
9. Latent inhibition: phenomenom when familiar stimulus is more difficult to condition as
a CS than unfamiliar Stimulus, if a dog has heard the sound of a metronome during
conditioning, trials may result little or no conditioning of metronome. Metronome 40
presentations , then metronome with food  salivation, metronome  no salivation
10. Occasion setting: Procedure in where stimulus signals that a CS is likely to be followed
by US with which it is associated. Light is occasion setter. When it is on, while
metronome plays theres food then theres saliva if off then no food then no saliva even if
metronome is on
11. Overshadowing: phenomenom where most salient member of compound stimulus more
readily conditioned as a CS gun over light tapping
12. Pseudoconditioning: elicited response that appears to be a CR is actually sensitization
rather than conditioning. Sensitization in where soldiers startle from loud sound.
Created a hyper sesnitve to stimulus
13. Semantic generalization: generalization of a conditioned response to verbal stimuli that
are similar in meaning to CS
14. Sensory preconditioning: one stimulus is conditioned as CS another stimulus it was
previously associated to becomes CS
15. Spontaneous recovery: reappearance of CR following extinction
16. Stimulus discrimination: tendency for CR to occur in presence of stimulus that is
similar to CS, saliva at 2000 hz not 1900 hz
17. Temporal conditioning: form cof classical conditioning in which CS is passage of time.
Dog given food every 10 minutes, by 10 minutes, dog is already salivating
18. US revaluation: process involving postconditioning presentation of US at different level
of intensity, thereby altering strength of response to previous conditioned CS.
Metronome: small food small saliva, then presented with large food, metronome
large saliva

Chapter 5:

Key Concepts:

Compensatory-response model and its implications for drug use and tolerance

Conditioning eventually results in a CR that appears to be opposite of original UR. Heroin 


decreased blood pressure, however with repeated drug use it causes increased blood pressure.

CS that has been repeatedly associated with primary response A to a US will eventually come to
elicit compensatory response B
Heroin  Decreased BP (A)  Increased BP (B), NS becomes CS such as in certain room, body
is prepared so rather than going through the decreased blood pressure goes straight to increased
blood pressure

Phobias and treatment

1. Training in relaxation: deep muscle relaxation procedure


2. Creation of hierarchy of imaginary scenes that elicit progressively intense levels of fear,
minor degree to a larger degree poodle  pitbull
3. Pairing of each item in hierarchy with relaxiationg. Visualize scene then engage in
relaxation

Other therapies based on classical conditioning


Key terms:

1. Aversion Therapy: Reduces the attractiveness of desired event by associating it with an


aversive stimulus. putting large spider in wine, feeling of repulsion
2. Counter Conditioning: CS that elicits one type of response is associated with an event
that elicted an incompativle response. Cats moving in to a room that looks more and more
similar to eat. First elicit shock then elicit positive response
3. Flooding: put rat in box that its scared of without letting it leave. Preventing avoidance
response causes them to be ok. Prolonged exposure to feared stimulus, maximal
opportunity to condition fear response to ve extinguished
4. Incubation: strengthening of a condition fear response as a result of brief exposures to
the aversive CS. Bite by dog run away from dog each time beome more and more scared
5. Over expectation effect: Decrease in conditioned response that occurs when two
separetly conditioned CS are combined itno a compound stimulus for further paring with
US.
6. Preparatory response theory: purpose of CR is to prepare organism (get ready) for the
presentation of the US. The dog salivates to the tone to get ready for food, rat freezes in
response to light to get ready for shock
7. Preparedness: refers to genetically based predisposition to learn certain kinds of
associations more easily than o thers. Inherited predisposition to develop fears to certain
types of objects or events. Innate tendency to fear certain conditions
8. Reciprocal inhibition: certain responses are incompatible with each other and
occurrence of one response necessarily inhibits the other. Postive countered negative
9. Selective sensitization: Increase in ones reactivity to a potentially fearful stimulus
following exposure to an unrelated stressful even. Scared of driving on the traffic when
husband broke up
10. Systematic desensitization: Slowly bring rabit closer and give cookies. Within months,
baby holding rabbit while eating cookies. Behavioral treatment for phobias that involve
pairing relaxation with a succession of stimuli that elicit increasing level of fear

Vivo vs Imaginal: Benefits of both as vivo may elicit a huge response but also may cause that
people to not just beconditioned to the fake.

S-R stimulus response model: ns becomes directly associated with UR and therefore elicits UR
S- S- stimulus model NS becomes directly associated with US and because of this association.
Dog bites kid, bit= pain child fears dog

Rescorla Wagner Theory: Proposes that given US can support only so much
conditioning .everything is given an associative value. And cant use up all the resources.

Article 1
Like a rat:

 Before Antecents during and Consequences after. What influenced animals to press lever
in laboratory cage. Different species learn faster reward and punishment offers the same
results.
 Only a brief period is in order to change behavior . Also reward for ahands off.

Tiny Tyrants:

 Don’t beat your kids and stuff. Telling them what you want then rewarding for following
is better than explaining why they did shit wrong or yelling at them.
Class Notes:

PSYC 4: STUDY GUIDE TEST 2

General info:

-Multiple choice (remember to bring pencil)

-Questions drawn from lectures, videos, articles, and relevant book chapters; this guide will cover the
most important concepts, but any assigned reading material or content discussed in class is fair game!

-Good luck, happy studying : )

From chapter 6

Key Terms: Discriminative stimulus, escape behavior, extrinsic/intrinsic reinforcement, generalized


reinforcer, law of effect, natural reinforcers, negative punishment, negative reinforcement, operant
conditioning, positive punishment, positive reinforcement, primary/ secondary reinforcer, shaping,
three-term contingency

Key Concepts: operant conditioning- what does it involve; think about examples (from class as well as
your own) and differentiate between positive & negative and reinforcement & punishment.

From chapter 7:

Key Terms: bliss point, chained schedule, continuous & intermittent reinforcement schedule, DRH, DRL,
DRP, drive reduction theory, FD/FI/FR/FT/VD/VI/VR/VT schedules, goal gradient effect, noncontingent
schedule of reinforcement, Premack principle, response deprivation hypothesis.

Key Concepts: compare/contrast the schedules of reinforcement and the typical response patterns they
elicit. Compare/contrast the theories of reinforcement.

From chapter 8:
Key Terms: DRO, discrimination training, discriminative stimulus for extinction, extinction, extinction
burst, fading, generalization gradient, multiple schedule, partial reinforcement effect, peak shift,
resurgence, spontaneous recovery, stimulus control, stimulus generalization,

Key Concepts: aspects of extinction and stimulus control. Don’t worry about matching-to-sample or
behavioral contrast.

From chapter 9:

Key Terms: learned helplessness, one- and two-process theories of avoidance, experimental neurosis.

Key Concepts: problems with punishment, effective use of punishment, types of punishment, compare &
contrast the forms of noncontingent punishment. Don’t worry about theories of punishment.

From chapter 10:

Key Terms: bias, commitment, concurrent schedule, impulsiveness, matching law, melioration theory,
overmatching, self-control, undermatching.

Key Concepts: deviations from matching; drawbacks of melioration; Ainslie-Rachlin Model of self-
control; factors that influence self-control and impulsiveness.

Articles & Videos:

Bribery article: making reinforcement effective

Timeouts article: how to make timeouts effective

No, You Shut Up article: compare/contrast strategies for responding to misbehavior

The Messy Room Dilemma article: choosing battles, changing behaviors

Harry video, methods & results of study; identify examples of positive/negative


reinforcement/punishment.

No Brakes article: adolescent risk: why it happens & common interventions don’t work

No Brakes cont’d article: effective strategies for risky adolescent behavior

Cognition, Creativity, and Behavior video: understand the purpose of the experiments and the main
points that Epstein and Skinner were making about mental processes, creativity, and self-concept.
Lecture 6:

 Operant Conditioning
o Classical vs. Operant conditioning
o 4 basic operant procedures
o Factors that make operant procedures more effective
o Primary and secondary reinforces
o Shaping
o Conditioning
 Classical Conditioning: Operant Conditioning
 Reflex responses are Learning based on consequences of
associated with new stimuli responding
 Reflexice behavior Operant Behavior
 Key component Key component
 Assoc bet. 2 stimuli consequences of ones B
 Measures Measured only by the
 Percentage of CRs
 Magnitude of CRs
 Latency of CRs
 Direct physiological Response
 Indirect (Approach/
o 4 basic procedures
 Positive Reinforcement
 Negative Reinforcement
 Positive Punishment
 Negative Punishment
o Operant Conditioning – 4 consequences
 Something good can start or be presented
 Something Good can end or be taken away
 Something bad can start or be presented
 Something bad can end or be taken away
o Operant Conditioning (Voluntary)
 Operant Conditioning: Learning through voluntary behavior and its
subsequent consequences
 Reinforcement: Strengthens a response and makes it more likely to occur
 Punishment: weakens a response and makes it less likely to recur
o Operant Conditioning – 4 consequences
 Positive: technical term for an event started or an item presented, since its
something that added to animals environment
 Negative: technical term for an event ended or an item taken away is since
its something that subtracted
o Operant Conditioning – Reinforcements
 Positive reinforcement: when a response is followed by receiving a reward
or other desirable event
 Negative reinforcement when a response is followed by removal of an
unpleasant event ( stop headache aspirin)
o Operant Conditioning – Punishments
 Punishment: Any event that follows a response and decreases likelihood of
occurring again
 Positive punishments = adding a stimulus weakens likelihood of
reoccurrence (run 4 extra laps)
 Negative punishment = taking away a stimulus weakens likelihood of
reoccurrence (makes you sit somewhere else)
o Side Effects of Punishment
 Increased Aggression
 Passive aggressiveness
 Avoidance behavior
 Modeling
 Temporary suppression
 Learned helplessness
 Uses of partial reinforcement
o Shaping
 Reinforce successive approximations of target behavior
 One step at a time
 Keep raising the bar
 Creating new behaviors
 Shaping
o Begin with a behavior the subject already does and shape it
to the behavior you want
o Need to reinforce each step (successive approximation)
 Stop reinforcing a step to encourage subject to try
new step
 Goal: perform target behavior
 Subtypes of negative reinforcement
o Escape
 Experience something, then perform behavior (Rain open umbrella)
o Avoidance
 Perform behavior before you experience something (umbrella going into
rain)
 Making Operant conditioning more effective
o Contingency (Dependency)
o Contiguity (Immediecy)
o Size
o Deprivation
 Primary and secondary Reinforcers
o Reinforcer = consequence that strengthens behavior
o Primary positive SR: biological need
 Food water sex sleep
o Primary negative SR : escape
o Secondary Reinforcers (conditioned reinforcers)
 SRs have acquired value through being:
 Paired with established SR
 Exchanged with an established SR
 Established SRs can be first degree or 2nd reinforcers
o Money; Praise, social SR
 Secondary Reinforcer Subtypes
o Standard: exchangeable only for one type of primary SR
o Generalized Conditioned Reinforcer
 Can be exchanged for multiple reinforcers
o Advantages
 Cheaper
 More immediate
 Avoid satiation
 More conveniently delivered
 Not tied to any specific deprivation
 Extrinsic and Intrinsic Reinforcers
o Extrinsic Reinforcement: getting an external reinforcer for performing a behavior
 Reading war and peace for a good grade in class
o Intrinsic reinforcement: getting reinforcement simply by performing the behavior
 Reading war and peace for fun

Lecture 7

 Schedules of Reinforcement
o Continuous vs. Intermittent Schedules
 Ratio vs. Interval Schedules
 Fixed vs. Variable Schedules
o Response Patterns and Rates
 Continuous vs. Intermittent Schedules
o Schedule of Reinforcement: rules according to which the behavioral response (R)
is reinforced
o Continuous (CRF) = every response is reinforce
o Intermittent schedules are far more common
 Ratio Schedules: reinforcement SR depends on number of times a
response R is repeated, or the amount of behavior that is emitted.
 More responses (Rs) = more reinforcers (SRs)
 Interval Schedule: reinforcement SR depends on passage of time, plus one
response
 Responses during interval are not reinforced, Subjects must wait
enough time has passed and then make one response to get SR
 Fixed Schedule: SR is predicatable
 Fixed Ration (FR) = SR after every X responses
 Fixed Interval (FI) = get SR after every X seconds, plus one
response
 Variable schedule: SR is unpredictable
 Variable Ratio (VR) = get SR after about every X responses
 Variable Interval (VI) = get SR after about every X seconds plus
one response
 Response Patterns and Rates
o Pattern of responses: how responses are distributed between reinforcers
o Rate of responses: number of responses per minute
o FR: Rate is high until SR then pause Pattern ressembles steps
o FI: Rate starts low, increases closer to SR with post reinforcement pause
 Pattern resembles shallow scallops
o VR: Rate is extremely high and steady with NO pause
 Pattern is steep slope
o VI: rate is steady with no pauses, not as high as VR pattern is a moderate slope.
 Schedules of reinforcement
o Fixed ration- buy 5 get 1 free cards
o Variable Ration- slot machine
o Fixed interval- paycheck
o Variable interval – study for pop quiz
 Superstition
o What occurs when a reinforcer is not contigent
o S connects unnecessary Bs with non contingent SR
 Human behavior will change over time
 We do learn, even under non-contingent reinforcement

Lecture 8:

 Theories of reinforcement
o Skinners views
 Of reinforcement
 No theory required
 An empirical issue
 If something acts to increase a behavior then it’s a SR
 Inductive view point
 A SR is a stimulus that reinforces the operant R
o Two basic theory types
o Hulls drive reduction theory
o The premack principle
o Response Deprivation Theory
o Bliss point theory
 Two types of theories
o Reinforcers as stimuli (Hull)
o Reinforcers as activities
 SR = eating the food
 SR = watching TV
o 3 main reinforcers as activities theories
 Hull’s Drive-Reduction theory
o Stimuli that reduce a biological need, or that reduce strong adversive stimulation
will become reinforcers
o Drive: Hunger  Act: Lever Press  Drive Reduced: Receive Food
o Drive: Escape shock pain  Act: lever press  Drive reduced Neg SR shock
ends SR
o 3 types of reinforcement
 Primary reinforcers
 Stimli that reduce drives (food)
 Stimuli that result in reduction of another strong aversive stimulus
(painkillers)
 Secondary Reinforcers
 Stimuli associated with drive- reducing stimuli (kitchen, cooking
showsm, meeting with snack)
 Problem with Hulls theory
o Researchers found stimuli that had high reinforcement value, yet did not reduce
any drives
 Monkey pulled chains to see lab and its workers
 Rats pressed levers to stimulate pleasure centers
 Premacks principle
o Reinforcement is a contingency between 2 behaviors
 Instrumental R and contigent R
o Getting to do the second behavior is contigent upon first peroming the
intrsumental behavior
o Doing the first behavior is instrumental in your being able to do the second
contigent behavior
o Can consider contingent activity as reinforcer
o Not all behaviors are reinforceers
o More probable Behaviors reinforce less probable behaviors
o Contingent Behaviors have higher probability than instrumental behavior
 Instrumental behavior goes down - contingent behavior goes up
 Measure through free baseline (paired baseline)
o The bottom line:
 Getting to know the more probable Behavior is contigent upon performing
less probable instrumental Behavior
 To demonstrate significant contingency effect you must perform
instrumental behavior above baseline level
o Shortfalls of Premack’s principle
 Premack assumed that the more preferred activity were those that people
spent more time doing during baseline
 Doing hw: 30 minutes – multiple hours
 Having sex typically less than homework
o Premack did not attach time amounts to his behaviors
 Free baseline: run 60 min drink 30 min
 Contingency: drink 15 min  run 30 min
 Result – Rat drank 30 min then ran 60 min
 No increase above baseline  no contingency effect
 Response- Deprivation Theory
o Reinforcement effect will occur only when the reinforcement contingency
deprives the S of the contingency activity.
 Performing baseline amount of instrumental R earns you less than baseline
amount of Contingent R
 Make S perform instrumental Behavior at a higher level than its baseline
to obtain at least the haseline level of the contingent Behavior
o Premack: behavior probabilities; Response deprivation behavior preferences
o Contingent activity doesn’t have to be more probable
 Any activity can be reinforcing as long as subject is deprived of
performing it at baseline
o Use less preferred activity as instrumental behavior to gain access to the more
preferred contingent Behvaior
 Employ these in ratios that are above baseline
o Subjects do not always perform contingencies as predicted/ calculated by reponse
deprivation
o Subjects instead alter their behaviors to compromise
 Bliss Point theory
o Baseline = behavioral ideal
 Behavioral ideal drives performance
 Any contingency imposed disrupts bliss point
 S increases frequency of BI as a compromise
 Point of Mininum Deviation
 Accurately predicted by formulae

Lecture 9

 Extinction
o Operant extinction
 No longer reinforcing operant behavioral response
 Ultimately reducing behavior probability to zero
o Ideally combined with DRO (esp. DRI)
 Effects of Extinction
o Initial and temporary effects
 Extinction burst
 Emotional behaviors
 Frustration
o Emotion when no longer reinforced
 Aggressopm
o Kicking cursing biting
o Other Efefcts (most occur soon, disappear quickly)
 Increase in behavior variability
 Resurgence
 Depression
o Spontaneous recovery
 Recovery of extinguished response after a sufficient break
 Ultimately response decreases
 Resistance to extinction (RTE)
o Ext. takes longer under intermittent reinforcement
 PRE (partial reinforcement effect)
o Resistance to extinction: extinction takes longer
 CRF schedules have low RTE
 Intermittent schedules have high RTE
o To decrease RTE best strategy
 Cobine extinction of inappropriate behavior with reinforcement of
incompatible behavior DRI
 Generalization and Discrimination
o Discriminative stimuli
o Stimulus control
o Stimulus generalization
 Generalization gradient
o Stimulus discrimination
 Discrimination training
o Peak shift
 Stimulus Discrimination: tendency for an operant response R to be emitted more in
presence of one stimulus than another
 Discriminative Stimuli
o Pigeon learns to emit key peck R to obtain SR of food
 R (peck)  SR (food)
o Reinforce R only in presence of a lighted key
 SD yellow key: R(peck)  SR (food)
o Discriminative Stimulus (SD) a stimulus in presence of which responses are
reinforced and in the absence of which they are not reinforced
 Stimulus Control
o Stimulus control: when presence of SD reliably affects probability of behavior R.
 Examples:
o Pigeon more likely to peck light when lit up
o Drivers more likely to pull into gas station when low fuel light comes on
 Stimulus generalization: tendency for an operant response R to be emitted in the presence
of a stimulus that is similar to SD
o SD yellow key: R (peck)  SR (Food)
 Discrimination Training
o Not always desirable for a behavior to occur in every situation
 (flirting during a funeral)
o How do we behave appropriately in each situation
o Discrimination traning involves reinforcement of responding in the presence of
one stimulus SD and not another stimulus
 Discriminative Stimulus for extinction S delta: stimulus that signals absence of
reinforcement
 Discrimination Training Example
o Pigeon SD yellow key: R peck  SR food
o AND S delta (Blue Key): R Peck  NO FOOD
 2 types of discrimination training
o Intradimensional: both SD and Sdelta are from the same dimension (same
spectrum, noise type, etc.)
o Interdimensional: SD and Sdelta are from different dimensions (SD =light; S delta
= sound)
 Having a stimulus being ON vs. OFF also counts SD = light on S delta =
light off
o Intra = within
o Inter = between
 The Peak Shift
o Peak shift has two crucial elements:
 Peak is no longer centered over SD
 The entire gradient has shifted away from Sdelta
o What causes the Peak Shift?
 Interaction between Excitatory and Inhibitory Gradients
 Excitatory gradient (SD)
 Inhibitory gradient (S delta) usually more flat
 Peak Shift occurs only after Intradimensional Training

Lecture 10

 Aversive Control and Avoidance


o Aversive Control
o Negative Reinforcement
 Escape
 Avoidance
 Avoidance procedures
o Avoidance Theories
 Two-Factor Theory
 One-Factor Theory
 Negative Reinforcement
o Escape: aversive stimulus is already occurring; organisms R terminates it
 Raining on you  open umbrealla
o Avoidance: R prevents the aversive stimulus from ever occurring
 Open umbrealla before stepping into the rain
 Avoidance Procedures
o Signaled Avoidance (Aka Discriminated Avoidance, Discrete Trials Avoidance)
 Signal Stimulus Precedes aversive stimulus
o Non-signaled avoidance, or free-operant avoidance (aka Sidman avoidance, Non-
reinforced avoidance)
 No signal precedes aversive stimulus
 Theories of Avoidance
o What reinforces the avoidance response
 We cant say that the lack of an aversive stimulus is reinforcing avoidance, b/c the
absence of something cannot reinforce behavior
o EReinforcement something getting added positive or taken away negative
 If aversive stimulus is neither added nor taken away then its absence cant be reinforcing
the behavior
 2 factor theory
o Roommate frown  Explosion
o Frown = SD; explosion = aversive stimulus; Hide in room = R
o Hiding  reduction in fear resulting from frown
 Dissipation of fear = SR
o Must learn to fear the signal for aversive stimulus
 2 factor theory
o Factor 1: classical conditioning of fear to a signal for the aversive stimulus
o US = aversive stimulus; UR = fear
o CS = signal for aversive stim; CR = fear of signal
o Factor 2: operant conditioning of avoidance response RAV, decreases CR of gear
o RAV is negatively reinforced
 Fear is reduced via termination of the CS
 Two-Factor Theory—essential elements:
 n Classical factor - Fear (CR) elicited by CS (signal)
 n Operant factor - Termination of Fear (CR) as a
 consequence of RAV: RAV à Fear ↓
 n Absence of shock is NOT a SR, according to theory:
 The CR’s (fear) termination, which is a consequence of
 emitting the avoidance response RAV, reinforces your
 avoidance behavior. SR = reduction of fear
 How does Two-Factor Theory explain NON-signaled avoidance? (no exteroreceptive
signal)
 CS (signal for aversive stimulus) = Passage of time
 More time (CS)  More fear (CR)
o Two-Factor only works with Fixed time intervals
o In Variable time situations, you’re always afraid
 Two factor theory is molecular
o Concentrates on what happens immedietly after behavior
 One factor is Molar
o Concentrates on a larger overall timeframe
 Both are well supported
o 2 factor: signaled conditions
 Punishment
o Influences on the effectiveness of punishment
o What if the punished behavior is positively reinforced?
o Potential problems with use of punishment
 Punishment basics
o Basic procedure akin to that for positive reinforcement
 Punisher must be more intense than positive reinforcer
o A punisher is considred to be effective if
 Reduces behavior quickly
 Reduces behavior significantly
o Intensity = more effective
o Method of Introduction
 Punishment less effective if you start at low intensity and gradually build
up
o Immediacy
 More immediate = more effective
 Factors that influence the effectiveness of punishment
o Contingency
 Best if punisher occurs b/c of behavior not random
o Frequency of punishment
 Continuous
o Negative vs positive
 Negative better
o Discrimination
 No SD signaling that punisher is in effect
 Punishment competes with reinforcement
o Stronger one is, the less effective the other will be
 Frequency and size of positive reinforcement for R
 The consequence that is larger/ more immediate/ more frequent/
more contingent will be stronger
 Level of Deprivation
 Greater SR deprivation  SR is more reinforcing
 Availability of alternitve RsS for smilar SR
 Innefective punishments
o Fines for speeding
 Punisher is intermittent and delayed
 Reinforcer is continuous and immediate
 Punisher is small and gradually increased
 Punosher is often signaled
 Radar signs
 Speeding is highly reinforced

Lecture 11

 Complex schedules and matching


o Simple vs. complex schedules
o Complex schedules
 Chained schedules
 Multiple schedules
 Concurrent schedules
o Matching
 Herrnsteins matching law
 Overmatching, undermatching, and bias
 Simple vs. Complex Schedules
o Simple Schedule of Reinforcement
 Single Schedule (VI60)
o Complex schedules of reinforcement
 Schedules that combine 2 or more simple schedules
 Chained Scheule: Specific sequence of 2 or more simple schedules
o Each link has its own SD
o Terminal link results in final reinforcer SR
o Green KeySD FR10 R  Red KeySRSR VI 120  food Terminal SR
 Multiple Schedule: 2 or more independent simple schedules presented in sequence of
some sort
o Each component has its own SD and primary reinforcer
o Green key FR10  Food/ Re Key VI 120  Food
o SD –R—SR ; SD –R—SR
 Concurrent schedules
o Two or more schedules available simultaneously
 2 boy or girlfriends available at same time
 VI 160 and VI 30 at same time
 Each independent
 How do we find out which alternative is preferred
 Frequency of preference for one over another
 Time spent with one versus the other
 How do we assess how much one is preferred
 Measure preference trends with a ratio: R1/(R1+R2)
 Herrnsteins matching law
o What determines preference over another
 Affected by reinforcers that are contingent on your responses
 Preference for a option = proportion of SR gained from particular choice
 Matching Law R1/(R1+R2) = r1/(r1+r2) R = Number of responses; r =
number of reinforcers per unit of time
o Preference for an option matches its rate of reinforcement
 If A reinforces me 3x as often as B, I will prefer A 75 percent of the time
 Given the relative reinforcement rate, how can we use matching law to
predict relative response rate
o Concurrent FR schedules will result in exclusive preference for richer schedule
 Responding on both concurrent ratio scheduels is wasteful
 Responding on both concurrent interval schedules is optimal
 Deviations from matching
o Overmatching: Preferred schedule chosen more often than predicted
o Undermatching: Preferred schedule chosen less often than predicted
o Bias: organism expresses prefence for one option regardless of specific
reinforcement schedule
 Cannot be explained
 Self Control
o Definition
 Choice between a smaller/sooner reinforcer and a larger later reinforcer
 Self control is exhibited when LL is chosen
 Impuslivity is exhibited when SS is chosen
o Dynamics of value over time
 Value of Sr does not remain constant
 Short delay to LL, the more valuable LL apperars, long delay, less value
 1 mil woth more next week than next 25 years
o Commitment
o Some factors affecting self control
 Skinners take on self control
 Self control is involved when a vehavior has two conflicting consequences: one
immediate, the other delayed
o Immediate pleasure SR and delayed punishment
 Smoking/drinking
 spending
o Immediate punishment and delayed benefit SR
 Going to dentist/ excercising
 Commitment
o Organimsms need to do something to prevent choosing SSwhen its value is
greater than LL
o Make commitment response ( controlling response) to receive LL before getting
opportunity to choose between SS and LL
 At point of commitment LL > SS
 Commitment response prevents impulsiveness
o Change your environment
 Leave credit card home
o Satiation
 Drink lots of water before eating
o Do something else instead
 go to fruit cart
Bribery Article

 Rewarding desired behavior is one element of positive reinforcement


 If child does not know behavior now, best way to jump start it?
 As children grow older don’t need reinforcer, havit got down
 Home should not be like real world
 Winging it can cause random reinforcement
 Hail Mary Reward System: don’t just tell them to get straight As for a goal
 Complex Reward System Too complicated hard to follow through
o Parents attention and praise very good main rewards
 To make Praise Effective
o Be specific about behaviors you want: be specific about say when you talk to
grandma be calm, praise by saying you stayed in seat entire time
o Identify a small number of behaviors: no more than 2 or 3 behaviors. Reward
does not produce results, encourage repeated practice of heaviors
o Model behaviors you want: show child exact what behavior would look like,
demonstrate point out desirable behaviors
o The key is repetition, so practice, go to bed rpractice in pretend situations
o Shape desired behavior by rewarding gradual approximations of it, praise lesser
durations and build over time to an hour of violin practice
 Praise is one of largest positive experiences that build relationship., praise good behavior
should be more frequent than punishments

Timeout Article

 Timeouts should be used sparingly, side effect of excessive punishments are more
significant than benefits
 Brief: timouts positive effect on behavior is almost all concentrated in first 2 minutes
 Immediete: following unwanted behavior done immeietely
 Isolation: let him along
 Administered calmly and without repeated warning
 Pivot and be firm take away privileges

Anger Article

 Shock and Awe: respond swiftly with indignation, one of most common and least
effective
o Immediete: negative back and forth with kid
o Long term effect: increase occurrence of disrespect
o Side effect: teaching children to do the same to others
 The Evil Eye: stare down child with dire expression and say nothing
o ImmediateL escalate and continue childs behavior
o Long term:better than full range
o Side: help develop calmer behavior on childs part in long run, too harsh is still
inflammatory
 Rational Saint:
o Go to child and in gentle voice explain why she is doing is not appropriate ,
ineffective
o Immediate: not change behavior
o Long term: modeling of calm in response to range wil have positive influence
over time, slow to cocur
o Side: not teach child proper behavior
 Ringmaster: divert childs interest to something else
o Immediate: not likely to work
o Long term: no effect
o Side: avoid task of teaching other ways to handle stress
 Void: Ignore and walk away
o Immediate: withholding all attention deescalates child behavior , end childs
comments
o Longterm: ignoring decrease likelihood of respect of time
o Side: weak response
 Mona Lisa:
o Show no emotion, skught amused, relatively effective
o Immediate: deescalate
o Decreaseses likelihood of future battles,
o Side: not taught lesson but asking for restraingt from yourself
 Parking ticket
o Most effective option, take away privilege and walk away
o Lognterm: decrease happeniogn again

Misbehavior Article

 Find positive opposite identify whaty can be let go or not. Age creates
 Why focus on it

Risky Behavior Adolescence Article

Risk Behavior Modication Article

Chapter 6 Operant Conditioning: Introduction

 Operant Behaviors: behaviors influenced by their consequences


 Effects of those consequences are called operant conditioning
o Response operates on the environment to produce a consequence, this learning is
called instrumental conditioning b/c response is instrumental in producing
consequence
 Thorndikes Law of Effect
o Hungry cat in puzzle box, food outside, must learn to escape box
o No flash of insight, just improvement of getting out
o Law of effect behaviors leading to satisfying state of affairs are strengthened or
stamped in
 Skinners Selection by consequence
o Skinner’s box, rat able to earn pellets by response lever
o Skinner’s procedure is known as free operant, b/c rat freely responds with
particular behavior (pressing lever for food at any rate)
o Skinner demonstrated that rate of behavior in operant chamber was controlled by
conditions set up my experiments
o Pigeon pecks response key
o Behaviors divided into two cateogires
 1. Involuntary reflexive-type behaviors, can be classicaly conditioned to
occur in new situations (Called Respondent Behavior)
 2. Operant behavior: behaviors that more voluntary in nature and
controlled by consequences rather than stimuli that precede them.
 Operant Conditioning
o Type of learning in which future probability of behavior is affected by
consequences
o Skinner avoided speculation about what animal might be thinking and emphasized
effect of consequence on future probability of behavior
o Operant conditioning process conceptualized as 3 components
 1. Response that produces a certain consequence (lever produces food_
 2. Consequence that serves to either increase or decrease probability of
response preceded it (Rat presses lever again for food)
 3. Discriminative stimulus that precedes the response and signals that a
certain consequence is now available (tone signal shows that lever press
will give food)
 Operant behavior
o Class of emitted responses that result in certain consequences; these consequences
affect future probability or strength of responess. Operant responses are called
operants
 Lever press food
 Effect: lever pressing increases
 Tell Joke no one laughs
 Effect: not tell joke again
o Behavior in question (lever pressing or joke telling) is operant response or operant
because its occurrence results in delivery of certain consequence
o Operant behaviors emitted by organism, more voluntary, flexible quality
o Defined as a class of responses with all responses in class capable of producing
the consequence
 Pressing lever with different hands position etc all belong in lever presses
 Operant Consequences: Reinforcers and Punishers
o Second component of operant conditioning is consequence that either increases or
decreases frequency of a behavior.
 Event is a reinforce, follows a behavior and future probability of that
behavior increases
 Event is punisher if follows a behavior future probability of behavior
decreases
 Reincorcers = SR reinforcing stimulus
 Punishers = SP punishing stimulus
 Operant Response = R
 Lever Press  Food pellet R SR
 Tell Joke  No one laughs R SP
o Behavior is reinforced or punished
o Reinforcement/ punishment refer to process or procedure
 Using food to increase strength of lever press is an example of
reinforcement, Frowning to encourage jon to stop is a punishment
 Food pellet is reinforce, frown is punisher
o Weakening of behavior through the withdrawal of reinforcement for that behavior
is known as extinction
 Operant Antecedents: Discriminative Stimuli
o When behavior consistently reinforced or punished in presence of certain stimuli,
stimuli will influence occurrence of behavior
 Pressing lever produce food only when tone, rat learns to press lever when
theres tone
 Tone: SD: Lever Press R  Food Pellet SR
 Tone is called discriminative stimulus: stimulus in presence of which
response are reinforced and in absence of which they are not reinforced
o Discriminatve stimulus, operant behavior, and the reinforce or punisher constitute
the Three-Term Contingency.
 Antecedent event (preceding event), Behavior, and a consequence
 Antecedent Behavior Consequence
 Susan SD Tell her a Joke R  She Laughs
 Tone: SD Lever Press R  Food Pellets
o Stimuli can be associated with punishment. A stimulus that signals that a response
will be punished is called a Discriminative Stimulus for punishment.
 Water Bottle SD: Meow R  Get Sprayed SP
 Police car SD: Speed R  Receive Fine SP
o Discriminative stimulus may signal occurrence of extinction
 Stimulus that signals absence of reinforcement Discriminative stimulus
for extinction
 S-Delta
 Tone SD Lever Press R  Food Pellet SR
 Buzzer SDelta Lever Press R  No Food
o During course of operant conditioning, tone will also become CS electing
conditioned response of drooling
 4 types of Contingencies
o Reinforcers and Punishers
o If response is followed by reinforce, Contigency of Reinforcement exists.
(Delivery of reinforce is contingent upon response)
o If response is followed by punishment, Contingency of punishment
 Positive Reinforcement
o Consists of presentation of a stimulus that reinforces increases in frequency, and it
is positive reinforcement because the consequence involves the presentation of
something –Namely food
 Negative Reinforcement
o Is the removal of a stimulus following a response, which then leads to a n increase
in the future strength of that response. Loosely speaking, the behavior results in
prevention or removal of something the person or animal hates
 Open Umbrella R  escape Rain SR
o Negative reinforcement involves 2 types of behavior: escape and avoidance
 Escape Behavior: results in termination of an aversive stimulus.
 Avoidance behavior: occurs before aversive stimulus is presented,
umbrella opened before stepping into rain.
 Positive Punishment
o Consists of presentation of a stimulus following a response which leads to
decrease in future strength of that response
 Talk back to the boss R  get reprimanded SP
 Swat at Wasp R  get stung SP
o People frequently confuse positive punishment with negative reinforcement,
 One reason for this is the fact that many behaviorists use term negative
reinforce to refer to an aversive stimulus
 Positive reinforce refer to an appetitive stimulus
 Negative Punishment
o Consists of the removal of a stimulus following a response which leads to a
decrease in future strength of response
 Stay out past curfew R  Lose car privilege SP
 Positive Reinforcement: Further Distinctions
o Immediete Versus Delayed Reinforcement
 Immediate the stronger
 Primary and Secondary Reinforcers
o Primary reinforce (Unconditioned reinforce) an event that is innately reinforcing,
food water, proper temperature
 Food is highly effective reinforce
o Secondary reinforce (conditioned reinforce) is an event that is reinforcing because
it has been associated with some other reinforce
 Nice carm fine clothese
 Conditioned Stimuli that have been associated with appetitive unconditioned stimuli
(USs) can also function as secondary reinforcers.
o Metronome NS: Food US  Salivation UR
o Metronome CS  Salivation CR
o Lever Press R  Metronome SR
o Tone SD: Lever PressR  Food SR
 Generalized reinforcer( known as generalized secondary reinfocer) is a type of
secondary reinforcer that has been associated with several other reinforcers
o Money is a powerful generalized reinforcer for humans b./c associated with food
etc.
 Intrinsic and Extrinsic Reinforcement
o Intrinsic Reinforcement is reinforcement provided by mere act of performing the
behavior.
 Party with friends b/c like company
o Extrinsic Reinforcement is reinforcement provided by some consequence that is
external to behavior
 Reading book to pass exam, passing exam is extrinsic consequence,
driving to get somewhere
 Natural and Contrived Reinforcers
o Natural Reinforcers are reinforcers that naturally provided for a certain behavior
 Money natural for seeling merchandise, medal for good peromance
o Contrived Reinforcers, Reinforcers that have been deliberately arranged to modify
a behavior; not a typical consequence arranged to modify a behavior
o Intrinsic reinforceres are always natural while extrinsic reinforcers can be natural
or contrived.
 Shaping
o Gradual creation of new operant behavior through reinforcement of successive
approximations to that behavior. With our rat, reinforce touching lever step by
step
o Teach animals tricks

Chapter 7

Schedule of Reinforcement
 Schedule of Reinforcement is response requirement that must be met to obtain
reinforcement
o Ex: Does each lever press by rat result in food or several lever pulls later?
o Did mommy give cooky once or every time
 Continuous Versus Intermittent Schedules
o Continuous reinforcement schedule: one in which each specified response is
reinforced. Each time a rat presses lever, it obtains food pellet. Each time dog
rolls get treat, each tine turn ignition motor starts
o CRF useful when behavior first shaped or strengthened. Praise child each time
brush teeth
 Intermittent Reinforcement Schedule: some responses are reinforced
o Ex: one in which only some responses are reinforced, only some of rats lever
presses result in food pellet. Not every person accepts a date.
o 4 Types: Fixed Ratio, Variable Ratio, Fixed Interval and Variable Interval
o Steady State behaviors: stable pattern
o Fixed Ratio Schedule: reinforcement is contigent upon a fixed, predictable
number of responses.
 FR 5: Press 5 times to obtain food
 FR 50: press lever 50 times to obtain food pellet.
 Earning a dollar every 10 toys assembled = FR10
 FR1 = CRF continuous reinforcement, each response is reinforced
 Produce higher rate of response following a pause called post
reinforcement pause.
 Take break following reinforce, short break after each chapter,
break = higher rate of response.
 FR pattern – “break and run pattern” a short break followed by run
of response
 Starting a task is harder
 Longer tasks FR 100 vs FR 30 produce a longer time pause
 Reinforcer easy to obtain: Dense or Rich, hard = lean
 Moving from dense  lean should be a slow process, too fast of a
jump is called ratio strain, disruption in responding
 Variable Ratio Schedule: Reinforcement is contigent upon a varying,
unpredictable number of responses
 VR5, emit average of 5 lever presses for each food pellet.
 Generally produce a high and steady rate of response with little or
no postreinforcement pause.
o Each response on VR schedule has potential of resulting in
reinforce
o Gambling, unpredictable nature of activit results in high
rate of behavior
o The less often the victimizer reinforces the victim, the more
attention he or she receives from the victim. Victim works
hard to get the partners attention that he or she actually
reinforces the process of being largely ignored.
 Fixed Interval Schedule: reinforcement is contingent upon the first
response after a fixed, predictable period of time.
 First lever press after a 30 second interval has elapsed results in
food, another 30 seconds must follow for food pellet.
 Increase rate of response as post reinforcement pause followed by
gradually increasing rate of response
 Indicated time would be discriminative stimulus SD, As time
progresses look at watch more.
 Variable Interval Schedules: Reinforcement is contigent upon the first
response after a varying unpredictable period of time. Average interval of
30 seconds will result in food.
 Produce a moderate, steady rate of response with little or no
postreinforcement pause.
 Comparing the 4 basic schedules:
o Produce quite different patterns of behavior
 Ratio produces higher response than interval. Reinforcer in such schedules
is entirely response contigent.
 Fixed schedules produce postreinforment pauses wheras variable
schedules do not. ON VS there is always possibility of relative immediate
reinforce even if one has just attained a reinforce.
 On a fixed schedule attaining one reinforce means the next reinforce is
some distance away.
 On FR schedule, this results in a short post reinforcement pause after
grinding out another set of responses
 On FI schedule post reinforcement pause is followed by gradual
increasing rate of response as interval draws to a close and reinforce
becomes imminent
 Other Simple Schedules of Reinforcement
o Duration Schedules: reinforcement is contingent on performing a behavior
throughout period of time.
 Ex: Rat must run 60 seconds to earn one pellet of food (FD-60 Schedule)
o Variable Duration Schedule: behavior must be performed continuously for a
varying unpredictable period of time.
o FR schedules, one knows precisely what was done to achieve the reinforce, On
FD schedule, what constitutes continuous performance.
 Ex: lazy rat can walk or an energetic rat can run fast. Both will receive
reinforce.
 Response Rate Schedules
o Different types of intermittent schedules produce different rate of response
o Reinforcement is contingent upon organisms’ rate of response.
o Differential Reinforcement of High Rate (DRH): Reinforcement is contingent
upon emitting at least a certain number of responses, reinforcement is provided
for responding at a fast rate.
 One type of response is reinforced while another is not.
 DRH schedule, reinforcement is provided for a high rate of response and
not for a low rate.
 Ex: Rat gets food from at LEAST 30 lever within a minute, DRH
ensure high rate of responding
o Differential Reinforcement of Low Rate (DRL): minimum amount of time
must pass between each response before reinforce delivered. Reinforcer provided
from slow response
 Rat get food only if it waits 10 seconds between each press. In DRL
schedule, responses prevent reinforcement from occurying, responding
before 10 seconds must not occur
o Differential Reinforcement of Paced Responding (DRP): reinforcement is
contingent upon emitting a series of response at a set time, reinforcement is
provided for responding neither fast nor slow
 Ex: rat get food emits 10 consecutive response, separated by 1.5-2.5
seconds
 Orchestra, music
 Noncontingent Schedules: Non contingent schedule of reinforcement, the reinforcer is
delivered independently of any response. Response not required for reinforcer to be
obtained.
o Fixed Time Schedule: reinforcer is delivered following a fixed period of time
 Ex: pigeon gets food every 30 seconds regardless of what happens
 Free reinforcer
o Variable Time Schedule: reinforcer developed follows unpredictable period of
time
o Noncontingent reinforcement account for forms of superstitious behavior
 Ex: accidentally reinforced behaviors within pigeons. They would do same
things hopefully for same frequency.
 Adjunctive Behaviors: fidgeting while waiting
o Carrying lucky charm
o Pigeons that receive free reinforces will work less vigorously for the contingent
reinforces.
o Effective in reducing frequency of maladaptive behaviors. If attention given on
noncontingent basis, they will not act out.
o Unconditional positive regard viewed as form of noncontingent social
reinforcement.
 Complex Schedules of Reinforcement
o Complex Schedule: combination of 2 or more simple schedules
o Conjunctive schedules: type of complex schedule in which requirements of 2 or
more simple schedules must be met before reinforcer delivered
 Ex: FI 2 minute FR 100 schedule, reinforcement is contingent upon
pressing 100 level presses and completeing one lever press following a 2
minute interval
o Adjusting Schedules: Response requirement changes as a function of the
organisms performance while responding for the previous reinforcer
 Ex: FR 100 schedule, if rat completes all 100 within 5 minutes, increase
requirement to 110
o Chained Schedules: sequence of two or more simple schedules, each of which
has own SD and the last of which results in a terminal reinforcer. In other words,
person or animal must work through series of component of schedules to obtain
reinforcer.
o Green Key: Peck  Red Key: Peck  Food
 SD R SR/SD R SR
 SD: Discriminative Stimulus
 SR: Reinforcing Stimulus
o Presentation of red key is both secondary reinforcer for completing VR 20
schedule and SD for responding on subsequent FI 10 second schedule. Can create
3 link chain
o Once pigeons learn which schedule is associated to which key, show appropriate
response patterns for schedules.
 Pigeons display long pauses and slow rate of response on white key
compared to other two. Responded most on red key
o Goal Gradient Effect: increase in strength or efficiency of responding as one
draws near to the goal.
 Ex: rat runs faster through maze as it goes
 Backwards chaining: training final link first and initial link last
 Establish red key as secondary reinforcer associated with food.
Presentation of red key can be reinforced corresponding green key, green
then with white
 Theories of Reinforcement
o Drive Reduction Theory: an event is reinforcing to the extent that it is associated
with a reduction in some type of physiological drive
o Hunger produces food drive, seek food, drive reduced when food found
o Incentive Motivation: Motivation derived from some property of reinforcer.
 Ex: playing video game for fun
 Spicy food is incentive motivation
o Premack Principle: reinforcers can often be viewed as behaviors rather than
stimuli. Lever pressing not for food but for act of eating food (behavior)
o Premack Principle: high probability behavior can be used to reinforce low
probability behavior
 Ex: rat is hungry, eating food has high likelihood of occurrence than
running on wheel. Eating food is (HPB). Rat will run wheel in order to
obtain food (LPB)
o Work then play
 Responsive Deprivation Hypothesis:
o Premack principle requires us to know relative probabilities of 2 behaviors before
judge. States that behavior can serve as a reinforcer when:
 1. Access to behavior is restricted
 Frequency falls below preferred level. (base line level of occurrence)
 Rat run 1 hr, only let him run 15 min, now will work to run on
wheel
 Behavior Blis Point Approach
o Between 2 or more activities, organisms with free access to alternative activities
will contriute its behavior in such ways to maximize overall reinforcement
o Distribution of behavior represents optimal reinforcement available from 2
activities
 Rat forced to run twice as long as wheel than explore maze, (likes to do
opposite) it will find a way to get close to making same ratio it will have
to do feel.

Chapter 8

 Extinction: non reinforcement of previously reinforced response, result of decrease in


strength of response
o Process and procedure, procedure: nonreinforcement of previously reinforced
response
o Process: extriction is resultant decrease in response strength
o IF lever pressing ceases entirely, then it is extinguished., not yet is partially
extinguished
 Side Effects of Extinction
o When an extinction procedure is implemented, accompanied by side effects
because the can mislead one into believing that an extinction procedure is not
having an effect
o Extinction burst: implementation of an extinction procedure does not always
result in an immediate decrease in responding,
 Ex: putting money in machine and mashing on buttons
o Increase in variability: extinction procedure can also result in an increase in
variability of a behavior.
 Ex: push buttons on machine differently
o Emotional Behavior: extinction is often accompanied by emotional behavior.
Hungr pigeon suddenly finds key pecking no longer prodce food, becomes mad
o Aggression: get mad
o Resurgence: unusual side effect of extinction where reappearance during
extinction of other behaviors that had once been effective in obtaining
reinforcement.
 Ex: do something trained to do again
o Depressions: 1. Running down an alleyway for food, placement in open area
that ras could freely explore.
 Low levels of activity
 Resistance to extinction
o Resistance to Extinction: extent to which responding persists after an extinction
 Dog keeps begging for 20 min vs. 5 min.
o Schedule of reinforcement: mos important factor influencing resistance to
extinction.
 Partial reinforcement effect: Behavior that has been maintained on an
intermittent schedule of reinformcement will extinguish more slowl han
behavior that has been mainained on a continuous schedule.
 FR 100 schedule will take longer to extinguish than FR 10
 VR 20 greater resistance than FR20
o Less frequent e reinforce, longer it takes animal to discover that reinforcement is
no longer available.
o History of Reinforcement
 More reinforcers an individual has received for a behavior, the greater the
resistance to extinction.
 Easier to stop from 10 reinforcers vs earning 100 reinforcers.
o Magnitude of reinforce:
 Magnitude of reinfocrcer can also effect resistance to extinction. Large
magnitude reinforce result in greater resistance to extinction of smaller
magnitude reinforce.
o Degree of Deprivation: degree where deprived of reinforce
 Ex: Easier to stop not so hungry rats compared to very hungry rats
o Previous Experience with extinction: session of extinction are alternated with
sessions of reinforcemen, greater number of prior exposure o extinction, quicker
behavior will extinguish during subsequent exposures.
 Ex: child giving up after 10 min of whining
o Distinctive signal for extinction: greatly facilitated wen there is a distinctive
stimulus that signals onset of extinction
 Sponaneous Recovery: reappearance of extinguished response following rest period
after extinction.
o Skinner proposed: function of discriminative stimuli associated with start of
session
 Entering the supermarket is SD for availability of candy
 Differential Reinforcement of other Behavior
o Extinguishing target behavior and reinforcing occurrence of replacement
behavior
o Called Differential Reinforcement of other behavior (DRO)
o One variant of this procedure, known as differential reinforcement of
incompatible behavior (DRI), involves reinforcing a behavior that is specifically
incompatible with the target behavior.
o Functional communication training: communicating ones desires differentially
reinforced
 Stimulus Control: presence signals availability of reinforcement, thereby increasing
probability that behavior will occur. Presence of discriminative stimulus affects
probability of behavior
o Ex: 2000 hz sound to press lever to get food, lever pressing is stimulus controlled
 Stimulus Generalization and Discriminiation
o Stimulus Generalization: tendency for an operant response to be emitted in
presense of a stimulus that is similar to SD
 Ex: press lever at 1800-2200 Hz tone
o Generalization gradient: graphic description of strength of responding in
presence of stimuli that are similar to SD
 Steep gradient means as stimuli is different, less responsive
 Flat indicates, responding drops gradual as stimuli different
o Stimulus discrimination: tendency for an operant response to be emitted more
in presence of one stimulus than another.
 Steep gradient: weak generalization strong discrimination
 Flat: strong generalization, weak discrimination
o Discrimination training: reinforcement of responding in presnse of one SD and
not another stimulus
 Discriminative stimulus for extinction: stimulus that signals absence of
reinfocrcement
 S-delta
 Rat will learn to press lever in 2000 Hz and not in 1200 Hz. Strong
stimulus control
 Peak shift effect
o Peak of a generalization gradient following discrimination training will shift from
SD to a stimulus that is further removed from the S delta
o With discrimination training, gradient drops off more sharply on side towards the
S delta, rat strongly discriminates between S delta and SD
o Rat likes higher toned sounds
o Less similar tones such as one higher than 2000 would get more rate of
responding
 Multiple Schedules and Behavioral Contrast
o Two or more independent schedules presented in sequence, resulting in
reinforcement and having distinctive SD
o Completion on each component schedule results in food, in chain, both needs to
be completed
o Behavioral contrast occurs when change in the rate of reinforcement on one
another component
 Negative Contrast Effect: increase in rate of reinforcement on one
component produces a decrease in rate of response on other component.
 Decrease rate of response on red key since VI went from 60-30. First
compoen in sequence is more attractive
 Positive behavioral Contrast: decrease in rate of reinforcement on one
component results in iecrease in rate of response on other component.
 Increase rate on VI 60 since other one changed from 60-120
o Anticipatory Contrast: Rate f response varies inversely with upcoming change
in rate of reinforcement.
 Pigeons increase rate of responding for reinforcement when they were
presented with a stimulus signaling that extinction was imminent
 More vigorously for reinforcement while reinforcement still available.
 Acing out because girl thinks guy is cheating
 Fading and errorless Discrimination Learning
o Process of learning to discriminate s delta and SD, subject initially make
mistakes.
 Errorless discrimination training: minimize number of errors
 S delta is introduced earl, learn to respond,
 S dela presented in weak form o begin with and then gradually strengthen.
 Fading: process of graduall altering intensity of a stimulus.
 Pigeons do better when gradually change key to green from dark rather
than straight change Darkness was S delta
 Very hard to switch back
 Stimulus Control Procedures for Study of Memory
o Delayed matching to sample: animal shown sample stimulus and then following
delay, required to select stimulus out of group
o Get food by pecking the right stimulus
o Directed Forgetting: when you have been told to forget something
o O tells pigeon OK and X says no, told to forget they do worse
 Stimulus Control: Additional Applications
o Targeting: involves using process of shaping to train an animal to approach and
touch a particular object, as in training a dog to touch the end of a stick with its
nose
o By putting behavior on cue, less likely to occur in absence of the cue.

Chapter 9

 Escape and Avoidance:


o Negative Reinforcement 2 types of behaviors:
 1. Escape Behavior, performance of the behavior terminates aversive
stimulus
 2. Avoidance behavior, performance of behavior prevents aversitve
stimulus from occurring
o One learns to escape from aversive stimulus and then to avoid it
o Shuttle Avoidance procedure: animal shuttle back and forth in box to avoid
aversive stimulus.
o Shock: Cross barrier  removal of shock
 SD R SR
o Light: Cross barrier  Avoidance of Shock
 SD R SR
 Two Process Theory of Avoidance
o Study avoid over escape, escape is easier to understand
o Two processes are involved in learning an avoidance response
 Light: shock  Fear
 Light  Fear
o Proposes that avoidance behavior is the result of 2 distinct processes
 Classical Conditioning in where a fear response comes to be elicited by a
CS
 Operant Conditioning, moving away from CS is negatively reinforced by a
reduction in fear
 Animal , like dog may repeatedly encounter the CS in absence of
US. Fear of CS is never extinguished.
 Possible answer: Anxiety Conservation Hypothesis: avoidance
responses usually occur so quickly that there is insufficient
exposure to the CS for the conditioned fear to fully extinguish
 Some animals feel no fear but avoid anyways
o One Process theory: act of avoidance is negatively reinforced simply by lower
rate of aversive stimulation it is associated with,
 Climbing over barrier because decrease rate of shock, not because it
results in decrease fear of fear
o Species Specific Defense reaction theory many avoidance behaviors are elicited
behaviors rather than operant behaviors
 Avoidance Condition and phobias
o Basis of many phobias is development of a classically conditioned fear response,
failed to extinguish.
o IN experimental avoidance conditioning: animal avoids aversive US, rat avoids
shock (US) when it sees light (CS)
o IN human phobias, people avoid (CS)
 A second limitation of experimental avoidance is that the avoidance
behavior seems to condition less readily than does avoidance behavior
in a phobia.
 Rat may be shocked still sometimes
 Human only need a brief conditioning to produce avoidance response that
is strong and persistent.
 (1) the reliable establishment of a fear response with only a single,
brief pairing of the CS and US, (2) subsequent avoidance of the CS as
well as the US, and (3) the occurrence of successful avoidance on
100% of trials.
 Phobic individual learns to make avoidance response early on in
chain of events as to minimize effort of avoiding
 Avoidance conditioning and Obsessive Compulsive Disorder
o Obsessive compulsive disorder (OCD), disorder characterized by persistent
thoughts impulses or images and repetitive sterotped actions that are carried out in
response to the obsessions.
o Obsessions are associated with an increase in anxiety, where compulsions are
associated with a decrease in anxiety.
 Taking out garbage and showering, showering is a classically conditioned
response brought by contact with garbage, showering is operant response
 If compulsive behavior pattern, excessive washing, maintained by
avoidance of anxiety arousing event, preventing the avoidance response
after occurring should result in eventual extinction of anxiety
 Exposure and response Preventation (ERP) treating OCD that involves prolonged
exposure to the anxiety arousing event while not engaging in compulsive behavior pattern
that reduces anxiety
 2 process cant explain why people with OCD are affected by no conditioning event
 Punishment: weakening of a behavior through the application of an aversive stimulus or
removal of an appetitive stimulus
o Positive punishment: presentation of a certain event following a response, which
leads to a decrease in future strength of the response.
 Spanked when bad
o Negative punishment, removal of certain event following a response
 Losing job for being bad/ loss of positive reinforce
o Time out- loss of positive reinforce for a brief period of time.
 Should be brief to suppress unwanted behavior.
o Response cost: removal of specific reinforce following occurrence of a problem
behavior
 Receiving a fine for speeding or taking toys away examples
 Can adjust severity of punishment.
 Must identify reinforce that have an impact on behavior
 performing the behavior no longer leads to something (in which case,
the process is extinction), or because performing the behavior leads to
the removal of something that you would otherwise possess (in which
case the process is negative punishment).
 Intrinsic Punishment: punishment that is an inherent aspect of the behavior being
punished
o Activity itself is punishing, performing will not repeat
o Watching scary film not watch anymore
 Extrinsic Punishment: punishment that is not an inherent aspect of the behavior being
punished,
o Chastised after lighting a cigarette.
 Primary unconditioned punisher: innately punishing
o Born to dislike
 Secondary Punisher conditioned: associated with punisher
o Tone shock fear, running in wheel tone
o Genearlized punisher: event that has become punishing because it has in the past
been associated with other punishers
 Icy stare is best categorized as a generalized punisher b/c
disapproving looks have no doubt been associated with numerous
unpleasant events
 Problems with use of punishments
o Punishment of an inappropriate behavior does not strengthen occurrence of
appropriate behavior
 Kid play rough gets yelled at stops playing
o Person delivering punishment could become SD for punishment, unwanted
behavior is suppressed only when parents around
o Punishment might teach individual to avoid person delivering punishment
o Punishment likely to elicit a strong emotional response
o Punishment can elicit aggressive reaction
o Use of punishment through the process of modeling, could teach person that
punishment is an acceptable mean of controlling behavior
o Punishment often has immediate effect in stopping an unwanted behavior, the use
of punishment often strongly reinforced
 Benefit and Effective Use of Punishment
o Punishment increase in social behavior
o Result in improvement of mood weird
 Disrupt agitation
o Increase attention to the environment
 Become more vigilant to what is happening around them
 To maximize effectiveness
o Punishment should be immediate rather than delayed
 Don’t yell at goldy after he peed on floor for a long time
o At least at the outset, punishment should consistently follow each occurrence of
unwanted behavior
o Punishment should be intense enough from the outset to suppress target behavior.
 Don’t start mild
o Negative punishment generally preferable to positive
 Time out/ response cost less likely to produce harmful side effects
o Language capacity, punishment more effective with explanation
 Frequent feedback about behavior
o Punishment of inappropriate behavior should be combined with positive
reinforcement of appropriate behavior
 Theories of Punishment
o Conditioned Suppression Theory
 When punishment stop, condition may go back
 Punish: generate emotional response that suppress appetitive behavior
 Conditioned Suppression theory of punishment: punishment does not
weaken a behavior but produces emotional response that interferes with
occurrence of behavior
 Stronger punishments are capable of suppressing longer periods of
time
o Avoidance theory of Punishment: punishment actually involves a type of
avoidance conditioning in which avoidance response consists of any behavior
other than the behavior being punished
 Any other behavior other than lever pressing  no shock
 Any behavior other than lever pressing is negatively reinforced by absence
of shock
o The premack approach to punishment: high probability of behavior can be
used to reinforce low probability of behavior
 LPB can be used to punish HPB
 Eating food  Running in wheel, rat will less likely to eat food
 Effects of noncontingent Punishment
o Aversive stimulus controllable in sense that animal is able to make a response that
significantly reduces its effect
o Learned helplessness
 Inescapable shock condition, dog shocked unable to do anything.
 Escapable shock condition, dog receive shock but can terminate by
pressing panel. Dogs were also kept shocked and also dogs with nothing
happening.
 When all placed in room to shock and try to escape, both shocked
escapable and nothing happened can escape but cant escape shocked dogs
were unable to escape

Chapter 10

 Choice and Matching


o Concurrent Schedules: Concurrent schedule of reinforcement consists of a simultaneous
presentation of 2 or more independent schedules each leading to a reinforce.
o Matching law: proportion of response emitted on a particular schedule matches
proportion of reinforcers obtained on that schedule
 Pigeon emit twice as many responses on VI 30 second than VI 60 sec schedule
 Pigeon emit 3 times on VI 10 than VI 30
o Proportion of responses emitted on schedule A Ra/ (Ra+Rb)
o Number of reinforces earned on schedule A divided by total number earned on both
SRa/ (SRa + SRb)
 Deviations from Matching
o Undermatching: proportion of response on richer schedule versus the poorer schedule
is less different than would be predicted by matching.
o Occurs when there is little cost for switching from one schedule to another
o Changeover delat or COD
 Two fields Of grass are easy to get to, rather than staying in 1 can go to both
o Overmatching: proportion of responses on richer schedule versus the poorer schedule is
more different than would be predicted by matching.
 Cost of moving from one to another is very high.
 Predator doesn’t want to travel around mountain to switch patch of prey
o Bias from Matching: one alternative attracts a higher proportion of responses than
would be presdicted by matching regardless of whether alternative contains poorer or
richer schedule.
 Emitting 10 percent more responses than predicted by matching. Pigeon likes
red so adds more effort on red key over and above amount of responding
dictated by schedule of reinforcement.
 Indicate preference (Erin vs Jason)
 Matching and Melioration
o Distribution of behavior in a choice situation shifts toward those alternatives that have
higher value regardless of the long term effect on the overall amount of reinforcement
 Pigeon first confronted with concurrent VI 30 Sec and VI 60 sec, equal number
on both alternatives.
 In terms of benefits versus costs, VI 30 second schedule will have a much higher
value than VI 60 sec value.
 Results in shift towards VI 30 sec value. Cease at point of matching, point where
two alternatives have equal value.
 Sort of leveling process where behavior shifts until two alternatives have equal
value in cost versus benefits
 Problems of melioration
 First: Alternative may not require as much responding as one is
distributing toward it obtain all of available reinforcers.
o VR 100 VI 30, should work on ratio then move to get interval
but instead pigeon stay on interval
o Continue to court easy customers not try to get hard customers
 Second: overindulgence in highly reinforcing alternative often result in
long term habituation to alternative reducing value as reinforce
o Become rich eat lobster everyday, it gets boring not as
enjoyable
 Third: Melioration is often result of behavior too strongly governed by
immedeite consequences as opposed to delayed consequences
o Studying for annoying classes and maximizing overall GPA
average (delayed reinforce)
 Self-Control
o Plato: people are retarded Aristotle: people know what theyre doing is bad
o Skinner on Self-Control: not issue of willpower but issue involving conflicting outcomes.
 Managing conflict based on controlling response and controlled response
 You want to save money so leave it at home, this is called controlling respons,
amount spent is controlled response.
 Physical Restraint: physically manipulate environment to prevent occurrence of
some problem behavior. Putting money at home to save lending wii to study are
examples
 Depriving and Satiation: deprive or satiate yourself
 Eat before going to grocery store, so buy less fatty foods
 Doing Something Else:
 Quit smoking by chewing gum
 Self-reinforcement and self punishment
 Short circuiting contingency and immediate consumption of reward, say
study 3 hrs then eat pizza, eat pizza after 10 minutes
o Self Control as a temporal Issue
 Immediete consequences are more powerful than delayed consequences,
 Fun evening vs good grades, fun evening is there and is strong
 From temporal perspective, lack of self-control arises from fact that our
behavior is more heavily influenced by immedeite consequences than
by delayed consequences
 You don’t know for sure cigs can cause your health to fall or quitting will
cause you to stay healthy. However, certain that next cig will be good
and that quitting can cause withdrawal symptoms
 Immediete (certain) vs. Delayed (Uncertain)
 Task of choosing between alternatives is known as a delay of
gratification task because person or animal must forgo smaller sooner
reward to obtain larger reward later
 Self control consists of choosing larger later reward over smaller sooner
reward.
 Impulsiveness consists of choosing smaller sooner reward over larger
later
 Mischels Delay of Gratification Paradigm
o Social learning theorist Walter Mischel
o Resistance to temptation greatly enhanced by not attending to the reward
o Ability to devise appropriate tactics to delay gratification can enhance areas in life
 Marshmallow and pretzel example
 The Ainslie-Rachlin Model of Self-Control
Reversal of preference as time passed and smaller sooner reward became imminent.
o
Based on assumption that value of a reward is a hyperbolic function of its delay
o
 Bday party at 3 weeks delay is worth less than chocolate bar but as days move
closer value of party increase.
 Manner in which delay functions account for preference reversal is shown.
When both rewards are still distant, larger later reward (LLR) is preferred, SSR
smaller sooner reward becomes imminent as time passes and value goes up
 Going out is SSR getting a good grade LLR SSR more tempting
 When SSR is close, it was preferred, when both LLR and SSR faraway LLR
preferred
 Changing Shape of delay function for larger later reward
o Basic reason preference reversal occurs is because LLR has low value at long delays
o There appear to be innate differences in impulsivity between species
 Delays for humans less deeply scalloped than they are for other animals
o Differences between individuals with some individuals more impulsive than others
o Age makes less impulsive people
 Experience with responding for delayed rewards, delay gratification as children
grow up
o Availability of other sources of reinforcement may be yet another factor that influences
impulsiveness
 Person wants to smoke or drink after something bad happens, under depression
long term goals lose their relevance
 Set up subgoals
 Making a commitment Response
o Commitment response is an action carried out at an early point in time that serves
either to eliminate or greatly reduce value of an upcoming temptation
 Girl pays boy to make sure she studies, making a monetary commitment
 Small but cumulative effects model
o Each individual choice on a self control task has only a small but cumulative effect on
our likelihood of obtaining the desired long term outcome
 Don’t eat a burger everytime, get tofu once in awhile

Class notes
 Autism and Behavioral Therapy
o Autism Spectrum Disorder
 Autism, Aspergers Syndrome, PDD NOS
 Autism Spectrum Disorders
o 1 in 50 have autism spectrum disorder
o Prevalence of ASD has increased by 285 percent over 12 years
o 5 times more common in boys
o Present across all culture and socio economic statuses
o Very Heterogeneous Population
 Diagnostic Criteria 1
o Impairment in social interaction (at least 2)
 Nonverbal social interaction
 Peer relationships
 Spontaneous sharing
 Social or emotional reciprocity
 Diagnostic Criteria 2
o Impairment in communication
 Delay in language development
 Impairment in initiating and maintain conversation
 Stereotyped and repetitive use of language
 Lack of imaginative play
 Diagnostic Criteria 3
o Restricted repetitive an stereotyped pattern of behavior interests and activities (1)
 Restricted range of interest
 Insistence on specific nonfunctional routines or rituals
 Stereotyped and repetitivce body movements
 Preoccupation with parts of object
 Basic behavioral Therapy components
o Antecedent: Stimulus that elicits the childs behavior [pick up box of crayons say crayons]
o Behavior: response by the child [child says crayons]
o Consequence: what is maintain the behavior [give child box of crayons]
 Prompting, getting child to say something
 You want more water? You say yes, child says yes, give child water
 Child says ba, you say book child says book, give him book
 Pivotal response training
o Pivotal areas
 Motivation, responsivity to multiple cues
 Why target pivotal areas?
o Maximum effective ness of intervention
o Minimize the treatment time necessary to learn new skills
o Pivotal response training
 Antecedents (attention, shared control, multiple cues, maintainance skills)
 Consequences (Contingent reinforcement, reinforcement of attempts, direct
reinforcement)
o Skills we can teach with PRT (language, play skills, social skills, academic skills)
 Child attention and appropriate prompting
o Question/instruction/opportunity to respond should be clear, appropriate to the task,
uninterrupted and the child must be attending
 Maintainance/ acquisition: Intersperse maintaince tasks with acquisition tasks to keep
motivation high and frustration low
 Shared Control
o Teaching interactions should include strategies of shared control to enhance motivation,
these strategies include (Child choice, take turns)
 Multiple Cuers: Structure environment to increase childs responsivity to multiple cues
 Contingent Reinforcement: any response to a childs behavior must be contingent on the correct
behavior or attempt
 Reinforcements of attempts: goal directed attempts to respond to questions, instructions, or
opportunities should be reinforced
 Direct reinforcemet: there should be a direct relationship between the reinforcement and the
desired behavior

Lecture: Animal behavior: physical and social cognition

 Physical cognition: tool use, social cognition, social learning (diet mate selection, imitation and
mimicry)
 Physical cognition: encoding storage retrieval and processing of feature information related to
objects  determine relationship between object in world, recognize objects, tool use,
acquiring food, nest building, weapons, concealment
 Social Cognition: encoding storage retrieval and processing of social information related to other
members of a community, individuals living in a social group regulary interact with other
members of that group
o To be successful in social group, animals need to have knowledge of interactions and
they they are currently doing. Western scrub jay, cache seeds, but pay attention to
whether other Jays are around
 Social learning, animals learn a behavior by observing another animal
o Chimpanzee community has unique traditions, chimpanzees on the west but not the
east side cracks nuts. Suggests cultural rather than genetic, transmission of behavior
 Animals learn a behavior by observing another animal
o Demonstrator: the animal producing the behavior
o Observer: animal watching the behavuior and later reproducing all part of it
o Mouse prefers demonstrated chocoloate over cinnamon, smell food
o Guppies: female observers select males previously seen with a partner, as opposed to a
male seen alone
 Imitation: copying the response of a demonstrater when it leads to awards
o Overimitation: tendency to copy unnecessary actions, chimpanzees open box, copying
only necessary actions , child copy all actions why? Eager to please
o Demonstrater intentionally modifies behavior in presence of naïve animal, incurs a cost
or no immediate benefit, observer learns faster than it would have
 Mimicry: copying response of a demonstrateor when it doe not result in any tangible reward,
social

Article Reverse psychology

 Tell parents to back off, stop asking child to do desired behavior, OK not to do it at all. Mask
attitude of enthuthiasm or rage bland disinterest.
 Behavior is controlled by antecedents, urgency can inspire push-back and resistence to even the
most rational pitch. Normal
 Reactance: reaction that is directly opposite to some rule or request. Occurs when someone
feels he is being pressured and there is some added limit being placed on his ffreedom or
choice.
 Ironic processes: oppositional actions within ourselves. Pressure we put on ourselves to do
something backfires
 However, when we are stressed or overloaded—when we're trying to multitask, for instance—
the monitoring can break down and those other things we're trying to exclude are much more
likely to come up and be expresse

Motivation Article

 Motivation thought of result of an interaction between environment and individuals


temperament and personality. Motivation follows behavior
 Identifying the problem: lack of motivation
o Normal down time: some is essential, create time for family
o Possible Concern: problems in focusing, lack of focus
o Stress: lack of motivation can be from stress, tendncy to dismiss notion that child is
stressed. Being overweight can be a stress thing
o Spceific or general lack of motivation: specific to area of childs life, children who are
bullied express disinterest in school, children have troble with academics may seem like
they are not trying.
 Doing something:
o Setting Goals: short manageable list of interests that you can encourage your child to
pursue as a regular family life, Sampling
o Expectations: not too heavy expectations will participate in luife and running of
household, simple chores, as they grow up moer expectations
o Modeling: Parents need to model, take children to groceries
o Building competencies: help child develp skill or activity
o Environmental cues: terms of motivating child, useful to leave novel and engaging
thuings around the house where child can come upon. . let child find something
engaging
o Peers: have home open to peers exert a stronger influence on a child.

Better Resolutions Article

 People who have a lot to do full suit case, tunneling, focus on whats at hand and not whats the
bigger picture
 Possibility of borrowing resulted inlower scores for the time poor overall. Time starved person
rushes to meet deadlines w/o taking time to order his affair for the future
 Stress of scarcity leaves insuffiecient energy to pay attention to long run decisions. Poor aren’t
poor because they make bad financial decision, it’s the stress of poverty that leads to bad
financial decision in the first place
 Take on fewere obligations., should have occasional periods of self reflection

Gaming and behaviorism Article

 Every game based center off palyer


 Contingencies and Schedueles: set of rules governing when rewards are given out, players gain
experience same as give rat food
 Ratios and intervals: Fixed ratio, takes longer to level up each time. Variable ratio produces
highest rates of activity, maplestory drops. Interval schedules. Instead of providing a reward
after cetain number of actions, reward after time
 Special Cases: chain schedules: multiple stages to the contingency.
 Extinction, causes frustration. Behavioral contrast, doing simple task, then get a rock then get
gold get mad at rock even though rock was ok before.
 Avoidance, partipants work to keep things from happening, simple lab rate in cage, small
shockm if press lever wont happen. Press lever at slowrate
 Recipes: Hot to make players play hjard: variable ratio schedule, how to make play forever,
behavioral momentum
 Quit: the prize is shit

Chapter 11 Biological Dispositions in Learning

 People are more readily afraid of events that have some type of evolutionary association with
danger, such as with snakes rather than cars. This innate tendency for an organism to more
easily learn certain types of behaviors or to associate certain types of events with each other is
known as Preparedness.
 Preparedness and Conditioning
o Fear conditioning is one form of classical condition in which preparedeness plays an
important role
 Tase aversion conditioning, form of classical conditioning in where food item
has been paired with stomach ace becomes aversive stimulus. Doesn’t like food
after it makes you feel bad
 Even though it may not be the food, could be flu, the person still doesn’t like the
food. After making rat sick with injections rat doesn’t want sweet water
anymore
 Sweet water (NS): X-ray (US) -> Nausea (UR)
 Sweet Water (CS)  Nausea (CR)
 Stimulus generalization: occurs when food taste similar to aversive item and
perceived as aversive. One fish hate all fish hate.
 Can be extinguished if aversive food item is repeatedly digested without
illness
 Overshadowing, develop aversion of stronger tasting food than mild
tasting food onions  potatoes.
o Presence of food that has aversive associations can block
development of asserve associations to other food.
o Eat peas (hate) and bad fish, you will think getting sick because
of peas
 Latent inhibiton: more likely to associate relatively novel item such as
unusual liquor with sickness than familiar items such as beer.
o Ex: a rat encounters posion, becomes ill not go for poison again,
eat a little move on to food it has eaten already.
 Differences in taste aversion
o The formation of Associations over Long Delays.
 NS and US must occur in temporal proximity in classical conditioning for it to
work.
 Taste aversions can develop 24 hrs later, protect animal from being a dumb shit
and eating poison over and over again.
o One Trial Condition
 Strong conditioned taste aversion can be achieved with only a single pairing.
Much like fear conditioning. Minimize possibility of repeat
o Specificity of Association
 Feel nauseous, associate to the meal, rat associates not feeling good with food
not injection
 Associate with food rather than anything else. Called CS-US relevance, associate
certain types of stimuli with each other
 Rat conditioned with sound and sweet water, given injection  go for normal
water, given shock  still go for sweet water.
 Physical pain deals with visual and auditory stimuli, nausea deals with taste
 Quail and Mouse were given a choice between Dark Blue Sour water, Dark Blue
water and sour water placed in front. Mouse go for Dark blue because deals
with taste. Quails go for sour water because they use vision for food. Rats are
night time feeders and use smell and taste
 Woman are better at odor and create more taste aversion. During pregnancy,
these taste aversion help to ingest anything bad.
 Preparedness in operant conditionin
o Song can be used to train birds to go to certain areas and food used to peck at certain
things. Biological preparedness plays role in operant conditioning
o Easier for bird to fly away to avoid shock but not peck something to avoid shock
o Aversive stimulus elicits Species-Specific Defense Reaction (SSDR), in natural
environment is often effective in countering danger
 Rats run or freeze in painful situation, because they are naturally brought out in
face of danger.
 May even stand still and get shocked
 Operant-Respondent Interactions
o 2 further examples of overlap between operants and respondants: Instinctive drift and
sign tracking
o Instinctive drift:
 Brelands trained animals, pig throw coin everywhere not just put into bank
same with raccoon
 Fixed action patterns had emerged to interfere with operant behaviors patterns.
The coin became associated with food. And the raccon started rubbing it as it
would for shellfish
 Coin (SD)  Deposit in bank R  Food SR
 Coin (NS): Food (US)  Rooting (UR)
 Coin (CS)  Rooting (CR)
 As classically conditioned response increased in strength it overrode operant
conditionied respond
 Instinctive drift: instance of classical conditioning in which genetically based FAP
gradually emerges and displaces behavior that is being operantly conditioned
o Sign Tracking
 Light not only associated with food but also acquired appetitive properties.
 In sign tracking, an organism approaches a stimulus that signals the
presentation of an appetitive event. The approach behavior seems very much
like an operant behavior becvause it appears to be quite goal directed, yet the
procedure that produces it is more closely akin to classical conditioning
 Ex: light presented before food cue, dog salivating on sight of light, dog walks
over to the light and starts displaying food-related behaviors towards it.
 Autoshaping: type of sign tracking in which a pigeon automatically pecks at a
key because key light has been associated with delivery of food.
 Behavior that starts off as an elicited behavior transformed into an
operant behavior. Pigeon initially pecks key because key light predicts
food, later pecks key in order to obtain food
 When a bird pecked a key associated with water, it dod so when the eye
closed beak open, it was trying to drink the key that was associated with
water and eat key associated with food
 They peck even when loss of food
 Key light exerts such strong control over behavior that it overrides
negative punishment. Sign tracking despite resultant loss of reinforce is
Negative automaintainenance
 Adjunctive behavior
o Indistinctive drift and sign tracking represent two types of anomalous behavior patters
that develop during operant conditioning procedure
o Adjunctive behavior: excessive pattern of behavior that emerges as by product of an
intermittent schedule of reinforcement for some other behavior. As one behavior is
being strengthened, another emerges as side effect of procedure , called schedule
induced behavior.
o Basic Procedure and Defining Characteristics
 Falk first person to investigate, Rats trained to press lever also began drinking
excessive amount of water.
 3 hr session drink 3 times amount of water
 Schedule induced polydipsia: excessive thirst developed rapidly beginning in
first session and firmly established second session
 Studies of adjuctive behavior employ FI or FT schedule
 During the interreinforcement intervals adjunctive behavior occurs, time when
reinforce not available in a schedule. Low probability or zero probability of
reinforcement
 Pigeons exposed to FI or FT began attacking nearby pigeon, or picture of stuffed
pigeon. Adjunctive behaviors can be generated not necessarily using food as
reinforce. Use electrical stimulation of pleasure centers to produce adjunctive
behavior of eating
 According to Falk adjunctive behavior has several distinguishing feature
o Typically occur in period immedietly following consumption of intermitten reinforce
 Rat eat food delievered then drink water. Start of interval tends to be
dominated by drinking then at the later end it is dominated by pressing lever
o Adjunctive behavior is affected by level of deprivation for the scheduled reinforce
 Greater level of deprivation stronger the adjunctive behavior. Greatly leaving
out food = more drinking
o Adjunctive behaviors can function as reinforcers for other behaviors
 High probability behaviors serve as effective reinforcers for low probability.
Press lever for food also press lever for water
o Optimal interval between reinforcers for the development of an adjuctive behavior
 Rats will drink less in 5 second interval over 180 second interval
 Adjunctive behavior in humans
o Adjunctive processes contribute to development of drug and alcohol abuse.
o Humans produce adjunctive just not as severe. Exposed to fixed interval schedules of
monetary reinforcement display increase tendency to drink water. Length of interval
between reinforcers was an important variable. LENGTH of variable is important
o Drinking beer and smoking happens after period of reinforcer
o Adjunctive processes encourage individual to frequently consume addictive substance
with person becoming addicted to it
 Adjunctive behavior as displacement Activity
o Adjunctive behaviors represent type of Displacement activity an irrelevant activity
displayed by animals when congronted by conflict or thwarted from attaining a goal.
 Maybe animal is pissed
o Displacement activities serve 2 purposes
 They provide for more diversified range of behaviors in particular setting, often
beneficial. Pick at twigs causing birds to uncover more food.
 Second: help animal remain in situation where significant reinforce might
eventually become available. Pecking ground gives bird something to do to wait
for the emergence of the bug
 Adjunctive viewed as lack of self-control. People that give up an addiction are likely to seek out a
replacement.
 Activity Anorexia
o One type of behavior generated as adjunctive behavior is wheel running
o Activity anorexia, it has some important implication for people who are undertaking a
diet and exercise program to lose weight, causes more running
o Basic and defining characteristics
 Rat allowed to access food for 1.5 hrs feeding period, access to running wheem
22. 5 hrs interval between meals, they will spend increasing amount of time
running durinv interval
 More they run, less they eat, less they eat more they run
 Negative feedback cycle in which two behavioral tendencies increase running
and decrease eating. Within week all run and no eat. Process continues rats will
die
 Activity Anorexia: high level of activity and low level of food intake generated
by exposure to restricted schedule of feeding. No wheel available and restricted
food do just fine. Display only moderate level of running and no tendency
towards self-starvation
o Comparisons with Anorexia Nervosa
 Anorexia nervosa is psychiatric disorder in where patient refuses to eat
adequate amount of food and lose weight, require hospitalization.
 Anorexia comes with high level of activity, exercise program, severe sort of
restless ness.
 Increase in activity follows decrease in food intake, decrease in food intake
follows increase in exercise.
 Athletes and dancers higher chance of being anorexic
 People/ rats enjoy their food and take ime to eat even when anorexic.
 Rat is physicaly restricted of accessing food humans impose self diet even when
food availab.e Free availability of food may be more apparent than real.
 Bulimia: binge on food then purge by vomiting or taking laxatives.
o 2 types of anorexia: Reestricting Type, simple food restriction and
o Binge-eating/ purging type: dieting is combined with binging
 Underlying Mechanisms
o Endorphins play role in anorexia, morphine like substance in brain to help pain
reduction. Implicated in feeling of pleasure, runners high
o Anorexia can be stopped by providing high level of food all of a sudden
 Clinical Implications
o Anorexia has several clinical implications. Behavioral treatments for anorexia should
focus as much on establishing normal patterns of activity as they do on establishing
normal patterns of eating. Endorphin blockers to reduce pleasure
o Anorexia model suggests people who are dieting should eat several small meals per day
as opposed to a large meal. Increasing exercise level should do so slowly. Dieters ensure
that meals are balanced
 Behavior Systems Theory
o Biologoical dispositions play a strong role in conditionis.
o Behavior Systems theory: animals behavior is organized into various motivational
systems, feeding mating, avoiding predators and so forth. Each of these systems
encompass a set of relevant response, can be activated by particular cues. Different
systems overlap such that response that is typically associated with one system may be
instigated by another system
o Feeding system in rat. Rat is hungry, engage in food related responses. Salivating
chewing etc. When food about to come it will salivate food not yet here it will search
cage, even farthere it will run and explore
o Lever pressing and maze work well because rats have evolved to run along enclosed
spaceand using forepaws to pick up food . Manipulating something such as lever is
natural
o Behavior systems theory: assigns important role to environmental cues that determine
which behavior will be activated.
 Chamber in which stimulus rat could mechanically inserted and withdrawn
 When rat inserted just before delivery of food, thus becoming CS for
food.participant rat would approach stimulus and engage in social behavior,
sniffing mouth pawing and grooming
 Error! Hyperlink reference not valid.Pay close attent to other rats steal food.
Stimulus rat brought out social comporent of feeding system in participant rat.
Wooden block did not bring out social component. Rolling marbe brought out
clawing and gnawing.
 Dog that bites when you take its food asserted dominance and is part of
behavior system theory

Chapter 12 : Observational learning language and Rule-Governed Behavior

 Processes that allow us to acquire new behavior patterns through indirect means.
 Observational or Social Learning
o Watch people of what they did and follow, called observational learning
 Behavior of a model is witnessed by an observer, and observer behavior is
alrtered. Observational learning is a social process, humans are social beings,
acquire new behavioral pattern this way. Learning is often referred to as social
learning. Social situation can change behavior.
 Classical and operant conditioning. 2 elementary forms of social influence
 Contagious behavior and stimulus enhancement
 Contagious Behavior and Stimulus Enhancement
o Contagious Behavior: reflexive behavior triggered by occurrence of same behavior in
another individual. Everyone yawn because of one person
o One startled duck to get the rest ro fly off. Laugh tracks help people laugh
o Orienting reponses are also contagious, orient ourselves towards stimuli as well as
orient ourselves in direction others have oriented. Works for dogs with umans they
detect things we don’t detect
o Stimulus Enhancement: probability of a behavior is changed because individuals
attention is drawn to a particular item or location. Girl goes for candy causing you go go
for candy because she noticed it. The incentive is enough.
o When animals of same species come across scent, enough to pay attention and find
food.
o Behavioral contagion and stimulus enhancements are examples of social influence. They
are best at rudimentary forms of social influence. Result in momentary change in
behavior.
 Observational Learning in Classical Conditioning
o Observational learning is involved in development of classically conditioned responses.
Stimuli involved are usually emotional in nature.
o Various emotional responses: Classically conditioned emotional responses result from
seeing emotional responses exhibited by others, this type of conditioning is called
Vicarious emotional Conditioning
 Take place in 2 ways. First: Expression of fear in others may act as
unconditioned stimuli that brings out fear in oneself
 Quickly learn events are dangerous.
 JellyFish (NS): Fear of others (US)  Fear in Oneself (UR)
 Jellyfish (CS)  Fear in oneself (CR)
 Look of fear in others (NS)  Frightening events (US)  Fear in onself (UR)
 Loof of fear in others (CS)  Fear in oneself (CR)
 Jellyfish (NS2) Look of fear in others (CS1)  Fear in oneself (CR)
 Jellyfish (CS2)  Fear in oneself (CR)
 After watchibng happy children play with toy observing child wants to play.
 Observational Learning in Operant Conditioning
o Acquisition and performance
 Acquired basic information to drive, until of age can actually Perform driving
o Acquisition
 Requires that observer pay attention to behavior of model. Very sensitive to the
consequences of the models behavior. If behavior is reinforced, observer may
do behavior.
 Observer receives reinforcement for the behavior to a model. Teaching is based
on this principle
 Third: Whether the observer has sufficient skills to benefit from modeling.
Playing chopsticks vs. Beetoven is way different.
 Personal Characteristics of a model can strongly influence the extent to which
we will attend to their behaviors. Attend to models that resemble us—for
example people dress the same. Follow people we respect
o Performance: reinforcement and punishment work to midfy behavior in modeling
situations in three ways
 We are more likely to perform a modeled behavior when we have observed
model experience reinforcement for that b ehavior
 Vicarious reinforcement: Model wears perfume attracts people of
opposite sex, wear the same perfume
 Tell same joke that makes people laugh
nd
 2 : We are more or less likely to perform a modeled behavior when we
ourselves will experience reinforcement or punishment for behavior
 Tell joke that people like keep telling it, frown stop telling it
rd
 3 : our own history of reinforcement or punishment for performing modeled
behavior
 What is appropriate to model and who is appropriate to model off of.
When it is appropriate to perform behaviors that have been modeled by
others
 Imitation
o Interchangeably with observational learning. True Imitation is a form of observational
learning that involves close duplication of novel behavior
o Ex: If hot girl flirt with guard let in, Kelly flirt in different way it would not be true
imitation, flirt in same way, true imitation takes place
o Children generally imitate a model
o Generalized imitation: tendency to imitate a new modeled behavior with no specific
reinforcement of doing so.
 By reinforcing imitation of some behaviors therapists can produce in these
children a generalized tendency to imitate
 Can Animals Imitate?
o Kittens better imitate mom because of stimulus enhancers, pay more attention. Follow
mom and manipulate box
o In an experiment Pigeons are more likely to do what they see.
o Adult imitation more similar to monkey than children, even though children are the
imitators.
o Debate whether or not animals can really imitate, orangatanes are able to copy
 Social learning and Aggression
o Children observed adult models behaving aggressively toward bobo doll. Children who
observed were violent towards Bobo replicate behavior, perform same motor
movements toward same target
o Children who heard disapproving comments about fight produce far fewer aggressive
behaviors, only when disapproving adult was present.
o Absence increase in aggression
 Social Learning and Media Violence: From Bobo doll to Grand Theft Auto
o Filmed violence was as effective as live violence for inducing violent behaviors in
observers
o Pacman to grand theft auto. The amount of violent media viewed in childhood is
correlated with aggressive and antisocial behavior 10 years later. Significantly related to
adult criminality
o Amount of TV watched in childhood is correlated with amount of aggressive violent
behavior toward others
o Males more aggressive than females, more hostile view. Girls inhibit violence,
demonstrate higher frequency of aggression to girls over guys
o Desensitization to violence allow females to feel that violence and aggression are
normal aspects of life which could lead them to violent relationships
o Some people may view violence and never grow violent. Exposure to media can increase
likelihood of becoming victim to violence
 Language
o Written, Spoken, Symbolic
o Humans born with black box helps acquire language.
o Ability to use arbitrary symbols, is called reference
o Grammar set of rules that control meaning of a string of words. Monkeys have different
calls for different predators
o Productivity: finite number of rules for any language, once learned, infinite number of
expressions can be generated
o Situational Freedom, which means that language can be used in a variety of contexts
and is not fixed to a particular situation
 Can Animals Talk
o Cross-fostering: because chimpanzees were raised in human foster homes.
 Sign Language Experiments
o Monkeys are able to produce and understand other forms of language, American Sign
language.
o Project Washoe, raised by 2 scientists, use modeling, demonstrate sign open. Also used
technique called molding, placing apes hand in correct signing position and associate
o Strictly controlled tests of language use were performed with many of the chimpanzees
trained in sign language. All chimps seemed to pass Reference test.
 Artificial Language Experiments
o Lived in laboratories and coversed via artificial language. Yerkish, easy grammar and
vocab basic feature of language. Artificial and highly controlled surroundings made
systematic assessment relatively easy. Could have just memorized sequence Please
machine give X.
o Dolphins are social and better to train. Trained to do symbolic language. Researches
have worked with two dolphin. Understand a bit of grammar
 Rule Governed Behavior
o Use of language enhances ability to interact with one and another.
o Rule: verbal description of a contingency. In a setting, performing a behavior will bring
out a certain consequence.
o Rule-governed behavior:behavior generated through exposure to rules.
o Rules or instructions: extremely useful for rapidly establishing appropriate patterns of
behavior
o Follor rules given in order to behave effectively in certain settings
o Generalized tendency to follow instruction as it produces good results
 Some disadvantage of rule-governed behavior
o Less efficient than behavior directly shaped by natural contingency. Devoting time to
play golf can’t just read instructions.
o 2nd drawback behavior is sometimes surprisingly insensitive to the actual contingency of
reinforcement operating in a particular setting. Locked into notion of a certain golf
swing
 Personal Rules in self regulation
o Advantage outweigh disadvantage
o Personal rules: verbal description of contingency we present to ourself, go to library
o Say do correspondence: close match between what we say we are going to do and what
we actually. Carry out promises.
o Need to specify when and where and how a goal is to be accomplished
o Personal process rules: personal rules that indicate the specific process by which a task
is to be accomplished.
Watson This man founded the psychological approach
of behaviorism. He was also known for his
unethical "Little Albert" Experiment

Skinner This man is known as the founder of radical


behaviorism and advocated a pure behaviorist
approach

Behaviorism *This is the school of psychological study that


focuses on actions and treats the mind like a
black box

Little Albert This was an experiment in which a toddler was


Experiment classically conditioned to fear small rodent like
creatures through association with loud
unpleasant sound

Consciousness of This is the school of psychological study that


Mind focuses on the thought processes and mental
experiences of an individual

Reflexive Behaviors *These are behaviors that are mechanistic


causality; they occur automatically in response
to the environment

Operant Behaviors *These are behaviors that are affected by


consequences; we perform them in order to
achieve a desired affect
Experimental This is research that provides an explanation
Research

Descriptive This is research that does not provide an


Research explanation but merely describes

Independent **This is the factor in an experiment that is


Variable manipulated to determine effect

Dependent Variable **This is the factor in an experiment that


changes in response to a manipulated variable

Unlearned, These are the four criteria for innate behavior


Invariant, Universal,
Adaptive

Reflexes This is a kind of innate behavior that is a


simple one step response to something (such
as a knee jerk reaction)

Fixed Action Pattern *This is a sequence of actions that are elicited


by a specific stimulus

Static Environment Innate behaviors provide an advantage here

Dynamic Ability to learn provides an advantage here


Environment

Habituation **This is a loss of response to a stimulus after


repetition

Sensitization **This is a heightening of response to a


stimulus after repetition
Spontaneous **This is the resurgence of a habituated
Recovery response after a break

Dishabituation **This is the restoration of a response that has


been weakened by habituation to a single novel
stimulus

Higher Intensity **This leads to sensitization

Lower Intensity **This leads to habituation

Moderate Intensity This leads to sensitization, then habituation

Classical *This is learning done by association; two


Conditioning stimuli are paired, one of which predicts the
other and elicits a response mirroring a
response to the predicted stimulus

Unconditioned *This is a stimulus that elicits a response by


Stimulus default

Conditioned *This is a stimulus that in and of itself is


(Neutral) Stimulus unrelated to a subject but becomes relevant
through the use of a stimulus that elicits a
response by default

Unconditioned *This is an innate reaction to a stimulus


Response

Conditioned *This is a response to a stimulus that a subject


Response (CR) originally had no interest in until after being
paired with a stimulus that elicited an innate
response
Pavlov's Dogs *This is accepted as the first experiment
regarding classical conditioning

Percentage and These are the two measures of how effective


Magnitude of CRs classical conditioning is

Latency of CR This is the time between a conditioned


stimulus and a conditioned response; it
typically decreases with successive trials

Conditioning Trial **This is a trial in which there is a conditioned


stimulus to unconditioned stimulus pairing

Test Trial **This is a trial in which a conditioned stimulus


is presented without an unconditioned stimulus
at intervals between conditioning trials in order
to test how strongly a subject has been
conditioned thus far

Extinction **This happens when a conditioned stimulus is


repeatedly presented without an unconditioned
stimulus, thus extinguishing the conditioned
response

Disinhibition **This is the recovery during extinction of a


conditioned response due to a similar but novel
stimulus

Spontaneous **This is the resurgence of a conditioned


Recovery response after extinction

Excitatory This is conditioning that occurs when the


Conditioning probability of an unconditioned stimulus
occurring is greater than not at the
presentation of a conditioned stimulus

Inhibitory This is conditioning that occurs when the


Conditioning probability of an unconditioned stimulus
occurring is less than not at the presentation of
a conditioned stimulus

Latent Inhibition **This is a term that refers to the fact that it is


easier to condition a novel stimulus than one
that has already taken on meaning for a subject

Higher-Order **This is conditioning that first ties a


Conditioning conditioned stimulus to an unconditioned
(Second-Order, stimulus and then ties another conditioned
Third-Order, etc) stimulus to the first conditioned stimulus (and
so on repeatedly any number of times)

Sensory **This is conditioning in which two stimuli are


Preconditioning first presented together with no relevance to
one another. Second, one stimulus is paired
with an unconditioned stimulus and the
stimulus is conditioned to produce a
conditioned response. Lastly, upon
presentation of the other stimuli in the first
stage, said stimulus should still elicit the
conditioned response

Blocking **This is when conditioning has already


occurred and thus prevents further
conditioning to a new stimulus (only when
paired together at the same time as the second
stimulus; the second stimulus is made
irrelevant)

Overshadowing **This is when two stimuli are used and


associated with an unconditioned stimulus and
only the stronger of the two is conditioned (i.e
excessively bright lights and a very soft
singular ring where only the excessively bright
lights become a conditioned stimulus)

Delayed (This is **This is the most effective timing for


when the conditioning
conditioned
stimulus comes
immediately before
the unconditioned
stimulus)

Backwards (This is **This is the least effective timing for


when the conditioning
conditioned
stimulus comes
after the
unconditioned
stimulus)

Pavlov's Stimulus This is the theory that the CS replaces the US


Substitution Theory and that the US and UR are essentially identical

Siegel's This is the theory that proposes an individual


Compensatory CR that performs a procedure multiple times
Theory begins to counteract the procedure's primary
effects before going through the procedure
(such as weakening drug effects for an addict)
Generalization *This is when similar stimuli are broadly
registered as relevant

Discrimination *This is when similar stimuli are rejected as


being different from a noted stimulus due to
slight differences

Flooding *This is a method of eliminating a phobia by


inundating an individual with the stimulus that
causes their terror until they no longer fear said
stimulus

Experimental This is caused by the inability to discriminate


Neurosis between stimuli that predict good outcomes
and stimuli that predict bad ones. It causes
stress and anxiety in the subject

Systematic *This is a method of eliminating a phobia by


Desensitization ranking levels of discomfort in situations with
progressively stronger interaction with the
phobia and moving up these ranks as they stop
causing a phobic response

Aversion Therapy *This is a method of preventing an individual


from continuing an undesirable habit by pairing
the habit with something undesirable (For
laughs and a great example:
http://www.youtube.com/watch?v=1aFji7aJP7o)

Phobia *This is a maladaptive fear

Counterconditionin *This is the learning of a new, incompatible


g association to a conditioned stimulus that
negates a previously conditioned stimulus
(useful in extinguishing phobias)

Latent Learning *This is a form of learning that is not


immediately expressed in an overt response

Law of Contiguity This is the principle that when two ideas or


events are perceived close in time or
association, it is probable that they will appear
close in time or associated again. This is a
founding principle of conditioning

Empiricism This is the principle that all knowledge is


derived from sense and observation

Nativism This is the principle that certain skills are


hardwired into creatures

Social Learning This posits that people learn within a social


Theory context through concepts such as modeling
and observation

Reciprocal This is the theory that proposes a person's


Determinism behavior both influences and is influenced by
one's environment

Stimulus-Response This is the basis for classical conditioning; it is


Theory another way of noting the link between the
environment and an individual's reaction

Control Group *A group on which no experimentation is


performed; serves as a baseline for an
experimental group
Reversal Design This is the process of establishing a baseline
and then manipulating an independent variable
and measuring changes in the dependent
variable

Satiation This is the decreasing effectiveness of a


reinforcer due to excess repetition of it

Deprivation This is the withholding of a reinforcer in order


to increase its effectiveness

Spatial Contiguity This is when two things appear near each other
in location

Temporal This is when two things appear near each other


Contiguity in time

Contingency This is when the order of experiences are


consistent. It's important for conditioning, for
example, that the conditioned stimulus comes
prior to the unconditioned stimulus each trial in
order to serve as a reliable predictor

Opponent Process *This postulates that experiences are divided


Theory into pleasant and unpleasant and have an
aftereffect that mirrors the original experience
and gets stronger with each iteration. An
example of this is addiction, where the first
experiences are very enjoyable, but further
doses produce a less and less desirable high

Acquisition *The strengthing of a conditioned response


Compound Stimulus A complex stimulus consisting of the
presentation of two or more stimuli (such as a
light and a tone)

External Inhibition This is a decrease in the strength of a


conditioned response due to the simultaneous
occurrance of a novel stimulus with the
conditioned stimulus

Occasion Setting This is a procedure where a stimulus (such as


an environment or ambience, like a party)
signals that a conditioned stimulus will be
followed by the unconditioned stimulus with
which it is associated

Pseudoconditioning This is when a response seems like a


conditioned response but really is merely a
result of sensitization

Semantic This is the generalization of a conditioned


Generalization response to verbal stimuli synonymous to the
conditioned stimulu

Unconditioned This is the presenting of an unconditioned


Stimulus stimulus at a different intensity than
Revaluation conditioned in order to elicit a stronger or
weaker conditioned response

Overexpectation This is when two conditioned stimuli are


Effect conditioned separately for the same
unconditioned stimulus. When both
conditioned stimuli are used at the same time
in tandem with the unconditioned stimulus, the
effect is actually weaker

Prepatory Reponse This proposes that the conditioned response is


Theory to prepare an organism for the unconditioned
stimulus

Reciprocal This is a technique in behavior therapy in which


Inhibition a response is inhibited by conditioning a new
response that is incompatible with the
maladaptive response

Classical This is the associating of reflex responses with


Conditioning new stimuli

Operant This is the associating of voluntary behaviors


Conditioning and cognitive responses with new stimuli

Primary Reinforcer This is a natural reinforcement; no learning is


required to view such as positive and desirable
(i.e food, water, sex)

Secondary This is a reinforcer that has been associated


Reinforcer (A with a natural reinforcement through
primary reinforcer conditioning
can be conditioned
to be a secondary
reinforcer as well)

Shaping This is a process of operant conditioning in


which a behavior is modified in steps till it
matches a desired response (think of the prof's
cat and trying to train it to use a toilet)
Positive This is when something desirable is introduced
Reinforcement and increases the probability of a behavior
happening

Negative This is when something undesirable is taken


Reinforcement and increases the probability of a behavior
happening

Positive Punishment This is when something undesirable is


introduced and decreases the probability of a
behavior happening

Negative This is when something desirable is taken and


Punishment decreases the probability of a behavior
happening

Learned This is the condition of failing to respond even


Helplessness when opportunities to avoid unpleasant
circumstances or gain positive rewards due to
perceived absence of control over situations

Avoidance Behavior This is a side effect of punishment in which the


subject tries to stay away from the source of
punishment

Contingency This is when one event is dependent on


another

Contiguity This is when one event occurs near, spatially or


temporally, another event or stimulus

Deprivation This is the denial of a particular stimulus in


order to increase its perceived value to a
subject

Standard This is something that is exchangable for only


one type of primary reinforcer

Generalized This is something that can be exchanged for


Conditioned multiple reinforcers. It possesses the
Reinforcer advantages of being cheaper, immediate, and
not being tied to any specific satiation or
deprivation

Extrinsic Reinforcer This is a reinforcer that comes from outside the


actual performing of a behavior

Intrinsic Reinforcer This is a reinforcer that comes simply from


performing a behavior

Continuous This refers to when every response is


Schedule reinforced

Intermittent This refers to when not every response is


Schedule reinforced

Ratio Schedule This refers to when reinforcement depends on


the number of times or degree to which a
response is exhibitted

Interval Schedule This refers to when reinforcement is dependent


on how much time passes by

Fixed Schedule This refers to when reinforcement is


predictable based on number of responses or
time passed
Variable Schedule This refers to when reinforcement is
unpredictable based on number of responses
or time passed

Fixed Ratio This has a response pattern that is high until


Schedule the reinforcement is presented, after which
there is a pause

Fixed Interval This has a response pattern that starts low and
Schedule increases closer to the reinforcement, after
which there is a pause

Variable Ratio This has a response pattern that is extremely


Schedule high with no pauses

Variable Interval This has a response pattern that is fairly high


Schedule with no pauses

Superstition This is a false association of a behavior with a


stimulus

Hull's Drive- This is the theory that stimuli which reduce


Reduction Theory biological needs or reduce aversive stimuli
become reinforcers (ex. drive = hunger --> act
= press lever --> drive reduced = receive
food), where

Premack's Principle This is the theory that states more probable


behaviors will reinforce less probable behaviors
(i.e, in order to condition someone to do
homework, the instrumental behavior, you have
to provide a contingent behavior, such as
candy, as a reward)
Response This suggests there is a baseline at which
Deprivation people, without restrictions of reality, will
perform certain behaviors. By preventing an
individual from performing those activities at
his or her baseline, one can use it as a
reinforcing activity to get that individual to
perform a conditioned behavior

Bliss Point This is the point at which utility of


consumption is maximized

Operant Extinction This is what happens when one stops


reinforcing a behavioral response, ultimately
reducing a behavior's probability to zero

Differential This is when a conditioner reinforces any


Reinforcement of behavior except a targeted undesired behavior
Other Behavior
(DRO)

Differential This is when a conditioner reinforces


Reinforcement of behavior(s) that can not be performed at the
Incompatible same time as a targeted undesired behavior
Behavior (DRI)

Extinction Burst This is when a subject exhibits a high rate of


behavior that once elicited a reinforcement, but
no longer does, in an attempt to receive that
reinforcement

Spontaneous This is the resurgance of an extinguished


Recovery response after a long amount of time has
passed
Resistance to This refers to a subject's susceptability to
Extinction (RTE) extinction

High Resistance to Intermittent schedules have this in regards to


Extinction extinction

Low Resistance to Continuous reinforcement schedules have this


Extinction in regards to extinction

Discriminative In the presence of this, behaviors are


Stimulus reinforced, and in the absence of it, behaviors
are not reinforced

Stimulus This is the tendency for a response to be


Generalization elicited due to a stimulus similar to a
discriminative stimulus

Discrimination This is the reinforcement of responding to a


Training stimulus and not another stimulus (i.e: Yellow
key, peck for food, blue key, pecks don't garner
food; casual dress among friends merits social
approval, but with an interviewer does not)

Discriminative This is a stimulus that signals the absence of


Stimulus for reinforcement
Extinction

Intradimensional This is the kind of discrimination training in


which both the discriminative and bad stimulus
are of the same kind (different colored lights)

Interdimensional This is the kind of discrimination training in


which the discriminative and bad stimulus are
not of the same kind (a colored light and a
tone)

Peak Shift This refers to the phenomenon in which a bad


stimulus in discrimination training can move
where the most responses occur away so that
it's no longer centered over the descriminative
stimulus. It only occurs in intradimensional
training

Aversive Control This is the implementation of undesirable


stimuli to discourage unwanted behavior

Escape This is the removal of an aversive stimulus


after it has already started occuring

Avoidance This is the prevention of an aversive stimulus


before it has started occuring

Signaled Avoidance This is when a stimulus precedes an aversive


stimulus, warning the subject

Non-Signaled This is when there is no stimulus precedes an


Avoidance aversive stimulus to warn a subject

Two-Factor Theory This proposes that fear can be classically


conditioned to teach a subject to avoid a
signaled stimulus; the fear is classically
conditioned and the avoidance is operantly
conditioned

One-Factor Theory This proposes that preventing or escaping the


undesired stimulus is the only noteworthy part
of avoidance

Reduces the A punishment is considered to be effective if it


behavior quickly can do the following:
and significantly
(ideally immediately
and completely)

Intensity and These are the two main factors that affect the
Immediacy strength of a punishment and how effective it is
in reducing a behavior

Less effective This describes how effective a punishment is if


you start at low intensity and gradually build up

More Effective This describes how effective a punishment is if


you start at high intensity

Contingency, These are four additional factors that affect the


Frequency, Positive strength of a punishment and how effective it is
(as opposed to in reducing a behavior
negative), no
Discriminating
Stimululs (to warn)

Simple Schedule This is a single reinforcement pattern

Complex Schedule This is a reinforcement pattern that combines 2


or more simple schedules

Chained Schedule This is a complex reinforcement pattern in


which the goal is to acquire a terminal primary
reinforcer
Multiple Schedule This is a complex reinforcement pattern in
which each step has it's own primary reinforcer

Concurrent This is when there are two or more


Schedule independent simple schedules available at the
same time

R1 / (R1 + R2) This is the formula for determining how


strongly a subject prefers on concurrent
interval schedule over another

Herrnstein's This states that a preference for an interval


Matching Law option approximates its rate of reinforcement
(i.e, if I like option 1 four times as much as
option 2, I will use option 1 80% of the time)

Exclusive Preference Concurrent fixed ratio schedules will result in


for the better this
schedule

Overmatching This is when the preferred schedule is chosen


more often than predicted

Undermatching This is when the preferred schedule is chosen


less often than predicted

Bias This is when an organism displays preference


for an option despite the reinforcement
schedule

Self Control This describes the ability to determine that an


option, even though delayed, is more
worthwhile than a more immediate option
Impulsivity This describes the inability to determine that
an option, even though delayed, is more
worthwhile than a more immediate option

Commitment This is the act of contracting oneself to a later,


larger benefit than a smaller, sooner one

Differential This is when reinforcement is contingent on


Reinforcement of performing at least a certain number of
High Rates responses

Differential This is when reinforcement is contingent on


Reinforcement of responding to a stimulus at a slow or low rate
Low Rates

Differential This is when reinforcement is contingent on


Reinforcement of emitting responses at a set rate, neither too
Paced Responding fast nor too slow

Fixed Duration This is a reinforcement pattern where behavior


Schedule must be held continuously for a predictable
amount of time

Variable Duration This is a reinforcement pattern where behavior


Schedule must be held continuously for an unpredictable
amount of time

Fixed Time This is a reinforcement pattern where behavior


Schedule is irrelevant and reinforcement is given after
predictable amounts of time have passed

Variable Time This is a reinforcement pattern where behavior


Schedule is irrelevant and reinforcement is given after
unpredictable amounts of time have passed

Law of Effect This states that a behavior is likely to increase


when it elicits desirable outcomes and likely to
decrease when it elicits undesirable outcomes

Three-Term This is the connection between a stimulus, a


Contingency response, and the reinforcement or punishment
that follows the response

Goal Gradient Effect This refers to the increased strength or


efficiency of responses as a subject draws
closer to a goal

Fading This refers to the gradual removal of a prompt


in training a subject to elicit a certain behavior

Terms Definitions

Preparedness This is the term alluding to the biological


disposition for an organism to respond a
certain way to a stimulus

Species-Specific This is the way in which certain animals


Defense Reaction respond to aversive stimuli (birds fly away,
humans and rats freeze up)

Instinctive Drift This is when an innate animal behavior


displaces a conditioned response

Sign Tracking This is when an organism tends to approach a


pleasant conditioned stimulus that it has
associated with something it desires
Adjunctive Behavior This is when a seemingly contingent behavior
is increased in frequency due to FT or FI
schedules (rats drinking excessive amounts of
water on such food schedules)

Displacement This is something often performed when an


Activity organism has to decide between two highly
desired choices or when denied a single highly
desirable action (pigeons will peck at the
ground, humans will scratch their head)

Activity Anorexia This refers to high activity in response to


restriction of food intake

1 in 50 children This is the portion of children who are


diagnosed with autism spectrum disorder

5 times This describes the multiplier with which boys


are likely to be diagnosed with autism spectrum
disorder than girls

Nonverbal, Peer The diagnostic criteria for autism spectrum


Relationships disorder includes that at least 2 of the
Spontaneous following social deficiencies apply:
Sharing, Social
Reciprocity

Delay in Language The diagnostic criteria for autism spectrum


Development, disorder includes that at least 1 of the
Initiating or following communication deficiencies apply:
Maintaining
Conversation,
Repetitive Use of
Language, Lack of
Imaginative Play

Restricted Range of The diagnostic criteria for autism spectrum


Interest, disorder includes that at least 1 of the
Nonfunctional following behavioral deficiencies apply:
Routines, Repetitive
Body Movements,
Preoccupation With
Parts of Objects

Antecedent, In behavioral theory, this is the stimulus that


Behavior, elicits the child's behavior, the response by the
Consequence child, and what maintains the behavior,
respectively

Prompt After an antecedent or initial behavior, one can


use these to push the child towards the desired
behavior

Pivotal Response This seeks to improve motivation and ability to


Training respond to multiple cues, which in turn is
believed to improve other behaviors in turn

Physical Cognition This is the processing of useful information for


the exploitation of tools

Social Cognition This is the processing of social information


pertaining to other members of a community

Social Learning This is when an animal learns a behavior or


information about a stimulus by watching
another animal
Demonstrator In social learning, this is the animal originally
performing a behavior

Observor In social learning, this is the animal watching a


behavior and later reproducing all or part of it

Imitation This is the copying of a demonstrator's


response when it leads to a reward

Overimitation This is the copying of excess steps a


demonstrator takes in acquiring a reward

Chimpanzees These tend to imitate only necessary steps in


acquiring a reward such as food

Human Children These tend to overimitate unnecessary steps in


acquiring a reward such as food

Teaching This is when a demonstrator deliberately


modifies its behavior in the presence of a naive
animal at a cost or with no immediate benefit
to itself in order to help the animal learn faster

Mimicry This is the copying of a demonstrator's


response when it doesn't lead to any obvious or
immediate reward

You might also like