0% found this document useful (0 votes)
13 views37 pages

Artigo The Fuzzy Concept of Applied Behavior Analysis Research

The article critiques the seven-dimension framework of applied behavior analysis (ABA) established by Baer, Wolf, and Risley, arguing that it has become a bottleneck for addressing socially relevant issues in contemporary research. The authors advocate for a broader interpretation of ABA that prioritizes societal interest in problems over strict adherence to the original framework. They emphasize the need for a more flexible understanding of ABA research, suggesting it should be viewed as a fuzzy concept rather than a rigid set of criteria.

Uploaded by

matti.venturi96
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
13 views37 pages

Artigo The Fuzzy Concept of Applied Behavior Analysis Research

The article critiques the seven-dimension framework of applied behavior analysis (ABA) established by Baer, Wolf, and Risley, arguing that it has become a bottleneck for addressing socially relevant issues in contemporary research. The authors advocate for a broader interpretation of ABA that prioritizes societal interest in problems over strict adherence to the original framework. They emphasize the need for a more flexible understanding of ABA research, suggesting it should be viewed as a fuzzy concept rather than a rigid set of criteria.

Uploaded by

matti.venturi96
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 37

BEHAV ANALYST (2017) 40:123–159

DOI 10.1007/s40614-017-0093-x
ORIGINAL RESEARCH

The Fuzzy Concept of Applied Behavior


Analysis Research

Thomas S. Critchfield 1 & Derek D. Reed 2

Published online: 15 May 2017


# Association for Behavior Analysis International 2017

Abstract A seven-dimension framework, introduced by Baer, Wolf, and Risley in an


iconic 1968 article, has become the de facto gold standard for identifying Bgood^ work
in applied behavior analysis. We examine the framework’s historical context and show
how its overarching attention to social relevance first arose and then subsequently
fueled the growth of applied behavior analysis. Ironically, however, in contemporary
use, the framework serves as a bottleneck that prevents many socially important
problems from receiving adequate attention in applied behavior analysis research.
The core problem lies in viewing the framework as a conjoint set in which Bgood^
research must reflect all seven dimensions at equally high levels of integrity. We
advocate a bigger-tent version of applied behavior analysis research in which, to use
Baer and colleagues’ own words, BThe label applied is determined not by the proce-
dures used but by the interest society shows in the problem being studied.^ Because the
Baer-Wolf-Risley article expressly endorses the conjoint-set perspective and devalues
work that falls outside the seven-dimension framework, pitching the big tent may
require moving beyond that article as a primary frame of reference for defining what
ABA should be.

Keywords Applied behavior analysis . History . Historiography . Translation . Social


validity

Before long, a half-century will have elapsed since the first issue of Journal of Applied
Behavior Analysis (JABA) featured Baer, Wolf and Risley’s (1968) iconic article,

* Thomas S. Critchfield
[email protected]
* Derek D. Reed
[email protected]

1
Department of Psychology, Illinois State University, Normal, IL 61761, USA
2
Department of Applied Behavioral Science, University of Kansas, Lawrence, KS 66045, USA
124 BEHAV ANALYST (2017) 40:123–159

BSome current dimensions of applied behavior analysis^ (hereafter we will refer to the
article as BWR, and the authors as Baer et al.). In the years since the article was
published, the seven-dimension framework that it described (top, left panel of Fig. 1)
has become deeply engrained in the fabric of applied behavior analysis (ABA; see
Carpintero Capell, De Barrio, & Mabubu, 2014). The framework is invoked frequently
in ABA scholarship, with BWR garnering nearly 3400 scholarly citations (Google
Scholar search conducted October 6, 2016) and its citation rate increasing across the
decades (top, right panel of Fig. 1). The BWR framework also is routinely used to initiate
future professionals into the field of ABA. At the time of this writing, we examined how 12
behavior analysis textbooks, published in the years 2000–2015, defined ABA. Eleven of
the books referenced BWR for this purpose, with 9 specifically describing the seven
dimensions (see Fig. 1, bottom, for an example). Similarly, according to the Behavior
Analyst Certification Board’s list of required competencies, a Board Certified Behavior

Fig. 1 Top, left: the seven dimensions of applied behavior analysis as described by Baer, Wolf, and Risley
(1968). All of behavior analysis is Behavioral, Analytical, and Conceptual. ABA’s focus on problems of social
importance is a subset of Behavioral, since not only selected behaviors are bound up in social concerns. ABA
creates Technological Interventions that are Effective under the conditions in which they are implemented and
also produce General outcomes. Top, right: mean number of scholarly citations per year of the Baer et al.
article. Bottom: example of a textbook summary of the seven dimensions. Reproduced from Mayer et al.
(2012) by permission of the Sloan Publishing Company
BEHAV ANALYST (2017) 40:123–159 125

Analyst® must be able to, BDefine and describe the dimensions of applied behavior
analysis (Baer, Wolf, & Risley, 1968)^ (Behavior Analyst Certification Board Fifth Edition
Task List, retrieved February 13, 2017, from http://bacb.com/wp-content/uploads/2017/01
/170113-BCBA-BCaBA-task-list-5th-ed-english.pdf).
As the preceding examples document, BWR has become the de facto frame of
reference for describing what ABA is, or should be, at the level of both research and
practice. Both domains are important, but increasingly they are separate domains. At
the time BWR was composed, seminal practitioners were also, for the most part, ABA’s
seminal scientists (Rutherford, 2009), but in recent years, the roles of scientist and
practitioner have tended to be filled by different individuals (e.g., Marr, 1991; Rider,
1991; Sidman, 2011). The present article focuses on research and examines how BWR
achieved its status as the benchmark definition of scholarship in ABA. This part of our
appraisal examines some general factors that lead people to regard a publication as
iconic and the specific historical conditions under which BWR was created.
In recognition of the fact that iconic works have wildly varied shelf lives,1 we also
explore BWR’s continuing qualifications as the gold-standard description of ABA
research. Central to this part of our appraisal is the mission that Baer et al. defined
back in 1968: the capacity of ABA research to achieve broad social relevance (defined
in terms of Bthe interest which society shows in the problems being studied;^ BWR, p.
92).2 Among the points of discussion is a risk that an overly strict criterion for what
qualifies as ABA may focus attention on specific methods and away from many
informative lines of investigation. Because research of societal importance is framed
primarily in terms of topics or problems rather than methods, we suggest that there is no
one-size-fits-all definition of ABA research. Our core proposition will be that ABA
research should be thought of as a fuzzy concept with many possible features, not all of
which must be present in any single investigation. Throughout, we make no attempt to
thoroughly summarize BWR (its wide circulation suggests that readers will require no
such assistance) although we quote from it as necessary to show that our observations
are not subjective.

Iconic Publications Articulate a Discipline’s Rules of Engagement

People tend to value publications that shaped the discipline to which they are commit-
ted. Branches of geology, biology, and psychology, respectively, were forever altered
when Hutton (1788) proposed that rock layers hold clues to events of the distant past;
when Darwin (1859) derived the principle of natural selection; and when Skinner
(1938) showed how to reveal regularities by focusing on behavior-consequence rela-
tions. Iconic publications like these alter a discipline by describing rules for engaging
with a particular subject matter, and such rules of engagement are appreciated because

1
For example, compare Einstein’s theory of general relativity, which for more than a century has withstood
every empirical challenge (Halpern 2015), to Golgi’s (1873) notion of the nervous system as a single
continuous network of tissue, which had already been disconfirmed by the time the work was recognized
with a 1906 Nobel Prize.
2
Or Bimportance.^ Debate could be waged over what defines Bimportance^ but for present purposes, we
invoke it in the generic sense of Baer et al. (Bthe interest which society shows in the problems being studied;^
p. 92) and assume that, as with pornography, observers will know it when they see it.
126 BEHAV ANALYST (2017) 40:123–159

they save the reader a considerable amount of contingency shaping. Skinner (1956)
colorfully described his early career as a series of laboratory false starts that gradually
shaped an investigatory repertoire. This repertoire proved crucial to later successes of
both Skinner and those who followed his lead. Those who followed had it easier than
Skinner, of course, because Skinner shared not only what had been discovered but also
some of the rules of engagement that fueled his discoveries (e.g., Ferster & Skinner,
1957; Skinner, 1938).
By disseminating rules of engagement, iconic publications establish a network of
Bapprentices,^ with repertoires similar to those of the mentor-author, who subsequently
come into contact with significant professional reinforcers (e.g., see Skinner’s [1957]
account of scientific behavior). This outcome helps to explain the esteem with which
many disciplines regard their Bfirst textbooks,^ such as Thorpe’s (1898) in chemical
engineering, and Keller and Schoenfeld’s (1950) Principles of Psychology in behavior
analysis. In reflecting on his own impressive career accomplishments, for example,
Murray Sidman (1995) was reminded of Principles of Psychology, to which he was
exposed fairly early in his training:

I found that much of my own behavior that I had assumed was the result of
interactions with my own data, assisted by my own creative thought processes,
actually came from [Principles of Psychology]; the behavior had become part of
my repertoire. (unpaginated preface)

Sidman’s comment precisely captures how, for newly indoctrinated Bapprentices,^


an iconic publication Bdoes not just present what is known but points out what we need
to know, and suggests how we might find out^ (Sidman, 1995, unpaginated preface).
As the next section describes, BWR may be assumed to have served this role for early
generations of ABA investigators.

BWR’s Pivotal Role in the Development of ABA

BWR is appreciated today for providing the first systematic rules of engagement for
extending behavior analysis, which began in the laboratory, to the field. These rules of
engagement had taken form gradually, and through considerable trial and error, during
the 1940s through the early 1960s, when procedures devised for studying the behavior
of rats and pigeons first were adapted for use with humans (e.g., Rutherford, 2009). As
some of these efforts shifted from laboratory to field, by the late 1950s several
published reports documented the emergence of something not very different from
ABA as Baer et al. would later describe it (Ayllon & Michael, 1959; Azrin & Lindsley,
1956; Williams, 1959; see also Morris, Altus, & Smith, 2013).
Once initiated, ABA grew rapidly. Barely a decade after those seminal reports of the
1950s, a sufficient body of work existed to found JABA, which provides as good a marker
as any of ABA becoming distinct from basic research in the experimental analysis of
behavior (Morris et al., 2013). Thus, BWR, as a principal component of JABA’s first issue,
appeared just as the efforts of a modest-sized community of pioneers were coalescing into
a scholarly movement. Later, Baer et al. would acknowledge a debt to this seminal
community by referring to BWR as Ban anthropologists’s account of the group calling
its culture Applied Behavior Analysis^ (Baer, Wolf, & Risley, 1987, p. 313).
BEHAV ANALYST (2017) 40:123–159 127

The rules of engagement articulated in BWR placed a heavy emphasis on addressing


problems of societal importance (hereafter we will refer to this as a social-validity
perspective). Subsequently, ABA developed strongly along these lines, with attendant
growth in the number of individuals practicing ABA as BWR described it (Baer et al.,
1987), in the number of total pages published in JABA, and the proportion of JABA
articles describing BWR-style research (Hayes, Rincover, & Solnick, 1980).
At least part of this growth can be credited directly to the rules of engagement that
BWR articulated. With its clear specification of an agenda and its common-sense
explanation of defining dimensions, BWR would have proven immensely useful in
training new generations of Bapprentices^ according to the social-validity model. As
Baer et al. (1987) later pointed out (though BWR, surprisingly, did not), such an
emphasis on social validity is essential to harnessing contingencies of survival that
allow professional disciplines to flourish. No professional field can proceed without
resources, and those who transparently address society’s interests may be rewarded with
fee-for-service arrangements, grant funds, positions for its researchers and practitioners,
and so forth (e.g., Critchfield, 2011a; Mace & Critchfield, 2010; Stokes, 1997). BWR’s
emphasis on social validity helped to link ABA to such tangible benefits, and those who
followed its rules tended to profit accordingly. In recent years especially, this has been
evident in fervent public demand for ABA services and, in many jurisdictions, regula-
tory embrace of professional ABA certification in fee-for-service arrangements (e.g.,
Johnston, Carr, & Mellichamp, in press; Shook & Favell, 2008).

Iconic Publications Are Creations of Their Time

Like all behavior, that which creates publications is a product of past and current
influences. Whatever a publication’s subsequent impact on a professional field, it first
arises, as per Baer et al.’s (1987) Banthropologist^ comment, as a snapshot of the period
in which it was created and must be evaluated partly in this context. For example,
Watson’s classic (1913) article, BPsychology as the behaviorist views it,^ may be a
seminal description of behaviorism (Harzem, 2004) but it did not spring forth unpro-
voked. Rather, Watson penned the article to address specific problems of his day,
prominent among which were the prevalence of biological nativism and subjective
measurement methods (introspection). Watson’s influence has been so durable that it is
easy to overlook this historical context and uncritically accept once - revolutionary ideas
that have become part of a diffuse contemporary zeitgeist, Blike a cube of sugar dissolved
in tea; it has no major distinct existence but is everywhere^ (Harzem, 2004, p. 11).

BWR and the Battle for the Soul of ABA

So it may be with BWR. Baer et al.’s seven-dimension framework has become so


familiar that contemporary observers may overlook the circumstances that gave rise to
the source article. During the early days of extending behavior analysis to human
behavior, there was considerable friction between two competing camps (Rutherford,
2009). One group (which we will call the experimental-control camp) believed that
responsible application required preliminary explorations of the generality to humans of
basic behavior principles that had been derived from animal research, or as Wolf (2001)
put it:
128 BEHAV ANALYST (2017) 40:123–159

Some were of the opinion that ... a leap [from lab to field] was premature and
unwise because we didn’t know enough, that we needed to wait for more human
operant research. (p. 290).3

At the least, it was asserted that field extensions should retain laboratory-like
procedures in order to preserve experimental control:

At the time… many of the operant conditioning people were very strongly
methodologically oriented to the point where [they felt] you could not possibly
collect reasonable data unless it was automatically collected. (Jack Michael,
quoted in Rutherford, 2009, pp. 60–61)

Those with strong backgrounds in the experimental analysis of behavior therefore


were uncomfortable with what they regarded as rather unstructured tinkering in field
settings. In their view, some of the laboratory’s most powerful analytic tools were being
abandoned, at best costing opportunities for investigation, and at worst possibly
eroding the field’s conceptual core (Rutherford, 2009). For instance, in observing the
dissemination of Skinner’s air crib (an automated Bbaby tender^), Ogden Lindsley
remarked that

It amazed me that 250 children had been reared in air cribs … and no one had put
in a lever. No one had put in a lever.4 (quoted in Rutherford, 2009, p. 61)

Skinner (1959) might have captured the essence of the experimental-control camp’s
worries when, writing about mainstream clinical psychology, he warned that clinicians
might be more strongly influenced by social factors (Bgratitude is a powerful general-
ized reinforcer;^ p. 450) than by the reinforcers associated with understanding behav-
ior’s controlling variables. Deitz (1982) has explained how the development of profes-
sional medicine was slowed by foregoing experimental analyses in hurried pursuit of
socially important outcomes, and Skinner (1959) illustrated his own argument by
invoking Albert Schweitzer: BIf he had worked as energetically for as many years in
a laboratory of tropical medicine, he would almost certainly have made discoveries
which in the long run would help – not thousands – but literally billions of people^ (p.
451). A parallel with the concerns of the experimental-control camp in behavior
analysis is obvious.5

3
History is written by the victors, and we will show that this was not the experimental-control camp. How
members of this camp felt about the central controversy of their era we could not ascertain from the archival
record and thus must rely on secondary accounts.
4
So he installed one—well, a response panel, anyway—into an air crib in which his daughter spent some time
(Lindsley, 2002). From this, resulted the first free-operant cumulative record obtained from humans (Lindsley
& Lindsley, 1952), as well as a photo in a national newsmagazine and criticism from people who thought the
air crib to be inhumane (Lindsley, 2002). Such adverse publicity magnified the urgency that Baer et al. would
later feel about distancing ABA from the experimental-control approach.
5
To be clear, we are aware of no evidence that Skinner had similar concerns about field extensions of behavior
analysis, which he had anticip'ated in Walden Two (1948) and Science and Human Behavior (1953), and about
which he subsequently commented approvingly in Reflections on Behaviorism and Society (1978). Moreover,
the BWR framework, with its emphasis on empirical evidence of socially important change, can be interpreted
as a hedge against the worst of the clinical drift about which Skinner warned.
BEHAV ANALYST (2017) 40:123–159 129

By contrast, members of the social-validity camp were impatient with


laboratory-esque extensions6 and instead sought to change behavior that had been
selected for practical importance rather than laboratory convenience. To their way
of thinking, basic principles of reinforcement were straightforward and dependable
Blike gravity^ (Jack Michael, quoted in Rutherford, 2009, p. 59), and consequently
Bfindings from human-sized Skinner boxes held little interest for this new breed of
behavior-shaper^ (Rutherford, 2009, p. 60). As Risley (2001) wrote:

We saw too many studies… whose importance to human affairs were [sic]
highlighted in their introduction and discussion sections but absent in their
procedures and results sections. (p. 271)

A leading member of the social-validity camp was Montrose Wolf, who, in the early
1960s, was hired at the University of Washington to run a human operant laboratory,
but instead spent most of his time consulting on child behavior problems in natural
settings:

Wolf came to consider laboratory research on human behavior to be mostly a


‘dead end in scientific trappings’ and… when he created the editorial policy of
the new Journal of Applied Behavior Analysis, he explicitly excluded laboratory
purported-analogues, in favor of the in-context observation and investigation of
real-world phenomena. (Todd Risley, quoted in Rutherford, 2009, p. 60)

Manifesto: All Social Validity, All the Time

These were the times in which BWR was composed. With a formative battle underway
about whether the Bextension to human behavior^ would be defined more by society’s
practical concerns or by a methodological debt to the experimental analysis of animal
behavior, any credible observer would have been compelled to address the controversy.
In doing so, Baer et al. took a decisive stand in favor of the social-validity camp. BWR’s
first two pages were devoted to distinguishing clearly between ABA and the experi-
mental analysis of behavior, and its subsequent explanation of the dimension Applied
was, not surprisingly, grounded almost exclusively in social-validity arguments:

• Applied research is constrained to examining behaviors which are socially


important, rather than convenient for study. (BWR, p. 92, italics added).

• In behavioral application, the behavior, stimuli, and/or organism under study are
chosen because of their importance to man and society, rather than their impor-
tance to theory. (BWR, p. 92, italics added)

6
Also with narrative interpretations of everyday phenomena. BWe saw too many examples of behavioral
researchers behaving like other psychologists and casually extrapolating their findings to account for things
they actually knew little or nothing about,^ wrote Risley (2001, p. 271), who dismissed Skinner’s (1953)
Science and Human Behavior (1953) as, Bthree chapters [that] outlined an agenda for an inductive, empirical
approach to a science of human behavior… followed by 26 chapters of a deductive, logical explanation of
uninvestigated human behavior^ (p. 271).
130 BEHAV ANALYST (2017) 40:123–159

Table 1 Emphasis on social importance in BWR’s description of several dimensions of applied behavior
analysis

Dimension Relevant statements


Behavioral BApplied research is imminently pragmatic. It asks how it is possible to get someone to do
something... [and the behavior analyst must] defend that goal as socially important.^
(BWR, p. 93, italics added)
Analytic BApplication typically means producing valuable behavior.^ (BWR, p. 94, italics added)
Effective BIf an application of behavioral techniques does not produce large enough effects for
practical value, then application has failed.^ (BWR, p. 96, italics added)
BIn application, the theoretical importance of a variable is usually not an issue. Its practical
importance, specifically its power in altering behavior enough to be socially important, is
the essential criterion.^ (BWR, p. 96, italics added)
BIn evaluating whether a given application has changed behavior enough to deserve the
label, a pertinent question can be, how much did that behavior need to be changed?
Obviously, that is not a scientific question, but a practical one. Its answer is likely to be
supplied by people who must deal with the behavior.^ (BWR, p. 96, italics added)
General BApplication means practical improvement in important behaviors.^ (BWR, p. 96, italics
added)

• The label applied is … determined by the … interest which society shows in the
problem being studied. (BWR, p. 92, italics added)

Further affirmation of the social validity mission suffused BWR’s discussion of the
dimensions Behavioral, Analytic, Effective, and General (Table 1). For good measure,
the central point was reiterated in BWR’s concluding comments:

• An applied behavior analysis will make obvious the importance of the behavior
changed (p. 97, italics in original)

From the vantage point of contemporary ABA, in which socially important prob-
lems are routinely addressed, these remonstrances seem excessive, but BWR was a
product of its era. With social validity as the prime directive and ABA’s status as an
independent enterprise not yet solidified, Baer et al. sought not just to identify, but also
to defend and promote, those characteristics that, in their view, most distinguished
ABA from the experimental analysis of behavior.
Because BWR was a creation of its time, some of its arguments, while technically
correct, may no longer merit the special emphasis that BWR gave them. For instance,
BWR defended direct observation (as contrasted with automated behavior recording) as
a viable means of measuring behavior in the field. Today, of course, observational
methods are an integral part of every student’s introduction to ABA (e.g., Cooper,
Heron, & Heward, 2007). In selected instances, BWR’s arguments seem not only dated
but also stylistically emblematic of the period’s prickly relations between more basic
researchers and what Baer et al. might have regarded as Btruly applied^ behavior
analysts. For example, when explaining how a concern for social validity might place
practical constraints on experimental design, Baer et al. noted that:
BEHAV ANALYST (2017) 40:123–159 131

Society rarely will allow its important behaviors, in their correspondingly impor-
tant settings, to be manipulated repeatedly for the merely logical comfort of a
scientifically skeptical audience. (BWR, p. 92, italics added)

[A laboratory] experimenter has achieved an analysis of behavior when he can


exercise control over it. By common laboratory standards, that has meant an
ability to turn the behavior on and off, or up and down, at will. Laboratory
standards usually have made this control clear by demonstrating it repeatedly,
even redundantly, over time. Applied research... cannot approach this arrogantly
frequent clarity of being in control.... (BWR, p. 94, italics added)

Today, it is widely understood that practical considerations in field settings can limit the
number of observations that may be made and mitigate the opportunity to reverse experi-
mental effects. The multiple baseline design, which BWR went to some lengths to justify as
an alternative to potentially troublesome reversal designs (see p. 94), is now an uncontro-
versial standard component of the ABA investigatory arsenal (Cooper et al., 2007).
Upon close inspection of such details, it becomes evident that BWR is not, as Baer,
et al. (1987) suggested, merely an Banthropologist’s account of ... [1968] Applied
Behavior Analysis^ (p. 313). It is a polemic, 7 a Declaration of Independence from
the experimental analysis of behavior. While the BWR framework (Fig. 1) included
three dimensions that describe behavior analysis as a whole (Behavioral, Conceptual,
and Analytical), within an overarching concern for social validity these were embraced
only uneasily. Baer et al. took pains to emphasize that not every Behavior is socially
important; that in applied work not all Analytical tools are suitable; and that Conceptual
analysis is not the primary goal but rather a possible tool for arriving at for socially
important behavior change (BIn application, the theoretical importance of a variable is
usually not an issue. Its practical importance... is the essential criterion;^ BWR, p. 96).
The framework’s remaining dimensions (Technological, Effective, and General) were
explained almost entirely within the context of the social-validity mission.
Viewing BWR as BDeclaration of Independence^ makes it possible to extend several
observations about the article’s influence upon the early development of ABA. First,
BWR may have helped to tip the balance of power between the early social-validity and
experimental-control camps not only because it articulated the former model with
exceptional clarity but also because, in its edgier sections, it portrayed the alternative
view as misguided and perhaps even foolish (i.e., Bredundant,^ Barrogant^). Years later,
when reflecting upon the development of ABA, Risley (2001) emphasized that it had

7
Though a nuanced one. Risley (2001) wrote that he and other early converts to the social-validity camp, Bdid
not think that laboratory research findings are unimportant to human affairs – quite the contrary^ (p. 270). But
they felt that ABA’s earliest pioneers had freed them from misconceptions derived from studying laboratory
research: BTheir most important ‘breakthrough’ contribution,^ wrote Risley, Bwas the demonstration that
naively simple things were actually powerfully important in the real lives of people. You see, at the time we
were all talking about the principles of learning and behavior but we thought they would actually be expressed
only in complex, multiply-interactive combinations in the ongoing actions of people in real life.... We assumed
that their role could only be isolated and analyzed after carefully designed histories in specially arranged
settings – in other words, in laboratories^ (p. 267). But in viewing the results of the first field interventions,
BWe had never seen or imagined such power! We were all amazed at the speed and magnitude of the effects of
simple adjustments of such a ubiquitous variable as adult attention... [This was] the most influential discovery
of modern psychology^ (p. 268). Baer et al.’s zeal for the social-validity perspective was understandable.
132 BEHAV ANALYST (2017) 40:123–159

spread far and wide Bexcept peculiarly in those places where operant laboratory
research was strongest^ (p. 270, italics added). This phrasing suggests a persistent
view that ABA and other kinds of human research were locked in a zero-sum game.
Second, the experimental-control camp produced no equivalent manifesto, and
predictably, perhaps, it produced relatively few intellectual Bapprentices.^ In fact, a
generation and more later, the descendants of this camp, now calling their work human
operant research or the Experimental Analysis of Human Behavior (EAHB), were still
struggling to achieve critical mass, to standardize their methods, to make sense of their
findings, and to identify their proper place in the broader discipline of behavior analysis
(e.g., Baron, Perone, & Galizio, 1991; Buskist & Johnston, 1988; Dinsmoor, 1991;
Etzel, 1987; Hake, 1982; Harzem & Williams, 1983; Michael, 1984; Perone, 1985;
Perone, Galizio, & Baron, 1988).
Third, BWR served as an explicit mission statement for the newly launched JABA,
which according to Montrose Wolf’s design explicitly excluded the kinds of investi-
gations being conducted by the experimental-control camp (Rutherford, 2009). As
Risley (2001) has noted:

The Baer, Wolf, and Risley (1968) article was written…as an attempt to differ-
entially prompt certain types of submissions….[What] most concerned Wolf and
me: the encouragement of field research; the insistence that you should seek
lawfulness in the everyday activities of people; and the pursuit of the invention
(and documentation) of new behavioral technology. (p. 270)

Investigators from the social validity camp therefore had a natural outlet for their
work, while those from the opposing camp did not. Even a generation later, with basic
science journals still featuring primarily research with nonhumans and JABA, by
design, unfriendly to laboratory research with humans, those working in EAHB found
it difficult to publish their research (e.g., Hake, 1982).

A Crisis of Confidence and a Translational Pivot

A further noteworthy development is that, roughly a decade after BWR’s publication,


ABA’s heady beginnings gave way to a crisis of confidence during which observers
worried that the field had lost contact with its scholarly foundations and therefore had
become overly Technological (e.g., Birnbrauer, 1979; Branch & Malagodi, 1980;
Cullen, 1981; Deitz, 1978; Michael, 1980; Moxley, 1989; Pierce & Epling, 1980;
Poling, Picker, Grossett, Hall-Johnson, & Holbrook, 1981). BWe are becoming less
concerned with basic principles of behavior and more concerned with techniques per
se,^ wrote Hayes, Rincover, & Solnick (1980, p. 283), and Morris (1991) bemoaned
the rise of research that, Bdemonstrates the effects of behavioral interventions at the
expense of discovering... actual controlling relationships^ (p. 413).
In support of these concerns, Hayes et al. (1980) surveyed the first 10 volumes of JABA
and determined that its contents were growing less Analytical and Conceptual. Regarding
Analytical, they found a decrease over time in the percentage of JABA empirical articles
that described component analyses (which determine the features of an intervention
package that are responsible for behavior change) and parametric analyses (which map
dose-response relationships between interventions and behavior changes). Regarding
BEHAV ANALYST (2017) 40:123–159 133

Conceptual, Hayes et al. described an increase over time in the percentage of JABA articles
that were purely methodological (e.g., development-of-procedures exercises that might
today be referred to as R&D; Johnston, 2000) or Technological (describing an intervention
without reference to behavior principles). There was a concomitant decrease in the
percentage of articles that today might be called Btranslational^ (i.e., Bshows an effort to
advance our basic understanding of some behavioral phenomena;^ Hayes et al., p. 278).
It is difficult to view these trends as unrelated to BWR’s ardent focus on distinguishing
ABA from basic research. BWR presented seven dimensions but passionately defended
one core idea: that application means creating socially appreciated behavior change. The
dimensions of greatest concern during BThe Crisis^ (Conceptual and Analytical) were
two of the three that are most central to behavior analysis generally—that is, they are
dimensions along which ABA is particularly at risk for being confused with the exper-
imental analysis of behavior. We suggest, therefore, that BThe Crisis^ arose when a newly
minted army of Bapprentices^ took BWR’s rules of engagement very literally by
emphasizing social importance above all else, and by taking pains not to look too
much like the experimental analysis of behavior. As Hayes et al. (1980) commented:

The data show that applied behavior analysis is increasingly following Baer
et al.’s recommendation that applied studies directly manipulate the problem
behaviors, in natural settings, and with the troubled populations. However, this
definition of applied has been labeled a Bstructural^ one because it is based
strictly upon the nature of the subjects, settings, and behavioral topography being
studied [rather than increasing] the applied workers’ ability to predict, under-
stand, and control socially important behavior in the settings and with the clients
they serve (p. 281).

BThe Crisis^ shows that BWR was less a final statement on ABA than a develop-
mental marker in a continuing debate over what ABA should become. Prior to BWR’s
publication, members of the social-validity camp had been impatient with conceptually
and methodologically cautious research-to-practice translation. Following an additional
decade or so of field successes, it was perhaps logical for the same individuals to
exhibit Bless and less interest in conceptual questions^ (Hayes et al., 1980, p. 289)—or,
stated more positively, to conclude that the technology of dissemination was sufficient-
ly advanced that field practitioners had little need to understand the underlying
theoretical principles (e.g., see Baer, 1981; Wolf, 2001).8 To be clear, Baer et al. did
not advance this view explicitly and even acknowledged that

8
Regarding field workers who engaged in Conceptual matters Bso briefly that it is easy to miss^ (p. 89), Baer
(1981) advanced the analogy of the practicing physician, who he acknowledged did not operate much in the
realm of the Conceptual, but who he praised for the utility of his, Bvery effective packages, his routine
algorithm for when to apply them, and his simple empirical willingness to try another of them without
amazement if the first choice did not work^ (p. 87). Baer suggested that critics overestimated the Conceptual
complexity of many applied problems. To those working in the field, he said, Ba huge amount of the
behavioral trouble that they can see in the world looks remarkably to them like the suddenly simple
consequence of unapplied positive reinforcement or misapplied positive reinforcement^ (p. 88), so Bonly
the simplest, already validated, and most general of behavior-analytic principles are meant to be at issue in
most applications^ (p. 89).
134 BEHAV ANALYST (2017) 40:123–159

The field of applied behavior analysis will probably advance best if published
descriptions of its procedures... strive for relevance to principle... This can have
the effect of making a body of technology into a discipline rather than a collection
of tricks. (p. 96)

But BWR may have contributed to BThe Crisis^ by damning with faint praise. In
particular, its endorsement of the Conceptual dimension was cursory (one paragraph of
a 6.5-page article) and lukewarm (in the above quotation, note the autoclitics
Bprobably^ and Bcan^ rather than, say, Bcertainly^ and Bwill^). This, combined with
the article’s dismissive treatment of the Analytical conventions of laboratory research,
may have left the impression that that the Conceptual and Analytical dimensions were
optional considerations or even—consistent with the pre-BWR views of some members
of the social-validity camp who we quoted previously—inconvenient hurdles to the
development and dissemination of Technological interventions.
Eventually, more Conceptual and Analytical heads prevailed in ABA research. At
roughly the same time that the BThe Crisis^ became a matter of public discussion, there
was a development that would dramatically inflect ABA’s trajectory. Functional anal-
ysis methodology (Iwata, Dorsey, Slifer, Bauman, & Richman, 1982) provided a
relatively user-friendly way to examine socially important behavior in the context of
basic principles. Its procedures qualified as Technological (easily replicated) but were
explicitly Analytical in design (demonstrated clear experimental control), and in a
standardized way led directly to Conceptual insights (linking problem behavior to
behavior principles). Functional analysis thus provided a vivid illustration of how to
integrate the practice-friendly Technological dimension with dimensions that charac-
terize behavior analysis as a whole (e.g., Mace & Critchfield, 2010; Wacker, 2000).
The Iwata et al. (1982) functional analysis report was reprinted in JABA in 1994.
Within a few years, that journal, which Hayes et al. (1980) had previously regarded as
part of the Technological problem, became part of the solution through shifts in
editorial policy that placed greater emphasis on the theoretical roots of application.
Beginning around the mid-1990s, ABA scholarship took on an increasingly transla-
tional tenor that continues to this day (e.g., Virues-Ortega, Hurtado-Parrado, Cox, &
Pear, 2014; for a brief history, see Mace & Critchfield, 2010; for more extensive
surveys, see DiGennaro Reed & Reed, 2015; Madden, 2013).

Section Conclusion

Darwin is reputed to have said that in scholarly discourse there are Blumpers^ who
emphasize the similarities among things and Bsplitters^ who emphasize differences.9 In
composing BWR, Baer et al. operated primarily as splitters. Although their spirited
defense of ABA as a unique enterprise makes sense in historical perspective, the
approach can be linked to both desirable and undesirable disciplinary outcomes (ABA’s
dramatic growth and BThe Crisis,^ respectively). With this mixed track record as

9
Darwin apparently did not coin these terms but is remembered for applying them to biologists who employed
relatively loose versus stringent criteria for distinguishing among species (see http://ncse.com/blog/2014/11
/whence-lumpers-splitters-0016004).
BEHAV ANALYST (2017) 40:123–159 135

context, we now consider how well the BWR framework characterizes contemporary
ABA research.

BWR and Contemporary Applied Research

It is reasonable to suggest that investigations that incorporate all of the BWR dimen-
sions (Fig. 1) remain as well positioned to contribute to ABA’s social-validity mission
as they were 50 years ago. An investigation that is Analytical, Conceptual, and
Behavioral is good behavior analysis and therefore capable of advancing the under-
standing of a problem, whether practical (Applied) or theoretical. An intervention that
is Technological, Effective, and General shows how to use this understanding to create
socially valued change. In this general sense, the BWR framework holds up well as a
template for ABA research. We said Ba template^ rather than Bthe template^ because,
upon careful consideration of the mission of applied research, two kinds of ambiguities
about the BWR framework can be identified.
The first ambiguity concerns how individual BWR dimensions are to be interpreted
when examining specific research activities. For illustrative purposes, consider the
dimension Behavioral in the context of the study of childhood accidents that cause
injury, death, loss of physical functioning, and substantial health care costs. Baer et al.
asserted that, Bthe behavior of an individual is composed of physical events^ (BWR, p.
93) and admonished against anything other than direct observation of those events.
Child accidents, however, pose considerable challenges to direct observation. They take
place rarely and unpredictably, almost always out of an investigator’s scope of obser-
vation. Consequently, research aimed at identifying the antecedents and consequences
of accident-related behaviors often employs interviews with caregivers and other
people who have observed these behaviors (e.g., Alwash & McCarthy, 1987;
Peterson, Farmer, & Mori, 1987). Baer et al. strongly objected to using verbal reports
as a substitute for direct observation, so according to a strict application of the BWR
framework, this research must be considered uninformative, despite the fact that it has
led to Effective strategies for preventing injuries (e.g., Finney, Christophersen, Friman,
Kalnins, Maddux, Peterson, Roberts, & Wolraich, 1993). Or at least apparently effec-
tive strategies, because the primary dependent variable in intervention research is the
number of child injuries (e.g., see Kaung, Hausman, Fisher, Donaldson, Cox, Lugo, &
Wiskow, 2014), and injuries are not behavior per se but rather a byproduct of behaviors
and environments that place children at risk (see Johnston & Pennypacker, 1980, for a
critique of measuring behavior products instead of behavior). Thus, according to a strict
application of the BWR framework, this research can support no confident conclusions
about improvements in child welfare. Never mind that markers of Bgeneral
functioning,^ like child injuries, directly address problems as society conceives of
them (e.g., see Strosahl, Hayes, Bergan, & Romano, 1999) or that most observers
would consider an intervention to be pointless if it changed child behavior without
reducing injuries (e.g., Wolf, 1978). In short, by common sense, standard research
on child injuries qualifies as socially important, but within a strict application of
the BWR framework, it cannot be useful to ABA because it is insufficiently
Behavioral.
The preceding anticipates a second ambiguity about the BWR framework, which
concerns whether every study must exhibit all seven of its dimensions in order to
136 BEHAV ANALYST (2017) 40:123–159

qualify as ABA. Although putatively Bthe label applied is not determined by the
research procedures used but by the interest which society shows in the problem being
studied^ (BWR, p. 92), Baer et al. presented their framework as a conjoint set in which
six dimensions were described as necessary components of the seventh (Applied =
social importance). Thus:

An applied behavior analysis will make obvious the importance of the behavior
changed, its quantitative characteristics, the experimental manipulations which
analyze with clarity what was responsible for the change, the technologically
exact description of all procedures contributing to that change, the effectiveness
of those procedures in making sufficient change for value, and the generality of
that change (BWR, p. 97; italics in original)

This passage makes clear that, in the view of Baer et al., no investigation qualifies as
ABA if it lacks any one of the seven dimensions. Thus, just as child-accident research
is insufficiently Behavioral, thoughtful analyses of the variables controlling acts of
terrorism (Nevin, 2003) and consumer demand for indoor tanning services (Reed,
Partington, Kaplan, Roma, & Hursh, 2013) fall short on three dimensions (Effective,
General, and Technological) because they lack an intervention.
Similarly, Lovaas’ (1987) seminal randomized controlled trial on early intensive
autism intervention may be insufficiently Analytical because it employs a group-
statistical experimental design. A common objection is that group-comparison research
is not, in BWR terminology, Analytical because behavioral functional relations always
are within-individual effects (for representative arguments see Bailey & Burch, 2002;
Branch & Pennypacker, 2013; Cooper et al., 2007; Hurtado-Parrado & Lopez-Lopez,
2015; Johnston & Pennypacker, 1980, 1986; and Sidman, 1960).10 The BWR article
did not directly disparage group-based designs, but Baer (1977) subsequently did and,
tellingly, when Baer et al. introduced the dimension Analytical, they mentioned only
single-subject designs. Despite the central role of randomized controlled trials in
contemporary public policy deliberations (e.g., Drake, Latimer, Leff, McHugo, &
Burns, 2004), this remains a common strategy for illustrating the ABA research agenda
(see Haegele & Hodge, 2015; Malott & Shane, 2014; Martin & Pear, 2015;
Miltenberger, 2016; Poling & Grossett, 1986). Overall, while the interpretation that
group designs are incompatible with ABA cannot be traced exclusively to BWR, it is
fair to call it a standard component of contemporary appeals to the BWR framework
(see Bailey & Burch, 2002; Cooper et al., 2007; Johnston & Pennypacker, 1986).
If the examples just mentioned do not qualify as Applied then a state of affairs exists in
which, in order to guarantee the BWR-defined purity of ABA, potentially informative work
on important problems must be excluded from consideration. We did not manufacture this
conundrum for dramatic effect. Throughout the history of ABA, journal editorial teams
have wrestled with the determination of what counts as Applied, and in the process, they
often have considered more than Bthe interest which society shows in the problem being

10
Although suspicions of group-based research run deep, clearly there are differing opinions about the value
of various designs (e.g., Madden, 2013). Our purpose in the present essay is not to debate the relative merits of
different designs, but merely to point out that when evidence on a topic of societal importance is lacking from
one source, it may be useful to examine evidence from other sources.
BEHAV ANALYST (2017) 40:123–159 137

studied^ (BWR, p. 92). Montrose Wolf’s insistence that JABA would reject Blaboratory
purported-analogues, in favor of the in-context observation and investigation of real-world
phenomena^ (Todd Risley, quoted in Rutherford, 2009, p. 60), partially illustrates this
point. During ABA’s formative years, it was observed that Bsome JABA reviewers will
now reject articles [focusing on socially important phenomena] because the result does not
appear to be of social significance to the clients^ (Birnbrauer, 1979, p. 17), and Ba common
lament is that ... applied journals consider [laboratory models of socially important
phenomena] as not applied enough and suggest a basic journal^ (Hake, 1982, p. 23).
Because journals do not make public their peer review files, it is difficult to know
how relevant such concerns may be to contemporary ABA journals, and one might
hope that, in our enlightened present, journals no longer resist work that does not
conform to the BWR framework.11 For want of systematic data that might bear on this
hypothesis, we must rely on anecdote in the form of our own peer review experiences
over the past two decades or so. Though of unknown generality, these experiences
suggest that things have not changed much since ABA’s formative years. On multiple
occasions, a reviewer has remarked that our work was interesting and socially relevant
(topics included dermatological health, politics, and elite sport competition). The
primary objection to these studies was that they deviated from the BWR framework
along at least one dimension (e.g., by using descriptive rather than experimental
methods, or by employing measurement that was not deemed Behavioral). The recur-
ring refrain, in the verbatim words of some of our reviewers, has been:

& not applied (Baer, Wolf, & Risley, 1968)


& not applied... in the Baer, Wolf, & Risley sense
& not reflecting [the journal’s] emphasis on the Baer et al. (1968) dimensions of ABA
& not suitable for [the journal] because it does not meet the Baer et al. definition of
‘applied’12

Sometimes our feedback has referenced one or more of the seven dimensions without
invoking BWR by name. At the risk of sounding like sore losers in the editorial game, we
illustrate with the following comments from an action editor of one prominent journal,
who wrote that the research Baddresses an important question,^ commended the authors
Bfor exploring a relatively non-traditional research subject for applied behavior analysis,^
and acknowledged that, BI take great interest in the work reported here.^ The decision
letter indicated that Bthe reviews are generally positive, the recommendations for revi-
sions are doable, and the content of the manuscript might be of interest to readers.^ This
may sound like the preface to a slam-dunk acceptance, but instead the conclusion was
that, BThe manuscript does not meet the minimum requirements to be accepted for
11
A reviewer of the present article advanced this position.
12
Such reflexive invoking of the BWR framework probably first arose as a side effect of the source article
doubling as both Banthopologist’s account^ (Baer et al., 1987, p. 313) and journal mission statement. As a
result of the latter, the BWR Bprescriptions, widely followed by contributors [who were] deeply interested in
having their papers accepted in JABA, became assumed as the normal way of doing research in the field^
(Carpintero Capell et al., 2014, p. 1727). To avoid overstating our case, however, we note that some of our
papers that were evaluated in this fashion eventually were published because one reviewer advocated for them
or because an action editor imposed a more expansive view of what counts as Applied. Clearly, differences of
opinion exist, but nevertheless, BWR-framed objections have arisen too often for us to view them as isolated
occurrences.
138 BEHAV ANALYST (2017) 40:123–159

publication ... because the research relies exclusively on self-report measures [and
because] the presentation and analysis of the data are in aggregate form and rely
exclusively on the use of inferential statistics.^ Thus, a conceptually informative manu-
script of transparent social significance was not deemed sufficiently Behavioral or
Analytical for inclusion in an Applied behavior analysis journal—although, we were
assured, Bthese data will be of great interest, and very appropriate, for a different journal
and audience.^ This is not the only time that we have seen the BWR framework used to
direct work on interesting topics away from behavior analysis journals.
If the translational movement in behavior analysis has demonstrated anything, it is
that no single means exists to advance the understanding of socially important behav-
iors (see Critchfield, Doepke, & Campbell, 2015; Mace & Critchfield, 2010; see also
Birnbrauer, 1979; Dietz, 1983; Fawcett, 1985, 1991; Hake, 1982; Moxley, 1989;
Stokes, 1997; Wacker, 1996, 2000, 2003). Although ABA could not progress without
research that reflects all seven BWR dimensions, it makes little sense to limit the
pursuit of social relevance that Baer et al. so valued only to topics that lend themselves
to study within the BWR framework.
This point is so crucial that we now bolster it by examining several further examples
of research that could be judged as inadequate based on one or more BWR dimensions.
These examples are intended to be illustrative rather than exhaustive, and the reader is
invited to identify further examples from personal experience. As contrasted with Bpure
behavioral research^ (Johnston & Pennypacker, 1986, p. 36) that incorporates a full
complement of traditionally expected features, these examples can be considered
Bquasi-behavioral research^ (p. 38) that intermingles some of those features with
features that are associated with other research traditions. The challenge we present
to the reader is not to determine whether any particular instance should count as Bpure^
ABA research, but instead to decide holistically whether ABA, as a general enterprise,
would be more interesting, vibrant, intellectually engaged, and socially relevant with
such instances included within versus excluded from its core literature.13

Experimental Evaluation of Effective Interventions

Some clinical experiments, despite being well grounded in the Conceptual framework
of behavior principles and examining Effective, General, and Technological interven-
tions, may raise questions concerning the extent to which they are Analytical and/or
Behavioral. For instance, clinical experiments have shown Technological voucher-
based incentive interventions (Fig. 2, top panel) to be Effective in improving workplace
attendance by drug abusers with poor attendance records (e.g., Jones, Haug, Silverman,
Stitzer, & Svikis, 2001). The voucher system is derived from the Conceptual frame-
work of behavioral economics theory. Measurement of attendance is strictly Behavioral
and there is evidence that attendance improvements are General. However, effective-
ness often has been evaluated in large N designs that many behavior analysts would
regard as insufficiently Analytical.
The voucher system described above also reduces drug taking (Higgins, Budney,
Bickel, Foerg, Donham, & Badger, 1994; Jones et al., 2001; Lussier, Heil, Mongeon,

13
The astute reader will notice that some of our examples were published in applied behavior analysis
journals. More telling is that many were not.
BEHAV ANALYST (2017) 40:123–159 139

Fig. 2 Examples of Effective interventions that can be questioned on at least one of the seven dimensions of
applied behavior analysis of Baer et al. (1968). Solid rectangles identify dimensions that are clearly reflected in
the research. Dashed rectangles identify dimensions about which some behavior analysts might have
concerns. See text for further explanation

Badger, & Higgins, 2006), as demonstrated in putatively non-Analytical group


design experiments (e.g., Lussier et al., 2006; Prendergast, Podus, Finney,
Greenwell, & Roll, 2006). Daily monetary incentives are contingent on providing
a drug-free urine sample so, according to the BWR framework, this research might
not be considered Behavioral because its primary dependent measure is a marker
of general functioning (the chemical composition of urine) that is a byproduct of
behavior and not behavior per se (Fig. 2, middle panel). A similar concern could
be raised about research on the effectiveness of Positive Behavioral Interventions
and Supports in which school office referrals were measured rather than behaviors
of individual students that lead to office referrals (e.g., Horner & Sugai, 2015; see
middle panel of Fig. 2), and about research on dental compliance in which
investigator-rated levels of plaque buildup were measured instead of the behaviors
140 BEHAV ANALYST (2017) 40:123–159

to which plaque formation presumably was related (Iwata & Becksfort, 1981; not
shown in Fig. 2).
The potentially objectionable features of these examples reflect not poor scientific
judgment but rather topic-specific strategic investigatory decisions. For example, proxy
measures are used in lieu of direct behavioral observation when behaviors of interest
(like drug-taking, actions that get students into trouble at school, and dental hygiene
habits) are difficult to track and observe in the field on a moment-to-moment basis.
Moreover, the target audience for research on voucher systems, PBIS, and dental health
includes mainstream researchers, funding agencies, and policy makers for whom
markers of general functioning, assessed at the large-group level, will be most persua-
sive as outcome measures.
Some intervention research is Behavioral without being especially Conceptual
(Fig. 2, bottom panel). For example, a recent series of studies has examined the effects
of creating or renovating public parks on the activity levels of community members.
The interventions were defined well enough to support replication (Technological), and
when evaluated through well-validated Behavioral observation systems (e.g.,
McKenzie, Cohen, Sehgal, Williamson, & Golinelli, 2006), they were found to be
Effective and General across several settings (Cohen, Han, Derose, Wiliamson, Marsh,
& McKenzie, 2013; Cohen, Han, Isacoff, Shulaker, Williamson, Marsh, McKenzie,
Weir, & Bhatia, 2015; Cohen, Marsh, Williamson, Han, Derose, Golinelli, &
Mckenzie, 2014). Yet some behavior analysts might consider this work to be insuffi-
ciently Conceptual because the independent variable (access to parks) was not
discussed in terms of behavior principles, and insufficiently Analytical because the
results were examined through large N designs that pooled the behavior of many
community members. Nevertheless, in light of an epidemic of obesity (Wang &
Baydoun, 2007) and the near-absence of a behavior analysis literature on how public
spaces can promote physical activity, the studies just describe provide a start at
unpacking an important societal problem.

Experimental Evaluation of Ineffective Interventions

According to the BWR framework, an intervention experiment that does not produce
socially important behavior change Bis not an obvious example of applied behavior
analysis^ (BWR, p. 96), even if it is textbook perfect in other ways (Applied,
Conceptual, and Technological). This is relevant to the present discussion because
not all experiments aimed at socially important problems create socially important
behavior change. For instance, the Analytical, Behavioral, and Technological research
strategies of behavior analysis have been used to evaluate interventions with origins
outside of the Conceptual system of behavior analysis (Fig. 3, top panel). Example
topics of investigation include effects on task engagement and rates of problem
behaviors of hyperbaric chamber exposure (Lerman, Sansbury, Hovanetz, Wolever,
Garcia, O’Brien, & Adedipe, 2008), on motor skills of wearing ambient prism lenses
(Chok, Reed, Kennedy, & Bird, 2010), and on autism-related behaviors of facilitated
communication (e.g., Montee, Miltenberger, & Wittrock, 1995) and Btherapeutic^
horseback riding (Jenkins & DiGenarro Reed, 2013). In none of these cases was the
intervention found to be Effective, in which case there were no effects that might prove
to be General.
BEHAV ANALYST (2017) 40:123–159 141

Fig. 3 Examples of clinical interventions that are not Effective. Solid rectangles identify dimensions from
Baer et al. (1968) that are clearly reflected in the research. Dashed rectangles identify dimensions about which
some behavior analysts might have concerns. See text for further explanation

Made clear by these examples is that, although the BWR dimensions place a
premium on creating behavior change, sometimes there is value in knowing what does
not work (e.g., Normand, 2008). A topic is Bsocially important^ if members of society
think it is. Non-behavioral therapies can be popular with consumers (Green, 1996;
Shute, 2010) and in this sense, they qualify as indisputably Bimportant.^ From the
Conceptual perspective of a behavior analyst, alternative therapies may not seem
promising, but they are worthy of experimental attention in part because of their
capacity to waste consumers’ limited time and money (e.g., Lilienfeld, 2002;
Normand, 2008) or even to make existing problems worse (e.g., Mercer, Sarner, &
Rosa, 2003). Thus, Normand (2008) has asserted that, BDetection of and protection
from pseudoscientific practices is an important service for those in need who have
limited ability to detect such foolery themselves^ (p. 48).
But that is not all. Even interventions that are rooted in behavior principles
do not necessarily create socially meaningful behavior change in all settings
and for all kind of clients. Baer et al. (1987) distinguished between Btheoretical
failure^ (which results from basing interventions on incorrect principles, as per
many alternative therapies) and Btechnological failure^ (which results from
tailoring interventions poorly to the demands of a given situation), in the latter
case acknowledging that in the pursuit of Effective interventions, there will be
instructive failures:

Quite likely, technological failure is an expected and indeed important event in


the progress of any applied field, even those whose underlying theory is thor-
oughly valid. (p. 342, italics in original)

The publication of failures to achieve socially important behavior change (Fig. 3,


bottom panel) thus is consistent with the BWR emphasis on Analytical, and with the
general notion of science as self-correcting. It is, however, inconsistent with a strict,
conjoint-set reading of the BWR dimensions, which allows that only Effective
142 BEHAV ANALYST (2017) 40:123–159

interventions can be of interest. That interpretation is especially illogical in the case of


experiments that show not just that an intervention failed but why:

...Failures teach... Surely our journals should begin to publish not only [reports
of] our field’s successes but also those of its failures done well enough to let us
see the possibility of better designs. (Baer, et al., 1987, p. 325; italics added)

Recent examples of this approach have illustrated how promising approaches to


intervention were subverted by unexpected punishment contrast effects (Roane, Fisher,
& McDonough, 2003), by discriminative properties of reinforcement (Tiger & Hanley,
2005), and by events outside of the intervention setting (Critchfield & Fienup, 2013). Such
empirical evaluations of Bfailure^ are part of what separates science from pseudoscience
(Normand, 2008) and thus are an essential component of any program of research.

Translational Studies Without a Clinical Intervention

Possibly more controversial than the examples described so far are studies that are
designed to shed light on socially important problems without attempting to create
socially valued behavior change.

Laboratory Models of Clinical Phenomena One such category of study involves


what Baer et al. derisively labeled Blaboratory purported-analogues^ (Todd Risley,
quoted in Rutherford, 2009, p. 60), that is, experiments that, for convenience of
observation and experimental control, model socially important phenomena outside
of clinical settings (Fig. 4, top panel). For example, several laboratory experiments have
explored the role of derived stimulus relations in such problems as false memory (e.g.,
Guinther & Dougher, 2010, 2014), consumer product preferences (Barnes-Holmes,
Keane, Barnes-Holmes, & Smeets, 2000), clinically relevant fear and avoidance (e.g.,
Augustson & Dougher, 1997; Dymond, Schlund, Roche, De Houwer, & Freegard,
2012), and social categorization and stereotyping (Lilis & Hayes, 2007; Roche &
Barnes, 1996). Similarly, laboratory simulations have been developed to study gam-
bling behavior (Dixon & Schreiber, 2002; Maclin, Dixon, & Hayes, 1999), demand by
pregnant women for nicotine and other drugs (e.g., Higgins, Reed, Redner, Skelly,
Zvorski, & Kurti, 2017), and behaviors that may contribute to child accidents (e.g.,
Cataldo, Finney, Richman, Riley, Hook, Brophy, & Nau, 1992).
Laboratory models do not examine only human behavior (Davey, 1983). Perhaps the
most familiar example of a laboratory model comes from behavioral pharmacology,
where animal drug self-administration experiments have proven exceptionally useful in
predicting human drug use and abuse in natural environments (e.g., Meisch & Carroll,
1987). But there are many other examples. For instance, some of the dynamics of say-
do correspondence have been examined in pigeons (da Silva & Lattal, 2010; Lattal &
Doepke, 2001), and clinically important predictions of behavioral momentum theory
have been tested first in rats (Mace et al., 2010, Experiment 2).
All of these cases fall outside of the BWR framework because the setting is artificial,
the participants may be selected at least partly for convenience, and the behavior under
study may bear only partial similarity to that seen in everyday circumstances. Never-
theless, the relevant experiments may illuminate mechanisms that matter in everyday
BEHAV ANALYST (2017) 40:123–159 143

Fig. 4 Examples of research on socially important topics that lack a clinical intervention. Solid rectangles
identify dimensions from Baer et al. (1968) that are clearly reflected in the research. Dashed rectangles
identify dimensions about which some behavior analysts might have concerns. See text for further explanation

settings and clinical interventions, including in ways that field studies and theoretically
driven basic research have not previously illuminated (e.g., Mace, McComas, Mauro,
Progar, Taylor, Ervin, & Zangrillo, 2010, Experiment 2). Such studies therefore share
many features with clinical functional analyses conducted under analog circumstances,
which despite not reproducing the exact behavior-environment relations that exist in
everyday circumstances provide valuable insights about behavior control that general-
ize to those circumstances (Hanley, Iwata, & McCord, 2003).

Descriptive Analyses of Naturally Occurring Behaviors A second category of non-


intervention research involves the study of problems that are widely recognized as
socially important but for which field interventions (and experiments to evaluate them)
are beyond the practical reach of the investigator, leaving descriptive studies as the only
logical approach (Fig. 4, middle panel). Though anticipating this problem (BThe
analysis of socially important problems becomes experimental only with difficulty;^
BWR, p. 92), Baer et al. apparently concluded that no investigation at all is better than
144 BEHAV ANALYST (2017) 40:123–159

one employing descriptive methods (Ba non-experimental analysis is a contradiction in


terms;^ p. 92).
Subsequent investigators have not always agreed, in a number of instances prefer-
ring a non-experimental analysis of important problems to no analysis at all. Previously,
we mentioned analyses of terrorist acts (Nevin, 2003) and of consumer demand for
indoor tanning (Reed et al., 2013). Other investigations have examined whether
matching theory can reveal Bdeviance^ in youth conversation patterns (McDowell &
Caron, 2010a, 2010b); whether reinforcement principles shed light on the legislative
behavior of members of the US Congress (Critchfield, Haley, Sabo, Colbert, &
Macropoulis, 2003; Critchfield, Reed, & Jarmolowicz, 2015; Weisberg & Waldrop,
1972); whether public policy changes affected use of child-adapted automobile seating
(Seekins, Fawcett, Cohen, Elder, Jason, Schnelle, & Winett, 1988); and whether
behavior during elite sport competition is explained by various aspects of behavior
theory (e.g., Critchfield & Stilling, 2015; Mace, Lalli, Shea, & Nevin, 1992; Poling,
Weeden, Redner, & Foster, 2011; Reed, Critchfield, & Martens, 2006; Seniuk,
Williams, Reed, & Wright, 2015; Vollmer & Bourret, 2000).
Studies like these examine face-valid topics of great social interest and—unlike the
narrative interpretation strategy employed often by Skinner (e.g., 1953)—test behavior
analytic interpretations against empirical evidence. Unlike most experiments, they can
operate at massive scales of analysis (e.g., for an entire branch of government or for
entire nations; Critchfield, Reed, & Jarmolowicz, 2015; Reed et al., 2013). The
standard objection to descriptive analyses is that they reveal only correlations and
therefore provide, at best, unsubstantiated clues about possible causal relations. Until
behavior analysts conduct experimental analyses of the relevant problems, however, the
reader must decide whether such clues are better than no empirical clues whatsoever.

Using Verbal Behavior as Proxy for Socially Important Behavior In the quest for
socially valuable behavior change, Baer et al. saw nothing quite so threatening as using
verbal responses as measurement proxy for other behavior:

A subject’s verbal description of his own non-verbal behavior usually would not
be accepted as a measure of his actual behavior unless it were independently
substantiated. Here there is little applied value in the demonstration that an
impotent man can be made to say that he is no longer impotent. The relevant
question is not what he can say, but what he can do. Application has not been
achieved until this question has been answered satisfactorily. (BWR, p. 93; see
Baer et al., 1987, pp. 316-317, for elaboration)

Yet many socially important behaviors, such as those implicated in child accidents,
drug use, and sexual relations, tend to occur outside of a researcher’s scope of
observation, creating a potentially uncomfortable choice between ignoring these be-
haviors and studying them with alternatives to direct observation.
The use of verbal reports for this purpose is illustrated in a series of studies using
structured interviews to collect information about naturally occurring alcohol consump-
tion and environmental events to which it might be related. The goal was to test
predictions of behavioral economic theory for such clinically significant behaviors as
spontaneous help-seeking and post-treatment relapse. A careful process of validation
BEHAV ANALYST (2017) 40:123–159 145

showed that while self-reports did not necessarily indicate exact amounts of important
events like drinking, they correlated well with those amounts (e.g., Gladsjo, Tucker,
Hawkins, & Vuchinich, 1992). Based on self-reports, help-seeking and relapse were
found to covary with rates of alternative (i.e., non-alcohol-related) sources of reinforce-
ment, thereby linking alcohol use in the everyday environment to the findings of
experiments on incentive-based interventions (e.g., Tucker, Vuchinich, & Pukish,
1995; Tucker, Vuchinich, & Rippens, 2002).
This is not intervention research (Effective, General), but for present purposes the
pertinent issue is that, within the BWR framework, it is insufficiently Behavioral due to
reliance on self-report measurement. As Baer et al. alluded, the rub with self-reports is
that they can deviate from the reported behavior, but often overlooked is that this
problem is not unique to self-report measurement. The Breports^ of external observers
also can deviate substantially from what actually occurs with target behavior and—as
BWR emphasized—are useful in research only under conditions that establish good
correspondence. By the same token, self-reports are inherently neither informative nor
uninformative. They are good measurement under some conditions and not under
others, and the trick, as with using external observers, is to identify and instantiate
the useful conditions (Critchfield & Epting, 1998; Critchfield, Tucker, & Vuchinich,
1998). This is a point that Baer (1981) subsequently came to appreciate when consid-
ering how to evaluate consumer satisfaction with behavioral interventions: B20 years
[of experience in the field] had shown a use for clients’ statements that things were or
were not better now…. Suddenly we were in the ironic position of needing truthful talk
about behavior from the very people we had earlier not trusted to talk truthfully about
behavior^ (pp. 257–258).
A different point of contention arises with research procedures that ask participants
to respond verbally to hypothetical scenarios. For example, delay discounting tasks
(Fig. 4, bottom panel) include a series of choices between unequal outcomes (often, but
not always, money amounts) in the format of a smaller amount available sooner versus
a larger amount available later (this usually takes place in the context of insufficiently
Analytical large N designs that typically do not include a clinical intervention and
therefore cannot be Technological, Effective, or General). From these choices can be
derived an Bindex of impulsivity^ that varies across individuals and across selected
external contexts, and that also correlates with engagement in a wide variety of socially
important impulsive behaviors (for introductions, see Critchfield & Kollins, 2001;
Odum, 2011).
As Odum (2011) has pointed out, behavior analysts object to delay discounting
research on at least two grounds. First, consistent with Baer et al.’s mistrust of verbal
data, there is the uncertain ontological status of Bhypothetical verbal behavior,^ which
is not the same thing as physical behavioral responses. Second, the concept of
Bimpulsivity^ might suggest a personality construct, and historically, behavior analysts
have not looked fondly upon constructs that equate behavior patterns with an unchang-
ing trait. In light of these Bshortcomings,^ it might seem curious that both Journal of
the Experimental Analysis of Behavior and The Psychological Record have recently
devoted special issues to delay discounting (Bickel, MacKillop, Madden, Odum, & Yi,
2015; Green, Myerson, & Critchfield, 2011). According to the logic often employed in
conjunction with the BWR framework, these projects undermine those journals’ status
as Behavioral.
146 BEHAV ANALYST (2017) 40:123–159

Yet, as employed by most behavior analysts, Bimpulsivity^ is merely a way of


summarizing behavior patterns in one domain (e.g., delay discounting tasks) that
predict pattern in others. As Odum notes:

Whatever delay discounting may be, its study has provided the field of behavior
analysis and other areas measures with robust generality and predictive validity
for a variety of significant human problems (p. 427)....Due to the scope and
impact of research in the area, the study of delay discounting can be considered as
one of the successes of the field of behavior analysis. Skinner (1938) maintained
that the appropriate level of analysis was one that produced orderly and repeat-
able results.... Delay discounting is related to a host of maladaptive behaviors,
including drug abuse, gambling, obesity... poor college performance, personal
safety, and self care. (p. 435-436)

Thus, standard critiques of hypothetical delay discounting tasks as not Behavioral


confuse construct validity with predictive validity. At issue is not whether verbal
responses are Btrue^ but rather whether they provide insights into other objectively
verifiable and socially important behaviors and their eventual clinical modification.
They often do (Ayres, 2010; Bickel, Moody, & Higgins, 2016; Bickel & Vuchinich,
2000; Critchfield & Kollins, 2001; Odum, 2011).

Section Conclusion: ABA Research as a Fuzzy Concept

A great deal of interesting research on socially important problems is being conducted


without fully representing the dimensions of ABA that BWR specified. Even though
behavior analysts conducted much of the research just described, very different proce-
dures were employed across studies and research areas, and it is reasonable to ask why.
We suggest that as behavior analysis extends to new domains of social importance, a
few key individuals modify normative rules of engagement based on trial-and-error
experience with new problems. As a result of this inductive process, it is a virtual
certainty that, for good and unavoidable reasons, emergent rules of engagement for
studying Social Problem X will differ from those for studying Social Problem Y. For
example, studies conducted in residential facilities may allow for long-term observation
of individual behavior within the context of elaborate, steady-state within-subject
experimental designs (e.g., Piazza, Hanley, & Fisher, 1996). The same luxuries may
not be available in everyday settings such as public schools and business organizations,
or for behaviors that occur too infrequently and unpredictably to be analyzed in terms
of individual-client response rates (e.g., suicidal actions; Richa, Fahed, Khoury, &
Mishara, 2014). In short, investigators tackling problems of societal importance that
have not been the focus of previous systematic investigation within applied behavior
analysis have little recourse but to embrace Baer et al.’s frequently overlooked assertion
that, BThe label applied is determined not by the procedures used but by the interest
society shows in the problem being studied^ (BWR, p. 92).
As a means of understanding ambiguous cases like those described here, it may be
productive to think about ABA research as a fuzzy concept rather than a conjoint set of
dimensions. A concept exists Bwhen a group of [stimuli] gets the same response, when
BEHAV ANALYST (2017) 40:123–159 147

they form a class the members of which are reacted to similarly^ (Keller & Schoenfeld,
1950, p. 154). For instance, there are many different kinds of chairs, but we sit in all of
them and tact them with the same verbal label. In a fuzzy concept, exemplars of a class
have multiple defining stimulus features, but not all of these features must be present in
each exemplar (e.g., Herrnstein, 1990, Rosch & Mervis, 1975). For example, a chair
might be said to consist of (a) a surface for supporting the buttocks, (b) perched on
three or four legs, and supplemented by (c) a vertical surface for supporting the back
and (d) additional horizontal surfaces for supporting the arms. Although a Bstadium
seat^ lacks leg and arm supports, and a bar stool lacks the back and arm supports, both
are generally recognized as chairs. A beanbag chair, despite containing Bpure^ versions
of none of these features, also is recognized as a chair because it functions adequately
for sitting.
A concept implies Bgeneralization within classes and discrimination between
classes^ (Keller & Schoenfeld, 1950, p. 155, italics in original), so in a fuzzy
concept, there may be generalization along multiple stimulus dimensions. While
ABA research must reflect social relevance in some transparent fashion, Figs. 2,
3, and 4 make clear that ABA’s other defining features will be present to
varying degrees in different programs of investigation. A common finding in
the study of concept learning is that the more class-defining stimulus features
that are present in an exemplar, the more likely it is to be responded to as part
of the class (Rosch & Mervis, 1975). It is therefore understandable that
research incorporating all of the BWR dimensions is most readily recognized
as ABA while other research is sometimes viewed with ambivalence. But this
ambivalence is a property of observers, not of the relevant research. It makes
little sense to impose categorical distinctions (Bapplied^ versus Bnot applied^)
on studies that, despite varying in degree of similarity to ABA research as
BWR described it, in all cases aim to advance the understanding of socially
important problems.
During the time of BThe Crisis,^ Iwata (1991) argued that the Btoo technological^
research that troubled other observers was a valid component of a discipline that by its
very nature should encompass a wide variety of research topics and styles. What he
wrote then is equally applicable to contemporary research that does not necessarily
meet every requirement of the BWR framework:

The most serious problem evident in applied behavior analysis today is not the
type of research being conducted; it is that not enough good research – of all
types – is being conducted.... To reduce the frequency of one type of research by
denigrating it or by punishing those who do it well seems foolish. (p. 424).

This is the case because opportunities exist

To conduct many varieties of empirical investigation that should qualify as


applied behavior analysis because they speak to the functional properties of
socially important behavior, even if they deviate from the field’s procedural
norms.... These are, to be sure, serious compromises in the modus operandi of
applied behavior analysis, and they should not be embraced lightly or routinely.
148 BEHAV ANALYST (2017) 40:123–159

But neither should topics of considerable interest in the everyday world be


excluded from analysis. (Critchfield et al., 2003, pp. 482-482)

Limitations of Boundaries

Whatever else may be said of the BWR framework, its strict interpretation creates tidy
boundaries between ABA and other pursuits (as was Baer et al.’s intent). We have
attempted to raise the question of what is achieved, in the post-1968 world of
contemporary ABA, by drawing such distinctions. Once there might have existed
uncertainty about the goals to emphasize in extending behavior analysis to the empir-
ical study of everyday human behavior, and Baer et al.’s BDeclaration of
Independence^ was an attempt to ensure that the nascent field of ABA did not devolve
into something Btoo basic^ for face-valid attention to important social problems or too
casual in approach to achieve behavior change when necessary. A half-century later,
with ABA’s traditions well established, neither concern seems terribly compelling.
One place where firm distinctions matter is in the development of a profession
(according to Google.com, Ba paid occupation, especially one that involves prolonged
training and a formal qualification^). Members of a profession typically control who
may join as a means of protecting both access to a marketplace of consumers and the
quality of services delivered; credentialing is one mechanism for doing this (Critchfield,
2011b). In the case of ABA-as-profession, therefore, boundaries exclude unqualified
practitioners and help qualified practitioners find work. The BWR dimensions, as a
conjoint set, may be as good a means as any for determining who is qualified to
practice. But with one exception, to be noted shortly, the present discussion focuses on
research, not practice.
Research is a process of inquiry, not a guild, so what of firm distinctions between
one research tradition and another? Because our advocacy for Bquasi-behavioral^
studies blurs the distinction between Bpure^ ABA research and other research tradi-
tions, the reader will be justified in pointing out that not all methods are created equal.
Classical introspection, for instance, had fatal flaws that impeded scholarly progress,
and it was justifiably relegated to the scrap heap of investigatory tools (e.g., Boring,
1953). Are Bquasi-behavioral^ methods, like those of the studies described in the
previous section, similarly bankrupt? One perspective, which echoes a strict embrace
of the BWR conjoint framework, is that BThe standards for pure behavioral research are
as uncompromising as the behavioral nature that dictates them… Their violation may
not doom an experiment to utter worthlessness, but it must suffer in direct proportion to
the trespass^ (Johnston & Pennypacker, 1986, p. 32). In response to this proposition,
we merely repeat a theme from the last section. Do Bquasi-behavioral^ studies like we
have described provide useful insights into socially relevant behavior? And would
including those studies in ABA’s core literature help the field become more interesting,
vibrant, intellectually engaged, and socially relevant? The answers to these questions,
rather than conceptual arguments, should dictate our collective appraisal of Bquasi-
behavioral^ methods.
A final way in which distinctions might matter is in delineating the mission of
academic journals. In penning the BWR article for this purpose, Baer et al. did not just
promote attention to social relevance; given the times in which they wrote, they also
BEHAV ANALYST (2017) 40:123–159 149

promoted a particular means of achieving social relevance. If our hypothesis is


correct—if topics of societal importance are being marginalized in ABA because they
have not been examined within the BWR framework—then an interesting irony
emerges. The method that Baer et al. recommended for pursuing social relevance
now sometimes helps to limit the number of domains in which ABA, as depicted in
its core journals, is socially relevant.
For our money, applied journals, like the applied studies they feature, exist to
promote insights into problems of societal importance, not to promote a specific
method. Only academicians organize the world according to methods used to study
it, and when the goal is to understand a particular problem, firm methodological and
disciplinary boundaries rarely are productive. Many important scientific advances trace
to an investigator who gained insight by drawing on disparate influences (Root-
Bernstein, 1988). Pavlov learned how to measure salivation (and ultimately to describe
classical conditioning) by reading case reports of an American physician from the
French and Indian War era (Harre, 2009). Field’s (e.g., 1993) development of thera-
peutic massage for prenatal infants arose partly from an accidental encounter with
research about pup-rearing practices in rats. Several prominent behavior analysts have
similar stories of unexpected inspiration (Holdsambeck & Pennypacker, 2015, 2016).
We suggest that an essential purpose of problem-focused journals is to pull together
disparate influences that might yield the next important insight. This is not to propose
that, in a perfect world of journal operations, anything goes, only that the bar for what
pertains to a behavioral analysis should not be set so high as to discourage clearly
relevant work that does not fit a preconceived research mold.
Because insight is not the sole province of scientists, we conclude the present section
as it began, by focusing on practice. Baer et al. acknowledged, and observers during
BThe Crisis^ emphasized, that practitioners should understand both intervention tech-
niques (Ba collection of tricks;^ BWR, p. 96) and the behavior principles that guide
their design. Because most of today’s field workers are trained at the Masters level or
below (Shook & Favell, 2008), observers have worried that practitioner training may
include too little grounding in Conceptual foundations—creating a BSecond Crisis^ of
technological excess involving uncertain quality of implementation and dissemination
by practitioners who may be ill-equipped to link practice to basic principles that Bhave
the effect of making a body of technology into a discipline rather than a collection of
tricks^ (BWR, p. 96). The stakes are considerable. Practitioners who do not understand
the field’s Conceptual foundations may not be Bable to place their particular problem in
a more general context and thereby deal with it ... successfully^ (Sidman, 2011, p. 973),
and any resulting ineffectiveness may undermine the public demand that now exists for
ABA services (Critchfield, Doepke, & Campbell, 2015; Dixon et al., 2015; Sidman,
2011).
How to establish the needed Conceptual skills during professional training is much
debated, but observers tend to agree on the importance of trainee engagement with the
field’s research foundations (Arena et al., 2015; Dixon, Reed, Smith, Belisle, &
Jackson, 2015; Hayes, 2015; Pritchard & Wine, 2015; Schlinger, 2010; Sidman,
2011). Iwata’s (1991) BFirst-Crisis^ apprehension that Bnot enough good research –
of all types – is being conducted^ (p. 424) seems relevant here because anecdotal
evidence suggests that principles-to-practice translation may be a unique repertoire that
does not emerge automatically from mastering either theory or practice (Critchfield,
150 BEHAV ANALYST (2017) 40:123–159

2011a; Critchfield, Doepke, & Campbell, 2015; Critchfield & Reed, 2005; Mace &
Critchfield, 2010; Poling, Alling, & Fuqua, 1994; Virues-Ortega et al., 2014; Wacker,
1996, 2000, 2003). Moreover, it must be a generalized repertoire that maps onto a wide
variety of field circumstances (e.g., Chase & Wylie, 1985). We therefore suggest, in the
spirit of multiple exemplar training that produces generalized repertoires (e.g., Baer &
Sherman, 1964; Greer & Yuan, 2008; Herrnstein, Loveland, & Cable, 1976; Lechago,
Carr, Kisamore, & Grow, 2015; Rosales, Rehfeldt, & Lovett, 2011), that graduate
education for practitioners should involve plenty of opportunities to examine how
principles guide the understanding of everyday behavior. If the goal is for new
practitioners to recognize behavior principles in action everywhere they look, then
the best way to accomplish this is to show them behavior principles operating in as
many contexts as possible. Should a methodologically homogeneous ABA literature
discourage topical heterogeneity by omitting socially important behaviors that are hard
to study with preferred methods, then a limited range of exemplars is available for
consumption by future practitioners. The more limited the range of exemplars, the less
generalized the resulting translational repertoire is likely to become.

Looking Forward

As new discoveries are made, new truths disclosed, and manners and opinions
change with the change of circumstances, institutions must advance also, and
keep pace with the times. We might as well require a man to wear still the coat
which fitted him when a boy, as civilized society to remain ever under the
regimen of their barbarous 14 ancestors. Thomas Jefferson, letter to Samuel
Kercheval, July 12, 1816

In almost 50 years since it was published, BWR has earned its iconic status by
helping to fuel the development of ABA research and practice. The debt that behavior
analysis owes to Baer et al. for systematizing useful rules of engagement should not be
understated. Yet much has happened since 1968, and a twenty-first century
Banthropologist’s account of the group calling its culture Applied Behavior Analysis^
(Baer et al., 1987, p. 313) might draw different conclusions than those advanced in
BWR. Among studies that employ or are relevant to a behavioral analysis of socially
important problems, many reflect the full BWR framework, but a substantial minority
do not. In considering BWR as the public face of contemporary ABA research, we are
reminded of the near-exclusive focus in 1960s television situation comedies on white,
heterosexual, middle-class, two-parent families that surely exist but do not define the
only model of a successful family. Similarly, although we cannot imagine ABA research
without BWR-style investigations that reflect all seven of the proposed dimensions, it is
misleading to suggest that this is the only possible approach to addressing socially
important problems. Risley (2001) acknowledged as much when he wrote:

14
Jefferson’s colorful language (Bbarbarous^) is reproduced here for historical accuracy, not to suggest any
disrespect for Baer, Wolf, or Risley. We employ this quote in the spirit of Hayes’ (2001) observation that, in
intellectual pursuits, one moves forward or stagnates.
BEHAV ANALYST (2017) 40:123–159 151

Wolf and I intended the [BWR] article to be heuristic (BSome Current^) rather
than definitive (BDimensions of Applied Behavior Analysis^). He and I assumed
that the enterprise of Applied Behavior Analysis would evolve – that findings
would condense into knowledge and technology, and that new problems and
opportunities would require and beget new research methodologies. (p. 270)

Too often, the BWR article functions as a conceptual and methodological bottleneck
through which work on socially important problems must pass in order to be embraced
by the ABA community. This is a curious state of affairs given that the article’s own
authors had differing predilections for levels of analyses. As Risley (2001) put it: BBaer
[was] mostly interested in explaining the world, and Wolf was mostly interested in fixing
the world. I think I was mostly interested in exploring the world^ (p. 271). This is a fairly
apt description, we think, of the breadth of investigatory approaches described in the
present essay. One can never safely reconstruct the motivations of historical figures, but
we like to think that, if challenged during a spirited but friendly otherworldly bar
conversation, the ghosts of Baer, Wolf, and Risley might express reservations over the
homogeneity of approach that their iconic article has been used to support.
Because of the BWR article’s awkward fit with the activities of many contem-
porary behavior analysts who are examining socially important problems, it may
be necessary to reconsider BWR’s status as the definitive template for what ABA
is, or should be. The article clearly retains historical interest because it illuminates
the social and scientific dynamics of ABA’s formative era and provides potential
insight into the subsequent technological BCrisis.^ We remain comfortable saying
that anyone who seeks to investigate socially important behavior should be aware
of the ideals expressed in the seven-dimension framework. But in contrast to
current practices in textbooks and scholarly journals, we do not recommend the
original BWR article as a primary frame of reference for this. Considering the
framework specifically as BWR discussed it admits polemical baggage that is both
dated and detrimental. For novices, the BWR article is a misleading gateway to
the field because its splitter overtones and cursory treatment of the Conceptual
dimension are out of step with the integrative translational mission of contempo-
rary behavior analysis. For those with more advanced skills, BWR’s conjoint-set
account of the seven dimensions undersells the potential of behavior analysis
research to shed light on a wide variety of social problems, discourages interest
in problems that do not readily fit into the framework, and supports a too narrow
conception of what belongs in Applied behavior analysis journals.
The time has come to replace BWR’s one-size-fits-all, conjoint-set perspective with
a simpler basis for evaluating ABA scholarly work: the extent to which an analysis has
the potential to advance our behavior-theory-driven understanding of socially important
problems. What is meant by Badvancing understanding^ will, of course, vary across
problem areas. For some problems that have been thoroughly investigated, standards
approximating the BWR conjoint set often may be appropriate. For problems that are
fairly new to ABA—and this could mean most problems that do not pertain to ABA’s
staple topics of autism and developmental disabilities—progress can take many forms
(though in most cases either the research question will be inspired by behavior theory or
regularities in socially important behavior will be examined, perhaps atheoretically, as a
precursor to a functional analysis). The methods employed to achieve this progress
152 BEHAV ANALYST (2017) 40:123–159

should be appreciated for their heuristic capacity to advance understanding, not eval-
uated based on a single, predetermined model of investigation.

Acknowledgements All reviews and editorial decisions for this manuscript were handled independently by
Guest Associate Editor Mark Galizio.

Compliance with Ethical Standards

Conflict of Interest The authors declare that they have no conflicts of interest.

References

Alwash, R., & McCarthy, M. (1987). How do child accidents happen? Health Education Journal, 46, 169–
171. doi:10.1177/001789698704600410.
Arena, R., Chambers, S., Rhames, A., & Donahoe, K. (2015). The importance of research—a student
perspective. Behavior Analysis in Practice, 8, 152–153. doi:10.1007/s40617-015-0084-x.
Augustson, E. M., & Dougher, M. J. (1997). The transfer of avoidance evoking functions through stimulus
equivalence classes. Journal of Behavior Therapy and Experimental Psychiatry, 28, 181–191.
doi:10.1016/s0005-7916(97)00008-6.
Ayllon, T., & Michael, J. (1959). The psychiatric nurse as behavioral engineer. Journal of the Experimental
Analysis of Behavior, 1, 183–200. doi:10.1901/jeab.1959.2-323.
Ayres, I. (2010). Carrots and sticks: unlock the power of incentives to get things done. New York: Bantam.
Azrin, N. H., & Lindsley, O. R. (1956). The reinforcement of cooperation between children. Journal of
Abnormal and Social Psychology, 52, 100–102. doi:10.1037/h0042490.
Baer, D. M. (1977). Perhaps it would be better not to know everything. Journal of Applied Behavior Analysis,
10, 167–172. doi:10.1901/jaba.1977.10-167.
Baer, D. M. (1981). A flight of behavior analysis. The Behavior Analyst, 4, 85–91.
Baer, D. M., & Sherman, J. A. (1964). Reinforcement control of generalized imitation in young children.
Journal of Experimental Child Psychology, 1, 37–49. doi:10.1016/0022-0965(64)90005-0.
Baer, D. M., Wolf, M. M., & Risley, T. R. (1968). Some current dimensions of applied behavior analysis.
Journal of Applied Behavior Analysis, 1, 91–97. doi:10.1901/jaba.1968.1-91.
Baer, D. M., Wolf, M. M., & Risley, T. R. (1987). Some still-current dimensions of applied behavior analysis.
Journal of Applied Behavior Analysis, 20, 313–327. doi:10.1901/jaba.1987.20-313.
Bailey, J. S., & Burch, M. R. (2002). Research methods in applied behavior analysis. Thousand Oaks, CA:
Sage. doi:10.4135/9781412985710.
Barnes-Holmes, D., Keane, J., Barnes-Holmes, Y., & Smeets, P. M. (2000). A derived transfer of emotive
functions as a means of establishing differential preferences for soft drinks. Psychological Record, 50,
493–511.
Baron, A., Perone, M., & Galizio, M. (1991). Analyzing the reinforcement process at the human level: can
application and behavioristic interpretation replace laboratory research? The Behavior Analyst, 14, 95–
105.
Bickel, W. K., MacKillop, J., Madden, G. J., Odum, A. L., & Yi, R. (2015). Experimental manipulations of
delay discounting and related processes: introduction to the special issue. Journal of the Experimental
Analysis of Behavior, 103, 1–9. doi:10.1002/jeab.133.
Bickel, W. K., Moody, L., & Higgins, S. T. (2016). Some current dimensions of the behavioral economics of
health-related behavior change. Preventive Medicine. doi:10.1016/j.ypmed.2016.06.002.
Bickel, W. K., & Vuchinich, R. E. (Eds.). (2000). Reframing health behavior change with behavioral
economics. Hove, UK: Psychology Press. doi:10.4324/9781410605061.
Birnbrauer, J. S. (1979). Applied behavior analysis, service and the acquisition of knowledge. The Behavior
Analyst, 2, 15–21.
Boring, E. G. (1953). A history of introspection. Psychological Bulletin, 50, 169–189. doi:10.1037/h0090793.
BEHAV ANALYST (2017) 40:123–159 153

Branch, M. N., & Malagodi, E. F. (1980). Where have all the behaviorists gone? The Behavior Analyst, 3, 31–
39.
Branch, M. N., & Pennypacker, H. S. (2013). Generality and generalization of research findings. In G. J.
Madden, W. V. Dube, T. D. Hackenberg, G. P. Hanley, & K. A. Lattal (Eds.), APA handbook of behavior
analysis, Vol. 1: methods and principles (pp. 151–175). Washington, DC: American Psychological
Association.
Buskist, W., & Johnston, J. M. (1988). Laboratory lore and research practices in the experimental analysis of
human behavior. The Behavior Analyst, 11, 41–42.
McDowell, J. J., & Caron, M. L. (2010b). Matching in an undisturbed natural human environment. Journal of
the Experimental Analysis of Behavior, 93, 415–433. doi:10.1901/jeab.2010.93-415.
Carpintero Capell, H., Del Barrio, V., & Mababu, R. (2014). Applied psychology. The case of the Baer, Wolf
and Risley prescriptions for applied behavior analysis. Universitas Psychologica, 13(5), 1721–1728.
doi:10.11144/javeriana.upsy13-5.aptc.
Johnston J.M, Carr, J.E., & Mellichamp, F.E. (in press). A history of the credentialing of applied behavior
analysts. The Behavior Analyst.
Cataldo, M. F., Finney, J. W., Richman, G. S., Riley, A. W., Hook, R. J., Brophy, C. J., & Nau, P. A. (1992).
Behavior of injured and uninjured children and their parents in a simulated hazardous setting. Journal of
Pediatric Psychology, 17, 73–80. doi:10.1093/jpepsy/17.1.73.
Chase, P. N., & Wylie, R. G. (1985). Doctoral training in behavior analysis: training generalized problem-
solving skills. The Behavior Analyst, 8, 159–176.
Chok, J. T., Reed, D. D., Kennedy, A., & Bird, F. L. (2010). A single-case experimental analysis of the effects
of ambient prism lenses for an adolescent with developmental disabilities. Behavior Analysis in Practice,
3(2), 42–51.
Cohen, D. A., Han, B., Derose, K. P., Williamson, A., Marsh, T., & McKenzie, T. L. (2013). Physical activity
in parks: a randomized controlled trial using community engagement. American Journal of Preventive
Medicine, 45, 590–597. doi:10.1016/j.amepre.2013.06.015.
Cohen, D. A., Han, B., Isacoff, J., Shulaker, B., Williamson, S., Marsh, T., McKenzie, T. L., Weir, M., &
Bhatia, R. (2015). Impact of park renovations on park use and park-based physical activity. Journal of
Physical Activity & Health, 12, 289–295. doi:10.1123/jpah.12.2.289.
Cohen, D. A., Marsh, T., Williamson, S., Han, B., Derose, K. P., Golinelli, D., & McKenzie, T. L. (2014). The
potential for pocket parks to increase physical activity. American Journal of Health Promotion, 28(sp3),
S19–S26. doi:10.4278/ajhp.130430-quan-213.
Cooper, J.O., Heron, T.E., & Heward, W.L. (2007). Applied behavior analysis (2nd ed.). Upper Saddle River,
NJ: Pearson.
Critchfield, T. S. (2011a). To a young basic scientist, about to embark on a program of translational research.
The Behavior Analyst, 34, 133–148.
Critchfield, T. S. (2011b). Interesting times: practice, science, and professional associations in behavior
analysis. The Behavior Analyst, 34, 297–310.
Critchfield, T. S., Doepke, K. J., & Campbell, R. L. (2015a). Origins of clinical innovations: why practice
needs science and how science reaches practice. In F. D. DeGenarro Reed & D. D. Reed (Eds.), Bridging
the gap between science and practice in autism service delivery (pp. 1–23). New York: Springer.
doi:10.1007/978-1-4939-2656-5_1.
Critchfield, T. S., & Epting, L. K. (1998). The trouble with babies and the value of bath water: complexities in
the use of verbal reports as data. The Analysis of Verbal Behavior, 15, 65–74.
Critchfield, T. S., & Fienup, D. M. (2013). A “happy hour” effect in translational stimulus relations research.
Experimental Analysis of Human Behavior Bulletin, 29, 2–7.
Critchfield, T. S., Haley, R., Sabo, B., Colbert, J., & Macropoulis, G. (2003). A half century of scalloping in
the work habits of the United States Congress. Journal of Applied Behavior Analysis, 36, 465–486.
doi:10.1901/jaba.2003.36-465.
Critchfield, T. S., & Kollins, S. H. (2001). Temporal discounting: basic research and the analysis of socially-
important behavior. Journal of Applied Behavior Analysis, 34, 101–122. doi:10.1901/jaba.2001.34-101.
Critchfield, T. S., & Reed, D. D. (2005). Conduits of translation in behavior-science bridge research. In J. E.
Burgos & E. Ribes (Eds.), Theory, basic and applied research, and technological applications in
behavior science: conceptual and methodological issues (pp. 45–84). Guadalajara: University of
Guadalajara Press.
Critchfield, T. S., Reed, D. D., & Jarmolowicz, D. P. (2015b). Historically low bill production in the United
States Congress: snapshot of a reinforcement-contingency system in transition. Psychological Record, 15,
161–176. doi:10.1007/s40732-014-0098-8.
154 BEHAV ANALYST (2017) 40:123–159

Critchfield, T. S., & Stilling, S. T. (2015). A matching law analysis of risk tolerance and gain-loss framing in
football play selection. Behavior Analysis: Research and Practice, 15, 112–121. doi:10.1037/bar0000011.
Critchfield, T. S., Tucker, J. A., & Vuchinich, R. E. (1998). Self-report methods. In K. A. Lattal & M. Perone
(Eds.), Handbook of methods for the experimental analysis of human behavior (pp. 435–470). New York:
Plenum. doi:10.1007/978-1-4899-1947-2_14.
Cullen, C. (1981). The flight to the laboratory. The Behavior Analyst, 4, 81–83.
da Silva, S. P. D., & Lattal, K. A. (2010). Why pigeons say what they do: reinforcer magnitude and response
requirement effects on say responding in say-do correspondence. Journal of the Experimental Analysis of
Behavior, 93, 395–413. doi:10.1901/jeab.2010.93-395.
Darwin, C. (1859). On the origin of the species by means of natural selection; or the preservation of favoured
races in the struggle for life. London: Murray. doi:10.5962/bhl.title.68064.
Davey, G. C. (1983). Animal models of human behavior: conceptual, evolutionary and neurobiological
perspectives. London: Wiley.
DeGenarro Reed, F.D., & Reed, D.D. (Eds.), Bridging the gap between science and practice in autism service
delivery. New York: Springer. doi: 10.1007/978-1-4939-2656-5
Deitz, S. M. (1978). Current status of applied behavior analysis: science versus technology. The American
Psychologist, 33, 805–814. doi:10.1037/0003-066x.33.9.805.
Deitz, S. M. (1982). Defining applied behavior analysis: an historical analogy. The Behavior Analyst, 5, 53–
64.
Deitz, S. M. (1983). Two correct definitions of “applied”. The Behavior Analyst, 6, 105–106.
Dinsmoor, J. A. (1991). The respective roles of human and nonhuman subjects in behavioral research. The
Behavior Analyst, 14, 117–121.
Dixon, M. R., Reed, D. D., Smith, T., Belisle, J., & Jackson, R. E. (2015). Research rankings of behavior
analytic graduate training programs and their faculty. Behavior Analysis in Practice, 8, 7–15. doi:10.1007
/s40617-015-0057-0.
Dixon, M. R., & Schreiber, J. B. (2002). Utilizing a computerized video poker simulation for the collection of
data on gambling behavior. Psychological Record, 52, 417–428.
Drake, R. E., Latimer, E. A., Leff, H. S., McHugo, G. J., & Burns, B. J. (2004). What is evidence? Child and
Adolescent Psychiatric Clinics of North America, 13, 717–728.
Dymond, S., Schlund, M. W., Roche, B., De Houwer, J., & Freegard, G. P. (2012). Safe from harm: learned,
instructed, and symbolic generalization pathways of human threat-avoidance. PloS One, 7(10), e47539.
doi:10.1371/journal.pone.0047539.
Etzel, B. C. (1987). Pigeons and children: what are the differences? Psychological Record, 37, 17–27.
Fawcett, S. B. (1985). On differentiation in applied behavior analysis. The Behavior Analyst, 8, 143–150.
Fawcett, S. B. (1991). Some values guiding community research and action. Journal of Applied Behavior
Analysis, 24, 621–636. doi:10.1901/jaba.1991.24-621.
Ferster, C. B., & Skinner, B. F. (1957). Schedules of reinforcement. New York: Appleton-Century-Crofts.
doi:10.1037/10627-000.
Field, T. M. (1993). The therapeutic effects of touch. In G. G. Brannigan & M. R. Merrens (Eds.), The
undaunted psychologist: adventures in research (pp. 3–11). Philadelphia: Temple University Press.
Finney, J. W., Christophersen, E. R., Friman, P. C., Kalnins, I. V., Maddux, J. E., Peterson, L., Roberts, M. C.,
& Wolraich, M. (1993). Society of Pediatric Psychology Task Force report: pediatric psychology and
injury control. Journal of Pediatric Psychology, 18, 499–526. doi:10.1093/jpepsy/18.4.499.
Gladsjo, J. A., Tucker, J. A., Hawkins, J. L., & Vuchinich, R. E. (1992). Adequacy of recall of drinking
patterns and event occurrences associated with natural recovery from alcohol problems. Addictive
Behaviors, 17, 347–358. doi:10.1016/0306-4603(92)90040-3.
Golgi, C. (1873). Sulla struttura della sostanza grigia del cervello. Gazzetta Medica Italiana Lombardia, 33,
244–246.
Green, G. (1996). Evaluating claims about treatments for autism. In C. Maurice, G. E. Green, & S. C. Luce
(Eds.), Behavioral intervention for young children with autism: a manual for parents and professionals
(pp. 15–28). Champaign, IL: Pro-Ed.
Green, L., Myerson, J., & Critchfield, T. S. (2011). Introduction to the special issue: translational research on
discounting. Psychological Record, 61, 523–526.
Greer, R. D., & Yuan, L. (2008). How kids learn to say the darnedest things: the effect of multiple exemplar
instruction on the emergence of novel verb usage. The Analysis of Verbal Behavior, 24, 103–121.
Guinther, P. M., & Dougher, M. J. (2010). Semantic false memories in the form of derived relational intrusions
following training. Journal of the Experimental Analysis of Behavior, 93, 329–347. doi:10.1901
/jeab.2010.93-329.
BEHAV ANALYST (2017) 40:123–159 155

Guinther, P. M., & Dougher, M. J. (2014). Partial contextual control of semantic false memories in the form of
derived relational intrusions following training. Psychological Record, 64, 457–473. doi:10.1007/s40732-
014-0012-4.
Haegele, J. A., & Hodge, S. R. (2015). The applied behavior analysis research paradigm and single-subject
designs in adapted physical activity research. Adapted Physical Activity Quarterly, 32, 285–301.
doi:10.1123/apaq.2014-0211.
Hake, D. F. (1982). The basic-applied continuum and the possible evolution of human operant social and
verbal research. The Behavior Analyst, 5, 21–28.
Halpern, P. (2015). Einstein’s dice and Schrodinger’s cat: how two great minds battled quantum randomness
to create a unified theory of physics. New York: Basic Books. doi:10.1086/687134.
Hanley, G. P., Iwata, B. A., & McCord, B. E. (2003). Functional analysis of problem behavior: a review.
Journal of Applied Behavior Analysis, 36, 147–185. doi:10.1901/jaba.2003.36-147.
Harre, R. (2009). Pavlov’s dog and Schrodinger’s cat: scenes from the living laboratory. Oxford: New York.
Harzem, P. (2004). Behaviorism for new psychology: what was wrong with behaviorism and what is wrong
with it now. Behavior and Philosophy, 32, 5–12.
Harzem, P., & Williams, R. A. (1983). On searching for a science of human behavior. Psychological Record,
33, 565–574.
Hayes, S. C. (2001). The greatest dangers facing behavior analysis today. The Behavior Analyst Today, 2(2),
61–63. doi:10.1037/h0099914.
Hayes, L.J. (2015). There’s a man goin’ round taking names. Behavior Analysis in Practice, 8, 134-135. doi:
s40617-015-0070-3
Hayes, S. C., Rincover, A., & Solnick, J. V. (1980). The technical drift of applied behavior analysis. Journal of
Applied Behavior Analysis, 13, 275–285. doi:10.1901/jaba.1980.13-275.
Herrnstein, R. J. (1990). Levels of stimulus control: a functional approach. Cognition, 37, 133–166.
doi:10.1016/0010-0277(90)90021-b.
Herrnstein, R. J., Loveland, D. H., & Cable, C. (1976). Natural concepts in pigeons. Journal of Experimental
Psychology. Animal Behavior Processes, 2, 285–302. doi:10.1037//0097-7403.2.4.285.
Higgins, S. T., Budney, A. J., Bickel, W. K., Foerg, F. E., Donham, R., & Badger, G. J. (1994). Incentives
improve outcome in outpatient behavioral treatment of cocaine dependence. Archives of General
Psychiatry, 51, 568–576. doi:10.1001/archpsyc.1994.03950070060011.
Higgins, S. T., Reed, D. D., Redner, R., Skelly, J. M., Zvorski, & Kurti, A. N. Simulating demand for
cigarettes among pregnant women: a low-risk method for studying vulnerable populations. Journal of the
Experimental Analysis of Behavior, 107, 176–190. doi:10.1002/jeab.232.
Holdsambeck, R., & Pennypacker, H. S. (2015). Behavioral science: tales of inspiration, discovery, and
service. In Cornwall-on-Hudson. NY: Sloan.
Holdsambeck, R., & Pennypacker, H. S. (2016). Behavioral science: tales of inspiration, discovery, and
service volume II. In Cornwall-on-Hudson. NY: Sloan.
Horner, R. H., & Sugai, G. (2015). School-wide PBIS: an example of applied behavior analysis implemented
at a scale of social importance. Behavior Analysis in Practice, 8, 80–85. doi:10.1007/s40617-015-0045-4.
Hurtado-Parrado, C., & Lopez-Lopez, W. (2015). Single-case research methods: history and suitability for a
psychological science in need of alternatives. Integrative Psychological & Behavioral Science, 49, 323–
349. doi:10.1007/s12124-014-9290-2.
Hutton, J. (1788). X. Theory of the earth; or an investigation of the laws observable in the composition,
dissolution, and restoration of land upon the globe. Transactions of the Royal Society of Edinburgh, 1(02),
209–304. doi:10.1017/s0080456800029227.
Iwata, B. A. (1991). Applied behavior analysis as technological science. Journal of Applied Behavior
Analysis, 24, 421–424. doi:10.1901/jaba.1991.24-421.
Iwata, B. A., & Becksfort, C. M. (1981). Behavioral research in preventive dentistry: educational and
contingency management approaches to the problem of patient compliance. Journal of Applied
Behavior Analysis, 14, 111–120. doi:10.1901/jaba.1981.14-111.
Iwata, B. A., Dorsey, M. F., Slifer, K. J., Bauman, K. E., & Richman, G. S. (1982). Toward a functional
analysis of self-injury. Analysis and Intervention in Developmental Disabilities, 2, 3–20. doi:10.1016
/0270-4684(82)90003-9.
Jenkins, S. R., & DiGenarro Reed, F. D. (2013). An experimental analysis of the effects of therapeutic
horseback riding on the behavior of children with autism. Research in Autism Spectrum Disorders, 7,
721–740. doi:10.1016/j.rasd.2013.02.008.
Johnston, J. M. (2000). Behavior analysis and the R&D paradigm. The Behavior Analyst, 23, 141–148.
Johnston, J. M., & Pennypacker, H. S. (1980). Strategies and tactics of behavioral research. Upper Saddle
River, NJ: Erlbaum.
156 BEHAV ANALYST (2017) 40:123–159

Johnston, J. M., & Pennypacker, H. S. (1986). Pure versus quasi-behavioral research. In A. Poling & R. W.
Fuqua (Eds.), Research methods in applied behavior analysis: issues and advances (pp. 29–54). New
York: Plenum. doi:10.1007/978-1-4684-8786-2_3.
Jones, H. E., Haug, N., Silverman, K., Stitzer, M., & Svikis, D. (2001). The effectiveness of incentives in
enhancing treatment attendance and drug abstinence in methadone-maintained pregnant women. Drug
and Alcohol Dependence, 61, 297–306. doi:10.1016/s0376-8716(00)00152-6.
Kaung, S., Hausman, N. L., Fisher, A. B., Donaldson, J. M., Cox, J. R., Lugo, M., & Wiskow, K. M. (2014).
The safety of functional analyses of self-injurious behavior. Journal of Applied Behavior Analysis, 48,
107–114. doi:10.1002/jaba.168.
Keller, F. S., & Schoenfeld, W. S. (1950). Principles of psychology: a systematic text in the science of
behavior. New York: Appleton-Century-Crofts. doi:10.1037/11293-000.
Lattal, K. A., & Doepke, K. J. (2001). Correspondence as conditional stimulus control: insights from
experiments with pigeons. Journal of Applied Behavior Analysis, 34, 127–144. doi:10.1901
/jaba.2001.34-127.
Lechago, S. A., Carr, J. E., Kisamore, A. N., & Grow, L. L. (2015). The effects of multiple exemplar
instruction on the relation between listener and intraverbal categorization repertoires. The Analysis of
Verbal Behavior, 31, 76–95. doi:10.1007/s40616-015-0027-1.
Lerman, D. C., Sansbury, T., Hovanetz, A., Wolever, E., Garcia, A., O’Brien, E., & Adedipe, H. (2008). Using
behavior analysis to examine the outcomes of unproven therapies: an evaluation of hyperbaric oxygen
therapy for children with autism. Behavior Analysis in Practice, 1(2), 50–58.
Lilienfeld, S. O. (2002). The scientific review of mental health practice: our raison d’etre. The Scientific
Review of Mental Health Practice, 1, 1–9.
Lillis, J., & Hayes, S. C. (2007). Applying acceptance, mindfulness, and values to the reduction of prejudice: a
pilot study. Behavior Modification, 31, 389–411. doi:10.1177/0145445506298413.
Lindsley, O. R. (2002). Our Harvard pigeon, rat, dog, and human lab. Journal of the Experimental Analysis of
Behavior, 77, 385–387. doi:10.1901/jeab.2002.77-385.
Lindsley, O.R., & Lindsley, M. (1952). The reinforcing effect of auditory stimuli on operant behavior in the
human infant. Paper presented at the meeting of the Eastern Psychological Association, Atlantic City, NJ.
Lovaas, I. O. (1987). Behavioral treatment and normal educational and intellectual functioning in young
autistic children. Journal of Consulting and Clinical Psychology, 55, 3–9. doi:10.1037/0022-006x.55.1.3.
Lussier, J. P., Heil, S. H., Mongeon, J. A., Badger, G. J., & Higgins, S. T. (2006). A meta-analysis of voucher-
based reinforcement therapy for substance use disorders. Addiction, 101, 192–203. doi:10.1111/j.1360-
0443.2006.01311.x.
Mace, F. C., & Critchfield, T. S. (2010). Translational research in behavior analysis: historical traditions and
imperative for the future. Journal of the Experimental Analysis of Behavior, 93, 293–312. doi:10.1901
/jeab.2010.93-293.
Mace, F. C., Lalli, J. S., Shea, M. C., & Nevin, J. A. (1992). Behavioral momentum in college basketball.
Journal of Applied Behavior Analysis, 25, 657–663. doi:10.1901/jaba.1992.25-657.
Mace, F. C., McComas, J. J., Mauro, B. C., Progar, P. R., Taylor, B., Ervin, R., & Zangrillo, A. N. (2010).
Differential reinforcement of alternative behavior increases resistance to extinction: clinical demonstra-
tion, animal modeling, and clinical test of one solution. Journal of the Experimental Analysis of Behavior,
93, 349–367. doi:10.1901/jeab.2010.93-349.
Maclin, O. H., Dixon, M. R., & Hayes, L. J. (1999). A computerized slot machine simulation to investigate the
variables involved in gambling behavior. Behavior Research Methods, Instruments, & Computers, 31,
731–734. doi:10.3758/bf03195554.
Madden, G.J. (Editor-in-Chief) (2012). APA handbook of behavior analysis (vol. 1 and 2). Washington, DC:
APA Books. doi: 10.1108/rr-05-2013-0106
Malott, R. W., & Shane, J. T. (2014). Principles of behavior (7th ed.). Boston: Pearson.
Marr, M. J. (1991). The speciation of behavior analysis: the unnatural selection of foxes and hedgehogs. The
Behavior Analyst, 14, 183–186.
Martin, G., & Pear, J. (2015). Behavior modification: what it is and how to do it (10th ed.). Boston: Pearson.
Mayer, G. R., Sulzer-Azaroff, B., & Wallace, M. (2012). Behavior analysis for lasting change (2nd ed.). NY
Sloan: Cornwall-on-Hudson.
McDowell, J. J., & Caron, M. L. (2010a). Bias and undermatching in delinquent boys’ verbal behavior as a
function of their level of deviance. Journal of the Experimental Analysis of Behavior, 93, 471–483.
doi:10.1901/jeab.2010.93-471.
McKenzie, T. L., Cohen, D. A., Sehgal, A., Williamson, S., & Golinelli, D. (2006). System for Observing Play
and Recreation in Communities (SOPARC): reliability and feasibility measures. Journal of Physical
Activity & Health, 3, S208–S222. doi:10.1123/jpah.3.s1.s208.
BEHAV ANALYST (2017) 40:123–159 157

Meisch, R. A., & Carroll, M. E. (1987). Oral drug self-administration: drugs as reinforcers. In M. A. Bozarth
(Ed.), Methods of assessing the reinforcing properties of abused drugs (pp. 143–160). New York:
Springer. doi:10.1007/978-1-4612-4812-5_7.
Mercer, J., Sarner, L., & Rosa, L. (2003). Attachment therapy on trial: the torture and death of Candace
Newmaker. Santa Barbara, CA: Greenwood Publishing Group.
Michael, J. (1980). Flight from behavior analysis. The Behavior Analyst, 3(2), 1–21.
Michael, J. (1984). Verbal behavior. Journal of the Experimental Analysis of Behavior, 42, 363–376.
doi:10.1901/jeab.1984.42-363.
Miltenberger, R. G. (2016). Behavior modification: principles and procedures. Boston: Cengage.
Montee, B. B., Miltenberger, R. G., & Wittrock, D. (1995). An experimental analysis of facilitated commu-
nication. Journal of Applied Behavior Analysis, 28, 189–200. doi:10.1901/jaba.1995.28-189.
Morris, E. K. (1991). Deconstructing “technological to a fault”. Journal of Applied Behavior Analysis, 24,
411–416. doi:10.1901/jaba.1991.24-411.
Morris, E. K., Altus, K. E., & Smith, N. G. (2013). A study in the founding of applied behavior analysis
through its publications. The Behavior Analyst, 36, 73–107.
Moxley, R. A. (1989). Some historical relationships between science and technology with implications for
behavior analysis. The Behavior Analyst, 12, 45–57.
Nevin, J. A. (2003). Retaliating against terrorists. Behavior and Social Issues, 12, 109–128. doi:10.5210/bsi.
v12i2.39.
Normand, M. P. (2008). Science, skepticism, and applied behavior analysis. Behavior Analysis in Practice,
1(2), 42–49.
Odum, A. L. (2011). Delay discounting: I’m a k, you’re a k. Journal of the Experimental Analysis of Behavior,
96, 427–439. doi:10.1901/jeab.2011.96-423.
Perone, M. (1985). On the impact of human operant research: asymmetrical patterns of cross-citation between
human and nonhuman research. The Behavior Analyst, 8, 185–189.
Perone, M., Galizio, M., & Baron, A. (1988). The relevance of animal-based principles in the laboratory study
of human operant conditioning. In G. Davey & C. Cullen (Eds.), Human operant conditioning and
behavior modification (pp. 59–85). Oxford: Wiley.
Peterson, L., Farmer, J., & Mori, L. (1987). Process analysis of injury situations: a complement to epidemi-
ological methods. Journal of Social Issues, 43, 33–44. doi:10.1111/j.1540-4560.1987.tb01293.x.
Piazza, C. C., Hanley, G. P., & Fisher, W. W. (1996). Functional analysis and treatment of cigarette pica.
Journal of Applied Behavior Analysis, 29, 437–450. doi:10.1901/jaba.1996.29-437.
Pierce, W. D., & Epling, W. F. (1980). What happened to analysis in applied behavior analysis? The Behavior
Analyst, 3, 1–9.
Poling, A., Alling, K., & Fuqua, R. W. (1994). Self- and cross-citations in the Journal of Applied Behavior
Analysis and the Journal of the Experimental Analysis of Behavior. Journal of Applied Behavior Analysis,
27, 729–731. doi:10.1901/jaba.1994.27-729.
Poling, A., & Grossett, D. (1986). Basic research designs in applied behavior analysis. In A. Poling & R. W.
Fuqua (Eds.), Research methods in applied behavior analysis: issues and advances (pp. 7–27). New
York: Plenum. doi:10.1007/978-1-4684-8786-2_2.
Poling, A., Picker, M., Grossett, D., Hall-Johnson, E., & Holbrook, M. (1981). The schism between
experimental and applied behavior analysis: is it real and who cares? The Behavior Analyst, 4(2), 93–102.
Poling, A., Weeden, M. A., Redner, R., & Foster, T. M. (2011). Switch hitting in baseball: apparent rule-
following, not matching. Journal of the Experimental Analysis of Behavior, 96, 283–289. doi:10.1901
/jeab.2011.96-283.
Prendergast, M., Podus, M., Finney, J., Greenwell, L., & Roll, J. (2006). Contingency management for
treatment of substance use disorders: a meta-analysis. Addiction, 101, 1546–1560.
Pritchard, J. K., & Wine, B. (2015). Icing on the cake: the role of research in practitioner training. Behavior
Analysis in Practice, 8, 140–141. doi:10.1007/s40617-015-0074-z.
Reed, D. D., Critchfield, T. S., & Martens, B. K. (2006). Operant choice in elite sport competition: the
matching law and play-calling in professional football. Journal of Applied Behavior Analysis, 39, 281–
297. doi:10.1901/jaba.2006.146-05.
Reed, D. D., Partington, S. W., Kaplan, B. A., Roma, P. G., & Hursh, S. R. (2013). Behavioral economic
analysis of demand for fuel in North America. Journal of Applied Behavior Analysis, 46, 651–655.
doi:10.1002/jaba.64.
Richa, S., Fahed, M., Khoury, E., & Mishara, B. (2014). Suicide in autism spectrum disorders. Archives of
Suicide Research, 18, 327–339. doi:10.1080/13811118.2013.824834.
Rider, D. P. (1991). The speciation of behavior analysis. The Behavior Analyst, 14, 171–181.
158 BEHAV ANALYST (2017) 40:123–159

Risley, T. R. (2001). Do good, take data. In W. T. O’Donohue, D. A. Henderson, S. C. Hayes, J. E. Fisher, &
L. J. Hayes (Eds.), A history of the behavioral therapies: founders’ personal histories (pp. 267–287).
Reno, NV: Context Press.
Roane, H. S., Fisher, W. W., & McDonough, E. M. (2003). Progressing from programmatic to discovery
research: a case example with the overjustification effect. Journal of Applied Behavior Analysis, 36, 35–
46. doi:10.1901/jaba.2003.36-35.
Roche, B., & Barnes, D. (1996). Arbitrarily applicable relational responding and sexual categorization: a
critical test of the derived difference relation. Psychological Record, 46, 451–476.
Root-Bernstein, D. (1988). Setting the stage for discovery: breakthroughs depend on more than luck. The
Sciences, 28(3), 26–34. doi:10.1002/j.2326-1951.1988.tb03019.x.
Rosales, R., Rehfeldt, R. A., & Lovett, S. (2011). Effects of multiple exemplar training on the emergence of
derived relations in preschool children learning a second language. The Analysis of Verbal Behavior, 27,
61–74.
Rosch, E., & Mervis, C. B. (1975). Family resemblances: studies in the internal structure of categories.
Cognitive Psychology, 7, 573–605. doi:10.1016/0010-0285(75)90024-9.
Rutherford, A. (2009). Beyond the box: B.F. Skinner’s technology of behavior from laboratory to life, 1950s–
1970s. Toronto: University of Toronto Press.
Schlinger, H. D. (2010). Perspectives on the future of behavior analysis: introductory comments. The Behavior
Analyst, 33, 1–5.
Seekins, T., Fawcett, S. C., Cohen, S. H., Elder, J. P., Jason, L. A., Schnelle, J. F., & Winett, R. A. (1988).
Experimental evaluation of public policy: the case of state legislation for child passenger safety. Journal
of Applied Behavior Analysis, 21, 233–243. doi:10.1901/jaba.1988.21-233.
Seniuk, H. A., Williams, W. L., Reed, D. D., & Wright, J. W. (2015). An examination of matching with
multiple response alternatives in professional hockey. Behavior Analysis: Research and Practice, 15,
152–160. doi:10.1037/bar0000019.
Shook, G. L., & Favell, J. E. (2008). The Behavior Analyst Certification Board and the profession of behavior
analysis. Behavior Analysis in Practice, 1(1), 44–48.
Shute, N. (2010). Desperate for an autism cure. Scientific American, 303(4), 80–85. doi:10.1038
/scientificamerican1010-80.
Sidman, M. (1960). Tactics of scientific research. New York: Basic Books.
Sidman, M. (1995). Foreword I: K&S: then and now. In F. S. Keller & W. S. Schoenfeld (Eds.), Principles of
psychology: a systematic text in the science of behavior (unpaginated preface), B.F. Skinner Foundation
Reprint Series. Cambridge: MA: B.F. Skinner Foundation.
Sidman, M. (2011). Can an understanding of basic research facilitate the effectiveness of practitioners?
Reflections and personal perspectives. Journal of Applied Behavior Analysis, 44, 973–991. doi:10.1901
/jaba.2011.44-973.
Skinner, B.F. (1938). The behavior of organisms: an experimental analysis. NY: Appleton-Century.
Skinner, BF. (1948). Walden Two. New York: Macmillan.
Skinner, B.F. (1953). Science and human behavior. NY: Macmillan.
Skinner, B. F. (1956). A case history in scientific method. The American Psychologist, 11, 221–233.
doi:10.1037/h0047662.
Skinner, B. F. (1957). Verbal behavior. New York: Appleton-Century-Crofts. doi:10.1037/11256-000.
Skinner, B.F. (1959). Cumulative record. New York: Appleton-Century-Crofts.
Skinner, B. F. (1978). Reflections on behaviorism and society. Englewood Cliffs, NJ: Prentice-Hall.
Stokes, D. E. (1997). Pasteur’s quadrant: basic science and technological innovation. Washington, DC:
Brookings Institution Press.
Strosahl, K. D., Hayes, S. C., Bergan, J., & Romano, P. (1999). Assessing the field effectiveness of acceptance
and commitment therapy: an example of the manipulated training research method. Behavior Therapy, 29,
35–63. doi:10.1016/s0005-7894(98)80017-8.
Thorp, F. H. (1898). Outlines of industrial chemistry: a text-book for students. New York: Macmillan.
Tiger, J. H., & Hanley, G. P. (2005). An example of discovery research involving the transfer of stimulus
control. Journal of Applied Behavior Analysis, 38, 499–509. doi:10.1901/jaba.2005.139-04.
Tucker, J. A., Vuchinich, R. E., & Pukish, M. M. (1995). Molar environmental contexts surrounding recovery
from alcohol problems by treated and untreated problem drinkers. Experimental and Clinical
Psychopharmacology, 3, 195–204. doi:10.1037//1064-1297.3.2.195.
Tucker, J. A., Vuchinich, R. E., & Rippens, P. D. (2002). Predicting natural resolution of alcohol-related
problems: a prospective behavioral economic analysis. Experimental and Clinical Psychopharmacology,
10, 248–257. doi:10.1037//1064-1297.10.3.248.
BEHAV ANALYST (2017) 40:123–159 159

Virues-Ortega, J., Hurtado-Parrado, C., Cox, A. D., & Pear, J. J. (2014). Analysis of the interaction between
experimental and applied behavior analysis. Journal of Applied Behavior Analysis, 47, 380–403.
doi:10.1002/jaba.124.
Vollmer, T. R., & Bourret, J. (2000). An application of the matching law to evaluate the allocation of two- and
three-point shots by college basketball players. Journal of Applied Behavior Analysis, 33, 137–150.
doi:10.1901/jaba.2000.33-137.
Wacker, D. P. (1996). Behavior analysis research in JABA: a need for studies that bridge basic and applied
research. Experimental Analysis of Human Behavior Bulletin, 14, 11–14.
Wacker, D. P. (2000). Building a bridge between research in experimental and applied behavior analysis. In J.
C. Leslie & D. Blackman (Eds.), Experimental and applied analysis of human behavior (pp. 205–212).
Reno, NV: Context Press.
Wacker, D. P. (2003). Bridge studies in behavior analysis: evolution and challenges in JABA. The Behavior
Analyst Today, 3, 405–411. doi:10.1037/h0099998.
Wang, Y., & Beydoun, M. A. (2007). The obesity epidemic in the United States—gender, age, socioeconomic,
racial/ethnic, and geographic characteristics: a systematic review and meta-regression analysis.
Epidemiologic Reviews, 29, 6–28. doi:10.1093/epirev/mxm007.
Watson, J. B. (1913). Psychology as the behaviorist views it. Psychological Review, 20(2), 158–177.
doi:10.1037/h0074428.
Weisberg, P., & Waldrop, P. B. (1977). Fixed-interval work habits of Congress. Journal of Applied Behavior
Analysis, 5, 93–97. doi:10.1901/jaba.1972.5-93.
Williams, C. D. (1959). The elimination of tantrum behavior by extinction procedures. Journal of Abnormal
and Social Psychology, 59, 269. doi:10.1037/h0046688.
Wolf, M. M. (1978). Social validity: the case for subjective measurement or how applied behavior analysis is
finding its heart. Journal of Applied Behavior Analysis, 11, 203–214. doi:10.1901/jaba.1978.11-203.
Wolf, M. M. (2001). Application of operant conditioning procedures to the behavior problems of an autistic
child: a 25-year follow-up and the development of the Teaching Family model. In W. T. O’Donohue, D.
A. Henderson, S. C. Hayes, J. E. Fisher, & L. J. Hayes (Eds.), A history of the behavioral therapies:
founders’ personal histories (pp. 289–293). Reno, NV: Context Press.

You might also like