Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • View all journals
  • Explore content
  • About the journal
  • Publish with us
  • Sign up for alerts
  • Perspective
  • Published: 22 November 2022

Single case studies are a powerful tool for developing, testing and extending theories

  • Lyndsey Nickels   ORCID: orcid.org/0000-0002-0311-3524 1 , 2 ,
  • Simon Fischer-Baum   ORCID: orcid.org/0000-0002-6067-0538 3 &
  • Wendy Best   ORCID: orcid.org/0000-0001-8375-5916 4  

Nature Reviews Psychology volume  1 ,  pages 733–747 ( 2022 ) Cite this article

838 Accesses

6 Citations

25 Altmetric

Metrics details

  • Neurological disorders

Psychology embraces a diverse range of methodologies. However, most rely on averaging group data to draw conclusions. In this Perspective, we argue that single case methodology is a valuable tool for developing and extending psychological theories. We stress the importance of single case and case series research, drawing on classic and contemporary cases in which cognitive and perceptual deficits provide insights into typical cognitive processes in domains such as memory, delusions, reading and face perception. We unpack the key features of single case methodology, describe its strengths, its value in adjudicating between theories, and outline its benefits for a better understanding of deficits and hence more appropriate interventions. The unique insights that single case studies have provided illustrate the value of in-depth investigation within an individual. Single case methodology has an important place in the psychologist’s toolkit and it should be valued as a primary research tool.

This is a preview of subscription content, access via your institution

Access options

Subscribe to this journal

Receive 12 digital issues and online access to articles

55,14 € per year

only 4,60 € per issue

Buy this article

  • Purchase on SpringerLink
  • Instant access to full article PDF

Prices may be subject to local taxes which are calculated during checkout

what is single case study design

Similar content being viewed by others

what is single case study design

Comparing meta-analyses and preregistered multiple-laboratory replication projects

what is single case study design

The fundamental importance of method to theory

what is single case study design

A critical evaluation of the p -factor literature

Corkin, S. Permanent Present Tense: The Unforgettable Life Of The Amnesic Patient, H. M . Vol. XIX, 364 (Basic Books, 2013).

Lilienfeld, S. O. Psychology: From Inquiry To Understanding (Pearson, 2019).

Schacter, D. L., Gilbert, D. T., Nock, M. K. & Wegner, D. M. Psychology (Worth Publishers, 2019).

Eysenck, M. W. & Brysbaert, M. Fundamentals Of Cognition (Routledge, 2018).

Squire, L. R. Memory and brain systems: 1969–2009. J. Neurosci. 29 , 12711–12716 (2009).

Article   PubMed   PubMed Central   Google Scholar  

Corkin, S. What’s new with the amnesic patient H.M.? Nat. Rev. Neurosci. 3 , 153–160 (2002).

Article   PubMed   Google Scholar  

Schubert, T. M. et al. Lack of awareness despite complex visual processing: evidence from event-related potentials in a case of selective metamorphopsia. Proc. Natl Acad. Sci. USA 117 , 16055–16064 (2020).

Behrmann, M. & Plaut, D. C. Bilateral hemispheric processing of words and faces: evidence from word impairments in prosopagnosia and face impairments in pure alexia. Cereb. Cortex 24 , 1102–1118 (2014).

Plaut, D. C. & Behrmann, M. Complementary neural representations for faces and words: a computational exploration. Cogn. Neuropsychol. 28 , 251–275 (2011).

Haxby, J. V. et al. Distributed and overlapping representations of faces and objects in ventral temporal cortex. Science 293 , 2425–2430 (2001).

Hirshorn, E. A. et al. Decoding and disrupting left midfusiform gyrus activity during word reading. Proc. Natl Acad. Sci. USA 113 , 8162–8167 (2016).

Kosakowski, H. L. et al. Selective responses to faces, scenes, and bodies in the ventral visual pathway of infants. Curr. Biol. 32 , 265–274.e5 (2022).

Harlow, J. Passage of an iron rod through the head. Boston Med. Surgical J . https://doi.org/10.1176/jnp.11.2.281 (1848).

Broca, P. Remarks on the seat of the faculty of articulated language, following an observation of aphemia (loss of speech). Bull. Soc. Anat. 6 , 330–357 (1861).

Google Scholar  

Dejerine, J. Contribution A L’étude Anatomo-pathologique Et Clinique Des Différentes Variétés De Cécité Verbale: I. Cécité Verbale Avec Agraphie Ou Troubles Très Marqués De L’écriture; II. Cécité Verbale Pure Avec Intégrité De L’écriture Spontanée Et Sous Dictée (Société de Biologie, 1892).

Liepmann, H. Das Krankheitsbild der Apraxie (“motorischen Asymbolie”) auf Grund eines Falles von einseitiger Apraxie (Fortsetzung). Eur. Neurol. 8 , 102–116 (1900).

Article   Google Scholar  

Basso, A., Spinnler, H., Vallar, G. & Zanobio, M. E. Left hemisphere damage and selective impairment of auditory verbal short-term memory. A case study. Neuropsychologia 20 , 263–274 (1982).

Humphreys, G. W. & Riddoch, M. J. The fractionation of visual agnosia. In Visual Object Processing: A Cognitive Neuropsychological Approach 281–306 (Lawrence Erlbaum, 1987).

Whitworth, A., Webster, J. & Howard, D. A Cognitive Neuropsychological Approach To Assessment And Intervention In Aphasia (Psychology Press, 2014).

Caramazza, A. On drawing inferences about the structure of normal cognitive systems from the analysis of patterns of impaired performance: the case for single-patient studies. Brain Cogn. 5 , 41–66 (1986).

Caramazza, A. & McCloskey, M. The case for single-patient studies. Cogn. Neuropsychol. 5 , 517–527 (1988).

Shallice, T. Cognitive neuropsychology and its vicissitudes: the fate of Caramazza’s axioms. Cogn. Neuropsychol. 32 , 385–411 (2015).

Shallice, T. From Neuropsychology To Mental Structure (Cambridge Univ. Press, 1988).

Coltheart, M. Assumptions and methods in cognitive neuropscyhology. In The Handbook Of Cognitive Neuropsychology: What Deficits Reveal About The Human Mind (ed. Rapp, B.) 3–22 (Psychology Press, 2001).

McCloskey, M. & Chaisilprungraung, T. The value of cognitive neuropsychology: the case of vision research. Cogn. Neuropsychol. 34 , 412–419 (2017).

McCloskey, M. The future of cognitive neuropsychology. In The Handbook Of Cognitive Neuropsychology: What Deficits Reveal About The Human Mind (ed. Rapp, B.) 593–610 (Psychology Press, 2001).

Lashley, K. S. In search of the engram. In Physiological Mechanisms in Animal Behavior 454–482 (Academic Press, 1950).

Squire, L. R. & Wixted, J. T. The cognitive neuroscience of human memory since H.M. Annu. Rev. Neurosci. 34 , 259–288 (2011).

Stone, G. O., Vanhoy, M. & Orden, G. C. V. Perception is a two-way street: feedforward and feedback phonology in visual word recognition. J. Mem. Lang. 36 , 337–359 (1997).

Perfetti, C. A. The psycholinguistics of spelling and reading. In Learning To Spell: Research, Theory, And Practice Across Languages 21–38 (Lawrence Erlbaum, 1997).

Nickels, L. The autocue? self-generated phonemic cues in the treatment of a disorder of reading and naming. Cogn. Neuropsychol. 9 , 155–182 (1992).

Rapp, B., Benzing, L. & Caramazza, A. The autonomy of lexical orthography. Cogn. Neuropsychol. 14 , 71–104 (1997).

Bonin, P., Roux, S. & Barry, C. Translating nonverbal pictures into verbal word names. Understanding lexical access and retrieval. In Past, Present, And Future Contributions Of Cognitive Writing Research To Cognitive Psychology 315–522 (Psychology Press, 2011).

Bonin, P., Fayol, M. & Gombert, J.-E. Role of phonological and orthographic codes in picture naming and writing: an interference paradigm study. Cah. Psychol. Cogn./Current Psychol. Cogn. 16 , 299–324 (1997).

Bonin, P., Fayol, M. & Peereman, R. Masked form priming in writing words from pictures: evidence for direct retrieval of orthographic codes. Acta Psychol. 99 , 311–328 (1998).

Bentin, S., Allison, T., Puce, A., Perez, E. & McCarthy, G. Electrophysiological studies of face perception in humans. J. Cogn. Neurosci. 8 , 551–565 (1996).

Jeffreys, D. A. Evoked potential studies of face and object processing. Vis. Cogn. 3 , 1–38 (1996).

Laganaro, M., Morand, S., Michel, C. M., Spinelli, L. & Schnider, A. ERP correlates of word production before and after stroke in an aphasic patient. J. Cogn. Neurosci. 23 , 374–381 (2011).

Indefrey, P. & Levelt, W. J. M. The spatial and temporal signatures of word production components. Cognition 92 , 101–144 (2004).

Valente, A., Burki, A. & Laganaro, M. ERP correlates of word production predictors in picture naming: a trial by trial multiple regression analysis from stimulus onset to response. Front. Neurosci. 8 , 390 (2014).

Kittredge, A. K., Dell, G. S., Verkuilen, J. & Schwartz, M. F. Where is the effect of frequency in word production? Insights from aphasic picture-naming errors. Cogn. Neuropsychol. 25 , 463–492 (2008).

Domdei, N. et al. Ultra-high contrast retinal display system for single photoreceptor psychophysics. Biomed. Opt. Express 9 , 157 (2018).

Poldrack, R. A. et al. Long-term neural and physiological phenotyping of a single human. Nat. Commun. 6 , 8885 (2015).

Coltheart, M. The assumptions of cognitive neuropsychology: reflections on Caramazza (1984, 1986). Cogn. Neuropsychol. 34 , 397–402 (2017).

Badecker, W. & Caramazza, A. A final brief in the case against agrammatism: the role of theory in the selection of data. Cognition 24 , 277–282 (1986).

Fischer-Baum, S. Making sense of deviance: Identifying dissociating cases within the case series approach. Cogn. Neuropsychol. 30 , 597–617 (2013).

Nickels, L., Howard, D. & Best, W. On the use of different methodologies in cognitive neuropsychology: drink deep and from several sources. Cogn. Neuropsychol. 28 , 475–485 (2011).

Dell, G. S. & Schwartz, M. F. Who’s in and who’s out? Inclusion criteria, model evaluation, and the treatment of exceptions in case series. Cogn. Neuropsychol. 28 , 515–520 (2011).

Schwartz, M. F. & Dell, G. S. Case series investigations in cognitive neuropsychology. Cogn. Neuropsychol. 27 , 477–494 (2010).

Cohen, J. A power primer. Psychol. Bull. 112 , 155–159 (1992).

Martin, R. C. & Allen, C. Case studies in neuropsychology. In APA Handbook Of Research Methods In Psychology Vol. 2 Research Designs: Quantitative, Qualitative, Neuropsychological, And Biological (eds Cooper, H. et al.) 633–646 (American Psychological Association, 2012).

Leivada, E., Westergaard, M., Duñabeitia, J. A. & Rothman, J. On the phantom-like appearance of bilingualism effects on neurocognition: (how) should we proceed? Bilingualism 24 , 197–210 (2021).

Arnett, J. J. The neglected 95%: why American psychology needs to become less American. Am. Psychol. 63 , 602–614 (2008).

Stolz, J. A., Besner, D. & Carr, T. H. Implications of measures of reliability for theories of priming: activity in semantic memory is inherently noisy and uncoordinated. Vis. Cogn. 12 , 284–336 (2005).

Cipora, K. et al. A minority pulls the sample mean: on the individual prevalence of robust group-level cognitive phenomena — the instance of the SNARC effect. Preprint at psyArXiv https://doi.org/10.31234/osf.io/bwyr3 (2019).

Andrews, S., Lo, S. & Xia, V. Individual differences in automatic semantic priming. J. Exp. Psychol. Hum. Percept. Perform. 43 , 1025–1039 (2017).

Tan, L. C. & Yap, M. J. Are individual differences in masked repetition and semantic priming reliable? Vis. Cogn. 24 , 182–200 (2016).

Olsson-Collentine, A., Wicherts, J. M. & van Assen, M. A. L. M. Heterogeneity in direct replications in psychology and its association with effect size. Psychol. Bull. 146 , 922–940 (2020).

Gratton, C. & Braga, R. M. Editorial overview: deep imaging of the individual brain: past, practice, and promise. Curr. Opin. Behav. Sci. 40 , iii–vi (2021).

Fedorenko, E. The early origins and the growing popularity of the individual-subject analytic approach in human neuroscience. Curr. Opin. Behav. Sci. 40 , 105–112 (2021).

Xue, A. et al. The detailed organization of the human cerebellum estimated by intrinsic functional connectivity within the individual. J. Neurophysiol. 125 , 358–384 (2021).

Petit, S. et al. Toward an individualized neural assessment of receptive language in children. J. Speech Lang. Hear. Res. 63 , 2361–2385 (2020).

Jung, K.-H. et al. Heterogeneity of cerebral white matter lesions and clinical correlates in older adults. Stroke 52 , 620–630 (2021).

Falcon, M. I., Jirsa, V. & Solodkin, A. A new neuroinformatics approach to personalized medicine in neurology: the virtual brain. Curr. Opin. Neurol. 29 , 429–436 (2016).

Duncan, G. J., Engel, M., Claessens, A. & Dowsett, C. J. Replication and robustness in developmental research. Dev. Psychol. 50 , 2417–2425 (2014).

Open Science Collaboration. Estimating the reproducibility of psychological science. Science 349 , aac4716 (2015).

Tackett, J. L., Brandes, C. M., King, K. M. & Markon, K. E. Psychology’s replication crisis and clinical psychological science. Annu. Rev. Clin. Psychol. 15 , 579–604 (2019).

Munafò, M. R. et al. A manifesto for reproducible science. Nat. Hum. Behav. 1 , 0021 (2017).

Oldfield, R. C. & Wingfield, A. The time it takes to name an object. Nature 202 , 1031–1032 (1964).

Oldfield, R. C. & Wingfield, A. Response latencies in naming objects. Q. J. Exp. Psychol. 17 , 273–281 (1965).

Brysbaert, M. How many participants do we have to include in properly powered experiments? A tutorial of power analysis with reference tables. J. Cogn. 2 , 16 (2019).

Brysbaert, M. Power considerations in bilingualism research: time to step up our game. Bilingualism https://doi.org/10.1017/S1366728920000437 (2020).

Machery, E. What is a replication? Phil. Sci. 87 , 545–567 (2020).

Nosek, B. A. & Errington, T. M. What is replication? PLoS Biol. 18 , e3000691 (2020).

Li, X., Huang, L., Yao, P. & Hyönä, J. Universal and specific reading mechanisms across different writing systems. Nat. Rev. Psychol. 1 , 133–144 (2022).

Rapp, B. (Ed.) The Handbook Of Cognitive Neuropsychology: What Deficits Reveal About The Human Mind (Psychology Press, 2001).

Code, C. et al. Classic Cases In Neuropsychology (Psychology Press, 1996).

Patterson, K., Marshall, J. C. & Coltheart, M. Surface Dyslexia: Neuropsychological And Cognitive Studies Of Phonological Reading (Routledge, 2017).

Marshall, J. C. & Newcombe, F. Patterns of paralexia: a psycholinguistic approach. J. Psycholinguist. Res. 2 , 175–199 (1973).

Castles, A. & Coltheart, M. Varieties of developmental dyslexia. Cognition 47 , 149–180 (1993).

Khentov-Kraus, L. & Friedmann, N. Vowel letter dyslexia. Cogn. Neuropsychol. 35 , 223–270 (2018).

Winskel, H. Orthographic and phonological parafoveal processing of consonants, vowels, and tones when reading Thai. Appl. Psycholinguist. 32 , 739–759 (2011).

Hepner, C., McCloskey, M. & Rapp, B. Do reading and spelling share orthographic representations? Evidence from developmental dysgraphia. Cogn. Neuropsychol. 34 , 119–143 (2017).

Hanley, J. R. & Sotiropoulos, A. Developmental surface dysgraphia without surface dyslexia. Cogn. Neuropsychol. 35 , 333–341 (2018).

Zihl, J. & Heywood, C. A. The contribution of single case studies to the neuroscience of vision: single case studies in vision neuroscience. Psych. J. 5 , 5–17 (2016).

Bouvier, S. E. & Engel, S. A. Behavioral deficits and cortical damage loci in cerebral achromatopsia. Cereb. Cortex 16 , 183–191 (2006).

Zihl, J. & Heywood, C. A. The contribution of LM to the neuroscience of movement vision. Front. Integr. Neurosci. 9 , 6 (2015).

Dotan, D. & Friedmann, N. Separate mechanisms for number reading and word reading: evidence from selective impairments. Cortex 114 , 176–192 (2019).

McCloskey, M. & Schubert, T. Shared versus separate processes for letter and digit identification. Cogn. Neuropsychol. 31 , 437–460 (2014).

Fayol, M. & Seron, X. On numerical representations. Insights from experimental, neuropsychological, and developmental research. In Handbook of Mathematical Cognition (ed. Campbell, J.) 3–23 (Psychological Press, 2005).

Bornstein, B. & Kidron, D. P. Prosopagnosia. J. Neurol. Neurosurg. Psychiat. 22 , 124–131 (1959).

Kühn, C. D., Gerlach, C., Andersen, K. B., Poulsen, M. & Starrfelt, R. Face recognition in developmental dyslexia: evidence for dissociation between faces and words. Cogn. Neuropsychol. 38 , 107–115 (2021).

Barton, J. J. S., Albonico, A., Susilo, T., Duchaine, B. & Corrow, S. L. Object recognition in acquired and developmental prosopagnosia. Cogn. Neuropsychol. 36 , 54–84 (2019).

Renault, B., Signoret, J.-L., Debruille, B., Breton, F. & Bolgert, F. Brain potentials reveal covert facial recognition in prosopagnosia. Neuropsychologia 27 , 905–912 (1989).

Bauer, R. M. Autonomic recognition of names and faces in prosopagnosia: a neuropsychological application of the guilty knowledge test. Neuropsychologia 22 , 457–469 (1984).

Haan, E. H. F., de, Young, A. & Newcombe, F. Face recognition without awareness. Cogn. Neuropsychol. 4 , 385–415 (1987).

Ellis, H. D. & Lewis, M. B. Capgras delusion: a window on face recognition. Trends Cogn. Sci. 5 , 149–156 (2001).

Ellis, H. D., Young, A. W., Quayle, A. H. & De Pauw, K. W. Reduced autonomic responses to faces in Capgras delusion. Proc. R. Soc. Lond. B 264 , 1085–1092 (1997).

Collins, M. N., Hawthorne, M. E., Gribbin, N. & Jacobson, R. Capgras’ syndrome with organic disorders. Postgrad. Med. J. 66 , 1064–1067 (1990).

Enoch, D., Puri, B. K. & Ball, H. Uncommon Psychiatric Syndromes 5th edn (Routledge, 2020).

Tranel, D., Damasio, H. & Damasio, A. R. Double dissociation between overt and covert face recognition. J. Cogn. Neurosci. 7 , 425–432 (1995).

Brighetti, G., Bonifacci, P., Borlimi, R. & Ottaviani, C. “Far from the heart far from the eye”: evidence from the Capgras delusion. Cogn. Neuropsychiat. 12 , 189–197 (2007).

Coltheart, M., Langdon, R. & McKay, R. Delusional belief. Annu. Rev. Psychol. 62 , 271–298 (2011).

Coltheart, M. Cognitive neuropsychiatry and delusional belief. Q. J. Exp. Psychol. 60 , 1041–1062 (2007).

Coltheart, M. & Davies, M. How unexpected observations lead to new beliefs: a Peircean pathway. Conscious. Cogn. 87 , 103037 (2021).

Coltheart, M. & Davies, M. Failure of hypothesis evaluation as a factor in delusional belief. Cogn. Neuropsychiat. 26 , 213–230 (2021).

McCloskey, M. et al. A developmental deficit in localizing objects from vision. Psychol. Sci. 6 , 112–117 (1995).

McCloskey, M., Valtonen, J. & Cohen Sherman, J. Representing orientation: a coordinate-system hypothesis and evidence from developmental deficits. Cogn. Neuropsychol. 23 , 680–713 (2006).

McCloskey, M. Spatial representations and multiple-visual-systems hypotheses: evidence from a developmental deficit in visual location and orientation processing. Cortex 40 , 677–694 (2004).

Gregory, E. & McCloskey, M. Mirror-image confusions: implications for representation and processing of object orientation. Cognition 116 , 110–129 (2010).

Gregory, E., Landau, B. & McCloskey, M. Representation of object orientation in children: evidence from mirror-image confusions. Vis. Cogn. 19 , 1035–1062 (2011).

Laine, M. & Martin, N. Cognitive neuropsychology has been, is, and will be significant to aphasiology. Aphasiology 26 , 1362–1376 (2012).

Howard, D. & Patterson, K. The Pyramids And Palm Trees Test: A Test Of Semantic Access From Words And Pictures (Thames Valley Test Co., 1992).

Kay, J., Lesser, R. & Coltheart, M. PALPA: Psycholinguistic Assessments Of Language Processing In Aphasia. 2: Picture & Word Semantics, Sentence Comprehension (Erlbaum, 2001).

Franklin, S. Dissociations in auditory word comprehension; evidence from nine fluent aphasic patients. Aphasiology 3 , 189–207 (1989).

Howard, D., Swinburn, K. & Porter, G. Putting the CAT out: what the comprehensive aphasia test has to offer. Aphasiology 24 , 56–74 (2010).

Conti-Ramsden, G., Crutchley, A. & Botting, N. The extent to which psychometric tests differentiate subgroups of children with SLI. J. Speech Lang. Hear. Res. 40 , 765–777 (1997).

Bishop, D. V. M. & McArthur, G. M. Individual differences in auditory processing in specific language impairment: a follow-up study using event-related potentials and behavioural thresholds. Cortex 41 , 327–341 (2005).

Bishop, D. V. M., Snowling, M. J., Thompson, P. A. & Greenhalgh, T., and the CATALISE-2 consortium. Phase 2 of CATALISE: a multinational and multidisciplinary Delphi consensus study of problems with language development: terminology. J. Child. Psychol. Psychiat. 58 , 1068–1080 (2017).

Wilson, A. J. et al. Principles underlying the design of ‘the number race’, an adaptive computer game for remediation of dyscalculia. Behav. Brain Funct. 2 , 19 (2006).

Basso, A. & Marangolo, P. Cognitive neuropsychological rehabilitation: the emperor’s new clothes? Neuropsychol. Rehabil. 10 , 219–229 (2000).

Murad, M. H., Asi, N., Alsawas, M. & Alahdab, F. New evidence pyramid. Evidence-based Med. 21 , 125–127 (2016).

Greenhalgh, T., Howick, J. & Maskrey, N., for the Evidence Based Medicine Renaissance Group. Evidence based medicine: a movement in crisis? Br. Med. J. 348 , g3725–g3725 (2014).

Best, W., Ping Sze, W., Edmundson, A. & Nickels, L. What counts as evidence? Swimming against the tide: valuing both clinically informed experimentally controlled case series and randomized controlled trials in intervention research. Evidence-based Commun. Assess. Interv. 13 , 107–135 (2019).

Best, W. et al. Understanding differing outcomes from semantic and phonological interventions with children with word-finding difficulties: a group and case series study. Cortex 134 , 145–161 (2021).

OCEBM Levels of Evidence Working Group. The Oxford Levels of Evidence 2. CEBM https://www.cebm.ox.ac.uk/resources/levels-of-evidence/ocebm-levels-of-evidence (2011).

Holler, D. E., Behrmann, M. & Snow, J. C. Real-world size coding of solid objects, but not 2-D or 3-D images, in visual agnosia patients with bilateral ventral lesions. Cortex 119 , 555–568 (2019).

Duchaine, B. C., Yovel, G., Butterworth, E. J. & Nakayama, K. Prosopagnosia as an impairment to face-specific mechanisms: elimination of the alternative hypotheses in a developmental case. Cogn. Neuropsychol. 23 , 714–747 (2006).

Hartley, T. et al. The hippocampus is required for short-term topographical memory in humans. Hippocampus 17 , 34–48 (2007).

Pishnamazi, M. et al. Attentional bias towards and away from fearful faces is modulated by developmental amygdala damage. Cortex 81 , 24–34 (2016).

Rapp, B., Fischer-Baum, S. & Miozzo, M. Modality and morphology: what we write may not be what we say. Psychol. Sci. 26 , 892–902 (2015).

Yong, K. X. X., Warren, J. D., Warrington, E. K. & Crutch, S. J. Intact reading in patients with profound early visual dysfunction. Cortex 49 , 2294–2306 (2013).

Rockland, K. S. & Van Hoesen, G. W. Direct temporal–occipital feedback connections to striate cortex (V1) in the macaque monkey. Cereb. Cortex 4 , 300–313 (1994).

Haynes, J.-D., Driver, J. & Rees, G. Visibility reflects dynamic changes of effective connectivity between V1 and fusiform cortex. Neuron 46 , 811–821 (2005).

Tanaka, K. Mechanisms of visual object recognition: monkey and human studies. Curr. Opin. Neurobiol. 7 , 523–529 (1997).

Fischer-Baum, S., McCloskey, M. & Rapp, B. Representation of letter position in spelling: evidence from acquired dysgraphia. Cognition 115 , 466–490 (2010).

Houghton, G. The problem of serial order: a neural network model of sequence learning and recall. In Current Research In Natural Language Generation (eds Dale, R., Mellish, C. & Zock, M.) 287–319 (Academic Press, 1990).

Fieder, N., Nickels, L., Biedermann, B. & Best, W. From “some butter” to “a butter”: an investigation of mass and count representation and processing. Cogn. Neuropsychol. 31 , 313–349 (2014).

Fieder, N., Nickels, L., Biedermann, B. & Best, W. How ‘some garlic’ becomes ‘a garlic’ or ‘some onion’: mass and count processing in aphasia. Neuropsychologia 75 , 626–645 (2015).

Schröder, A., Burchert, F. & Stadie, N. Training-induced improvement of noncanonical sentence production does not generalize to comprehension: evidence for modality-specific processes. Cogn. Neuropsychol. 32 , 195–220 (2015).

Stadie, N. et al. Unambiguous generalization effects after treatment of non-canonical sentence production in German agrammatism. Brain Lang. 104 , 211–229 (2008).

Schapiro, A. C., Gregory, E., Landau, B., McCloskey, M. & Turk-Browne, N. B. The necessity of the medial temporal lobe for statistical learning. J. Cogn. Neurosci. 26 , 1736–1747 (2014).

Schapiro, A. C., Kustner, L. V. & Turk-Browne, N. B. Shaping of object representations in the human medial temporal lobe based on temporal regularities. Curr. Biol. 22 , 1622–1627 (2012).

Baddeley, A., Vargha-Khadem, F. & Mishkin, M. Preserved recognition in a case of developmental amnesia: implications for the acaquisition of semantic memory? J. Cogn. Neurosci. 13 , 357–369 (2001).

Snyder, J. J. & Chatterjee, A. Spatial-temporal anisometries following right parietal damage. Neuropsychologia 42 , 1703–1708 (2004).

Ashkenazi, S., Henik, A., Ifergane, G. & Shelef, I. Basic numerical processing in left intraparietal sulcus (IPS) acalculia. Cortex 44 , 439–448 (2008).

Lebrun, M.-A., Moreau, P., McNally-Gagnon, A., Mignault Goulet, G. & Peretz, I. Congenital amusia in childhood: a case study. Cortex 48 , 683–688 (2012).

Vannuscorps, G., Andres, M. & Pillon, A. When does action comprehension need motor involvement? Evidence from upper limb aplasia. Cogn. Neuropsychol. 30 , 253–283 (2013).

Jeannerod, M. Neural simulation of action: a unifying mechanism for motor cognition. NeuroImage 14 , S103–S109 (2001).

Blakemore, S.-J. & Decety, J. From the perception of action to the understanding of intention. Nat. Rev. Neurosci. 2 , 561–567 (2001).

Rizzolatti, G. & Craighero, L. The mirror-neuron system. Annu. Rev. Neurosci. 27 , 169–192 (2004).

Forde, E. M. E., Humphreys, G. W. & Remoundou, M. Disordered knowledge of action order in action disorganisation syndrome. Neurocase 10 , 19–28 (2004).

Mazzi, C. & Savazzi, S. The glamor of old-style single-case studies in the neuroimaging era: insights from a patient with hemianopia. Front. Psychol. 10 , 965 (2019).

Coltheart, M. What has functional neuroimaging told us about the mind (so far)? (Position Paper Presented to the European Cognitive Neuropsychology Workshop, Bressanone, 2005). Cortex 42 , 323–331 (2006).

Page, M. P. A. What can’t functional neuroimaging tell the cognitive psychologist? Cortex 42 , 428–443 (2006).

Blank, I. A., Kiran, S. & Fedorenko, E. Can neuroimaging help aphasia researchers? Addressing generalizability, variability, and interpretability. Cogn. Neuropsychol. 34 , 377–393 (2017).

Niv, Y. The primacy of behavioral research for understanding the brain. Behav. Neurosci. 135 , 601–609 (2021).

Crawford, J. R. & Howell, D. C. Comparing an individual’s test score against norms derived from small samples. Clin. Neuropsychol. 12 , 482–486 (1998).

Crawford, J. R., Garthwaite, P. H. & Ryan, K. Comparing a single case to a control sample: testing for neuropsychological deficits and dissociations in the presence of covariates. Cortex 47 , 1166–1178 (2011).

McIntosh, R. D. & Rittmo, J. Ö. Power calculations in single-case neuropsychology: a practical primer. Cortex 135 , 146–158 (2021).

Patterson, K. & Plaut, D. C. “Shallow draughts intoxicate the brain”: lessons from cognitive science for cognitive neuropsychology. Top. Cogn. Sci. 1 , 39–58 (2009).

Lambon Ralph, M. A., Patterson, K. & Plaut, D. C. Finite case series or infinite single-case studies? Comments on “Case series investigations in cognitive neuropsychology” by Schwartz and Dell (2010). Cogn. Neuropsychol. 28 , 466–474 (2011).

Horien, C., Shen, X., Scheinost, D. & Constable, R. T. The individual functional connectome is unique and stable over months to years. NeuroImage 189 , 676–687 (2019).

Epelbaum, S. et al. Pure alexia as a disconnection syndrome: new diffusion imaging evidence for an old concept. Cortex 44 , 962–974 (2008).

Fischer-Baum, S. & Campana, G. Neuroplasticity and the logic of cognitive neuropsychology. Cogn. Neuropsychol. 34 , 403–411 (2017).

Paul, S., Baca, E. & Fischer-Baum, S. Cerebellar contributions to orthographic working memory: a single case cognitive neuropsychological investigation. Neuropsychologia 171 , 108242 (2022).

Feinstein, J. S., Adolphs, R., Damasio, A. & Tranel, D. The human amygdala and the induction and experience of fear. Curr. Biol. 21 , 34–38 (2011).

Crawford, J., Garthwaite, P. & Gray, C. Wanted: fully operational definitions of dissociations in single-case studies. Cortex 39 , 357–370 (2003).

McIntosh, R. D. Simple dissociations for a higher-powered neuropsychology. Cortex 103 , 256–265 (2018).

McIntosh, R. D. & Brooks, J. L. Current tests and trends in single-case neuropsychology. Cortex 47 , 1151–1159 (2011).

Best, W., Schröder, A. & Herbert, R. An investigation of a relative impairment in naming non-living items: theoretical and methodological implications. J. Neurolinguistics 19 , 96–123 (2006).

Franklin, S., Howard, D. & Patterson, K. Abstract word anomia. Cogn. Neuropsychol. 12 , 549–566 (1995).

Coltheart, M., Patterson, K. E. & Marshall, J. C. Deep Dyslexia (Routledge, 1980).

Nickels, L., Kohnen, S. & Biedermann, B. An untapped resource: treatment as a tool for revealing the nature of cognitive processes. Cogn. Neuropsychol. 27 , 539–562 (2010).

Download references

Acknowledgements

The authors thank all of those pioneers of and advocates for single case study research who have mentored, inspired and encouraged us over the years, and the many other colleagues with whom we have discussed these issues.

Author information

Authors and affiliations.

School of Psychological Sciences & Macquarie University Centre for Reading, Macquarie University, Sydney, New South Wales, Australia

Lyndsey Nickels

NHMRC Centre of Research Excellence in Aphasia Recovery and Rehabilitation, Australia

Psychological Sciences, Rice University, Houston, TX, USA

Simon Fischer-Baum

Psychology and Language Sciences, University College London, London, UK

You can also search for this author in PubMed   Google Scholar

Contributions

L.N. led and was primarily responsible for the structuring and writing of the manuscript. All authors contributed to all aspects of the article.

Corresponding author

Correspondence to Lyndsey Nickels .

Ethics declarations

Competing interests.

The authors declare no competing interests.

Peer review

Peer review information.

Nature Reviews Psychology thanks Yanchao Bi, Rob McIntosh, and the other, anonymous, reviewer for their contribution to the peer review of this work.

Additional information

Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Cite this article.

Nickels, L., Fischer-Baum, S. & Best, W. Single case studies are a powerful tool for developing, testing and extending theories. Nat Rev Psychol 1 , 733–747 (2022). https://doi.org/10.1038/s44159-022-00127-y

Download citation

Accepted : 13 October 2022

Published : 22 November 2022

Issue Date : December 2022

DOI : https://doi.org/10.1038/s44159-022-00127-y

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

Quick links

  • Explore articles by subject
  • Guide to authors
  • Editorial policies

Sign up for the Nature Briefing newsletter — what matters in science, free to your inbox daily.

what is single case study design

An official website of the United States government

Official websites use .gov A .gov website belongs to an official government organization in the United States.

Secure .gov websites use HTTPS A lock ( Lock Locked padlock icon ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.

  • Publications
  • Account settings
  • Advanced Search
  • Journal List

NIHPA Author Manuscripts logo

Single-Case Experimental Designs: A Systematic Review of Published Research and Current Standards

Justin d smith.

  • Author information
  • Article notes
  • Copyright and License information

Address correspondence to Justin D. Smith, Child and Family Center, University of Oregon, 195 West 12th Avenue, Eugene, OR 97401-3408; [email protected]

Issue date 2012 Dec.

This article systematically reviews the research design and methodological characteristics of single-case experimental design (SCED) research published in peer-reviewed journals between 2000 and 2010. SCEDs provide researchers with a flexible and viable alternative to group designs with large sample sizes. However, methodological challenges have precluded widespread implementation and acceptance of the SCED as a viable complementary methodology to the predominant group design. This article includes a description of the research design, measurement, and analysis domains distinctive to the SCED; a discussion of the results within the framework of contemporary standards and guidelines in the field; and a presentation of updated benchmarks for key characteristics (e.g., baseline sampling, method of analysis), and overall, it provides researchers and reviewers with a resource for conducting and evaluating SCED research. The results of the systematic review of 409 studies suggest that recently published SCED research is largely in accordance with contemporary criteria for experimental quality. Analytic method emerged as an area of discord. Comparison of the findings of this review with historical estimates of the use of statistical analysis indicates an upward trend, but visual analysis remains the most common analytic method and also garners the most support amongst those entities providing SCED standards. Although consensus exists along key dimensions of single-case research design and researchers appear to be practicing within these parameters, there remains a need for further evaluation of assessment and sampling techniques and data analytic methods.

Keywords: daily diary, single-case experimental design, systematic review, time-series

The single-case experiment has a storied history in psychology dating back to the field’s founders: Fechner (1889) , Watson (1925) , and Skinner (1938) . It has been used to inform and develop theory, examine interpersonal processes, study the behavior of organisms, establish the effectiveness of psychological interventions, and address a host of other research questions (for a review, see Morgan & Morgan, 2001 ). In recent years the single-case experimental design (SCED) has been represented in the literature more often than in past decades, as is evidenced by recent reviews ( Hammond & Gast, 2010 ; Shadish & Sullivan, 2011 ), but it still languishes behind the more prominent group design in nearly all subfields of psychology. Group designs are often professed to be superior because they minimize, although do not necessarily eliminate, the major internal validity threats to drawing scientifically valid inferences from the results ( Shadish, Cook, & Campbell, 2002 ). SCEDs provide a rigorous, methodologically sound alternative method of evaluation (e.g., Barlow, Nock, & Hersen, 2008 ; Horner et al., 2005 ; Kazdin, 2010 ; Kratochwill & Levin, 2010 ; Shadish et al., 2002 ) but are often overlooked as a true experimental methodology capable of eliciting legitimate inferences (e.g., Barlow et al., 2008 ; Kazdin, 2010 ). Despite a shift in the zeitgeist from single-case experiments to group designs more than a half century ago, recent and rapid methodological advancements suggest that SCEDs are poised for resurgence.

Basics of the SCED

Single case refers to the participant or cluster of participants (e.g., a classroom, hospital, or neighborhood) under investigation. In contrast to an experimental group design in which one group is compared with another, participants in a single-subject experiment research provide their own control data for the purpose of comparison in a within-subject rather than a between-subjects design. SCEDs typically involve a comparison between two experimental time periods, known as phases. This approach typically includes collecting a representative baseline phase to serve as a comparison with subsequent phases. In studies examining single subjects that are actually groups (i.e., classroom, school), there are additional threats to internal validity of the results, as noted by Kratochwill and Levin (2010) , which include setting or site effects.

The central goal of the SCED is to determine whether a causal or functional relationship exists between a researcher-manipulated independent variable (IV) and a meaningful change in the dependent variable (DV). SCEDs generally involve repeated, systematic assessment of one or more IVs and DVs over time. The DV is measured repeatedly across and within all conditions or phases of the IV. Experimental control in SCEDs includes replication of the effect either within or between participants ( Horner et al., 2005 ). Randomization is another way in which threats to internal validity can be experimentally controlled. Kratochwill and Levin (2010) recently provided multiple suggestions for adding a randomization component to SCEDs to improve the methodological rigor and internal validity of the findings.

Examination of the effectiveness of interventions is perhaps the area in which SCEDs are most well represented ( Morgan & Morgan, 2001 ). Researchers in behavioral medicine and in clinical, health, educational, school, sport, rehabilitation, and counseling psychology often use SCEDs because they are particularly well suited to examining the processes and outcomes of psychological and behavioral interventions (e.g., Borckardt et al., 2008 ; Kazdin, 2010 ; Robey, Schultz, Crawford, & Sinner, 1999 ). Skepticism about the clinical utility of the randomized controlled trial (e.g., Jacobsen & Christensen, 1996 ; Wachtel, 2010 ; Westen & Bradley, 2005 ; Westen, Novotny, & Thompson-Brenner, 2004 ) has renewed researchers’ interest in SCEDs as a means to assess intervention outcomes (e.g., Borckardt et al., 2008 ; Dattilio, Edwards, & Fishman, 2010 ; Horner et al., 2005 ; Kratochwill, 2007 ; Kratochwill & Levin, 2010 ). Although SCEDs are relatively well represented in the intervention literature, it is by no means their sole home: Examples appear in nearly every subfield of psychology (e.g., Bolger, Davis, & Rafaeli, 2003 ; Piasecki, Hufford, Solham, & Trull, 2007 ; Reis & Gable, 2000 ; Shiffman, Stone, & Hufford, 2008 ; Soliday, Moore, & Lande, 2002 ). Aside from the current preference for group-based research designs, several methodological challenges have repressed the proliferation of the SCED.

Methodological Complexity

SCEDs undeniably present researchers with a complex array of methodological and research design challenges, such as establishing a representative baseline, managing the nonindependence of sequential observations (i.e., autocorrelation, serial dependence), interpreting single-subject effect sizes, analyzing the short data streams seen in many applications, and appropriately addressing the matter of missing observations. In the field of intervention research for example, Hser et al. (2001) noted that studies using SCEDs are “rare” because of the minimum number of observations that are necessary (e.g., 3–5 data points in each phase) and the complexity of available data analysis approaches. Advances in longitudinal person-based trajectory analysis (e.g., Nagin, 1999 ), structural equation modeling techniques (e.g., Lubke & Muthén, 2005 ), time-series forecasting (e.g., autoregressive integrated moving averages; Box & Jenkins, 1970 ), and statistical programs designed specifically for SCEDs (e.g., Simulation Modeling Analysis; Borckardt, 2006 ) have provided researchers with robust means of analysis, but they might not be feasible methods for the average psychological scientist.

Application of the SCED has also expanded. Today, researchers use variants of the SCED to examine complex psychological processes and the relationship between daily and momentary events in peoples’ lives and their psychological correlates. Research in nearly all subfields of psychology has begun to use daily diary and ecological momentary assessment (EMA) methods in the context of the SCED, opening the door to understanding increasingly complex psychological phenomena (see Bolger et al., 2003 ; Shiffman et al., 2008 ). In contrast to the carefully controlled laboratory experiment that dominated research in the first half of the twentieth century (e.g., Skinner, 1938 ; Watson, 1925 ), contemporary proponents advocate application of the SCED in naturalistic studies to increase the ecological validity of empirical findings (e.g., Bloom, Fisher, & Orme, 2003 ; Borckardt et al., 2008 ; Dattilio et al., 2010 ; Jacobsen & Christensen, 1996 ; Kazdin, 2008 ; Morgan & Morgan, 2001 ; Westen & Bradley, 2005 ; Westen et al., 2004 ). Recent advancements and expanded application of SCEDs indicate a need for updated design and reporting standards.

This Review

Many current benchmarks in the literature concerning key parameters of the SCED were established well before current advancements and innovations, such as the suggested minimum number of data points in the baseline phase(s), which remains a disputed area of SCED research (e.g., Center, Skiba, & Casey, 1986 ; Huitema, 1985 ; R. R. Jones, Vaught, & Weinrott, 1977 ; Sharpley, 1987 ). This article comprises (a) an examination of contemporary SCED methodological and reporting standards; (b) a systematic review of select design, measurement, and statistical characteristics of published SCED research during the past decade; and (c) a broad discussion of the critical aspects of this research to inform methodological improvements and study reporting standards. The reader will garner a fundamental understanding of what constitutes appropriate methodological soundness in single-case experimental research according to the established standards in the field, which can be used to guide the design of future studies, improve the presentation of publishable empirical findings, and inform the peer-review process. The discussion begins with the basic characteristics of the SCED, including an introduction to time-series, daily diary, and EMA strategies, and describes how current reporting and design standards apply to each of these areas of single-case research. Interweaved within this presentation are the results of a systematic review of SCED research published between 2000 and 2010 in peer-reviewed outlets and a discussion of the way in which these findings support, or differ from, existing design and reporting standards and published SCED benchmarks.

Review of Current SCED Guidelines and Reporting Standards

In contrast to experimental group comparison studies, which conform to generally well agreed upon methodological design and reporting guidelines, such as the CONSORT ( Moher, Schulz, Altman, & the CONSORT Group, 2001 ) and TREND ( Des Jarlais, Lyles, & Crepaz, 2004 ) statements for randomized and nonrandomized trials, respectively, there is comparatively much less consensus when it comes to the SCED. Until fairly recently, design and reporting guidelines for single-case experiments were almost entirely absent in the literature and were typically determined by the preferences of a research subspecialty or a particular journal’s editorial board. Factions still exist within the larger field of psychology, as can be seen in the collection of standards presented in this article, particularly in regard to data analytic methods of SCEDs, but fortunately there is budding agreement about certain design and measurement characteristics. A number of task forces, professional groups, and independent experts in the field have recently put forth guidelines; each has a relatively distinct purpose, which likely accounts for some of the discrepancies between them. In what is to be a central theme of this article, researchers are ultimately responsible for thoughtfully and synergistically combining research design, measurement, and analysis aspects of a study.

This review presents the more prominent, comprehensive, and recently established SCED standards. Six sources are discussed: (1) Single-Case Design Technical Documentation from the What Works Clearinghouse (WWC; Kratochwill et al., 2010 ); (2) the APA Division 12 Task Force on Psychological Interventions, with contributions from the Division 12 Task Force on Promotion and Dissemination of Psychological Procedures and the APA Task Force for Psychological Intervention Guidelines (DIV12; presented in Chambless & Hollon, 1998 ; Chambless & Ollendick, 2001 ), adopted and expanded by APA Division 53, the Society for Clinical Child and Adolescent Psychology ( Weisz & Hawley, 1998 , 1999 ); (3) the APA Division 16 Task Force on Evidence-Based Interventions in School Psychology (DIV16; Members of the Task Force on Evidence-Based Interventions in School Psychology. Chair: T. R. Kratochwill, 2003); (4) the National Reading Panel (NRP; National Institute of Child Health and Human Development, 2000 ); (5) the Single-Case Experimental Design Scale ( Tate et al., 2008 ); and (6) the reporting guidelines for EMA put forth by Stone & Shiffman (2002) . Although the specific purposes of each source differ somewhat, the overall aim is to provide researchers and reviewers with agreed-upon criteria to be used in the conduct and evaluation of SCED research. The standards provided by WWC, DIV12, DIV16, and the NRP represent the efforts of task forces. The Tate et al. scale was selected for inclusion in this review because it represents perhaps the only psychometrically validated tool for assessing the rigor of SCED methodology. Stone and Shiffman’s (2002) standards were intended specifically for EMA methods, but many of their criteria also apply to time-series, daily diary, and other repeated-measurement and sampling methods, making them pertinent to this article. The design, measurement, and analysis standards are presented in the later sections of this article and notable concurrences, discrepancies, strengths, and deficiencies are summarized.

Systematic Review Search Procedures and Selection Criteria

Search strategy.

A comprehensive search strategy of SCEDs was performed to identify studies published in peer-reviewed journals meeting a priori search and inclusion criteria. First, a computer-based PsycINFO search of articles published between 2000 and 2010 (search conducted in July 2011) was conducted that used the following primary key terms and phrases that appeared anywhere in the article (asterisks denote that any characters/letters can follow the last character of the search term): alternating treatment design, changing criterion design, experimental case*, multiple baseline design, replicated single-case design, simultaneous treatment design, time-series design. The search was limited to studies published in the English language and those appearing in peer-reviewed journals within the specified publication year range. Additional limiters of the type of article were also used in PsycINFO to increase specificity: The search was limited to include methodologies indexed as either quantitative study OR treatment outcome/randomized clinical trial and NOT field study OR interview OR focus group OR literature review OR systematic review OR mathematical model OR qualitative study.

Study selection

The author used a three-phase study selection, screening, and coding procedure to select the highest number of applicable studies. Phase 1 consisted of the initial systematic review conducted using PsycINFO, which resulted in 571 articles. In Phase 2, titles and abstracts were screened: Articles appearing to use a SCED were retained (451) for Phase 3, in which the author and a trained research assistant read each full-text article and entered the characteristics of interest into a database. At each phase of the screening process, studies that did not use a SCED or that either self-identified as, or were determined to be, quasi-experimental were dropped. Of the 571 original studies, 82 studies were determined to be quasi-experimental. The definition of a quasi-experimental design used in the screening procedure conforms to the descriptions provided by Kazdin (2010) and Shadish et al. (2002) regarding the necessary components of an experimental design. For example, reversal designs require a minimum of four phases (e.g., ABAB), and multiple baseline designs must demonstrate replication of the effect across at least three conditions (e.g., subjects, settings, behaviors). Sixteen studies were unavailable in full text in English, and five could not be obtained in full text and were thus dropped. The remaining articles that were not retained for review (59) were determined not to be SCED studies meeting our inclusion criteria, but had been identified in our PsycINFO search using the specified keyword and methodology terms. For this review, 409 studies were selected. The sources of the 409 reviewed studies are summarized in Table 1 . A complete bibliography of the 571 studies appearing in the initial search, with the included studies marked, is available online as an Appendix or from the author.

Journal Sources of Studies Included in the Systematic Review (N = 409)

Note: Each of the following journal titles contributed 1 study unless otherwise noted in parentheses: Augmentative and Alternative Communication; Acta Colombiana de Psicología; Acta Comportamentalia; Adapted Physical Activity Quarterly (2); Addiction Research and Theory; Advances in Speech Language Pathology; American Annals of the Deaf; American Journal of Education; American Journal of Occupational Therapy; American Journal of Speech-Language Pathology; The American Journal on Addictions; American Journal on Mental Retardation; Applied Ergonomics; Applied Psychophysiology and Biofeedback; Australian Journal of Guidance & Counseling; Australian Psychologist; Autism; The Behavior Analyst; The Behavior Analyst Today; Behavior Analysis in Practice (2); Behavior and Social Issues (2); Behaviour Change (2); Behavioural and Cognitive Psychotherapy; Behaviour Research and Therapy (3); Brain and Language (2); Brain Injury (2); Canadian Journal of Occupational Therapy (2); Canadian Journal of School Psychology; Career Development for Exceptional Individuals; Chinese Mental Health Journal; Clinical Linguistics and Phonetics; Clinical Psychology & Psychotherapy; Cognitive and Behavioral Practice; Cognitive Computation; Cognitive Therapy and Research; Communication Disorders Quarterly; Developmental Medicine & Child Neurology (2); Developmental Neurorehabilitation (2); Disability and Rehabilitation: An International, Multidisciplinary Journal (3); Disability and Rehabilitation: Assistive Technology; Down Syndrome: Research & Practice; Drug and Alcohol Dependence (2); Early Childhood Education Journal (2); Early Childhood Services: An Interdisciplinary Journal of Effectiveness; Educational Psychology (2); Education and Training in Autism and Developmental Disabilities; Electronic Journal of Research in Educational Psychology; Environment and Behavior (2); European Eating Disorders Review; European Journal of Sport Science; European Review of Applied Psychology; Exceptional Children; Exceptionality; Experimental and Clinical Psychopharmacology; Family & Community Health: The Journal of Health Promotion & Maintenance; Headache: The Journal of Head and Face Pain; International Journal of Behavioral Consultation and Therapy (2); International Journal of Disability; Development and Education (2); International Journal of Drug Policy; International Journal of Psychology; International Journal of Speech-Language Pathology; International Psychogeriatrics; Japanese Journal of Behavior Analysis (3); Japanese Journal of Special Education; Journal of Applied Research in Intellectual Disabilities (2); Journal of Applied Sport Psychology (3); Journal of Attention Disorders (2); Journal of Behavior Therapy and Experimental Psychiatry; Journal of Child Psychology and Psychiatry; Journal of Clinical Psychology in Medical Settings; Journal of Clinical Sport Psychology; Journal of Cognitive Psychotherapy; Journal of Consulting and Clinical Psychology (2); Journal of Deaf Studies and Deaf Education; Journal of Educational & Psychological Consultation (2); Journal of Evidence-Based Practices for Schools (2); Journal of the Experimental Analysis of Behavior (2); Journal of General Internal Medicine; Journal of Intellectual and Developmental Disabilities; Journal of Intellectual Disability Research (2); Journal of Medical Speech-Language Pathology; Journal of Neurology, Neurosurgery & Psychiatry; Journal of Paediatrics and Child Health; Journal of Prevention and Intervention in the Community; Journal of Safety Research; Journal of School Psychology (3); The Journal of Socio-Economics; The Journal of Special Education; Journal of Speech, Language, and Hearing Research (2); Journal of Sport Behavior; Journal of Substance Abuse Treatment; Journal of the International Neuropsychological Society; Journal of Traumatic Stress; The Journals of Gerontology: Series B: Psychological Sciences and Social Sciences; Language, Speech, and Hearing Services in Schools; Learning Disabilities Research & Practice (2); Learning Disability Quarterly (2); Music Therapy Perspectives; Neurorehabilitation and Neural Repair; Neuropsychological Rehabilitation (2); Pain; Physical Education and Sport Pedagogy (2); Preventive Medicine: An International Journal Devoted to Practice and Theory; Psychological Assessment; Psychological Medicine: A Journal of Research in Psychiatry and the Allied Sciences; The Psychological Record; Reading and Writing; Remedial and Special Education (3); Research and Practice for Persons with Severe Disabilities (2); Restorative Neurology and Neuroscience; School Psychology International; Seminars in Speech and Language; Sleep and Hypnosis; School Psychology Quarterly; Social Work in Health Care; The Sport Psychologist (3); Therapeutic Recreation Journal (2); The Volta Review; Work: Journal of Prevention, Assessment & Rehabilitation.

Coding criteria amplifications

A comprehensive description of the coding criteria for each category in this review is available from the author by request. The primary coding criteria are described here and in later sections of this article.

Research design was classified into one of the types discussed later in the section titled Predominant Single-Case Experimental Designs on the basis of the authors’ stated design type. Secondary research designs were then coded when applicable (i.e., mixed designs). Distinctions between primary and secondary research designs were made based on the authors’ description of their study. For example, if an author described the study as a “multiple baseline design with time-series measurement,” the primary research design would be coded as being multiple baseline, and time-series would be coded as the secondary research design.

Observer ratings were coded as present when observational coding procedures were described and/or the results of a test of interobserver agreement were reported.

Interrater reliability for observer ratings was coded as present in any case in which percent agreement, alpha, kappa, or another appropriate statistic was reported, regardless of the amount of the total data that were examined for agreement.

Daily diary, daily self-report, and EMA codes were given when authors explicitly described these procedures in the text by name. Coders did not infer the use of these measurement strategies.

The number of baseline observations was either taken directly from the figures provided in text or was simply counted in graphical displays of the data when this was determined to be a reliable approach. In some cases, it was not possible to reliably determine the number of baseline data points from the graphical display of data, in which case, the “unavailable” code was assigned. Similarly, the “unavailable” code was assigned when the number of observations was either unreported or ambiguous, or only a range was provided and thus no mean could be determined. Similarly, the mean number of baseline observations was calculated for each study prior to further descriptive statistical analyses because a number of studies reported means only.

The coding of the analytic method used in the reviewed studies is discussed later in the section titled Discussion of Review Results and Coding of Analytic Methods .

Results of the Systematic Review

Descriptive statistics of the design, measurement, and analysis characteristics of the reviewed studies are presented in Table 2 . The results and their implications are discussed in the relevant sections throughout the remainder of the article.

Descriptive Statistics of Reviewed SCED Characteristics

Note. % refers to the proportion of reviewed studies that satisfied criteria for this code: For example, the percent of studies reporting observer ratings.

The categories in the “Research design” subsection are the primary designs identified by the authors.

Categories in the “Mixed designs” subsection are included in the “Research design” subsection. Only the 3 most prevalent mixed designs are reported.

One study of 624 subjects was excluded from the calculation of the number of subjects because it was a significant outlier.

Similarly, one study with 500 subjects and one study with 950 subjects were excluded from the number of subject analyses for the simultaneous condition and time-series designs, respectively. This resulted in only one simultaneous condition study, which is why no standard deviation or range is reported.

Because of reporting inconsistencies in the reviewed articles, the mean number of baseline observations for each study was first calculated and then combined and reported in this table.

In contrast to the results reported in text, the findings here are based on the total number of studies and are not divided into those that reported an analysis and those that did not. Visual and statistical analyses are not applicable to most studies using changing criterion designs. However, some authors reported using visual analysis methods.

Discussion of the Systematic Review Results in Context

The SCED is a very flexible methodology and has many variants. Those mentioned here are the building blocks from which other designs are then derived. For those readers interested in the nuances of each design, Barlow et al., (2008) ; Franklin, Allison, and Gorman (1997) ; Kazdin (2010) ; and Kratochwill and Levin (1992) , among others, provide cogent, in-depth discussions. Identifying the appropriate SCED depends upon many factors, including the specifics of the IV, the setting in which the study will be conducted, participant characteristics, the desired or hypothesized outcomes, and the research question(s). Similarly, the researcher’s selection of measurement and analysis techniques is determined by these factors.

Predominant Single-Case Experimental Designs

Alternating/simultaneous designs (6%; primary design of the studies reviewed).

Alternating and simultaneous designs involve an iterative manipulation of the IV(s) across different phases to show that changes in the DV vary systematically as a function of manipulating the IV(s). In these multielement designs, the researcher has the option to alternate the introduction of two or more IVs or present two or more IVs at the same time. In the alternating variation, the researcher is able to determine the relative impact of two different IVs on the DV, when all other conditions are held constant. Another variation of this design is to alternate IVs across various conditions that could be related to the DV (e.g., class period, interventionist). Similarly, the simultaneous design would occur when the IVs were presented at the same time within the same phase of the study.

Changing criterion design (4%)

Changing criterion designs are used to demonstrate a gradual change in the DV over the course of the phase involving the active manipulation of the IV. Criteria indicating that a change has occurred happen in a step-wise manner, in which the criterion shifts as the participant responds to the presence of the manipulated IV. The changing criterion design is particularly useful in applied intervention research for a number of reasons. The IV is continuous and never withdrawn, unlike the strategy used in a reversal design. This is particularly important in situations where removal of a psychological intervention would be either detrimental or dangerous to the participant, or would be otherwise unfeasible or unethical. The multiple baseline design also does not withdraw intervention, but it requires replicating the effects of the intervention across participants, settings, or situations. A changing criterion design can be accomplished with one participant in one setting without withholding or withdrawing treatment.

Multiple baseline/combined series design (69%)

The multiple baseline or combined series design can be used to test within-subject change across conditions and often involves multiple participants in a replication context. The multiple baseline design is quite simple in many ways, essentially consisting of a number of repeated, miniature AB experiments or variations thereof. Introduction of the IV is staggered temporally across multiple participants or across multiple within-subject conditions, which allows the researcher to demonstrate that changes in the DV reliably occur only when the IV is introduced, thus controlling for the effects of extraneous factors. Multiple baseline designs can be used both within and across units (i.e., persons or groups of persons). When the baseline phase of each subject begins simultaneously, it is called a concurrent multiple baseline design. In a nonconcurrent variation, baseline periods across subjects begin at different points in time. The multiple baseline design is useful in many settings in which withdrawal of the IV would not be appropriate or when introduction of the IV is hypothesized to result in permanent change that would not reverse when the IV is withdrawn. The major drawback of this design is that the IV must be initially withheld for a period of time to ensure different starting points across the different units in the baseline phase. Depending upon the nature of the research questions, withholding an IV, such as a treatment, could be potentially detrimental to participants.

Reversal designs (17%)

Reversal designs are also known as introduction and withdrawal and are denoted as ABAB designs in their simplest form. As the name suggests, the reversal design involves collecting a baseline measure of the DV (the first A phase), introducing the IV (the first B phase), removing the IV while continuing to assess the DV (the second A phase), and then reintroducing the IV (the second B phase). This pattern can be repeated as many times as is necessary to demonstrate an effect or otherwise address the research question. Reversal designs are useful when the manipulation is hypothesized to result in changes in the DV that are expected to reverse or discontinue when the manipulation is not present. Maintenance of an effect is often necessary to uphold the findings of reversal designs. The demonstration of an effect is evident in reversal designs when improvement occurs during the first manipulation phase, compared to the first baseline phase, then reverts to or approaches original baseline levels during the second baseline phase when the manipulation has been withdrawn, and then improves again when the manipulation in then reinstated. This pattern of reversal, when the manipulation is introduced and then withdrawn, is essential to attributing changes in the DV to the IV. However, maintenance of the effects in a reversal design, in which the DV is hypothesized to reverse when the IV is withdrawn, is not incompatible ( Kazdin, 2010 ). Maintenance is demonstrated by repeating introduction–withdrawal segments until improvement in the DV becomes permanent even when the IV is withdrawn. There is not always a need to demonstrate maintenance in all applications, nor is it always possible or desirable, but it is paramount in the learning and intervention research contexts.

Mixed designs (10%)

Mixed designs include a combination of more than one SCED (e.g., a reversal design embedded within a multiple baseline) or an SCED embedded within a group design (i.e., a randomized controlled trial comparing two groups of multiple baseline experiments). Mixed designs afford the researcher even greater flexibility in designing a study to address complex psychological hypotheses, but also capitalize on the strengths of the various designs. See Kazdin (2010) for a discussion of the variations and utility of mixed designs.

Related Nonexperimental Designs

Quasi-experimental designs.

In contrast to the designs previously described, all of which constitute “true experiments” ( Kazdin, 2010 ; Shadish et al., 2002 ), in quasi-experimental designs the conditions of a true experiment (e.g., active manipulation of the IV, replication of the effect) are approximated and are not readily under the control of the researcher. Because the focus of this article is on experimental designs, quasi-experiments are not discussed in detail; instead the reader is referred to Kazdin (2010) and Shadish et al. (2002) .

Ecological and naturalistic single-case designs

For a single-case design to be experimental, there must be active manipulation of the IV, but in some applications, such as those that might be used in social and personality psychology, the researcher might be interested in measuring naturally occurring phenomena and examining their temporal relationships. Thus, the researcher will not use a manipulation. An example of this type of research might be a study about the temporal relationship between alcohol consumption and depressed mood, which can be measured reliably using EMA methods. Psychotherapy process researchers also use this type of design to assess dyadic relationship dynamics between therapists and clients (e.g., Tschacher & Ramseyer, 2009 ).

Research Design Standards

Each of the reviewed standards provides some degree of direction regarding acceptable research designs. The WWC provides the most detailed and specific requirements regarding design characteristics. Those guidelines presented in Tables 3 , 4 , and 5 are consistent with the methodological rigor necessary to meet the WWC distinction “meets standards.” The WWC also provides less-stringent standards for a “meets standards with reservations” distinction. When minimum criteria in the design, measurement, or analysis sections of a study are not met, it is rated “does not meet standards” ( Kratochwill et al., 2010 ). Many SCEDs are acceptable within the standards of DIV12, DIV16, NRP, and in the Tate et al. SCED scale. DIV12 specifies that replication occurs across a minimum of three successive cases, which differs from the WWC specifications, which allow for three replications within a single-subject design but does not necessarily need to be across multiple subjects. DIV16 does not require, but seems to prefer, a multiple baseline design with a between-subject replication. Tate et al. state that the “design allows for the examination of cause and effect relationships to demonstrate efficacy” (p. 400, 2008). Determining whether or not a design meets this requirement is left up to the evaluator, who might then refer to one of the other standards or another source for direction.

Research Design Standards and Guidelines

Measurement and Assessment Standards and Guidelines

Analysis Standards and Guidelines

The Stone and Shiffman (2002) standards for EMA are concerned almost entirely with the reporting of measurement characteristics and less so with research design. One way in which these standards differ from those of other sources is in the active manipulation of the IV. Many research questions in EMA, daily diary, and time-series designs are concerned with naturally occurring phenomena, and a researcher manipulation would run counter to this aim. The EMA standards become important when selecting an appropriate measurement strategy within the SCED. In EMA applications, as is also true in some other time-series and daily diary designs, researcher manipulation occurs as a function of the sampling interval in which DVs of interest are measured according to fixed time schedules (e.g., reporting occurs at the end of each day), random time schedules (e.g., the data collection device prompts the participant to respond at random intervals throughout the day), or on an event-based schedule (e.g., reporting occurs after a specified event takes place).

Measurement

The basic measurement requirement of the SCED is a repeated assessment of the DV across each phase of the design in order to draw valid inferences regarding the effect of the IV on the DV. In other applications, such as those used by personality and social psychology researchers to study various human phenomena ( Bolger et al., 2003 ; Reis & Gable, 2000 ), sampling strategies vary widely depending on the topic area under investigation. Regardless of the research area, SCEDs are most typically concerned with within-person change and processes and involve a time-based strategy, most commonly to assess global daily averages or peak daily levels of the DV. Many sampling strategies, such as time-series, in which reporting occurs at uniform intervals or on event-based, fixed, or variable schedules, are also appropriate measurement methods and are common in psychological research (see Bolger et al., 2003 ).

Repeated-measurement methods permit the natural, even spontaneous, reporting of information ( Reis, 1994 ), which reduces the biases of retrospection by minimizing the amount of time elapsed between an experience and the account of this experience ( Bolger et al., 2003 ). Shiffman et al. (2008) aptly noted that the majority of research in the field of psychology relies heavily on retrospective assessment measures, even though retrospective reports have been found to be susceptible to state-congruent recall (e.g., Bower, 1981 ) and a tendency to report peak levels of the experience instead of giving credence to temporal fluctuations ( Redelmeier & Kahneman, 1996 ; Stone, Broderick, Kaell, Deles-Paul, & Porter, 2000 ). Furthermore, Shiffman et al. (1997) demonstrated that subjective aggregate accounts were a poor fit to daily reported experiences, which can be attributed to reductions in measurement error resulting in increased validity and reliability of the daily reports.

The necessity of measuring at least one DV repeatedly means that the selected assessment method, instrument, and/or construct must be sensitive to change over time and be capable of reliably and validly capturing change. Horner et al. (2005) discusses the important features of outcome measures selected for use in these types of designs. Kazdin (2010) suggests that measures be dimensional, which can more readily detect effects than categorical and binary measures. Although using an established measure or scale, such as the Outcome Questionnaire System ( M. J. Lambert, Hansen, & Harmon, 2010 ), provides empirically validated items for assessing various outcomes, most measure validation studies conducted on this type of instrument involve between-subject designs, which is no guarantee that these measures are reliable and valid for assessing within-person variability. Borsboom, Mellenbergh, and van Heerden (2003) suggest that researchers adapting validated measures should consider whether the items they propose using have a factor structure within subjects similar to that obtained between subjects. This is one of the reasons that SCEDs often use observational assessments from multiple sources and report the interrater reliability of the measure. Self-report measures are acceptable practice in some circles, but generally additional assessment methods or informants are necessary to uphold the highest methodological standards. The results of this review indicate that the majority of studies include observational measurement (76.0%). Within those studies, nearly all (97.1%) reported interrater reliability procedures and results. The results within each design were similar, with the exception of time-series designs, which used observer ratings in only half of the reviewed studies.

Time-series

Time-series designs are defined by repeated measurement of variables of interest over a period of time ( Box & Jenkins, 1970 ). Time-series measurement most often occurs in uniform intervals; however, this is no longer a constraint of time-series designs (see Harvey, 2001 ). Although uniform interval reporting is not necessary in SCED research, repeated measures often occur at uniform intervals, such as once each day or each week, which constitutes a time-series design. The time-series design has been used in various basic science applications ( Scollon, Kim-Pietro, & Diener, 2003 ) across nearly all subspecialties in psychology (e.g., Bolger et al., 2003 ; Piasecki et al., 2007 ; for a review, see Reis & Gable, 2000 ; Soliday et al., 2002 ). The basic time-series formula for a two-phase (AB) data stream is presented in Equation 1 . In this formula α represents the step function of the data stream; S represents the change between the first and second phases, which is also the intercept in a two-phase data stream and a step function being 0 at times i = 1, 2, 3…n1 and 1 at times i = n1+1, n1+2, n1+3…n; n 1 is the number of observations in the baseline phase; n is the total number of data points in the data stream; i represents time; and ε i = ρε i −1 + e i , which indicates the relationship between the autoregressive function (ρ) and the distribution of the data in the stream.

Time-series formulas become increasingly complex when seasonality and autoregressive processes are modeled in the analytic procedures, but these are rarely of concern for short time-series data streams in SCEDs. For a detailed description of other time-series design and analysis issues, see Borckardt et al. (2008) , Box and Jenkins (1970) , Crosbie (1993) , R. R. Jones et al. (1977) , and Velicer and Fava (2003) .

Time-series and other repeated-measures methodologies also enable examination of temporal effects. Borckardt et al. (2008) and others have noted that time-series designs have the potential to reveal how change occurs, not simply if it occurs. This distinction is what most interested Skinner (1938) , but it often falls below the purview of today’s researchers in favor of group designs, which Skinner felt obscured the process of change. In intervention and psychopathology research, time-series designs can assess mediators of change ( Doss & Atkins, 2006 ), treatment processes ( Stout, 2007 ; Tschacher & Ramseyer, 2009 ), and the relationship between psychological symptoms (e.g., Alloy, Just, & Panzarella, 1997 ; Hanson & Chen, 2010 ; Oslin, Cary, Slaymaker, Colleran, & Blow, 2009 ), and might be capable of revealing mechanisms of change ( Kazdin, 2007 , 2009 , 2010 ). Between- and within-subject SCED designs with repeated measurements enable researchers to examine similarities and differences in the course of change, both during and as a result of manipulating an IV. Temporal effects have been largely overlooked in many areas of psychological science ( Bolger et al., 2003 ): Examining temporal relationships is sorely needed to further our understanding of the etiology and amplification of numerous psychological phenomena.

Time-series studies were very infrequently found in this literature search (2%). Time-series studies traditionally occur in subfields of psychology in which single-case research is not often used (e.g., personality, physiological/biological). Recent advances in methods for collecting and analyzing time-series data (e.g., Borckardt et al., 2008 ) could expand the use of time-series methodology in the SCED community. One problem with drawing firm conclusions from this particular review finding is a semantic factor: Time-series is a specific term reserved for measurement occurring at a uniform interval. However, SCED research appears to not yet have adopted this language when referring to data collected in this fashion. When time-series data analytic methods are not used, the matter of measurement interval is of less importance and might not need to be specified or described as a time-series. An interesting extension of this work would be to examine SCED research that used time-series measurement strategies but did not label it as such. This is important because then it could be determined how many SCEDs could be analyzed with time-series statistical methods.

Daily diary and ecological momentary assessment methods

EMA and daily diary approaches represent methodological procedures for collecting repeated measurements in time-series and non-time-series experiments, which are also known as experience sampling. Presenting an in-depth discussion of the nuances of these sampling techniques is well beyond the scope of this paper. The reader is referred to the following review articles: daily diary ( Bolger et al., 2003 ; Reis & Gable, 2000 ; Thiele, Laireiter, & Baumann, 2002 ), and EMA ( Shiffman et al., 2008 ). Experience sampling in psychology has burgeoned in the past two decades as technological advances have permitted more precise and immediate reporting by participants (e.g., Internet-based, two-way pagers, cellular telephones, handheld computers) than do paper and pencil methods (for reviews see Barrett & Barrett, 2001 ; Shiffman & Stone, 1998 ). Both methods have practical limitations and advantages. For example, electronic methods are more costly and may exclude certain subjects from participating in the study, either because they do not have access to the necessary technology or they do not have the familiarity or savvy to successfully complete reporting. Electronic data collection methods enable the researcher to prompt responses at random or predetermined intervals and also accurately assess compliance. Paper and pencil methods have been criticized for their inability to reliably track respondents’ compliance: Palermo, Valenzuela, and Stork (2004) found better compliance with electronic diaries than with paper and pencil. On the other hand, Green, Rafaeli, Bolger, Shrout, & Reis (2006) demonstrated the psychometric data structure equivalence between these two methods, suggesting that the data collected in either method will yield similar statistical results given comparable compliance rates.

Daily diary/daily self-report and EMA measurement were somewhat rarely represented in this review, occurring in only 6.1% of the total studies. EMA methods had been used in only one of the reviewed studies. The recent proliferation of EMA and daily diary studies in psychology reported by others ( Bolger et al., 2003 ; Piasecki et al., 2007 ; Shiffman et al., 2008 ) suggests that these methods have not yet reached SCED researchers, which could in part have resulted from the long-held supremacy of observational measurement in fields that commonly practice single-case research.

Measurement Standards

As was previously mentioned, measurement in SCEDs requires the reliable assessment of change over time. As illustrated in Table 4 , DIV16 and the NRP explicitly require that reliability of all measures be reported. DIV12 provides little direction in the selection of the measurement instrument, except to require that three or more clinically important behaviors with relative independence be assessed. Similarly, the only item concerned with measurement on the Tate et al. scale specifies assessing behaviors consistent with the target of the intervention. The WWC and the Tate et al. scale require at least two independent assessors of the DV and that interrater reliability meeting minimum established thresholds be reported. Furthermore, WWC requires that interrater reliability be assessed on at least 20% of the data in each phase and in each condition. DIV16 expects that assessment of the outcome measures will be multisource and multimethod, when applicable. The interval of measurement is not specified by any of the reviewed sources. The WWC and the Tate et al. scale require that DVs be measured repeatedly across phases (e.g., baseline and treatment), which is a typical requirement of a SCED. The NRP asks that the time points at which DV measurement occurred be reported.

The baseline measurement represents one of the most crucial design elements of the SCED. Because subjects provide their own data for comparison, gathering a representative, stable sampling of behavior before manipulating the IV is essential to accurately inferring an effect. Some researchers have reported the typical length of the baseline period to range from 3 to 12 observations in intervention research applications (e.g., Center et al., 1986 ; Huitema, 1985 ; R. R. Jones et al., 1977 ; Sharpley, 1987 ); Huitema’s (1985) review of 881 experiments published in the Journal of Applied Behavior Analysis resulted in a modal number of three to four baseline points. Center et al. (1986) suggested five as the minimum number of baseline measurements needed to accurately estimate autocorrelation. Longer baseline periods suggest a greater likelihood of a representative measurement of the DVs, which has been found to increase the validity of the effects and reduce bias resulting from autocorrelation ( Huitema & McKean, 1994 ). The results of this review are largely consistent with those of previous researchers: The mean number of baseline observations was found to be 10.22 ( SD = 9.59), and 6 was the modal number of observations. Baseline data were available in 77.8% of the reviewed studies. Although the baseline assessment has tremendous bearing on the results of a SCED study, it was often difficult to locate the exact number of data points. Similarly, the number of data points assessed across all phases of the study were not easily identified.

The WWC, DIV12, and DIV16 agree that a minimum of three data points during the baseline is necessary. However, to receive the highest rating by the WWC, five data points are necessary in each phase, including the baseline and any subsequent withdrawal baselines as would occur in a reversal design. DIV16 explicitly states that more than three points are preferred and further stipulates that the baseline must demonstrate stability (i.e., limited variability), absence of overlap between the baseline and other phases, absence of a trend, and that the level of the baseline measurement is severe enough to warrant intervention; each of these aspects of the data is important in inferential accuracy. Detrending techniques can be used to address baseline data trend. The integration option in ARIMA-based modeling and the empirical mode decomposition method ( Wu, Huang, Long, & Peng, 2007 ) are two sophisticated detrending techniques. In regression-based analytic methods, detrending can be accomplished by simply regressing each variable in the model on time (i.e., the residuals become the detrended series), which is analogous to adding a linear, exponential, or quadratic term to the regression equation.

NRP does not provide a minimum for data points, nor does the Tate et al. scale, which requires only a sufficient sampling of baseline behavior. Although the mean and modal number of baseline observations is well within these parameters, seven (1.7%) studies reported mean baselines of less than three data points.

Establishing a uniform minimum number of required baseline observations would provide researchers and reviewers with only a starting guide. The baseline phase is important in SCED research because it establishes a trend that can then be compared with that of subsequent phases. Although a minimum number of observations might be required to meet standards, many more might be necessary to establish a trend when there is variability and trends in the direction of the expected effect. The selected data analytic approach also has some bearing on the number of necessary baseline observations. This is discussed further in the Analysis section.

Reporting of repeated measurements

Stone and Shiffman (2002) provide a comprehensive set of guidelines for the reporting of EMA data, which can also be applied to other repeated-measurement strategies. Because the application of EMA is widespread and not confined to specific research designs, Stone and Shiffman intentionally place few restraints on researchers regarding selection of the DV and the reporter, which is determined by the research question under investigation. The methods of measurement, however, are specified in detail: Descriptions of prompting, recording of responses, participant-initiated entries, and the data acquisition interface (e.g., paper and pencil diary, PDA, cellular telephone) ought to be provided with sufficient detail for replication. Because EMA specifically, and time-series/daily diary methods similarly, are primarily concerned with the interval of assessment, Stone and Shiffman suggest reporting the density and schedule of assessment. The approach is generally determined by the nature of the research question and pragmatic considerations, such as access to electronic data collection devices at certain times of the day and participant burden. Compliance and missing data concerns are present in any longitudinal research design, but they are of particular importance in repeated-measurement applications with frequent measurement. When the research question pertains to temporal effects, compliance becomes paramount, and timely, immediate responding is necessary. For this reason, compliance decisions, rates of missing data, and missing data management techniques must be reported. The effect of missing data in time-series data streams has been the topic of recent research in the social sciences (e.g., Smith, Borckardt, & Nash, in press ; Velicer & Colby, 2005a , 2005b ). The results and implications of these and other missing data studies are discussed in the next section.

Analysis of SCED Data

Visual analysis.

Experts in the field generally agree about the majority of critical single-case experiment design and measurement characteristics. Analysis, on the other hand, is an area of significant disagreement, yet it has also received extensive recent attention and advancement. Debate regarding the appropriateness and accuracy of various methods for analyzing SCED data, the interpretation of single-case effect sizes, and other concerns vital to the validity of SCED results has been ongoing for decades, and no clear consensus has been reached. Visual analysis, following systematic procedures such as those provided by Franklin, Gorman, Beasley, and Allison (1997) and Parsonson and Baer (1978) , remains the standard by which SCED data are most commonly analyzed ( Parker, Cryer, & Byrns, 2006 ). Visual analysis can arguably be applied to all SCEDs. However, a number of baseline data characteristics must be met for effects obtained through visual analysis to be valid and reliable. The baseline phase must be relatively stable; free of significant trend, particularly in the hypothesized direction of the effect; have minimal overlap of data with subsequent phases; and have a sufficient sampling of behavior to be considered representative ( Franklin, Gorman, et al., 1997 ; Parsonson & Baer, 1978 ). The effect of baseline trend on visual analysis, and a technique to control baseline trend, are offered by Parker et al. (2006) . Kazdin (2010) suggests using statistical analysis when a trend or significant variability appears in the baseline phase, two conditions that ought to preclude the use of visual analysis techniques. Visual analysis methods are especially adept at determining intervention effects and can be of particular relevance in real-world applications (e.g., Borckardt et al., 2008 ; Kratochwill, Levin, Horner, & Swoboda, 2011 ).

However, visual analysis has its detractors. It has been shown to be inconsistent, can be affected by autocorrelation, and results in overestimation of effect (e.g., Matyas & Greenwood, 1990 ). Visual analysis as a means of estimating an effect precludes the results of SCED research from being included in meta-analysis, and also makes it very difficult to compare results to the effect sizes generated by other statistical methods. Yet, visual analysis proliferates in large part because SCED researchers are familiar with these methods and are not only generally unfamiliar with statistical approaches, but lack agreement about their appropriateness. Still, top experts in single-case analysis champion the use of statistical methods alongside visual analysis whenever it is appropriate to do so ( Kratochwill et al., 2011 ).

Statistical analysis

Statistical analysis of SCED data consists generally of an attempt to address one or more of three broad research questions: (1) Does introduction/manipulation of the IV result in statistically significant change in the level of the DV (level-change or phase-effect analysis)? (2) Does introduction/manipulation of the IV result in statistically significant change in the slope of the DV over time (slope-change analysis)? and (3) Do meaningful relationships exist between the trajectory of the DV and other potential covariates? Level- and slope-change analyses are relevant to intervention effectiveness studies and other research questions in which the IV is expected to result in changes in the DV in a particular direction. Visual analysis methods are most adept at addressing research questions pertaining to changes in level and slope (Questions 1 and 2), most often using some form of graphical representation and standardized computation of a mean level or trend line within and between each phase of interest (e.g., Horner & Spaulding, 2010 ; Kratochwill et al., 2011 ; Matyas & Greenwood, 1990 ). Research questions in other areas of psychological science might address the relationship between DVs or the slopes of DVs (Question 3). A number of sophisticated modeling approaches (e.g., cross-lag, multilevel, panel, growth mixture, latent class analysis) may be used for this type of question, and some are discussed in greater detail later in this section. However, a discussion about the nuances of this type of analysis and all their possible methods is well beyond the scope of this article.

The statistical analysis of SCEDs is a contentious issue in the field. Not only is there no agreed-upon statistical method, but the practice of statistical analysis in the context of the SCED is viewed by some as unnecessary (see Shadish, Rindskopf, & Hedges, 2008 ). Traditional trends in the prevalence of statistical analysis usage by SCED researchers are revealing: Busk & Marascuilo (1992) found that only 10% of the published single-case studies they reviewed used statistical analysis; Brossart, Parker, Olson, & Mahadevan (2006) estimated that this figure had roughly doubled by 2006. A range of concerns regarding single-case effect size calculation and interpretation is discussed in significant detail elsewhere (e.g., Campbell, 2004 ; Cohen, 1994 ; Ferron & Sentovich, 2002 ; Ferron & Ware, 1995 ; Kirk, 1996 ; Manolov & Solanas, 2008 ; Olive & Smith, 2005 ; Parker & Brossart, 2003 ; Robey et al., 1999 ; Smith et al., in press ; Velicer & Fava, 2003 ). One concern is the lack of a clearly superior method across datasets. Although statistical methods for analyzing SCEDs abound, few studies have examined their comparative performance with the same dataset. The most recent studies of this kind, performed by Brossart et al. (2006) , Campbell (2004) , Parker and Brossart (2003) , and Parker and Vannest (2009) , found that the more promising available statistical analysis methods yielded moderately different results on the same data series, which led them to conclude that each available method is equipped to adequately address only a relatively narrow spectrum of data. Given these findings, analysts need to select an appropriate model for the research questions and data structure, being mindful of how modeling results can be influenced by extraneous factors.

The current standards unfortunately provide little guidance in the way of statistical analysis options. This article presents an admittedly cursory introduction to available statistical methods; many others are not covered in this review. The following articles provide more in-depth discussion and description of other methods: Barlow et al. (2008) ; Franklin et al., (1997) ; Kazdin (2010) ; and Kratochwill and Levin (1992 , 2010 ). Shadish et al. (2008) summarize more recently developed methods. Similarly, a Special Issue of Evidence-Based Communication Assessment and Intervention (2008, Volume 2) provides articles and discussion of the more promising statistical methods for SCED analysis. An introduction to autocorrelation and its implications for statistical analysis is necessary before specific analytic methods can be discussed. It is also pertinent at this time to discuss the implications of missing data.

Autocorrelation

Many repeated measurements within a single subject or unit create a situation that most psychological researchers are unaccustomed to dealing with: autocorrelated data, which is the nonindependence of sequential observations, also known as serial dependence. Basic and advanced discussions of autocorrelation in single-subject data can be found in Borckardt et al. (2008) , Huitema (1985) , and Marshall (1980) , and discussions of autocorrelation in multilevel models can be found in Snijders and Bosker (1999) and Diggle and Liang (2001) . Along with trend and seasonal variation, autocorrelation is one example of the internal structure of repeated measurements. In the social sciences, autocorrelated data occur most naturally in the fields of physiological psychology, econometrics, and finance, where each phase of interest has potentially hundreds or even thousands of observations that are tightly packed across time (e.g., electroencephalography actuarial data, financial market indices). Applied SCED research in most areas of psychology is more likely to have measurement intervals of day, week, or hour.

Autocorrelation is a direct result of the repeated-measurement requirements of the SCED, but its effect is most noticeable and problematic when one is attempting to analyze these data. Many commonly used data analytic approaches, such as analysis of variance, assume independence of observations and can produce spurious results when the data are nonindependent. Even statistically insignificant autocorrelation estimates are generally viewed as sufficient to cause inferential bias when conventional statistics are used (e.g., Busk & Marascuilo, 1988 ; R. R. Jones et al., 1977 ; Matyas & Greenwood, 1990 ). The effect of autocorrelation on statistical inference in single-case applications has also been known for quite some time (e.g., R. R. Jones et al., 1977 ; Kanfer, 1970 ; Kazdin, 1981 ; Marshall, 1980 ). The findings of recent simulation studies of single-subject data streams indicate that autocorrelation is a nontrivial matter. For example, Manolov and Solanas (2008) determined that calculated effect sizes were linearly related to the autocorrelation of the data stream, and Smith et al. (in press) demonstrated that autocorrelation estimates in the vicinity of 0.80 negatively affect the ability to correctly infer a significant level-change effect using a standardized mean differences method. Huitema and colleagues (e.g., Huitema, 1985 ; Huitema & McKean, 1994 ) argued that autocorrelation is rarely a concern in applied research. Huitema’s methods and conclusions have been questioned and opposing data have been published (e.g., Allison & Gorman, 1993 ; Matyas & Greenwood, 1990 ; Robey et al., 1999 ), resulting in abandonment of the position that autocorrelation can be conscionably ignored without compromising the validity of the statistical procedures. Procedures for removing autocorrelation in the data stream prior to calculating effect sizes are offered as one option: One of the more promising analysis methods, autoregressive integrated moving averages (discussed later in this article), was specifically designed to remove the internal structure of time-series data, such as autocorrelation, trend, and seasonality ( Box & Jenkins, 1970 ; Tiao & Box, 1981 ).

Missing observations

Another concern inherent in repeated-measures designs is missing data. Daily diary and EMA methods are intended to reduce the risk of retrospection error by eliciting accurate, real-time information ( Bolger et al., 2003 ). However, these methods are subject to missing data as a result of honest forgetfulness, not possessing the diary collection tool at the specified time of collection, and intentional or systematic noncompliance. With paper and pencil diaries and some electronic methods, subjects might be able to complete missed entries retrospectively, defeating the temporal benefits of these assessment strategies ( Bolger et al., 2003 ). Methods of managing noncompliance through the study design and measurement methods include training the subject to use the data collection device appropriately, using technology to prompt responding and track the time of response, and providing incentives to participants for timely compliance (for additional discussion of this topic, see Bolger et al., 2003 ; Shiffman & Stone, 1998 ).

Even when efforts are made to maximize compliance during the conduct of the research, the problem of missing data is often unavoidable. Numerous approaches exist for handling missing observations in group multivariate designs (e.g., Horton & Kleinman, 2007 ; Ibrahim, Chen, Lipsitz, & Herring, 2005 ). Ragunathan (2004) and others concluded that full information and raw data maximum likelihood methods are preferable. Velicer and Colby (2005a , 2005b ) established the superiority of maximum likelihood methods over listwise deletion, mean of adjacent observations, and series mean substitution in the estimation of various critical time-series data parameters. Smith et al. (in press) extended these findings regarding the effect of missing data on inferential precision. They found that managing missing data with the EM procedure ( Dempster, Laird, & Rubin, 1977 ), a maximum likelihood algorithm, did not affect one’s ability to correctly infer a significant effect. However, lag-1 autocorrelation estimates in the vicinity of 0.80 resulted in insufficient power sensitivity (< 0.80), regardless of the proportion of missing data (10%, 20%, 30%, or 40%). 1 Although maximum likelihood methods have garnered some empirical support, methodological strategies that minimize missing data, particularly systematically missing data, are paramount to post-hoc statistical remedies.

Nonnormal distribution of data

In addition to the autocorrelated nature of SCED data, typical measurement methods also present analytic challenges. Many statistical methods, particularly those involving model finding, assume that the data are normally distributed. This is often not satisfied in SCED research when measurements involve count data, observer-rated behaviors, and other, similar metrics that result in skewed distributions. Techniques are available to manage nonnormal distributions in regression-based analysis, such as zero-inflated Poisson regression ( D. Lambert, 1992 ) and negative binomial regression ( Gardner, Mulvey, & Shaw, 1995 ), but many other statistical analysis methods do not include these sophisticated techniques. A skewed data distribution is perhaps one of the reasons Kazdin (2010) suggests not using count, categorical, or ordinal measurement methods.

Available statistical analysis methods

Following is a basic introduction to the more promising and prevalent analytic methods for SCED research. Because there is little consensus regarding the superiority of any single method, the burden unfortunately falls on the researcher to select a method capable of addressing the research question and handling the data involved in the study. Some indications and contraindications are provided for each method presented here.

Multilevel and structural equation modeling

Multilevel modeling (MLM; e.g., Schmidt, Perels, & Schmitz, 2010 ) techniques represent the state of the art among parametric approaches to SCED analysis, particularly when synthesizing SCED results ( Shadish et al., 2008 ). MLM and related latent growth curve and factor mixture methods in structural equation modeling (SEM; e.g., Lubke & Muthén, 2005 ; B. O. Muthén & Curran, 1997 ) are particularly effective for evaluating trajectories and slopes in longitudinal data and relating changes to potential covariates. MLM and related hierarchical linear models (HLM) can also illuminate the relationship between the trajectories of different variables under investigation and clarify whether or not these relationships differ amongst the subjects in the study. Time-series and cross-lag analyses can also be used in MLM and SEM ( Chow, Ho, Hamaker, & Dolan, 2010 ; du Toit & Browne, 2007 ). However, they generally require sophisticated model-fitting techniques, making them difficult for many social scientists to implement. The structure (autocorrelation) and trend of the data can also complicate many MLM methods. The common, short data streams in SCED research and the small number of subjects also present problems to MLM and SEM approaches, which were developed for data with significantly greater numbers of observations when the number of subjects is fewer, and for a greater number of participants for model-fitting purposes, particularly when there are fewer data points. Still, MLM and related techniques arguably represent the most promising analytic methods.

A number of software options 2 exist for SEM. Popular statistical packages in the social sciences provide SEM options, such as PROC CALIS in SAS ( SAS Institute Inc., 2008 ), the AMOS module ( Arbuckle, 2006 ) of SPSS ( SPSS Statistics, 2011 ), and the sempackage for R ( R Development Core Team, 2005 ), the use of which is described by Fox ( Fox, 2006 ). A number of stand-alone software options are also available for SEM applications, including Mplus ( L. K. Muthén & Muthén, 2010 ) and Stata ( StataCorp., 2011 ). Each of these programs also provides options for estimating multilevel/hierarchical models (for a review of using these programs for MLM analysis see Albright & Marinova, 2010 ). Hierarchical linear and nonlinear modeling can also be accomplished using the HLM 7 program ( Raudenbush, Bryk, & Congdon, 2011 ).

Autoregressive moving averages (ARMA; e.g., Browne & Nesselroade, 2005 ; Liu & Hudack, 1995 ; Tiao & Box, 1981 )

Two primary points have been raised regarding ARMA modeling: length of the data stream and feasibility of the modeling technique. ARMA models generally require 30–50 observations in each phase when analyzing a single-subject experiment (e.g., Borckardt et al., 2008 ; Box & Jenkins, 1970 ), which is often difficult to satisfy in applied psychological research applications. However, ARMA models in an SEM framework, such as those described by du Toit & Browne (2001) , are well suited for longitudinal panel data with few observations and many subjects. Autoregressive SEM models are also applicable under similar conditions. Model-fitting options are available in SPSS, R, and SAS via PROC ARMA.

ARMA modeling also requires considerable training in the method and rather advanced knowledge about statistical methods (e.g., Kratochwill & Levin, 1992 ). However, Brossart et al. (2006) point out that ARMA-based approaches can produce excellent results when there is no “model finding” and a simple lag-1 model, with no differencing and no moving average, is used. This approach can be taken for many SCED applications when phase- or slope-change analyses are of interest with a single, or very few, subjects. As already mentioned, this method is particularly useful when one is seeking to account for autocorrelation or other over-time variations that are not directly related to the experimental or intervention effect of interest (i.e., detrending). ARMA and other time-series analysis methods require missing data to be managed prior to analysis by means of options such as full information maximum likelihood estimation, multiple imputation, or the Kalman filter (see Box & Jenkins, 1970 ; Hamilton, 1994 ; Shumway & Stoffer, 1982 ) because listwise deletion has been shown to result in inaccurate time-series parameter estimates ( Velicer & Colby, 2005a ).

Standardized mean differences

Standardized mean differences approaches include the common Cohen’s d , Glass’s Delta, and Hedge’s g that are used in the analysis of group designs. The computational properties of mean differences approaches to SCEDs are identical to those used for group comparisons, except that the results represent within-case variation instead of the variation between groups, which suggests that the obtained effect sizes are not interpretively equivalent. The advantage of the mean differences approach is its simplicity of calculation and also its familiarity to social scientists. The primary drawback of these approaches is that they were not developed to contend with autocorrelated data. However, Manolov and Solanas (2008) reported that autocorrelation least affected effect sizes calculated using standardized mean differences approaches. To the applied-research scientist this likely represents the most accessible analytic approach, because statistical software is not required to calculate these effect sizes. The resultant effect sizes of single subject standardized mean differences analysis must be interpreted cautiously because their relation to standard effect size benchmarks, such as those provided by Cohen (1988) , is unknown. Standardized mean differences approaches are appropriate only when examining significant differences between phases of the study and cannot illuminate trajectories or relationships between variables.

Other analytic approaches

Researchers have offered other analytic methods to deal with the characteristics of SCED data. A number of methods for analyzing N -of-1 experiments have been developed. Borckardt’s Simulation Modeling Analysis (2006) program provides a method for analyzing level- and slope-change in short (<30 observations per phase; see Borckardt et al., 2008 ), autocorrelated data streams that is statistically sophisticated, yet accessible and freely available to typical psychological scientists and clinicians. A replicated single-case time-series design conducted by Smith, Handler, & Nash (2010) provides an example of SMA application. The Singwin Package, described in Bloom et al., (2003) , is a another easy-to-use parametric approach for analyzing single-case experiments. A number of nonparametric approaches have also been developed that emerged from the visual analysis tradition: Some examples include percent nonoverlapping data ( Scruggs, Mastropieri, & Casto, 1987 ) and nonoverlap of all pairs ( Parker & Vannest, 2009 ); however, these methods have come under scrutiny, and Wolery, Busick, Reichow, and Barton (2010) have suggested abandoning them altogether. Each of these methods appears to be well suited for managing specific data characteristics, but they should not be used to analyze data streams beyond their intended purpose until additional empirical research is conducted.

Combining SCED Results

Beyond the issue of single-case analysis is the matter of integrating and meta-analyzing the results of single-case experiments. SCEDs have been given short shrift in the majority of meta-analytic literature ( Littell, Corcoran, & Pillai, 2008 ; Shadish et al., 2008 ), with only a few exceptions ( Carr et al., 1999 ; Horner & Spaulding, 2010 ). Currently, few proven methods exist for integrating the results of multiple single-case experiments. Allison and Gorman (1993) and Shadish et al. (2008) present the problems associated with meta-analyzing single-case effect sizes, and W. P. Jones (2003) , Manolov and Solanas (2008) , Scruggs and Mastropieri (1998) , and Shadish et al. (2008) offer four different potential statistical solutions for this problem, none of which appear to have received consensus amongst researchers. The ability to synthesize and compare single-case effect sizes, particularly effect sizes garnered through group design research, is undoubtedly necessary to increase SCED proliferation.

Discussion of Review Results and Coding of Analytic Methods

The coding criteria for this review were quite stringent in terms of what was considered to be either visual or statistical analysis. For visual analysis to be coded as present, it was necessary for the authors to self-identify as having used a visual analysis method. In many cases, it could likely be inferred that visual analysis had been used, but it was often not specified. Similarly, statistical analysis was reserved for analytic methods that produced an effect. 3 Analyses that involved comparing magnitude of change using raw count data or percentages were not considered rigorous enough. These two narrow definitions of visual and statistical analysis contributed to the high rate of unreported analytic method, shown in Table 1 (52.3%). A better representation of the use of visual and statistical analysis would likely be the percentage of studies within those that reported a method of analysis. Under these parameters, 41.5% used visual analysis and 31.3% used statistical analysis. Included in these figures are studies that included both visual and statistical methods (11%). These findings are slightly higher than those estimated by Brossart et al. (2006) , who estimated statistical analysis is used in about 20% of SCED studies. Visual analysis continues to undoubtedly be the most prevalent method, but there appears to be a trend for increased use of statistical approaches, which is likely to only gain momentum as innovations continue.

Analysis Standards

The standards selected for inclusion in this review offer minimal direction in the way of analyzing the results of SCED research. Table 5 summarizes analysis-related information provided by the six reviewed sources for SCED standards. Visual analysis is acceptable to DV12 and DIV16, along with unspecified statistical approaches. In the WWC standards, visual analysis is the acceptable method of determining an intervention effect, with statistical analyses and randomization tests permissible as a complementary or supporting method to the results of visual analysis methods. However, the authors of the WWC standards state, “As the field reaches greater consensus about appropriate statistical analyses and quantitative effect-size measures, new standards for effect demonstration will need to be developed” ( Kratochwill et al., 2010 , p.16). The NRP and DIV12 seem to prefer statistical methods when they are warranted. The Tate at al. scale accepts only statistical analysis with the reporting of an effect size. Only the WWC and DIV16 provide guidance in the use of statistical analysis procedures: The WWC “recommends” nonparametric and parametric approaches, multilevel modeling, and regression when statistical analysis is used. DIV16 refers the reader to Wilkinson and the Task Force on Statistical Inference of the APA Board of Scientific Affairs (1999) for direction in this matter. Statistical analysis of daily diary and EMA methods is similarly unsettled. Stone and Shiffman (2002) ask for a detailed description of the statistical procedures used, in order for the approach to be replicated and evaluated. They provide direction for analyzing aggregated and disaggregated data. They also aptly note that because many different modes of analysis exist, researchers must carefully match the analytic approach to the hypotheses being pursued.

Limitations and Future Directions

This review has a number of limitations that leave the door open for future study of SCED methodology. Publication bias is a concern in any systematic review. This is particularly true for this review because the search was limited to articles published in peer-reviewed journals. This strategy was chosen in order to inform changes in the practice of reporting and of reviewing, but it also is likely to have inflated the findings regarding the methodological rigor of the reviewed works. Inclusion of book chapters, unpublished studies, and dissertations would likely have yielded somewhat different results.

A second concern is the stringent coding criteria in regard to the analytic methods and the broad categorization into visual and statistical analytic approaches. The selection of an appropriate method for analyzing SCED data is perhaps the murkiest area of this type of research. Future reviews that evaluate the appropriateness of selected analytic strategies and provide specific decision-making guidelines for researchers would be a very useful contribution to the literature. Although six sources of standards apply to SCED research reviewed in this article, five of them were developed almost exclusively to inform psychological and behavioral intervention research. The principles of SCED research remain the same in different contexts, but there is a need for non–intervention scientists to weigh in on these standards.

Finally, this article provides a first step in the synthesis of the available SCED reporting guidelines. However, it does not resolve disagreements, nor does it purport to be a definitive source. In the future, an entity with the authority to construct such a document ought to convene and establish a foundational, adaptable, and agreed-upon set of guidelines that cuts across subspecialties but is applicable to many, if not all, areas of psychological research, which is perhaps an idealistic goal. Certain preferences will undoubtedly continue to dictate what constitutes acceptable practice in each subspecialty of psychology, but uniformity along critical dimensions will help advance SCED research.

Conclusions

The first decade of the twenty-first century has seen an upwelling of SCED research across nearly all areas of psychology. This article contributes updated benchmarks in terms of the frequency with which SCED design and methodology characteristics are used, including the number of baseline observations, assessment and measurement practices, and data analytic approaches, most of which are largely consistent with previously reported benchmarks. However, this review is much broader than those of previous research teams and also breaks down the characteristics of single-case research by the predominant design. With the recent SCED proliferation came a number of standards for the conduct and reporting of such research. This article also provides a much-needed synthesis of recent SCED standards that can inform the work of researchers, reviewers, and funding agencies conducting and evaluating single-case research, which reveals many areas of consensus as well as areas of significant disagreement. It appears that the question of where to go next is very relevant at this point in time. The majority of the research design and measurement characteristics of the SCED are reasonably well established, and the results of this review suggest general practice that is in accord with existing standards and guidelines, at least in regard to published peer-reviewed works. In general, the published literature appears to be meeting the basic design and measurement requirement to ensure adequate internal validity of SCED studies.

Consensus regarding the superiority of any one analytic method stands out as an area of divergence. Judging by the current literature and lack of consensus, researchers will need to carefully select a method that matches the research design, hypotheses, and intended conclusions of the study, while also considering the most up-to-date empirical support for the chosen analytic method, whether it be visual or statistical. In some cases the number of observations and subjects in the study will dictate which analytic methods can and cannot be used. In the case of the true N -of-1 experiment, there are relatively few sound analytic methods, and even fewer that are robust with shorter data streams (see Borckardt et al., 2008 ). As the number of observations and subjects increases, sophisticated modeling techniques, such as MLM, SEM, and ARMA, become applicable. Trends in the data and autocorrelation further obfuscate the development of a clear statistical analysis selection algorithm, which currently does not exist. Autocorrelation was rarely addressed or discussed in the articles reviewed, except when the selected statistical analysis dictated consideration. Given the empirical evidence regarding the effect of autocorrelation on visual and statistical analysis, researchers need to address this more explicitly. Missing-data considerations are similarly left out when they are unnecessary for analytic purposes. As newly devised statistical analysis approaches mature and are compared with one another for appropriateness in specific SCED applications, guidelines for statistical analysis will necessarily be revised. Similarly, empirically derived guidance, in the form of a decision tree, must be developed to ensure application of appropriate methods based on characteristics of the data and the research questions being addressed. Researchers could also benefit from tutorials and comparative reviews of different software packages: This is a needed area of future research. Powerful and reliable statistical analyses help move the SCED up the ladder of experimental designs and attenuate the view that the method applies primarily to pilot studies and idiosyncratic research questions and situations.

Another potential future advancement of SCED research comes in the area of measurement. Currently, SCED research gives significant weight to observer ratings and seems to discourage other forms of data collection methods. This is likely due to the origins of the SCED in behavioral assessment and applied behavior analysis, which remains a present-day stronghold. The dearth of EMA and diary-like sampling procedures within the SCED research reviewed, yet their ever-growing prevalence in the larger psychological research arena, highlights an area for potential expansion. Observational measurement, although reliable and valid in many contexts, is time and resource intensive and not feasible in all areas in which psychologists conduct research. It seems that numerous untapped research questions are stifled because of this measurement constraint. SCED researchers developing updated standards in the future should include guidelines for the appropriate measurement requirement of non-observer-reported data. For example, the results of this review indicate that reporting of repeated measurements, particularly the high-density type found in diary and EMA sampling strategies, ought to be more clearly spelled out, with specific attention paid to autocorrelation and trend in the data streams. In the event that SCED researchers adopt self-reported assessment strategies as viable alternatives to observation, a set of standards explicitly identifying the necessary psychometric properties of the measures and specific items used would be in order.

Along similar lines, SCED researchers could take a page from other areas of psychology that champion multimethod and multisource evaluation of primary outcomes. In this way, the long-standing tradition of observational assessment and the cutting-edge technological methods of EMA and daily diary could be married with the goal of strengthening conclusions drawn from SCED research and enhancing the validity of self-reported outcome assessment. The results of this review indicate that they rarely intersect today, and I urge SCED researchers to adopt other methods of assessment informed by time-series, daily diary, and EMA methods. The EMA standards could serve as a jumping-off point for refined measurement and assessment reporting standards in the context of multimethod SCED research.

One limitation of the current SCED standards is their relatively limited scope. To clarify, with the exception of the Stone & Shiffman EMA reporting guidelines, the other five sources of standards were developed in the context of designing and evaluating intervention research. Although this is likely to remain its patent emphasis, SCEDs are capable of addressing other pertinent research questions in the psychological sciences, and the current standards truly only roughly approximate salient crosscutting SCED characteristics. I propose developing broad SCED guidelines that address the specific design, measurement, and analysis issues in a manner that allows it to be useful across applications, as opposed to focusing solely on intervention effects. To accomplish this task, methodology experts across subspecialties in psychology would need to convene. Admittedly this is no small task.

Perhaps funding agencies will also recognize the fiscal and practical advantages of SCED research in certain areas of psychology. One example is in the field of intervention effectiveness, efficacy, and implementation research. A few exemplary studies using robust forms of SCED methodology are needed in the literature. Case-based methodologies will never supplant the group design as the gold standard in experimental applications, nor should that be the goal. Instead, SCEDs provide a viable and valid alternative experimental methodology that could stimulate new areas of research and answer questions that group designs cannot. With the astonishing number of studies emerging every year that use single-case designs and explore the methodological aspects of the design, we are poised to witness and be a part of an upsurge in the sophisticated application of the SCED. When federal grant-awarding agencies and journal editors begin to use formal standards while making funding and publication decisions, the field will benefit.

Last, for the practice of SCED research to continue and mature, graduate training programs must provide students with instruction in all areas of the SCED. This is particularly true of statistical analysis techniques that are not often taught in departments of psychology and education, where the vast majority of SCED studies seem to be conducted. It is quite the conundrum that the best available statistical analytic methods are often cited as being inaccessible to social science researchers who conduct this type of research. This need not be the case. To move the field forward, emerging scientists must be able to apply the most state-of-the-art research designs, measurement techniques, and analytic methods.

Acknowledgments

Research support for the author was provided by research training grant MH20012 from the National Institute of Mental Health, awarded to Elizabeth A. Stormshak. The author gratefully acknowledges Robert Horner and Laura Lee McIntyre, University of Oregon; Michael Nash, University of Tennessee; John Ferron, University of South Florida; the Action Editor, Lisa Harlow, and the anonymous reviewers for their thoughtful suggestions and guidance in shaping this article; Cheryl Mikkola for her editorial support; and Victoria Mollison for her assistance in the systematic review process.

Appendix. Results of Systematic Review Search and Studies Included in the Review

Psycinfo search conducted july 2011.

Primary key terms and phrases appearing ANYWHERE in the article (asterisks denote that any characters/letters can follow the last character of the search term):

Alternating treatment design

Changing criterion design

Experimental case*

Multiple baseline design

Replicated single-case design

Simultaneous treatment design

Time-series design

Methodological limiters:

Quantitative study OR treatment outcome/randomized clinical trial

NOT field study OR interview OR focus group OR literature review OR systematic review OR mathematical model OR qualitative study

Other limiters:

Publication range: 2000–2010

Published in peer-reviewed journals

Available in the English Language

Bibliography

(* indicates inclusion in study: N = 409)

  • *. Aburrous M, Hossain MA, Dahal K, Thabtah F. Experimental case studies for investigating e-banking phishing techniques and attack strategies. Cognitive Computation. 2010;2(3):242–253. doi: 10.1007/s12559-010-9042-7. [ DOI ] [ Google Scholar ]
  • Aguilar R, Caramés JM, Espinet A. Effects of neonatal handling on playfulness by means of reversal of the desire to play in rats (Rattus norvegicus) Journal of Comparative Psychology. 2009;123(4):347–356. doi: 10.1037/a0016437. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Ahearn WH. Using simultaneous presentation to increase vegetables consumption in a mildly selective child with autism. Journal of Applied Behavior Analysis. 2003;36(3):361–365. doi: 10.1901/jaba.2003.36-361. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • *. Allday RA, Pakurar K. Effects of teacher greetings on student on-task behavior. Journal of Applied Behavior Analysis. 2007;40(2):317–320. doi: 10.1901/jaba.2007.86-06. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • *. Allen-DeBoer RA, Malmgren KW, Glass M-E. Reading instruction for youth with emotional and behavioral disorders in a juvenile correctional facility. Behavioral Disorders. 2006;32(1):18–28. [ Google Scholar ]
  • Almeida FA, Smith-Ray RL, Van Den Berg R, Schriener P, Gonzales M, Onda P, Estabrooks PA. Utilizing a simple stimulus control strategy to increase physician referrals for physical activity promotion. Journal of Sport & Exercise Psychology. 2005;27(4):505–514. [ Google Scholar ]
  • Almer ED, Gramling AA, Kaplan SE. Impact of post-restatement actions taken by a firm on non-professional investors’ credibility perceptions. Journal of Business Ethics. 2008;80(1):61–76. doi: 10.1007/s10551-007-9442-0. [ DOI ] [ Google Scholar ]
  • *. Alvero AM, Rost K, Austin J. The safety observer effect: The effects of conducting safety observations. Journal of Safety Research. 2008;39(4):365–373. doi: 10.1016/j.jsr.2008.05.004. [ DOI ] [ PubMed ] [ Google Scholar ]
  • Amato Zech NA, Hoff KE, Doepke KJ. Increasing on-task behavior in the classroom: Extension of self-monitoring strategies. Psychology in the Schools. 2006;43(2):211–221. doi: 10.1002/pits.20137. [ DOI ] [ Google Scholar ]
  • *. Andrews-Salvia M, Roy N, Cameron RM. Evaluating the effects of memory books for individuals with severe dementia. Journal of Medical Speech-Language Pathology. 2003;11(1):51–59. [ Google Scholar ]
  • *. Angermeier K, Schlosser RW, Luiselli JK, Harrington C, Carter B. Effects of iconicity on requesting with the Picture Exchange Communication System in children with autism spectrum disorder. Research in Autism Spectrum Disorders. 2008;2(3):430–446. doi: 10.1016/j.rasd.2007.09.004. [ DOI ] [ Google Scholar ]
  • *. Anglesea MM, Hoch H, Taylor BA. Reducing rapid eating in teenagers with autism: Use of a pager prompt. Journal of Applied Behavior Analysis. 2008;41(1):107–111. doi: 10.1901/jaba.2008.41-107. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Antonelli G, Arrichiello F, Chiaverini S. The null-space-based behavioral control for autonomous robotic systems. Intelligent Service Robotics. 2008;1(1):27–39. doi: 10.1007/s11370-007-0002-3. [ DOI ] [ Google Scholar ]
  • Apple AL, Billingsley F, Schwartz IS. Effects of video modeling alone and with self-management on compliment-giving behaviors of children with high-functioning ASD. Journal of Positive Behavior Interventions. 2005;7(1):33–46. doi: 10.1177/10983007050070010401. [ DOI ] [ Google Scholar ]
  • *. Arbona CB, Osma J, Garcia-Palacios A, Quero S, Baños RM. Treatment of flying phobia using virtual reality: Data from a 1-year follow-up using a multiple baseline design. Clinical Psychology & Psychotherapy. 2004;11(5):311–323. doi: 10.1002/cpp.404. [ DOI ] [ Google Scholar ]
  • *. Arco L, du Toit E. Effects of adding on-the-job feedback to conventional analog staff training in a nursing home. Behavior Modification. 2006;30(5):713–735. doi: 10.1177/0145445505281058. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Ardoin SP, McCall M, Klubnik C. Promoting generalization of oral reading fluency: Providing drill versus practice opportunities. Journal of Behavioral Education. 2007;16(1):55–70. [ Google Scholar ]
  • *. Arndorfer RE, Allen KD. Extending the efficacy of a thermal biofeedback treatment package to the management of tension-type headaches in children. Headache: The Journal of Head and Face Pain. 2001;41(2):183–192. doi: 10.1046/j.1526-4610.2001.111006183.x. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Auslander GK, Buchs A. Evaluating an activity intervention with hemodialysis patients in Israel. Social Work in Health Care. 2002;35(1–2):407–423. doi: 10.1300/J010v35n01_05. [ DOI ] [ PubMed ] [ Google Scholar ]
  • Austin J, Weatherly NL, Gravina NE. Using task clarification, graphic feedback, and verbal feedback to increase closing-task completion in a privately owned restaurant. Journal of Applied Behavior Analysis. 2005;38(1):117–120. doi: 10.1901/jaba.2005.159-03. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • *. Azrin NH, Brooks J, Kellen MJ, Ehle C, Vinas V. Speed of eating as a determinant of the bulimic desire to vomit. Child & Family Behavior Therapy. 2008;30(3):263–270. doi: 10.1080/07317100802275728. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Azrin NH, Kellen MJ, Brooks J, Ehle C, Vinas V. Relationship between rate of eating and degree of satiation. Child & Family Behavior Therapy. 2008;30(4):355–364. doi: 10.1080/07317100802483223. [ DOI ] [ Google Scholar ]
  • *. Bach AK, Barlow DH, Wincze JP. The enhancing effects of manualized treatment for erectile dysfunction among men using sildenafil: A preliminary investigation. Behavior Therapy. 2004;35(1):55–73. doi: 10.1016/s0005-7894(04)80004-2. [ DOI ] [ Google Scholar ]
  • Banda DR, Hart SL. Increasing peer-to-peer social skills through direct instruction of two elementary school girls with autism. Journal of Research in Special Educational Needs. 2010;10(2):124–132. doi: 10.1111/j.1471-3802.2010.01149.x. [ DOI ] [ Google Scholar ]
  • *. Banda DR, Hart SL, Liu-Gitz L. Impact of training peers and children with autism on social skills during center time activities in inclusive classrooms. Research in Autism Spectrum Disorders. 2010;4(4):619–625. doi: 10.1016/j.rasd.2009.12.005. [ DOI ] [ Google Scholar ]
  • *. Barbera ML, Kubina RM., Jr Using transfer procedures to teach tacts to a child with autism. Analysis of Verbal Behavior. 2005;21:155–161. doi: 10.1007/BF03393017. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • *. Barry LM, Messer JJ. A practical application of self-management for students diagnosed with attention-deficit/hyperactivity disorder. Journal of Positive Behavior Interventions. 2003;5(4):238–248. doi: 10.1177/10983007030050040701. [ DOI ] [ Google Scholar ]
  • Bartels J, Douwes R, de Jong M, Pruyn A. Organizational identification during a merger: Determinants of employees’ expected identification with the new organization. British Journal of Management. 2006;17(Suppl 1):S49–S67. doi: 10.1111/j.1467-8551.2006.00478.x. [ DOI ] [ Google Scholar ]
  • *. Barton EE, Wolery M. Evaluation of e-mail feedback on the verbal behaviors of pre-service teachers. Journal of Early Intervention. 2007;30(1):55–72. doi: 10.1177/105381510703000105. [ DOI ] [ Google Scholar ]
  • *. Bass-Ringdahl SM. The relationship of audibility and the development of canonical babbling in young children with hearing impairment. Journal of Deaf Studies and Deaf Education. 2010;15(3):287–310. doi: 10.1093/deafed/enq013. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • *. Batchelder A, McLaughlin TF, Weber KP, Derby KM, Gow T. The effects of hand-over-hand and a dot-to-dot tracing procedure on teaching an autistic student to write his name. Journal of Developmental and Physical Disabilities. 2009;21(2):131–138. doi: 10.1007/s10882-009-9131-2. [ DOI ] [ Google Scholar ]
  • Beautrais AL, Gibb SJ, Fergusson DM, Horwood LJ, Larkin GL. Removing bridge barriers stimulates suicides: An unfortunate natural experiment. Australian and New Zealand Journal of Psychiatry. 2009;43(6):495–497. doi: 10.1080/00048670902873714. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Beck KV, Miltenberger RG. Evaluation of a commercially available program and in situ training by parents to teach abduction-prevention skills to children. Journal of Applied Behavior Analysis. 2009;42(4):761–772. doi: 10.1901/jaba.2009.42-761. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • *. Beeson PM, Egnor H. Combining treatment for written and spoken naming. Journal of the International Neuropsychological Society. 2006;12(6):816–827. doi: 10.1017/s1355617706061005. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • *. Begeny JC, Martens BK. Assisting low-performing readers with a group-based reading fluency intervention. School Psychology Review. 2006;35(1):91–107. [ Google Scholar ]
  • *. Belfiore PJ, Fritts KM, Herman BC. The role of procedural integrity: Using self-monitoring to enhance Discrete Trial Instruction (DTI) Focus on Autism and Other Developmental Disabilities. 2008;23(2):95–102. doi: 10.1177/1088357607311445. [ DOI ] [ Google Scholar ]
  • *. Bell RJ, Skinner CH, Fisher LA. Decreasing putting yips in accomplished golfers via solution-focused guided imagery: A single-subject research design. Journal of Applied Sport Psychology. 2009;21(1):1–14. doi: 10.1080/10413200802443776. [ DOI ] [ Google Scholar ]
  • *. Benavides CA, Poulson CL. Task interspersal and performance of matching tasks by preschoolers with autism. Research in Autism Spectrum Disorders. 2009;3(3):619–629. doi: 10.1016/j.rasd.2008.12.001. [ DOI ] [ Google Scholar ]
  • *. Benedict EA, Horner RH, Squires JK. Assessment and implementation of positive behavior support in preschools. Topics in Early Childhood Special Education. 2007;27(3):174–192. doi: 10.1177/02711214070270030801. [ DOI ] [ Google Scholar ]
  • *. Bennett K, Brady MP, Scott J, Dukes C, Frain M. The effects of covert audio coaching on the job performance of supported employees. Focus on Autism and Other Developmental Disabilities. 2010;25(3):173–185. doi: 10.1177/1088357610371636. [ DOI ] [ Google Scholar ]
  • Berk RA, Sorenson SB, Wiebe DJ, Upchurch DM. The legalization of abortion and subsequent youth homicide: A time series analysis. Analyses of Social Issues and Public Policy (ASAP) 2003;3(1):45–64. doi: 10.1111/j.1530-2415.2003.00014.x. [ DOI ] [ Google Scholar ]
  • Bermúdez-Ornelas G, Hernández-Guzmán L. Tratamiento de una sesión de la fobia específica a las arañas en niños [Treatment of a session of phobia to spiders in children] International Journal of Clinical and Health Psychology. 2008;8(3):779–791. [ Google Scholar ]
  • Birkan B, McClannahan LE, Krantz PJ. Effects of superimposition and background fading on the sight-word reading of a boy with autism. Research in Autism Spectrum Disorders. 2007;1(2):117–125. doi: 10.1016/j.rasd.2006.08.003. [ DOI ] [ Google Scholar ]
  • Björkman T, Hansson L. Case management for individuals with a severe mental illness: A 6-year follow-up study. International Journal of Social Psychiatry. 2007;53(1):12–22. doi: 10.1177/0020764006066849. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Blischak DM, Shah SD, Lombardino LJ, Chiarella K. Effects of phonemic awareness instruction on the encoding skills of children with severe speech impairment. Disability and Rehabilitation: An International, Multidisciplinary Journal. 2004;26(21–22):1295–1304. doi: 10.1080/09638280412331280325. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Bliss SL, Skinner CH, Adams R. Enhancing an English language learning fifth-grade student’s sight-word reading with a time-delay taped-words intervention. School Psychology Review. 2006;35(4):663–670. [ Google Scholar ]
  • *. Bloh C. Assessing transfer of stimulus control procedures across learners with autism. Analysis of Verbal Behavior. 2008;24:87–101. doi: 10.1007/BF03393059. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • *. Boersma K, Linton S, Overmeer T, Jansson M, Vlaeyen J, de Jong J. Lowering fear-avoidance and enhancing function through exposure in vivo. A multiple baseline study across six patients with back pain. Pain. 2004;108(1–2):8–16. doi: 10.1016/j.pain.2003.03.001. [ DOI ] [ PubMed ] [ Google Scholar ]
  • Borrero CSW, Vollmer TR. Experimental analysis and treatment of multiply controlled problem behavior: A systematic replication and extension. Journal of Applied Behavior Analysis. 2006;39(3):375–379. doi: 10.1901/jaba.2006.170-04. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • *. Bosseler A, Massaro DW. Development and evaluation of a computer-animated tutor for vocabulary and language learning in children with autism. Journal of Autism and Developmental Disorders. 2003;33(6):653–672. doi: 10.1023/B:JADD.0000006002.82367.4f. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Botella C, Bretón-López J, Quero S, Baños R, García-Palacios A. Treating cockroach phobia with augmented reality. Behavior Therapy. 2010;41(3):401–413. doi: 10.1016/j.beth.2009.07.002. [ DOI ] [ PubMed ] [ Google Scholar ]
  • Bowie SL, Barthelemy JJ, White G., Jr Federal welfare and housing policy at the crossroads: Outcomes from a rent incentive-based welfare-to-work initiative in a low-income, predominantly African American, urban public housing community. Journal of Human Behavior in the Social Environment. 2007;15(2–3):391–414. doi: 10.1300/J137v15n02_22. [ DOI ] [ Google Scholar ]
  • *. Bowman-Perrott LJ, Greenwood CR, Tapia Y. The efficacy of CWPT used in secondary alternative school classrooms with small teacher/pupil ratios and students with emotional and behavioral disorders. Education & Treatment of Children. 2007;30(3):65–87. [ Google Scholar ]
  • Boyer E, Miltenberger RG, Batsche C, Fogel V. Video modeling by experts with video feedback to enhance gymnastics skills. Journal of Applied Behavior Analysis. 2009;42(4):855–860. doi: 10.1901/jaba.2009.42-855. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Bradshaw W. Use of single-system research to evaluate the effectiveness of cognitive-behavioural treatment of schizophrenia. British Journal of Social Work. 2003;33(7):885–899. doi: 10.1093/bjsw/33.7.885. [ DOI ] [ Google Scholar ]
  • Brannick MT, Fabri PJ, Zayas-Castro J, Bryant RH. Evaluation of an error-reduction training program for surgical residents. Academic Medicine. 2009;84(12):1809–1814. doi: 10.1097/ACM.0b013e3181bf36b0. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Bray MA, Kehle TJ, Peck HL, Margiano SG, Dobson R, Peczynski K, Alric JM. Written emotional expression as an intervention for asthma: A replication. Journal of Applied School Psychology. 2005;22(1):141–165. doi: 10.1300/J370v22n01_08. [ DOI ] [ Google Scholar ]
  • *. Brenske S, Rudrud EH, Schulze KA, Rapp JT. Increasing activity attendance and engagement in individuals with dementia using descriptive prompts. Journal of Applied Behavior Analysis. 2008;41(2):273–277. doi: 10.1901/jaba.2008.41-273. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Bressi C, Lo Baido R, Manenti S, Frongia P, Guidotti B, Maggi L, Invernizzi G. Efficacia clinica della terapia familiare sistemica nella schizofrenia: Uno studio prospettico longitudinale [Schizophrenia and the clinical efficacy of systemic family therapy: A prospective longitudinal study] Rivista di Psichiatria. 2004;39(3):189–197. [ Google Scholar ]
  • Bressi C, Manenti S, Frongia P, Porcellana M, Invernizzi G. Systemic family therapy in schizophrenia: A randomized clinical trial of effectiveness. Psychotherapy and Psychosomatics. 2007;77(1):43–49. doi: 10.1159/000110059. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Buckley SD, Newchok DK. Differential impact of response effort within a response chain on use of mands in a student with autism. Research in Developmental Disabilities. 2005;26(1):77–85. doi: 10.1016/j.ridd.2004.07.004. [ DOI ] [ PubMed ] [ Google Scholar ]
  • Burke RV, Andersen MN, Bowen SL, Howard MR, Allen KD. Evaluation of two instruction methods to increase employment options for young adults with autism spectrum disorders. Research in Developmental Disabilities. 2010;31(6):1223–1233. doi: 10.1016/j.ridd.2010.07.023. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Burns MK, Peters R, Noell GH. Using performance feedback to enhance implementation fidelity of the problem-solving team process. Journal of School Psychology. 2008;46(5):537–550. doi: 10.1016/j.jsp.2008.04.001. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Byrnes V. Getting a feel for the market: The use of privatized school management in Philadelphia. American Journal of Education. 2009;115(3):437–455. doi: 10.1086/597486. [ DOI ] [ Google Scholar ]
  • *. Calmels C, Berthoumieux C, d’Arripe-Longueville F. Effects of an imagery training program on selective attention of national softball players. The Sport Psychologist. 2004;18(3):272–296. [ Google Scholar ]
  • *. Calmels C, Holmes P, Berthoumieux C, Singer RN. The development of movement imagery vividness through a structured intervention in softball. Journal of Sport Behavior. 2004;27(4):307–322. [ Google Scholar ]
  • *. Camarata S, Yoder P, Camarata M. Simultaneous treatment of grammatical and speech-comprehensibility deficits in children with Down syndrome. Down Syndrome: Research & Practice. 2006;11(1):9–17. doi: 10.3104/reports.314. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Cancio EJ, West RP, Young KR. Improving mathematics homework completion and accuracy of students with EBD through self-management and parent participation. Journal of Emotional and Behavioral Disorders. 2004;12(1):9–22. doi: 10.1177/10634266040120010201. [ DOI ] [ Google Scholar ]
  • *. Cannon JE, Fredrick LD, Easterbrooks SR. Vocabulary instruction through books read in American Sign Language for English-language learners with hearing loss. Communication Disorders Quarterly. 2010;31(2):98–112. doi: 10.1177/1525740109332832. [ DOI ] [ Google Scholar ]
  • Cardman S, Ryan BP. Experimental analysis of the relationship between speaking rate and stuttering during mother-child conversation II. Journal of Developmental and Physical Disabilities. 2007;19(5):457–469. doi: 10.1007/s10882-007-9063-7. [ DOI ] [ Google Scholar ]
  • *. Carlson B, McLaughlin TF, Derby KM, Blecher J. Teaching preschool children with autism and developmental delays to write. Electronic Journal of Research in Educational Psychology. 2009;7(1):225–238. [ Google Scholar ]
  • Carlson JI, Luiselli JK, Slyman A, Markowski A. Choice-making as intervention for public disrobing in children with developmental disabilities. Journal of Positive Behavior Interventions. 2008;10(2):86–90. doi: 10.1177/1098300707312544. [ DOI ] [ Google Scholar ]
  • *. Carnahan C, Musti-Rao S, Bailey J. Promoting active engagement in small group learning experiences for students with autism and significant learning needs. Education & Treatment of Children. 2009;32(1):37–61. doi: 10.1353/etc.0.0047. [ DOI ] [ Google Scholar ]
  • *. Carrier MH, Côté G. Évaluation de l’efficacité d’un traitement cognitif-comportemental pour le trouble d’anxiété généralisée combiné à des strategiés de régulation des émotions et d’acceptation et d’engagement expérientiel [Evaluation of a cognitive-behavioral treatment for GAD combined with emotion regulation and acceptance-based strategies] European Review of Applied Psychology/Revue Européenne de Psychologie Appliquée. 2010;60(1):11–25. doi: 10.1016/j.erap.2009.06.002. [ DOI ] [ Google Scholar ]
  • *. Carter DR, Horner RH. Adding function-based behavioral support to First Step to Success: Integrating individualized and manualized practices. Journal of Positive Behavior Interventions. 2009;11(1):22–34. doi: 10.1177/1098300708319125. [ DOI ] [ Google Scholar ]
  • *. Carter DR, Norman RK. Class-wide positive behavior support in preschool: Improving teacher implementation through consultation. Early Childhood Education Journal. 2010;38(4):279–288. doi: 10.1007/s10643-010-0409-x. [ DOI ] [ Google Scholar ]
  • *. Casey AM, McWilliam RA. Graphical feedback to increase teachers’ use of incidental teaching. Journal of Early Intervention. 2008;30(3):251–268. doi: 10.1177/1053815108319038. [ DOI ] [ Google Scholar ]
  • *. Casey SD, Merical CL. The use of functional communication training without additional treatment procedures in an inclusive school setting. Behavioral Disorders. 2006;32(1):46–54. [ Google Scholar ]
  • *. Cass M, Cates D, Smith M, Jackson C. Effects of manipulative instruction on solving area and perimeter problems by students with learning disabilities. Learning Disabilities Research & Practice. 2003;18(2):112–120. doi: 10.1111/1540-5826.00067. [ DOI ] [ Google Scholar ]
  • *. Catania CN, Almeida D, Liu-Constant B, DiGennaro Reed FD. Video modeling to train staff to implement discrete-trial instruction. Journal of Applied Behavior Analysis. 2009;42(2):387–392. doi: 10.1901/jaba.2009.42-387. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Cautilli JD, Dziewolska H. Brief report: The use of opportunity to respond and practice to increase efficiency of the stepping reflex in a five-month-old infant. The Behavior Analyst Today. 2006;7(4):538–547. [ Google Scholar ]
  • Ceroni GB, Rucci P, Berardi D, Berti Ceroni F, Katon W. Case review vs. usual care in primary care patients with depression: A pilot study. General Hospital Psychiatry. 2002;24(2):71–80. doi: 10.1016/s0163-8343(01)00182-7. [ DOI ] [ PubMed ] [ Google Scholar ]
  • Chaabane DBB, Alber-Morgan SR, DeBar RM. The effects of parent-implemented PECS training on improvisation of mands by children with autism. Journal of Applied Behavior Analysis. 2009;42(3):671–677. doi: 10.1901/jaba.2009.42-671. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • *. Charlop MH, Malmberg DB, Berquist KL. An application of the Picture Exchange Communication System (PECS) with children with autism and a visually impaired therapist. Journal of Developmental and Physical Disabilities. 2008;20(6):509–525. doi: 10.1007/s10882-008-9112-x. [ DOI ] [ Google Scholar ]
  • Cho H, Wilke DJ. How has the Violence Against Women Act affected the response of the criminal justice system to domestic violence? Journal of Sociology and Social Welfare. 2005;32(4):125–139. [ Google Scholar ]
  • *. Choate ML, Pincus DB, Eyberg SM, Barlow DH. Parent-child interaction therapy for treatment of separation anxiety disorder in young children: A pilot study. Cognitive and Behavioral Practice. 2005;12(1):126–135. doi: 10.1016/s1077-7229(05)80047-1. [ DOI ] [ Google Scholar ]
  • *. Christian AH, Mills T, Simpson SL, Mosca L. Quality of cardiovascular disease preventive care and physician/practice characteristics. Journal of General Internal Medicine. 2006;21(3):231–237. doi: 10.1111/j.1525-1497.2006.00331.x. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Chu YH, Frongillo EA, Jones SJ, Kaye GL. Improving patrons’ meal selections through the use of point-of-selection nutrition labels. American Journal of Public Health. 2009;99(11):2001–2005. doi: 10.2105/ajph.2008.153205. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Ciemins EL. The effect of parity-induced copayment reductions on adolescent utilization of substance use services. Journal of Studies on Alcohol. 2004;65(6):731–735. doi: 10.15288/jsa.2004.65.731. [ DOI ] [ PubMed ] [ Google Scholar ]
  • Ciemins EL, Blum L, Nunley M, Lasher A, Newman JM. The economic and clinical impact of an inpatient palliative care consultation service: A multifaceted approach. Journal of Palliative Medicine. 2007;10(6):1347–1355. doi: 10.1089/jpm.2007.0065. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Clarfield J, Stoner G. Research brief: The effects of computerized reading instruction on the academic performance of students identified with ADHD. School Psychology Review. 2005;34(2):246–254. [ Google Scholar ]
  • *. Clayton M, Helms B, Simpson C. Active prompting to decrease cell phone use and increase seat belt use while driving. Journal of Applied Behavior Analysis. 2006;39(3):341–349. doi: 10.1901/jaba.2006.153-04. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Clayton MC, Woodard C. The effect of response cards on participation and weekly quiz scores of university students enrolled introductory psychology courses. Journal of Behavioral Education. 2007;16(3):250–258. doi: 10.1007/s10864-007-9038-x. [ DOI ] [ Google Scholar ]
  • *. Clyne C, Blampied NM. Training in emotion regulation as a treatment for binge eating: A preliminary study. Behaviour Change. 2004;21(4):269–281. doi: 10.1375/bech.21.4.269.66105. [ DOI ] [ Google Scholar ]
  • *. Codding RS, Livanis A, Pace GM, Vaca L. Using performance feedback to improve treatment integrity of classwide behavior plans: An investigation of observer reactivity. Journal of Applied Behavior Analysis. 2008;41(3):417–422. doi: 10.1901/jaba.2008.41-417. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • *. Codding RS, Skowron J, Pace GM. Back to basics: Training teachers to interpret curriculum-based measurement data and create observable and measurable objectives. Behavioral Interventions. 2005;20(3):165–176. doi: 10.1002/bin.194. [ DOI ] [ Google Scholar ]
  • *. Codding RS, Smyth CA. Using performance feedback to decrease classroom transition time and examine collateral effects on academic engagement. Journal of Educational & Psychological Consultation. 2008;18(4):325–345. doi: 10.1080/10474410802463312. [ DOI ] [ Google Scholar ]
  • *. Coleman-Martin MB, Heller KW. Using a modified constant prompt-delay procedure to teach spelling to students with physical disabilities. Journal of Applied Behavior Analysis. 2004;37(4):469–480. doi: 10.1901/jaba.2004.37-469. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Collie-Akers V, Schultz JA, Carson V, Fawcett SB, Ronan M. Evaluating mobilization strategies with neighborhood and faith organizations to reduce risk for health disparities. Health Promotion Practice. 2009;10(2, Suppl):118S–127S. doi: 10.1177/1524839908331271. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Collins S, Higbee TS, Salzberg CL. The effects of video modeling on staff implementation of a problem-solving intervention with adults with developmental disabilities. Journal of Applied Behavior Analysis. 2009;42(4):849–854. doi: 10.1901/jaba.2009.42-849. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • *. Conelea CA, Woods DW. Examining the impact of distraction on tic suppression in children and adolescents with Tourette syndrome. Behaviour Research and Therapy. 2008;46(11):1193–1200. doi: 10.1016/j.brat.2008.07.005. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Conroy MA, Asmus JM, Sellers JA, Ladwig CN. The use of an antecedent-based intervention to decrease stereotypic behavior in a general education classroom: A case study. Focus on Autism and Other Developmental Disabilities. 2005;20(4):223–230. doi: 10.1177/10883576050200040401. [ DOI ] [ Google Scholar ]
  • Cooper MD. Exploratory analyses of the effects of managerial support and feedback consequences on behavioral safety maintenance. Journal of Organizational Behavior Management. 2006;26(3):1–41. doi: 10.1300/J075v26n03_01. [ DOI ] [ Google Scholar ]
  • *. Cory L, Dattilo J, Williams R. Effects of a leisure education program on social knowledge and skills of youth with cognitive disabilities. Therapeutic Recreation Journal. 2006;40(3):144–164. [ Google Scholar ]
  • Coryn CLS, Schröter DC, Hanssen CE. Adding a time-series design element to the success case method to improve methodological rigor: An application for nonprofit program evaluation. American Journal of Evaluation. 2009;30(1):80–92. doi: 10.1177/1098214008326557. [ DOI ] [ Google Scholar ]
  • *. Craig-Unkefer LA, Kaiser AP. Increasing peer-directed social-communication skills of children enrolled in Head Start. Journal of Early Intervention. 2003;25(4):229–247. doi: 10.1177/105381510302500401. [ DOI ] [ Google Scholar ]
  • *. Crawley SH, Lynch P, Vannest K. The use of self-monitoring to reduce off-task behavior and cross-correlation examination of weekends and absences as an antecedent to off-task behavior. Child & Family Behavior Therapy. 2006;28(2):29–48. doi: 10.1300/J019v28n02_03. [ DOI ] [ Google Scholar ]
  • *. Creech J, Golden JA. Increasing Braille practice and reading comprehension in a student with visual impairment and moderate mental retardation: An initial study and follow-up. Journal of Developmental and Physical Disabilities. 2009;21(3):225–233. doi: 10.1007/s10882-009-9137-9. [ DOI ] [ Google Scholar ]
  • Crosland KA, Cigales M, Dunlap G, Neff B, Clark HB, Giddings T, Blanco A. Using staff training to decrease the use of restrictive procedures at two facilities for foster care children. Research on Social Work Practice. 2008;18(5):401–409. doi: 10.1177/1049731507314006. [ DOI ] [ Google Scholar ]
  • Crosland KA, Dunlap G, Sager W, Neff B, Wilcox C, Blanco A, Giddings T. The effects of staff training on the types of interactions observed at two group homes for foster care children. Research on Social Work Practice. 2008;18(5):410–420. doi: 10.1177/1049731507314000. [ DOI ] [ Google Scholar ]
  • *. Crozier S, Tincani MJ. Using a modified social story to decrease disruptive behavior of a child with autism. Focus on Autism and Other Developmental Disabilities. 2005;20(3):150–157. doi: 10.1177/10883576050200030301. [ DOI ] [ Google Scholar ]
  • *. Culig KM, Dickinson AM, Lindstrom-Hazel D, Austin J. Combining workstation design and performance management to increase ergonomically correct computer typing postures. Journal of Organizational Behavior Management. 2008;28(3):146–175. doi: 10.1080/01608060802251064. [ DOI ] [ Google Scholar ]
  • Da Fonte MA, Taber-Doughty T. The use of graphic symbols in infancy: How early can we start? Early Child Development and Care. 2010;180(4):417–439. doi: 10.1080/03004430802009141. [ DOI ] [ Google Scholar ]
  • *. Dallery J, Glenn IM, Raiff BR. An Internet-based abstinence reinforcement treatment for cigarette smoking. Drug and Alcohol Dependence. 2007;86(2–3):230–238. doi: 10.1016/j.drugalcdep.2006.06.013. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Dallery J, Meredith S, Glenn IM. A deposit contract method to deliver abstinence reinforcement for cigarette smoking. Journal of Applied Behavior Analysis. 2008;41(4):609–615. doi: 10.1901/jaba.2008.41-609. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • *. Davis KM, Boon RT, Cihak DF, Fore C., III Power cards to improve conversational skills in adolescents with Asperger syndrome. Focus on Autism and Other Developmental Disabilities. 2010;25(1):12–22. doi: 10.1177/1088357609354299. [ DOI ] [ Google Scholar ]
  • de los Angeles Cruz-Almanza M, Gaona-Márquez L, Sánchez-Sosa JJ. Empowering women abused by their problem drinker spouses: Effects of a cognitive-behavioral intervention. Salud Mental. 2006;29(5):25–31. [ Google Scholar ]
  • *. DeQuinzio JA, Townsend DB, Poulson CL. The effects of forward chaining and contingent social interaction on the acquisition of complex sharing responses by children with autism. Research in Autism Spectrum Disorders. 2008;2(2):264–275. doi: 10.1016/j.rasd.2007.06.006. [ DOI ] [ Google Scholar ]
  • *. DeRosse P, Fields L. The contextually controlled, feature-mediated classification of symbols. Journal of the Experimental Analysis of Behavior. 2010;93(2):225–245. doi: 10.1901/jeab.2010.93-225. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • *. Devlin P. Enhancing the job performance of employees with disabilities using the self-determined career development model. Education and Training in Developmental Disabilities. 2008;43(4):502–513. doi: 10.1352/1934-9556-49.4.221. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Digennaro-Reed FD, Codding R, Catania CN, Maguire H. Effects of video modeling on treatment integrity of behavioral interventions. Journal of Applied Behavior Analysis. 2010;43(2):291–295. doi: 10.1901/jaba.2010.43-291. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • *. Dimling LM. Conceptually based vocabulary intervention: Second graders’ development of vocabulary words. American Annals of the Deaf. 2010;155(4):425–448. doi: 10.1353/aad.2010.0040. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Dixon MR, Holton B. Altering the magnitude of delay discounting by pathological gamblers. Journal of Applied Behavior Analysis. 2009;42(2):269–275. doi: 10.1901/jaba.2009.42-269. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • *. Dodd S, Hupp SDA, Jewell JD, Krohn E. Using parents and siblings during a social story intervention for two children diagnosed with PDD-NOS. Journal of Developmental and Physical Disabilities. 2008;20(3):217–229. doi: 10.1007/s10882-007-9090-4. [ DOI ] [ Google Scholar ]
  • Dojo Y, Tanaka-Matsumi J, Inoue N. Effect of goal setting with a target behavior card on class preparation behavior of children in a regular classroom. The Japanese Journal of Behavior Analysis. 2004;19(2):148–160. [ Google Scholar ]
  • *. Donaldson JM, Normand MP. Using goal setting, self-monitoring, and feedback to increase calorie expenditure in obese adults. Behavioral Interventions. 2009;24(2):73–83. doi: 10.1002/bin.277. [ DOI ] [ Google Scholar ]
  • Dorminy KP, Luscre D, Gast DL. Teaching organizational skills to children with high functioning autism and Asperger’s syndrome. Education and Training in Developmental Disabilities. 2009;44(4):538–550. [ Google Scholar ]
  • *. Downs A, Downs RC, Rau K. Effects of training and feedback on discrete trial teaching skills and student performance. Research in Developmental Disabilities. 2008;29(3):235–246. doi: 10.1016/j.ridd.2007.05.001. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Drager KDR, Postal VJ, Carrolus L, Castellano M, Gagliano C, Glynn J. The effect of aided language modeling on symbol comprehension and production in 2 preschoolers with autism. American Journal of Speech-Language Pathology. 2006;15(2):112–125. doi: 10.1044/1058-0360(2006/012). [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Driscoll C, Carter M. The effects of social and isolate toys on the social interaction of preschool children with disabilities. Journal of Developmental and Physical Disabilities. 2009;21(4):279–300. doi: 10.1007/s10882-009-9142-z. [ DOI ] [ Google Scholar ]
  • *. Driscoll C, Carter M. The effects of spatial density on the social interaction of preschool children with disabilities. International Journal of Disability, Development and Education. 2010;57(2):191–206. doi: 10.1080/10349121003750836. [ DOI ] [ Google Scholar ]
  • *. Dufrene BA, Reisener CD, Olmi DJ, Zoder-Martell K, McNutt MR, Horn DR. Peer tutoring for reading fluency as a feasible and effective alternative in response to intervention systems. Journal of Behavioral Education. 2010;19(3):239–256. doi: 10.1007/s10864-010-9111-8. [ DOI ] [ Google Scholar ]
  • *. Dugas MJ, Ladouceur R. Treatment of GAD: Targeting intolerance of uncertainty in two types of worry. Behavior Modification. 2000;24(5):635–657. doi: 10.1177/0145445500245002. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Duhon GJ, House SE, Poncy BC, Hastings KW, McClurg SC. An examination of two techniques for promoting response generalization of early literacy skills. Journal of Behavioral Education. 2010;19(1):62–75. doi: 10.1007/s10864-010-9097-2. [ DOI ] [ Google Scholar ]
  • *. Earleywine M, Van Dam NT. Case studies in cannabis vaporization. Addiction Research & Theory. 2010;18(3):243–249. doi: 10.3109/16066350902974753. [ DOI ] [ Google Scholar ]
  • Easterbrooks SR, Stoner M. Using a visual tool to increase adjectives in the written language of students who are deaf or hard of hearing. Communication Disorders Quarterly. 2006;27(2):95–109. doi: 10.1177/15257401060270020701. [ DOI ] [ Google Scholar ]
  • *. Ebanks ME, Fisher WW. Altering the timing of academic prompts to treat destructive behavior maintained by escape. Journal of Applied Behavior Analysis. 2003;36(3):355–359. doi: 10.1901/jaba.2003.36-355. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • *. Ebert KD, Kohnert K. Non-linguistic cognitive treatment for primary language impairment. Clinical Linguistics & Phonetics. 2009;23(9):647–664. doi: 10.1080/02699200902998770. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Eckman N, Williams KE, Riegel K, Paul C. Teaching chewing: A structured approach. American Journal of Occupational Therapy. 2008;62(5):514–521. doi: 10.5014/ajot.62.5.514. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Eikenhout N, Austin J. Using goals, feedback, reinforcement, and a performance matrix to improve customer service in a large department store. Journal of Organizational Behavior Management. 2004;24(3):27–62. doi: 10.1300/J075v24n03_02. [ DOI ] [ Google Scholar ]
  • Eikeseth S, Nesset R. Behavioral treatment of children with phonological disorder: The efficacy of vocal imitation and sufficient-response-exemplar training. Journal of Applied Behavior Analysis. 2003;36(3):325–337. doi: 10.1901/jaba.2003.36-325. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • *. Eisen AR, Raleigh H, Neuhoff CC. The unique impact of parent training for separation anxiety disorder in children. Behavior Therapy. 2008;39(2):195–206. doi: 10.1016/j.beth.2007.07.004. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Elias NC, Goyos C, Saunders M, Saunders R. Teaching manual signs to adults with mental retardation using matching-to-sample procedures and stimulus equivalence. Analysis of Verbal Behavior. 2008;24:1–13. doi: 10.1007/BF03393053. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • *. Embregts PJCM. Effectiveness of video feedback and self-management on inappropriate social behavior of youth with mild mental retardation. Research in Developmental Disabilities. 2000;21(5):409–423. doi: 10.1016/s0891-4222(00)00052-4. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Endo Y, Ohkubo K, Gomi Y, Noguchi M, Takahashi N, Takei S, Noro F. Application of interdependent group-oriented contingencies to cleaning behaviors of students in an elementary school: Effects of class-wide intervention and social validity. The Japanese Journal of Behavior Analysis. 2007;22(1):17–30. [ Google Scholar ]
  • *. Engel JM, Jensen MP, Schwartz L. Outcome of biofeedback-assisted relaxation for pain in adults with cerebral palsy: Preliminary findings. Applied Psychophysiology and Biofeedback. 2004;29(2):135–140. doi: 10.1023/B:APBI.0000026639.95223.6f. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Fabiano GA, Pelham WE., Jr Improving the effectiveness of behavioral classroom interventions for attention-deficit/hyperactivity disorder: A case study. Journal of Emotional and Behavioral Disorders. 2003;11(2):124–130. doi: 10.1177/106342660301100206. [ DOI ] [ Google Scholar ]
  • *. Facon B, Beghin M, Rivière V. The reinforcing effect of contingent attention on verbal perseverations of two children with severe visual impairment. Journal of Behavior Therapy and Experimental Psychiatry. 2007;38(1):23–28. doi: 10.1016/j.jbtep.2006.01.001. [ DOI ] [ PubMed ] [ Google Scholar ]
  • Fazzio D, Martin GL, Arnal L, Yu DCT. Instructing university students to conduct discrete-trials teaching with children with autism. Research in Autism Spectrum Disorders. 2009;3(1):57–66. doi: 10.1016/j.rasd.2008.04.002. [ DOI ] [ Google Scholar ]
  • Feather JS, Ronan KR. Trauma-focused cognitive-behavioural therapy for abused children with posttraumatic stress disorder: A pilot study. New Zealand Journal of Psychology. 2006;35(3):132–145. [ Google Scholar ]
  • *. Feather JS, Ronan KR. Trauma-focused CBT with maltreated children: A clinic-based evaluation of a new treatment manual. Australian Psychologist. 2009;44(3):174–194. doi: 10.1080/00050060903147083. [ DOI ] [ Google Scholar ]
  • Fenstermacher K, Olympia D, Sheridan SM. Effectiveness of a computer-facilitated interactive social skills training program for boys with attention deficit hyperactivity disorder. School Psychology Quarterly. 2006;21(2):197–224. doi: 10.1521/scpq.2006.21.2.197. [ DOI ] [ Google Scholar ]
  • Ferguson H, Myles BS, Hagiwara T. Using a personal digital assistant to enhance the independence of an adolescent with Asperger syndrome. Education and Training in Developmental Disabilities. 2005;40(1):60–67. [ Google Scholar ]
  • *. Ferrara SLN. Reading fluency and self-efficacy: A case study. International Journal of Disability, Development and Education. 2005;52(3):215–231. doi: 10.1080/10349120500252858. [ DOI ] [ Google Scholar ]
  • *. Fienup DM, Doepke K. Evaluation of a changing criterion intervention to increase fluent responding with an elementary age student with autism. International Journal of Behavioral Consultation and Therapy. 2008;4(3):297–303. [ Google Scholar ]
  • *. Filter KJ, Horner RH. Function-based academic interventions for problem behavior. Education & Treatment of Children. 2009;32(1):1–19. doi: 10.1353/etc.0.0043. [ DOI ] [ Google Scholar ]
  • *. Finkel AS, Weber KP, Derby KM. Use of a Braille Exchange Communication System to improve articulation and acquire mands with a legally blind and developmentally disabled female. Journal of Developmental and Physical Disabilities. 2004;16(4):321–336. doi: 10.1007/s10882-004-0689-4. [ DOI ] [ Google Scholar ]
  • *. Finnigan E, Starr E. Increasing social responsiveness in a child with autism: A comparison of music and non-music interventions. Autism. 2010;14(4):321–348. doi: 10.1177/1362361309357747. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Fischer JL, Howard JS, Sparkman CR, Moore AG. Establishing generalized syntactical responding in young children with autism. Research in Autism Spectrum Disorders. 2010;4(1):76–88. doi: 10.1016/j.rasd.2009.07.009. [ DOI ] [ Google Scholar ]
  • *. Fleming CV, Wheeler GM, Cannella-Malone HI, Basbagill AR, Chung Y-C, Day KG. An evaluation of the use of eye gaze to measure preference of individuals with severe physical and developmental disabilities. Developmental Neurorehabilitation. 2010;13(4):266–275. doi: 10.3109/17518421003705706. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Flood WA, Wilder DA. The use of differential reinforcement and fading to increase time away from a caregiver in a child with separation anxiety disorder. Education & Treatment of Children. 2004;27(1):1–8. [ Google Scholar ]
  • *. Forquer LM, Johnson CM. Continuous white noise to reduce resistance going to sleep and night wakings in toddlers. Child & Family Behavior Therapy. 2005;27(2):1–10. doi: 10.1300/J019v27n02_01. [ DOI ] [ Google Scholar ]
  • *. Forquer LM, Johnson CM. Continuous white noise to reduce sleep latency and night wakings in college students. Sleep and Hypnosis. 2007;9(2):60–66. [ Google Scholar ]
  • *. Forrest K, luzzini J. A comparison of oral motor and production training for children with speech sound disorders. Seminars in Speech and Language. 2008;29(4):304–311. doi: 10.1055/s-0028-1103394. [ DOI ] [ PubMed ] [ Google Scholar ]
  • Foxx RM, Schreck KA, Garito J, Smith A, Weisenberger S. Replacing the echolalia of children with autism with functional use of verbal labeling. Journal of Developmental and Physical Disabilities. 2004;16(4):307–320. doi: 10.1007/s10882-004-0688-5. [ DOI ] [ Google Scholar ]
  • *. France KG, Blampied NM. Modifications of systematic ignoring in the management of infant sleep disturbance: Efficacy and infant distress. Child & Family Behavior Therapy. 2005;27(1):1–16. doi: 10.1300/J019v27n01_01. [ DOI ] [ Google Scholar ]
  • *. Franzen K, Kamps D. The utilization and effects of positive behavior support strategies on an urban school playground. Journal of Positive Behavior Interventions. 2008;10(3):150–161. doi: 10.1177/1098300708316260. [ DOI ] [ Google Scholar ]
  • Frea WD, Arnold CL, Vittimberga GL. A demonstration of the effects of augmentative communication on the extreme aggressive behavior of a child with autism within an integrated preschool setting. Journal of Positive Behavior Interventions. 2001;3(4):194–198. doi: 10.1177/109830070100300401. [ DOI ] [ Google Scholar ]
  • *. Freeman P, Rees T, Hardy L. An intervention to increase social support and improve performance. Journal of Applied Sport Psychology. 2009;21(2):186–200. doi: 10.1080/10413200902785829. [ DOI ] [ Google Scholar ]
  • Fujita Y, Hasegawa Y. Weight control: Selection of low-calorie food. The Japanese Journal of Behavior Analysis. 2003;18(1):3–9. [ Google Scholar ]
  • *. Ganz JB, Bourgeois BC, Flores MM, Campos BA. Implementing visually cued imitation training with children with autism spectrum disorders and developmental delays. Journal of Positive Behavior Interventions. 2008;10(1):56–66. doi: 10.1177/1098300707311388. [ DOI ] [ Google Scholar ]
  • *. Ganz JB, Flores MM. Effects of the use of visual strategies in play groups for children with autism spectrum disorders and their peers. Journal of Autism and Developmental Disorders. 2008;38(5):926–940. doi: 10.1007/s10803-007-0463-4. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Ganz JB, Flores MM. The effectiveness of direct instruction for teaching language to children with autism spectrum disorders: Identifying materials. Journal of Autism and Developmental Disorders. 2009;39(1):75–83. doi: 10.1007/s10803-008-0602-6. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Ganz JB, Kaylor M, Bourgeois B, Hadden K. The impact of social scripts and visual cues on verbal communication in three children with autism spectrum disorders. Focus on Autism and Other Developmental Disabilities. 2008;23(2):79–94. doi: 10.1177/1088357607311447. [ DOI ] [ Google Scholar ]
  • *. Ganz JB, Sigafoos J. Self-monitoring: Are young adults with MR and autism able to utilize cognitive strategies independently? Education and Training in Developmental Disabilities. 2005;40(1):24–33. [ Google Scholar ]
  • *. Garrity ML, Luiselli JK, McCollum SA. Effects of a supervisory intervention on assessment of interobserver agreement by educational service providers. Behavioral Interventions. 2008;23(2):105–112. doi: 10.1002/bin.258. [ DOI ] [ Google Scholar ]
  • *. Gena A, Couloura S, Kymissis E. Modifying the affective behavior of preschoolers with autism using in-vivo or video modeling and reinforcement contingencies. Journal of Autism and Developmental Disorders. 2005;35(5):545–556. doi: 10.1007/s10803-005-0014-9. [ DOI ] [ PubMed ] [ Google Scholar ]
  • German DJ. A phonologically based strategy to improve word-finding abilities in children. Communication Disorders Quarterly. 2002;23(4):177–190. doi: 10.1177/15257401020230040301. [ DOI ] [ Google Scholar ]
  • *. Gilbertson D, Witt JC, Duhon G, Dufrene B. Using brief assessments to select math fluency and on-task behavior interventions: An investigation of treatment utility. Education & Treatment of Children. 2008;31(2):167–181. doi: 10.1353/etc.0.0023. [ DOI ] [ Google Scholar ]
  • *. Girling-Butcher RD, Ronan KR. Brief cognitive-behavioural therapy for children with anxiety disorders: Initial evaluation of a program designed for clinic settings. Behaviour Change. 2009;26(1):27–53. doi: 10.1375/bech.26.1.27. [ DOI ] [ Google Scholar ]
  • Glassman TJ, Dodd V, Miller EM, Braun RE. Preventing high-risk drinking among college students: A social marketing case study. Social Marketing Quarterly. 2010;16(4):92–110. doi: 10.1080/15245004.2010.522764. [ DOI ] [ Google Scholar ]
  • Goldkamp JS, Vîlcicã ER. Targeted enforcement and adverse system side effects: The generation of fugitives in Philadelphia. Criminology: An Interdisciplinary Journal. 2008;46(2):371–409. doi: 10.1111/j.1745-9125.2008.00113.x. [ DOI ] [ Google Scholar ]
  • Gómez S, López F, Martín CB, Barnes-Holmes Y, Barnes-Holmes D. Exemplar training and a derived transformation of functions in accordance with symmetry and equivalence. The Psychological Record. 2007;57(2):273–294. [ Google Scholar ]
  • *. Goodman JI, Brady MP, Duffy ML, Scott J, Pollard NE. The effects of “bug-in-ear” supervision on special education teachers’ delivery of learn units. Focus on Autism and Other Developmental Disabilities. 2008;23(4):207–216. doi: 10.1177/1088357608324713. [ DOI ] [ Google Scholar ]
  • *. Gorman DM, Huber JC., Jr Do medical cannabis laws encourage cannabis use? International Journal of Drug Policy. 2007;18(3):160–167. doi: 10.1016/j.drugpo.2006.10.001. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Gravina N, Austin J, Schoedtder L, Loewy S. The effects of self-monitoring on safe posture performance. Journal of Organizational Behavior Management. 2008;28(4):238–259. doi: 10.1080/01608060802454825. [ DOI ] [ Google Scholar ]
  • *. Gravina N, Lindstrom-Hazel D, Austin J. The effects of workstation changes and behavioral interventions on safe typing postures in an office. Work: Journal of Prevention, Assessment & Rehabilitation. 2007;29(3):245–253. [ PubMed ] [ Google Scholar ]
  • *. Gravina N, Wilder DA, White H, Fabian T. The effect of raffle odds on signing in at a treatment center for adults with mental illness. Journal of Organizational Behavior Management. 2004;24(4):31–42. doi: 10.1300/J075v24n04_02. [ DOI ] [ Google Scholar ]
  • *. Gregg MJ, Hrycaiko D, Mactavish JB, Martin GL. A mental skills package for Special Olympics athletes: A preliminary study. Adapted Physical Activity Quarterly. 2004;21(1):4–18. [ Google Scholar ]
  • *. Grey I, Healy O, Leader G, Hayes D. Using a Time Timer™ to increase appropriate waiting behavior in a child with developmental disabilities. Research in Developmental Disabilities. 2009;30(2):359–366. doi: 10.1016/j.ridd.2008.07.001. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Grissom T, Ward P, Martin B, Leenders NYJM. Physical activity in physical education: Teacher or technology effects. Family & Community Health: The Journal of Health Promotion & Maintenance. 2005;28(2):125–129. doi: 10.1097/00003727-200504000-00004. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Grskovic JA, Hall AM, Montgomery DJ, Vargas AU, Zentall SS, Belfiore PJ. Reducing time-out assignments for students with emotional/behavioral disorders in a self-contained classroom. Journal of Behavioral Education. 2004;13(1):25–36. doi: 10.1023/b:jobe.0000011258.06561.82. [ DOI ] [ Google Scholar ]
  • Gryiec M, Grandy S, McLaughlin TF. The effects of the copy, cover, and compare procedure in spelling with an elementary student with fetal alcohol syndrome. Journal of Precision Teaching & Celeration. 2004;20(1):2–8. [ Google Scholar ]
  • Gupta A, Naorem T. Cognitive retraining in epilepsy. Brain Injury. 2003;17(2):161–174. doi: 10.1080/0269905021000010195. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Gyovai LK, Cartledge G, Kourea L, Yurick A, Gibson L. Early reading intervention: Responding to the learning needs of young at-risk English language learners. Learning Disability Quarterly. 2009;32(3):143–162. [ Google Scholar ]
  • *. Haddad K, Tremayne P. The effects of centering on the free-throw shooting performance of young athletes. The Sport Psychologist. 2009;23(1):118–136. [ Google Scholar ]
  • *. Hall LJ, Grundon GS, Pope C, Romero AB. Training paraprofessionals to use behavioral strategies when educating learners with autism spectrum disorders across environments. Behavioral Interventions. 2010;25(1):37–51. [ Google Scholar ]
  • Halsteinli V. Treatment intensity in child and adolescent mental health services and health care reform in Norway, 1998–2006. Psychiatric Services. 2010;61(3):280–285. doi: 10.1176/appi.ps.61.3.280. [ DOI ] [ PubMed ] [ Google Scholar ]
  • Hamilton RA, Scott D, MacDougall MP. Assessing the effectiveness of self-talk interventions on endurance performance. Journal of Applied Sport Psychology. 2007;19(2):226–239. doi: 10.1080/10413200701230613. [ DOI ] [ Google Scholar ]
  • *. Hansen DL, Morgan RL. Teaching grocery store purchasing skills to students with intellectual disabilities using a computer-based instruction program. Education and Training in Developmental Disabilities. 2008;43(4):431–442. [ Google Scholar ]
  • *. Hanser GA, Erickson KA. Integrated word identification and communication instruction for students with complex communication needs: Preliminary results. Focus on Autism and Other Developmental Disabilities. 2007;22(4):268–278. doi: 10.1177/10883576070220040901. [ DOI ] [ Google Scholar ]
  • *. Harding JW, Wacker DP, Berg WK, Winborn-Kemmerer L, Lee JF, Ibrahimovic M. Analysis of multiple manding topographies during functional communication training. Education & Treatment of Children. 2009;32(1):21–36. doi: 10.1353/etc.0.0045. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Hart JM, Fritz JM, Kerrigan DC, Saliba EN, Gansneder BM, Ingersoll CD. Quadriceps inhibition after repetitive lumbar extension exercise in persons with a history of low back pain. Journal of Athletic Training. 2006;41(3):264–269. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • *. Hartley ET, Kehle TJ, Bray MA. Increasing student classroom participation through self-modeling. Journal of Applied School Psychology. 2002;19(1):51–63. [ Google Scholar ]
  • *. Hartnedy SL, Mozzoni MP, Fahoum Y. The effect of fluency training on math and reading skills in neuropsychiatric diagnosis children: A multiple baseline design. Behavioral Interventions. 2005;20(1):27–36. doi: 10.1002/bin.167. [ DOI ] [ Google Scholar ]
  • *. Haslam SA, Reicher S. Identity entrepreneurship and the consequences of identity failure: The dynamics of leadership in the BBC prison study. Social Psychology Quarterly. 2007;70(2):125–147. doi: 10.1177/019027250707000204. [ DOI ] [ Google Scholar ]
  • Hastie PA, Sharpe T. Introducing a changing-criterion design to hold students accountable in structured physical activity settings. Journal of Evidence-Based Practices for Schools. 2006;7(1):73–88. [ Google Scholar ]
  • Hastie PA, Sharpe T. Implementation guidelines: Introducing a changing-criterion design to hold students accountable in structured physical activity settings. Journal of Evidence-Based Practices for Schools. 2006;7(1):89–91. [ Google Scholar ]
  • *. Hayter S, Scott E, McLaughlin TF, Weber KP. The use of a modified direct instruction flashcard system with two high school students with developmental disabilities. Journal of Developmental and Physical Disabilities. 2007;19(4):409–415. doi: 10.1007/s10882-007-9059-3. [ DOI ] [ Google Scholar ]
  • Hayward D, Eikeseth S, Gale C, Morgan S. Assessing progress during treatment for young children with autism receiving intensive behavioural interventions. Autism. 2009;13(6):613–633. doi: 10.1177/1362361309340029. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Heering PW, Wilder DA. The use of dependent group contingencies to increase on-task behavior in two general education classrooms. Education & Treatment of Children. 2006;29(3):459–468. [ Google Scholar ]
  • *. Hernandez E, Hanley GP, Ingvarsson ET, Tiger JH. A preliminary evaluation of the emergence of novel mand forms. Journal of Applied Behavior Analysis. 2007;40(1):137–156. doi: 10.1901/jaba.2007.96-05. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • *. Hetzroni OE, Tannous J. Effects of a computer-based intervention program on the communicative functions of children with autism. Journal of Autism and Developmental Disorders. 2004;34(2):95–113. doi: 10.1023/B:JADD.0000022602.40506.bf. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Higbee TS, Chang S-M, Endicott K. Noncontingent access to preferred sensory stimuli as a treatment for automatically reinforced stereotypy. Behavioral Interventions. 2005;20(3):177–184. doi: 10.1002/bin.190. [ DOI ] [ Google Scholar ]
  • *. Hillman HL, Miller LK. The effects of a spouse implemented contingency contract on asthma medication adherence. The Behavior Analyst Today. 2009;10(1):1–6. [ Google Scholar ]
  • *. Himle MB, Woods DW, Conelea CA, Bauer CC, Rice KA. Investigating the effects of tic suppression on premonitory urge ratings in children and adolescents with Tourette’s syndrome. Behaviour Research and Therapy. 2007;45(12):2964–2976. doi: 10.1016/j.brat.2007.08.007. [ DOI ] [ PubMed ] [ Google Scholar ]
  • Hirasawa N, Fujiwara Y, Yamane M. Physical arrangements and staff implementation of function-based interventions in school and community settings. Japanese Journal of Special Education. 2009;46(6):435–446. [ Google Scholar ]
  • Hirayoshi S, Nakajima S. Do rats press the lever for longer than the required duration?: Differential reinforcement of response duration and response topography. The Japanese Journal of Behavior Analysis. 2003;18(2):99–107. [ Google Scholar ]
  • *. Hitchcock CH, Prater MA, Dowrick PW. Reading comprehension and fluency: Examining the effects of tutoring and video self-modeling on first-grade students with reading difficulties. Learning Disability Quarterly. 2004;27(2):89–103. doi: 10.2307/1593644. [ DOI ] [ Google Scholar ]
  • *. Holzer ML, Madaus JW, Bray MA, Kehle TJ. The test-taking strategy intervention for college students with learning disabilities. Learning Disabilities Research & Practice. 2009;24(1):44–56. doi: 10.1111/j.1540-5826.2008.01276.x. [ DOI ] [ Google Scholar ]
  • Hoskins S, Coleman M, McNeely D. Stress in careers of individuals with dementia and community mental health teams: An uncontrolled evaluation study. Journal of Advanced Nursing. 2005;50(3):325–333. doi: 10.1111/j.1365-2648.2005.03396.x. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Hough MS. Melodic intonation therapy and aphasia: Another variation on a theme. Aphasiology. 2010;24(6–8):775–786. doi: 10.1080/02687030903501941. [ DOI ] [ Google Scholar ]
  • Howard CD, Barrett AF, Frick TW. Anonymity to promote peer feedback: Pre-service teachers’ comments in asynchronous computer-mediated communication. Journal of Educational Computing Research. 2010;43(1):89–112. doi: 10.2190/EC.43.1.f. [ DOI ] [ Google Scholar ]
  • Høye A, Rezvy G, Hansen V, Olstad R. The effect of gender in diagnosing early schizophrenia: An experimental case simulation study. Social Psychiatry and Psychiatric Epidemiology. 2006;41(7):549–555. doi: 10.1007/s00127-006-0066-y. [ DOI ] [ PubMed ] [ Google Scholar ]
  • Huitema BE. Analysis of interrupted time-series experiments using ITSE: A critique. Understanding Statistics. 2004;3(1):27–46. doi: 10.1207/s15328031us0301_2. [ DOI ] [ Google Scholar ]
  • *. Hundert JP. Training classroom and resource preschool teachers to develop inclusive class interventions for children with disabilities: Generalization to new intervention targets. Journal of Positive Behavior Interventions. 2007;9(3):159–173. doi: 10.1177/10983007070090030401. [ DOI ] [ Google Scholar ]
  • Hupp SDA, Allen KD. Using an audio cueing procedure to increase rate of parental attention during parent training. Child & Family Behavior Therapy. 2005;27(2):43–49. doi: 10.1300/J019v27n02_04. [ DOI ] [ Google Scholar ]
  • *. Ingersoll B, Dvortcsak A, Whalen C, Sikora D. The effects of a developmental, social-pragmatic language intervention on rate of expressive language production in young children with autistic spectrum disorders. Focus on Autism and Other Developmental Disabilities. 2005;20(4):213–222. doi: 10.1177/10883576050200040301. [ DOI ] [ Google Scholar ]
  • *. Ingersoll B, Gergans S. The effect of a parent-implemented imitation intervention on spontaneous imitation skills in young children with autism. Research in Developmental Disabilities. 2007;28(2):163–175. doi: 10.1016/j.ridd.2006.02.004. [ DOI ] [ PubMed ] [ Google Scholar ]
  • Ingersoll B, Lalonde K. The impact of object and gesture imitation training on language use in children with autism spectrum disorder. Journal of Speech, Language, and Hearing Research. 2010;53(4):1040–1051. doi: 10.1044/1092-4388(2009/09-0043). [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • *. Ingersoll B, Lewis E, Kroman E. Teaching the imitation and spontaneous use of descriptive gestures in young children with autism using a naturalistic behavioral intervention. Journal of Autism and Developmental Disorders. 2007;37(8):1446–1456. doi: 10.1007/s10803-006-0221-z. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Ingersoll B, Schreibman L. Teaching reciprocal imitation skills to young children with autism using a naturalistic behavioral approach: Effects on language, pretend play, and joint attention. Journal of Autism and Developmental Disorders. 2006;36(4):487–505. doi: 10.1007/s10803-006-0089-y. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Ingvarsson ET, Hanley GP. An evaluation of computer-based programmed instruction for promoting teachers’ greetings of parents by name. Journal of Applied Behavior Analysis. 2006;39(2):203–214. doi: 10.1901/jaba.2006.18-05. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Iqbal N, Caswell HL, Hare DJ, Pilkington O, Mercer S, Duncan S. Neuropsychological profiles of patients with juvenile myoclonic epilepsy and their siblings: A preliminary controlled experimental video-EEG case series. Epilepsy & Behavior. 2009;14(3):516–521. doi: 10.1016/j.yebeh.2008.12.025. [ DOI ] [ PubMed ] [ Google Scholar ]
  • Ishida M. Effects of recasts on the acquisition of the aspectual form -te i-(ru) by learners of Japanese as a foreign language. Language Learning. 2004;54(2):311–394. doi: 10.1111/j.1467-9922.2004.00257.x. [ DOI ] [ Google Scholar ]
  • *. Ivy JW, Schreck KA. A behavioral approach to training day care workers. International Journal of Behavioral Consultation and Therapy. 2008;4(2):227–238. [ Google Scholar ]
  • *. Jameson JM, McDonnell J, Johnson JW, Riesen T, Polychronis S. A comparison of one-to-one embedded instruction in the general education classroom and one-to-one massed practice instruction in the special education classroom. Education & Treatment of Children. 2007;30(1):23–44. doi: 10.1353/etc.2007.0001. [ DOI ] [ Google Scholar ]
  • *. Jansson S, Söderlund A. A new treatment programme to improve balance in elderly people-an evaluation of an individually tailored home-based exercise programme in five elderly women with a feeling of unsteadiness. Disability and Rehabilitation: An International, Multidisciplinary Journal. 2004;26(24):1431–1443. doi: 10.1080/09638280400000245. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Jason LA, Braciszewski J, Olson BD, Ferrari JR. Increasing the number of mutual help recovery homes for substance abusers: Effects of government policy and funding assistance. Behavior and Social Issues. 2005;14(1):71–79. [ Google Scholar ]
  • Jindal-Snape D. Generalization and maintenance of social skills of children with visual impairments: Self-evaluation and the role of feedback. Journal of Visual Impairment & Blindness. 2004;98(8):470–483. [ Google Scholar ]
  • *. Jo Rodriguez B, Loman SL, Horner RH. A preliminary analysis of the effects of coaching feedback on teacher implementation fidelity of First Step to Success. Behavior Analysis in Practice. 2009;2(2):11–21. doi: 10.1007/BF03391744. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • *. Jöbges M, Heuschkel G, Pretzel C, Illhardt C, Renner C, Hummelsheim H. Repetitive training of compensatory steps: A therapeutic approach for postural instability in Parkinson’s disease. Journal of Neurology, Neurosurgery & Psychiatry. 2004;75(12):1682–1687. doi: 10.1136/jnnp.2003.016550. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • *. Johnston SS, Buchanan S, Davenport L. Comparison of fixed and gradual array when teaching sound-letter correspondence to two children with autism who use AAC. AAC: Augmentative and Alternative Communication. 2009;25(2):136–144. doi: 10.1080/07434610902921516. [ DOI ] [ PubMed ] [ Google Scholar ]
  • Jones PH, Ryan BP. Experimental analysis of the relationship between speaking rate and stuttering during mother–child conversation. Journal of Developmental and Physical Disabilities. 2001;13(3):279–305. doi: 10.1023/a:1016610420533. [ DOI ] [ Google Scholar ]
  • *. Jung S, Sainato DM, Davis CA. Using high-probability request sequences to increase social interactions in young children with autism. Journal of Early Intervention. 2008;30(3):163–187. doi: 10.1177/1053815108317970. [ DOI ] [ Google Scholar ]
  • *. Kahng S, Boscoe JH, Byrne S. The use of escape contingency and a token economy to increase food acceptance. Journal of Applied Behavior Analysis. 2003;36(3):349–353. doi: 10.1901/jaba.2003.36-349. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Kalarchian MA, Marcus M, Levine MD, Haas GL, Greeno CG, Weissfeld LA, Qin L. Behavioral treatment of obesity in patients taking antipsychotic medications. Journal of Clinical Psychiatry. 2005;66(8):1058–1063. doi: 10.4088/JCP.v66n0815. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Kapoor VG, Bray MA, Kehle TJ. School-based intervention: Relaxation and guided imagery for students with asthma and anxiety disorder. Canadian Journal of School Psychology. 2010;25(4):311–327. doi: 10.1177/0829573510375551. [ DOI ] [ Google Scholar ]
  • *. Karmali I, Greer RD, Nuzzolo-Gomez R, Ross DE, Rivera-Valdes C. Reducing palilalia by presenting tact corrections to young children with autism. Analysis of Verbal Behavior. 2005;21:145–153. doi: 10.1007/BF03393016. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • *. Kashinath S, Woods J, Goldstein H. Enhancing generalized teaching strategy use in daily routines by parents of children with autism. Journal of Speech, Language, and Hearing Research. 2006;49(3):466–485. doi: 10.1044/1092-4388(2006/036). [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Kay S, Harchik AF, Luiselli JK. Elimination of drooling by an adolescent student with autism attending public high school. Journal of Positive Behavior Interventions. 2006;8(1):24–28. doi: 10.1177/10983007060080010401. [ DOI ] [ Google Scholar ]
  • Keen B, Jacobs D. Racial threat, partisan politics, and racial disparities in prison admissions: A panel analysis. Criminology: An Interdisciplinary Journal. 2009;47(1):209–238. doi: 10.1111/j.1745-9125.2009.00143.x. [ DOI ] [ Google Scholar ]
  • *. Keen D, Brannigan KL, Cuskelly M. Toilet training for children with autism: The effects of video modeling category. Journal of Developmental and Physical Disabilities. 2007;19(4):291–303. doi: 10.1007/s10882-007-9044-x. [ DOI ] [ Google Scholar ]
  • *. Keller CL, Brady MP, Taylor RL. Using self-evaluation to improve student teacher interns’ use of specific praise. Education and Training in Developmental Disabilities. 2005;40(4):368–376. [ Google Scholar ]
  • Kellett S. The treatment of compulsive hoarding with object-affect fusion informed CBT: Initial experimental case evidence. Behavioural and Cognitive Psychotherapy. 2006;34(4):481–485. doi: 10.1017/s1352465806003006. [ DOI ] [ Google Scholar ]
  • *. Kelley C, Loy DP. Comparing the effects of aquatic and land-based exercise on the physiological stress response of women with fibromyalgia. Therapeutic Recreation Journal. 2008;42(2):103–118. [ Google Scholar ]
  • Kerler WA, III, Killough LN. The effects of satisfaction with a client’s management during a prior audit engagement, trust, and moral reasoning on auditors’ perceived risk of management fraud. Journal of Business Ethics. 2009;85(2):109–136. doi: 10.1007/s10551-008-9752-x. [ DOI ] [ Google Scholar ]
  • Killu K, Weber KP, McLaughlin TF. An evaluation of repeated readings across various counting periods of see to think, think to say, and think to write channels with a university student with learning disabilities. Journal of Precision Teaching & Celeration. 2001;17(2):39–57. [ Google Scholar ]
  • *. Kim S, Oah S, Dickinson AM. The impact of public feedback on three recycling-related behaviors in South Korea. Environment and Behavior. 2005;37(2):258–274. doi: 10.1177/0013916504267639. [ DOI ] [ Google Scholar ]
  • Kiran S, Roberts PM. Semantic feature analysis treatment in Spanish-English and French-English bilingual aphasia. Aphasiology. 2010;24(2):231–261. doi: 10.1080/02687030902958365. [ DOI ] [ Google Scholar ]
  • *. Kirby KC, Kerwin MLE, Carpenedo CM, Rosenwasser BJ, Gardner RS. Interdependent group contingency management for cocaine-dependent methadone maintenance patients. Journal of Applied Behavior Analysis. 2008;41(4):579–595. doi: 10.1901/jaba.2008.41-579. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • *. Kleeberger V, Mirenda P. Teaching generalized imitation skills to a preschooler with autism using video modeling. Journal of Positive Behavior Interventions. 2010;12(2):116–127. doi: 10.1177/1098300708329279. [ DOI ] [ Google Scholar ]
  • *. Koegel RL, Openden D, Koegel LK. A systematic desensitization paradigm to treat hypersensitivity to auditory stimuli in children with autism in family contexts. Research and Practice for Persons with Severe Disabilities. 2004;29(2):122–134. doi: 10.2511/rpsd.29.2.122. [ DOI ] [ Google Scholar ]
  • *. Koegel RL, Shirotova L, Koegel LK. Antecedent stimulus control: Using orienting cues to facilitate first-word acquisition for nonresponders with autism. The Behavior Analyst. 2009;32(2):281–284. doi: 10.1007/BF03392190. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • *. Koegel RL, Shirotova L, Koegel LK. Brief report: Using individualized orienting cues to facilitate first-word acquisition in non-responders with autism. Journal of Autism and Developmental Disorders. 2009;39(11):1587–1592. doi: 10.1007/s10803-009-0765-9. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • *. Kohler FW, Greteman C, Raschke D, Highnam C. Using a buddy skills package to increase the social interactions between a preschooler with autism and her peers. Topics in Early Childhood Special Education. 2007;27(3):155–163. doi: 10.1177/02711214070270030601. [ DOI ] [ Google Scholar ]
  • Koopmans R, Olzak S. Discursive opportunities and the evolution of right-wing violence in Germany. American Journal of Sociology. 2004;110(1):198–230. doi: 10.1086/386271. [ DOI ] [ Google Scholar ]
  • *. Koul R, Corwin M, Hayes S. Production of graphic symbol sentences by individuals with aphasia: Efficacy of a computer-based augmentative and alternative communication intervention. Brain and Language. 2005;92(1):58–77. doi: 10.1016/j.bandl.2004.05.008. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Kourea L, Cartledge G, Musti-Rao S. Improving the reading skills of urban elementary students through total class peer tutoring. Remedial and Special Education. 2007;28(2):95–107. doi: 10.1177/07419325070280020801. [ DOI ] [ Google Scholar ]
  • Kramarski B, Hirsch C. Using computer algebra systems in mathematical classrooms. Journal of Computer Assisted Learning. 2003;19(1):35–45. doi: 10.1046/j.0266-4909.2003.00004.x. [ DOI ] [ Google Scholar ]
  • *. Kramer TJ, Caldarella P, Christensen L, Shatzer RH. Social and emotional learning in the kindergarten classroom: Evaluation of the Strong Start curriculum. Early Childhood Education Journal. 2010;37(4):303–309. doi: 10.1007/s10643-009-0354-8. [ DOI ] [ Google Scholar ]
  • Kumar R, Pati NC, Mohanty S. Efficacy of an errorless training procedure on acquisition and maintenance of money naming by mentally retarded. Social Science International. 2004;20(1):48–64. [ Google Scholar ]
  • Kuosma K, Hjerrild J, Pedersen PU, Hundrup YA. Assessment of the nutritional status among residents in a Danish nursing home: Health effects of a formulated food and meal policy. Journal of Clinical Nursing. 2008;17(17):2288–2293. doi: 10.1111/j.1365-2702.2007.02203.x. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Kwak L, Kremers SPJ, van Baak MA, Brug J. A poster-based intervention to promote stair use in blue- and white-collar worksites. Preventive Medicine: An International Journal Devoted to Practice and Theory. 2007;45(2–3):177–181. doi: 10.1016/j.ypmed.2007.05.005. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Ladd MV, Luiselli JK, Baker L. Continuous access to competing stimulation as intervention for self-injurious skin picking in a child with autism. Child & Family Behavior Therapy. 2009;31(1):54–60. doi: 10.1080/07317100802701400. [ DOI ] [ Google Scholar ]
  • *. Ladouceur R, Léger É, Dugas M, Freeston MH. Cognitive-behavioral treatment of generalized anxiety disorder (GAD) for older adults. International Psychogeriatrics. 2004;16(2):195–207. doi: 10.1017/s1041610204000274. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Lammi BM, Law M. The effects of Family-Centred Functional Therapy on the occupational performance of children with cerebral palsy. Canadian Journal of Occupational Therapy/Revue Canadienne D’Ergothérapie. 2003;70(5):285–297. doi: 10.1177/000841740307000505. [ DOI ] [ PubMed ] [ Google Scholar ]
  • Lancioni GE, Singh NN, O’Reilly MF, Campodonico F, Piazzolla G, Scalini L, Oliva D. Impact of favorite stimuli automatically delivered on step responses of persons with multiple disabilities during their use of walker devices. Research in Developmental Disabilities. 2005;26(1):71–76. doi: 10.1016/j.ridd.2004.04.003. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Lane KL, Little MA, Redding-Rhodes J, Phillips A, Welsh MT. Outcomes of a teacher-led reading intervention for elementary students at risk for behavioural disorders. Exceptional Children. 2007;74(1):47–70. [ Google Scholar ]
  • *. Lane KL, Rogers LA, Parks RJ, Weisenbach JL, Mau AC, Merwin MT, Bergman WA. Function-based interventions for students who are nonresponsive to primary and secondary prevention efforts: Illustrations at the elementary and middle school levels. Journal of Emotional and Behavioral Disorders. 2007;15(3):169–183. doi: 10.1177/10634266070150030401. [ DOI ] [ Google Scholar ]
  • Lang R, Shogren KA, Machalicek W, Rispoli M, O’Reilly M, Baker S, Regester A. Video self-modeling to teach classroom rules to two students with Asperger’s. Research in Autism Spectrum Disorders. 2009;3(2):483–488. doi: 10.1016/j.rasd.2008.10.001. [ DOI ] [ Google Scholar ]
  • *. Lannie AL, Martens BK. Effects of task difficulty and type of contingency on students’ allocation of responding to math worksheets. Journal of Applied Behavior Analysis. 2004;37(1):53–65. doi: 10.1901/jaba.2004.37-53. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Lannie AL, Martens BK. Targeting performance dimensions in sequence according to the instructional hierarchy: Effects on children’s math work within a self-monitoring program. Journal of Behavioral Education. 2008;17(4):356–375. doi: 10.1007/s10864-008-9073-2. [ DOI ] [ Google Scholar ]
  • Lanovaz MJ, Fletcher SE, Rapp JT. Identifying stimuli that alter immediate and subsequent levels of vocal stereotypy: A further analysis of functionally matched stimulation. Behavior Modification. 2009;33(5):682–704. doi: 10.1177/0145445509344972. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Laushey KM, Heflin LJ, Shippen M, Alberto PA, Fredrick L. Concept mastery routines to teach social skills to elementary children with high functioning autism. Journal of Autism and Developmental Disorders. 2009;39(10):1435–1448. doi: 10.1007/s10803-009-0757-9. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Law S-P, Yeung O, Chiu KMY. Treatment for anomia in Chinese using an ortho-phonological cueing method. Aphasiology. 2008;22(2):139–163. doi: 10.1080/02687030701191358. [ DOI ] [ Google Scholar ]
  • Lawrence JM, Watkins ML, Ershoff D, Petitti DB, Chiu V, Postlethwaite D, Erickson JD. Design and evaluation of interventions promoting periconceptional multivitamin use. American Journal of Preventive Medicine. 2003;25(1):17–24. doi: 10.1016/s0749-3797(03)00097-7. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Leaf JB, Taubman M, Bloomfield S, Palos-Rafuse L, Leaf R, McEachin J, Oppenheim ML. Increasing social skills and pro-social behavior for three children diagnosed with autism through the use of a teaching package. Research in Autism Spectrum Disorders. 2009;3(1):275–289. doi: 10.1016/j.rasd.2008.07.003. [ DOI ] [ Google Scholar ]
  • *. LeBlanc LA, Carr JE, Crossett SE, Bennett CM, Detweiler DD. Intensive outpatient behavioral treatment of primary urinary incontinence of children with autism. Focus on Autism and Other Developmental Disabilities. 2005;20(2):98–105. doi: 10.1177/10883576050200020601. [ DOI ] [ Google Scholar ]
  • LeBlanc LA, Geiger KB, Sautter RA, Sidener TM. Using the Natural Language Paradigm (NLP) to increase vocalizations of older adults with cognitive impairments. Research in Developmental Disabilities. 2007;28(4):437–444. doi: 10.1016/j.ridd.2006.06.004. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Leblanc M-P, Ricciardi JN, Luiselli JK. Improving discrete trial instruction by paraprofessional staff through an abbreviated performance feedback intervention. Education & Treatment of Children. 2005;28(1):76–82. [ Google Scholar ]
  • *. Lebrecque J, Marchand A, Dugas MJ, Letarte A. Efficacy of cognitive-behavioral therapy for comorbid panic disorder with agoraphobia and generalized anxiety disorder. Behavior Modification. 2007;31(5):616–637. doi: 10.1177/0145445507301132. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Lee R, Sturmey P. The effects of lag schedules and preferred materials on variable responding in students with autism. Journal of Autism and Developmental Disorders. 2006;36(3):421–428. doi: 10.1007/s10803-006-0080-7. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Leew SV, Stein NG, Gibbard WB. Weighted vests’ effect on social attention for toddlers with autism spectrum disorders. Canadian Journal of Occupational Therapy/Revue Canadienne D’Ergothérapie. 2010;77(2):113–124. doi: 10.2182/cjot.2010.77.2.7. [ DOI ] [ PubMed ] [ Google Scholar ]
  • Lehmann CM, Heagy CD. Effects of professional experience and group interaction on information requested in analyzing IT cases. Journal of Education for Business. 2008;83(6):347–354. doi: 10.3200/joeb.83.6.347-354. [ DOI ] [ Google Scholar ]
  • Lehmann I, Crimando W. Unintended consequences of state and federal antidiscrimination and family medical leave legislation on the employment rates of persons with disabilities. Rehabilitation Counseling Bulletin. 2008;51(3):159–169. doi: 10.1177/0034355207312111. [ DOI ] [ Google Scholar ]
  • *. Levingston HB, Neef NA, Cihon TM. The effects of teaching precurrent behaviors on children’s solution of multiplication and division word problems. Journal of Applied Behavior Analysis. 2009;42(2):361–367. doi: 10.1901/jaba.2009.42-361. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • *. Li Q, Wang X-C, Cheng L-G. Multiple-baseline design on pretend game for the rectification of children’s aggressive behavior. Chinese Mental Health Journal. 2008;22(3):175–178. [ Google Scholar ]
  • *. Li Y. Recovering from spousal bereavement in later life: Does volunteer participation play a role. The Journals of Gerontology: Series B: Psychological Sciences and Social Sciences. 2007;62B(4):S257–S266. doi: 10.1093/geronb/62.4.s257. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Liberatore JS, Luyben PD. The effects of feedback and positive reinforcement on the on-task behavior of dancers. Journal of Prevention & Intervention in the Community. 2009;37(3):200–208. doi: 10.1080/10852350902976122. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Liddle HA, Rowe CL, Gonzalez A, Henderson CE, Dakof GA, Greenbaum PE. Changing provider practices, program environment, and improving outcomes by transporting multidimensional family therapy to an adolescent drug treatment setting. The American Journal on Addictions. 2006;15(Suppl 1):102–112. doi: 10.1080/10550490601003698. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Lien-Thorne S, Kamps D. Replication study of the first step to success early intervention program. Behavioral Disorders. 2005;31(1):18–32. [ Google Scholar ]
  • *. Lienemann TO, Graham S, Leader-Janssen B, Reid R. Improving the writing performance of struggling writers in second grade. The Journal of Special Education. 2006;40(2):66–78. doi: 10.1177/00224669060400020301. [ DOI ] [ Google Scholar ]
  • *. Lindsay P, Maynard I, Thomas O. Effects of hypnosis on flow states and cycling performance. The Sport Psychologist. 2005;19(2):164–177. [ Google Scholar ]
  • Liso DR. The effects of choice making on toy engagement in nonambulatory and partially ambulatory preschool students. Topics in Early Childhood Special Education. 2010;30(2):91–101. doi: 10.1177/0271121409344354. [ DOI ] [ Google Scholar ]
  • *. Loewy S, Bailey J. The effects of graphic feedback, goal-setting, and manager praise on customer service behaviors. Journal of Organizational Behavior Management. 2007;27(3):15–26. doi: 10.1300/J075v27n03_02. [ DOI ] [ Google Scholar ]
  • *. Loftin RL, Odom SL, Lantz JF. Social interaction and repetitive motor behaviors. Journal of Autism and Developmental Disorders. 2008;38(6):1124–1135. doi: 10.1007/s10803-007-0499-5. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Lohrmann S, Talerico J. Anchor the Boat: A classwide intervention to reduce problem behavior. Journal of Positive Behavior Interventions. 2004;6(2):113–120. doi: 10.1177/10983007040060020601. [ DOI ] [ Google Scholar ]
  • *. Loncola JA, Craig-Unkefer L. Teaching social communication skills to young urban children with autism. Education and Training in Developmental Disabilities. 2005;40(3):243–263. [ Google Scholar ]
  • López GC, Guzmán LH, Sierra AV, Meza V. Entrenamiento en habilidades de afrontamiento y competencia prosocial de jóvenes con historia de calle [Training in coping and pro-social competence abilities in homeless youngsters. Revista Mexicana de Psicología. 2003;20(2):201–209. [ Google Scholar ]
  • Lubitsh G, Doyle C, Valentine J. The impact of theory of constraints (TOC) in an NHS trust. Journal of Management Development. 2005;24(2):116–131. doi: 10.1108/02621710510579482. [ DOI ] [ Google Scholar ]
  • Luciano-Soriano MC, Molina-Cobos FJ, Gómez-Becerra I. Say-do-report training to change chronic behaviors in mentally retarded subjects. Research in Developmental Disabilities. 2000;21(5):355–366. doi: 10.1016/s0891-4222(00)00048-2. [ DOI ] [ PubMed ] [ Google Scholar ]
  • Luk R, Ferrence R, Gmel G. The economic impact of a smoke-free bylaw on restaurant and bar sales in Ottawa, Canada. Addiction. 2006;101(5):738–745. doi: 10.1111/j.1360-0443.2006.01434.x. [ DOI ] [ PubMed ] [ Google Scholar ]
  • Lyerla F, LeRouge C, Cooke DA, Turpin D, Wilson L. A nursing clinical decision support system and potential predictors of head-of-bed position for patients receiving mechanical ventilation. American Journal of Critical Care. 2010;19(1):39–47. doi: 10.4037/ajcc2010836. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Maag JW, Anderson JM. Effects of sound-field amplification to increase compliance of students with emotional and behavior disorders. Behavioral Disorders. 2006;31(4):378–393. [ Google Scholar ]
  • *. MacArthur CA, Lembo L. Strategy instruction in writing for adult literacy learners. Reading and Writing. 2009;22(9):1021–1039. doi: 10.1007/s11145-008-9142-x. [ DOI ] [ Google Scholar ]
  • *. Machalicek W, Shogren K, Lang R, Rispoli M, O’Reilly MF, Franco JH, Sigafoos J. Increasing play and decreasing the challenging behavior of children with autism during recess with activity schedules and task correspondence training. Research in Autism Spectrum Disorders. 2009;3(2):547–555. doi: 10.1016/j.rasd.2008.11.003. [ DOI ] [ Google Scholar ]
  • *. Madaus MMR, Kehle TJ, Madaus J, Bray MA. Mystery motivator as an intervention to promote homework completion and accuracy. School Psychology International. 2003;24(4):369–377. doi: 10.1177/01430343030244001. [ DOI ] [ Google Scholar ]
  • *. Mancil GR, Conroy MA, Haydon TF. Effects of a modified milieu therapy intervention on the social communicative behaviors of young children with autism spectrum disorders. Journal of Autism and Developmental Disorders. 2009;39(1):149–163. doi: 10.1007/s10803-008-0613-3. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Manuel JC, Sunseri MA, Olson R, Scolari M. A diagnostic approach to increase reusable dinnerware selection in a cafeteria. Journal of Applied Behavior Analysis. 2007;40(2):301–310. doi: 10.1901/jaba.2007.143-05. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • *. Marcus A, Sinnott B, Bradley S, Grey I. Treatment of idiopathic toe-walking in children with autism using GaitSpot Auditory Speakers and simplified habit reversal. Research in Autism Spectrum Disorders. 2010;4(2):260–267. doi: 10.1016/j.rasd.2009.09.012. [ DOI ] [ Google Scholar ]
  • Martens BK, Eckert TL, Begeny JC, Lewandowski LJ, DiGennaro FD, Montarello SA, Fiese BH. Effects of a fluency-building program on the reading performance of low-achieving second and third grade students. Journal of Behavioral Education. 2007;16(1):39–54. [ Google Scholar ]
  • *. Martens BK, Gertz LE, de Lacy Werder CS, Rymanowski JL. Agreement between descriptive and experimental analyses of behavior under naturalistic test conditions. Journal of Behavioral Education. 2010;19(3):205–221. doi: 10.1007/s10864-010-9110-9. [ DOI ] [ Google Scholar ]
  • Martin N, Fink R, Laine M. Treatment of word retrieval deficits with contextual priming. Aphasiology. 2004;18(5–7):457–471. doi: 10.1080/02687030444000129. [ DOI ] [ Google Scholar ]
  • *. Martins MP, Harris SL. Teaching children with autism to respond to joint attention initiations. Child & Family Behavior Therapy. 2006;28(1):51–68. doi: 10.1300/J019v28n01_04. [ DOI ] [ Google Scholar ]
  • *. Marvin KL, Rapp JT, Stenske MT, Rojas NR, Swanson GJ, Bartlett SM. Response repetition as an error-correction procedure for sight-word reading: A replication and extension. Behavioral Interventions. 2010;25(2):109–127. doi: 10.1002/bin.299. [ DOI ] [ Google Scholar ]
  • *. Massaro DW, Light J. Improving the vocabulary of children with hearing loss. The Volta Review. 2004;104(3):141–174. [ Google Scholar ]
  • Mastel-Smith B, Binder B, Malecha A, Hersch G, Symes L, McFarlane J. Testing therapeutic life review offered by home care workers to decrease depression among home-dwelling older women. Issues in Mental Health Nursing. 2006;27(10):1037–1049. doi: 10.1080/01612840600943689. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Matchett DL, Burns MK. Increasing word recognition fluency with an English-language learner. Journal of Evidence-Based Practices for Schools. 2009;10(2):194–206. [ Google Scholar ]
  • Mausbach BT, Coon DW, Patterson TL, Grant I. Engagement in activities is associated with affective arousal in Alzheimer’s caregivers: A preliminary examination of the temporal relations. Behavior Therapy. 2008;39(4):366–374. doi: 10.1016/j.beth.2007.10.002. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • *. Mauszycki SC, Wambaugh JL. The effects of rate control treatment on consonant production accuracy in mild apraxia of speech. Aphasiology. 2008;22(7–8):906–920. doi: 10.1080/02687030701800818. [ DOI ] [ Google Scholar ]
  • *. Mayfield KH, Vollmer TR. Teaching math skills to at-risk students using home-based peer tutoring. Journal of Applied Behavior Analysis. 2007;40(2):223–237. doi: 10.1901/jaba.2007.108-05. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • *. Mazzotti VL, Test DW, Wood CL, Richter S. Effects of computer-assisted instruction on students’ knowledge of postschool options. Career Development for Exceptional Individuals. 2010;33(1):25–40. doi: 10.1177/0885728809338714. [ DOI ] [ Google Scholar ]
  • *. McCarthy PJ, Jones MV, Harwood CG, Davenport L. Using goal setting to enhance positive affect among junior multievent athletes. Journal of Clinical Sport Psychology. 2010;4(1):53–68. [ Google Scholar ]
  • *. McCartney EJ, Anderson CM, English CL. Effect of brief clinic-based training on the ability of caregivers to implement escape extinction. Journal of Positive Behavior Interventions. 2005;7(1):18–32. doi: 10.1177/10983007050070010301. [ DOI ] [ Google Scholar ]
  • *. McClellan CB, Cohen LL, Moffett K. Time out based discipline strategy for children’s non-compliance with cystic fibrosis treatment. Disability and Rehabilitation: An International, Multidisciplinary Journal. 2009;31(4):327–336. doi: 10.1080/09638280802051713. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. McCurdy BL, Lannie AL, Barnabas E. Reducing disruptive behavior in an urban school cafeteria: An extension of the Good Behavior Game. Journal of School Psychology. 2009;47(1):39–54. doi: 10.1016/j.jsp.2008.09.003. [ DOI ] [ Google Scholar ]
  • *. McDonnell J, Johnson JW, Polychronis S, Riesen T, Jameson M, Kercher K. Comparison of one-to-one embedded instruction in general education classes with small group instruction in special education classes. Education and Training in Developmental Disabilities. 2006;41(2):125–138. [ Google Scholar ]
  • *. McDougall D. The range-bound changing criterion design. Behavioral Interventions. 2005;20(2):129–137. doi: 10.1002/bin.189. [ DOI ] [ Google Scholar ]
  • *. McEwen SE, Polatajko HJ, Huijbregts MPJ, Ryan JD. Inter-task transfer of meaningful, functional skills following a cognitive-based treatment: Results of three multiple baseline design experiments in adults with chronic stroke. Neuropsychological Rehabilitation. 2010;20(4):541–561. doi: 10.1080/09602011003638194. [ DOI ] [ PubMed ] [ Google Scholar ]
  • McGoey KE, Schneider DL, Rezzetano KM, Prodan T, Tankersley M. Classwide intervention to manage disruptive behavior in the kindergarten classroom. Journal of Applied School Psychology. 2010;26(3):247–261. doi: 10.1080/15377903.2010.495916. [ DOI ] [ Google Scholar ]
  • *. McKee SA, Harris GT, Rice ME, Silk L. Effects of a Snoezelen room on the behavior of three autistic clients. Research in Developmental Disabilities. 2007;28(3):304–316. doi: 10.1016/j.ridd.2006.04.001. [ DOI ] [ PubMed ] [ Google Scholar ]
  • McNeil MR, Katz WF, Fossett TRD, Garst DM, Szuminsky NJ, Carter G, Lim KY. Effects of online augmented kinematic and perceptual feedback on treatment of speech movements in apraxia of speech. Folia Phoniatrica et Logopaedica. 2010;62(3):127–133. doi: 10.1159/000287211. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • *. Mechling LC, Gustafson M. Comparison of the effects of static picture and video prompting on completion of cooking related tasks by students with moderate intellectual disabilities. Exceptionality. 2009;17(2):103–116. doi: 10.1080/09362830902805889. [ DOI ] [ Google Scholar ]
  • Mesmer EM, Duhon GJ, Hogan K, Newry B, Hommema S, Fletcher C, Boso M. Generalization of sight word accuracy using a common stimulus procedure: A preliminary investigation. Journal of Behavioral Education. 2010;19(1):47–61. doi: 10.1007/s10864-010-9103-8. [ DOI ] [ Google Scholar ]
  • *. Miguel CF, Petursdottir AI, Carr JE. The effects of multiple-tact and receptive-discrimination training on the acquisition of intraverbal behavior. Analysis of Verbal Behavior. 2005;21:27–41. doi: 10.1007/BF03393008. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • *. Miller JA, Austin J, Rohn D. Teaching pedestrian safety skills to children. Environment and Behavior. 2004;36(3):368–385. doi: 10.1177/0013916503260880. [ DOI ] [ Google Scholar ]
  • Milne D, Westerman C. Evidence-based clinical supervision: Rationale and illustration. Clinical Psychology & Psychotherapy. 2001;8(6):444–457. doi: 10.1002/cpp.297. [ DOI ] [ Google Scholar ]
  • *. Morgan L, Goldstein H. Teaching mothers of low socioeconomic status to use decontextualized language during storybook reading. Journal of Early Intervention. 2004;26(4):235–252. doi: 10.1177/105381510402600401. [ DOI ] [ Google Scholar ]
  • Morrison JQ, Jones KM. The effects of positive peer reporting as a class-wide positive behavior support. Journal of Behavioral Education. 2007;16(2):111–124. doi: 10.1007/s10864-006-9005-y. [ DOI ] [ Google Scholar ]
  • *. Morrison L, Kamps D, Garcia J, Parker D. Peer mediation and monitoring strategies to improve initiations and social skills for students with autism. Journal of Positive Behavior Interventions. 2001;3(4):237–250. doi: 10.1177/109830070100300405. [ DOI ] [ Google Scholar ]
  • *. Mozingo DB, Smith T, Riordan MR, Reiss ML, Bailey JS. Enhancing frequency recording by developmental disabilities treatment staff. Journal of Applied Behavior Analysis. 2006;39(2):253–256. doi: 10.1901/jaba.2006.55-05. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • *. Mruzek DW, Cohen C, Smith T. Contingency contracting with students with autism spectrum disorders in a public school setting. Journal of Developmental and Physical Disabilities. 2007;19(2):103–114. doi: 10.1007/s10882-007-9036-x. [ DOI ] [ Google Scholar ]
  • *. Mullane J, Corkum P. Case series: Evaluation of a behavioral sleep intervention for three children with attention-deficit/hyperactivity disorder and dyssomnia. Journal of Attention Disorders. 2006;10(2):217–227. doi: 10.1177/1087054706288107. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Müller K, Bütefisch CM, Seitz RJ, Hömberg V. Mental practice improves hand function after hemiparetic stroke. Restorative Neurology and Neuroscience. 2007;25(5–6):501–511. [ PubMed ] [ Google Scholar ]
  • *. Munroe-Chandler KJ, Hall CR, Fishburne GJ, Shannon V. Using cognitive general imagery to improve soccer strategies. European Journal of Sport Science. 2005;5(1):41–49. [ Google Scholar ]
  • *. Murphy KA, Theodore LA, Aloiso D, Alric-Edwards JM, Hughes TL. Interdependent group contingency and mystery motivators to reduce preschool disruptive behavior. Psychology in the Schools. 2007;44(1):53–63. doi: 10.1002/pits.20205. [ DOI ] [ Google Scholar ]
  • *. Murray LL, Ballard K, Karcher L. Linguistic specific treatment: Just for Broca’s aphasia? Aphasiology. 2004;18(9):785–809. doi: 10.1080/02687030444000273. [ DOI ] [ Google Scholar ]
  • *. Nakamura BJ, Pestle SL, Chorpita BF. Differential sequencing of cognitive-behavioral techniques for reducing child and adolescent anxiety. Journal of Cognitive Psychotherapy. 2009;23(2):114–135. doi: 10.1891/0889-8391.23.2.114. [ DOI ] [ Google Scholar ]
  • Nakonezny PA, Reddick R, Rodgers JL. Did divorces decline after the Oklahoma City bombing? Journal of Marriage and Family. 2004;66(1):90–100. doi: 10.1111/j.1741-3737.2004.00007.x. [ DOI ] [ Google Scholar ]
  • *. Naoi N, Yokoyama K, Yamamoto J-i. Matrix training for expressive and receptive two-word utterances in children with autism. Japanese Journal of Special Education. 2006;43(6):505–518. [ Google Scholar ]
  • *. Naoi N, Yokoyama K, Yamamoto J-i. Intervention for tact as reporting in children with autism. Research in Autism Spectrum Disorders. 2007;1(2):174–184. doi: 10.1016/j.rasd.2006.08.005. [ DOI ] [ Google Scholar ]
  • Nasar JL. Prompting drivers to stop for crossing pedestrians. Transportation Research Part F: Traffic Psychology and Behaviour. 2003;6(3):175–182. doi: 10.1016/s1369-8478(03)00024-x. [ DOI ] [ Google Scholar ]
  • *. Neddenriep CE, Skinner CH, Wallace MA, McCallum E. ClassWide peer tutoring: Two experiments investigating the generalized relationship between increased oral reading fluency and reading comprehension. Journal of Applied School Psychology. 2009;25(3):244–269. doi: 10.1080/15377900802487185. [ DOI ] [ Google Scholar ]
  • *. Neef NA, Bicard DF, Endo S, Coury DL, Aman MG. Evaluation of pharmacological treatment of impulsivity In children with attention deficit hyperactivity disorder. Journal of Applied Behavior Analysis. 2005;38(2):135–146. doi: 10.1901/jaba.2005.116-02. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • *. Neef NA, Marckel J, Ferreri S, Jung S, Nist L, Armstrong N. Effects of modeling versus instructions on sensitivity to reinforcement schedules. Journal of Applied Behavior Analysis. 2004;37(3):267–281. doi: 10.1901/jaba.2004.37-267. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • *. Nelson JS, Alber SR, Gordy A. Effects of systematic error correction and repeated readings on the reading accuracy and proficiency of second graders with disabilities. Education & Treatment of Children. 2004;27(3):186–198. [ Google Scholar ]
  • *. Newman B, Ten Eyck P. Self-management of initiations by students diagnosed with autism. Analysis of Verbal Behavior. 2005;21:117–122. doi: 10.1007/BF03393013. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Nguyen DM, Yu CT, Martin TL, Fregeau P, Pogorzelec C, Martin GL. Teaching object-picture matching to improve concordance between object and picture preferences for individuals with developmental disabilities: Pilot study. Journal on Developmental Disabilities. 2009;15(1):53–64. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • *. Noda W, Tanaka-Matsumi J. Effect of a classroom-based behavioral intervention package on the improvement of children’s sitting posture in Japan. Behavior Modification. 2009;33(2):263–273. doi: 10.1177/0145445508321324. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Normand MP. Increasing physical activity through self-monitoring, goal setting, and feedback. Behavioral Interventions. 2008;23(4):227–236. doi: 10.1002/bin.267. [ DOI ] [ Google Scholar ]
  • *. Normand MP, Knoll ML. The effects of a stimulus-stimulus pairing procedure on the unprompted vocalizations of a young child diagnosed with autism. Analysis of Verbal Behavior. 2006;22:81–85. doi: 10.1007/BF03393028. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • *. O’Callaghan PM, Allen KD, Powell S, Salama F. The efficacy of noncontingent escape for decreasing children’s disruptive behavior during restorative dental treatment. Journal of Applied Behavior Analysis. 2006;39(2):161–171. doi: 10.1901/jaba.2006.79-05. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • *. O’Callaghan PM, Reitman D, Northup J, Hupp SDA, Murphy MA. Promoting social skills generalization with ADHD-diagnosed children in a sports setting. Behavior Therapy. 2003;34(3):313–330. doi: 10.1016/s0005-7894(03)80003-5. [ DOI ] [ Google Scholar ]
  • *. O’Reilly MF, O’Halloran M, Sigafoos J, Lancioni GE, Green V, Edrisinha C, Olive M. Evaluation of video feedback and self-management to decrease schoolyard aggression and increase pro-social behaviour in two students with behavioural disorders. Educational Psychology. 2005;25(2–3):199–206. doi: 10.1080/0144341042000301157. [ DOI ] [ Google Scholar ]
  • Okinaka T, Shimazaki T. Effects of self-recording and self-goal-setting on accuracy of first service in soft tennis. Japanese Journal of Behavior Analysis. 2010;24(2):43–47. [ Google Scholar ]
  • Okuda K. Behavioral consultation services for school-refusal students with high-functioning pervasive developmental disorders: Token economy and changing reinforcement criteria. The Japanese Journal of Behavior Analysis. 2005;20(1):2–12. [ Google Scholar ]
  • Okuyama T, Isawa S. Right-left discrimination from one’s own and another person’s viewpoint in children with autism: Higher-order conditional discrimination and generalization of viewpoint. Japanese Journal of Behavior Analysis. 2010;24(2):2–16. [ Google Scholar ]
  • *. Ota K. Self-recording and accuracy of writing responses by students with developmental disabilities. Japanese Journal of Behavior Analysis. 2010;24(2):17–29. [ Google Scholar ]
  • *. Ottone S, Ponzano F. Competition and cooperation in markets: The experimental case of a winner-take-all setting. The Journal of Socio-Economics. 2010;39(2):163–170. doi: 10.1016/j.socec.2009.10.001. [ DOI ] [ Google Scholar ]
  • *. Palmen A, Didden R, Korzilius H. Effectiveness of behavioral skills training on staff performance in a job training setting for high-functioning adolescents with autism spectrum disorders. Research in Autism Spectrum Disorders. 2010;4(4):731–740. doi: 10.1016/j.rasd.2010.01.012. [ DOI ] [ Google Scholar ]
  • Pampino RN, Jr, MacDonald JE, Mullin JE, Wilder DA. Weekly feedback vs. daily feedback: An application in retail. Journal of Organizational Behavior Management. 2003;23(2–3):21–43. doi: 10.1300/J075v23n02_03. [ DOI ] [ Google Scholar ]
  • *. Pan-Skadden J, Wilder DA, Sparling J, Severtson E, Donaldson J, Postma N, Neidert P. The use of behavioral skills training and in-situ training to teach children to solicit help when lost: A preliminary investigation. Education & Treatment of Children. 2009;32(3):359–370. doi: 10.1353/etc.0.0063. [ DOI ] [ Google Scholar ]
  • *. Pappas DN, Skinner CH, Skinner AL. Supplementing accelerated reading with classwide interdependent group-oriented contingencies. Psychology in the Schools. 2010;47(9):887–902. doi: 10.1002/pits.20512. [ DOI ] [ Google Scholar ]
  • *. Park S, Singer GHS, Gibson M. The functional effect of teacher positive and neutral affect on task performance of students with significant disabilities. Journal of Positive Behavior Interventions. 2005;7(4):237–246. doi: 10.1177/10983007050070040501. [ DOI ] [ Google Scholar ]
  • *. Pasiali V. The use of prescriptive therapeutic songs in a home-based environment to promote social skills acquisition by children with autism: Three case studies. Music Therapy Perspectives. 2004;22(1):11–20. [ Google Scholar ]
  • *. Patterson DL, van der Mars H. Distant interactions and their effects on children’s physical activity levels. Physical Education and Sport Pedagogy. 2008;13(3):277–294. doi: 10.1080/17408980701345808. [ DOI ] [ Google Scholar ]
  • *. Peach RK, Reuter KA. A discourse-based approach to semantic feature analysis for the treatment of aphasic word retrieval failures. Aphasiology. 2010;24(9):971–990. doi: 10.1080/02687030903058629. [ DOI ] [ Google Scholar ]
  • *. Peck HL, Bray MA, Kehle TJ. Relaxation and guided imagery: A school-based intervention for children with asthma. Psychology in the Schools. 2003;40(6):657–675. doi: 10.1002/pits.10127. [ DOI ] [ Google Scholar ]
  • *. Peck HL, Kehle TJ, Bray MA, Theodore LA. Yoga as an intervention for children with attention problems. School Psychology Review. 2005;34(3):415–424. [ Google Scholar ]
  • *. Pennington L, Miller N, Robson S, Steen N. Intensive speech and language therapy for older children with cerebral palsy: A systems approach. Developmental Medicine & Child Neurology. 2010;52(4):337–344. doi: 10.1111/j.1469-8749.2009.03366.x. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Peterson P, Carta JJ, Greenwood C. Teaching enhanced milieu language teaching skills to parents in multiple risk families. Journal of Early Intervention. 2005;27(2):94–109. doi: 10.1177/105381510502700205. [ DOI ] [ Google Scholar ]
  • *. Peterson SMP, Caniglia C, Royster AJ, Macfarlane E, Plowman K, Baird SJ, Wu N. Blending functional communication training and choice making to improve task engagement and decrease problem behaviour. Educational Psychology. 2005;25(2–3):257–274. doi: 10.1080/0144341042000301193. [ DOI ] [ Google Scholar ]
  • *. Petscher ES, Bailey JS. Effects of training, prompting, and self-monitoring on staff behavior in a classroom for students with disabilities. Journal of Applied Behavior Analysis. 2006;39(2):215–226. doi: 10.1901/jaba.2006.02-05. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • *. Pétursdóttir A-L, McMaster K, McComas JJ, Bradfield T, Braganza V, Koch-McDonald J, Scharf H. Brief experimental analysis of early reading interventions. Journal of School Psychology. 2009;47(4):215–243. doi: 10.1016/j.jsp.2009.02.003. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Pétursdóttir A-L, Sigurdardóttir ZG. Increasing the skills of children with developmental disabilities through staff training in behavioral teaching techniques. Education and Training in Developmental Disabilities. 2006;41(3):264–279. [ Google Scholar ]
  • *. Pétursdóttir A-L, Carr JE, Lechago SA, Almason SM. An evaluation of intraverbal training and listener training for teaching categorization skills. Journal of Applied Behavior Analysis. 2008;41(1):53–68. doi: 10.1901/jaba.2008.41-53. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • *. Phaneuf L, McIntyre LL. Effects of individualized video feedback combined with group parent training on inappropriate maternal behavior. Journal of Applied Behavior Analysis. 2007;40(4):737–741. doi: 10.1901/jaba.2007.737-741. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • *. Phillips KJ, Mudford OC. Functional analysis skills training for residential caregivers. Behavioral Interventions. 2008;23(1):1–12. doi: 10.1002/bin.252. [ DOI ] [ Google Scholar ]
  • Pingzhi Y. A reversal design experiment on the game modification of preschool children’s social withdrawal. Psychological Science (China) 2004;27(1):231–233. [ Google Scholar ]
  • Pitman MJ. Functional analysis and treatment of socially stigmatizing ambulation. The Behavior Analyst Today. 2007;8(3):284–297. [ Google Scholar ]
  • *. Ploszay AJ, Gentner NB, Skinner CH, Wrisberg CA. The effects of multisensory imaging in conjunction with physical movement rehearsal on golf putting performance. Journal of Behavioral Education. 2006;15(4):249–257. doi: 10.1007/s10864-006-9034-6. [ DOI ] [ Google Scholar ]
  • Plow MA, Mathiowetz V, Lowe DA. Comparing individualized rehabilitation to a group wellness intervention for persons with multiple sclerosis. American Journal of Health Promotion. 2009;24(1):23–26. doi: 10.4278/ajhp.071211128. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Plumer PJ, Stoner G. The relative effects of classwide peer tutoring and peer coaching on the positive social behaviors of children with ADHD. Journal of Attention Disorders. 2005;9(1):290–300. doi: 10.1177/1087054705280796. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Porritt M, Burt A, Poling A. Increasing fiction writers’ productivity through an Internet-based intervention. Journal of Applied Behavior Analysis. 2006;39(3):393–397. doi: 10.1901/jaba.2006.134-05. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • *. Pullen PC, Lane HB, Lloyd JW, Nowak R, Ryals J. Effects of explicit instruction on decoding of struggling first grade students: A data-based case study. Education & Treatment of Children. 2005;28(1):63–76. [ Google Scholar ]
  • *. Putnam RF, Handler MW, Ramirez-Platt CM, Luiselli JK. Improving student bus-riding behavior through a whole-school intervention. Journal of Applied Behavior Analysis. 2003;36(4):583–590. doi: 10.1901/jaba.2003.36-583. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Pyles MK, Hahn EJ. Smoke-free legislation and charitable gaming in Kentucky. Tobacco Control: An International Journal. 2009;18(1):60–62. doi: 10.1136/tc.2008.027532. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Quesnel C, Savard J, Simard S, Ivers H, Morin CM. Efficacy of cognitive-behavioral therapy for insomnia in women treated for nonmetastic breast cancer. Journal of Consulting and Clinical Psychology. 2003;71(1):189–200. doi: 10.1037/0022-006x.71.1.189. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Quilty KM. Teaching paraprofessionals how to write and implement social stories for students with autism spectrum disorders. Remedial and Special Education. 2007;28(3):182–189. doi: 10.1177/07419325070280030701. [ DOI ] [ Google Scholar ]
  • *. Raghavendra P, Oaten R. Effects of speech and print feedback on spelling performance of a child with cerebral palsy using a speech generating device. Disability and Rehabilitation: Assistive Technology. 2007;2(5):299–308. doi: 10.1080/17483100701256388. [ DOI ] [ PubMed ] [ Google Scholar ]
  • Raiff BR, Faix C, Turturici M, Dallery J. Breath carbon monoxide output is affected by speed of emptying the lungs: Implications for laboratory and smoking cessation research. Nicotine & Tobacco Research. 2010;12(8):834–838. doi: 10.1093/ntr/ntq090. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Rantz WG, Dickinson AM, Sinclair GA, Van Houten R. The effect of feedback on the accuracy of checklist completion during instrument flight training. Journal of Applied Behavior Analysis. 2009;42(3):497–509. doi: 10.1901/jaba.2009.42-497. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • *. Rassau A, Arco L. Effects of chat-based on-line cognitive behavior therapy on study related behavior and anxiety. Behavioural and Cognitive Psychotherapy. 2003;31(3):377–381. doi: 10.1017/s1352465803003126. [ DOI ] [ Google Scholar ]
  • *. Rathel JM, Drasgow E, Christle CC. Effects of supervisor performance feedback on increasing preservice teachers’ positive communication behaviors with students with emotional and behavioral disorders. Journal of Emotional and Behavioral Disorders. 2008;16(2):67–77. doi: 10.1177/1063426607312537. [ DOI ] [ Google Scholar ]
  • *. Raviv T, Wadsworth ME. The efficacy of a pilot prevention program for children and caregivers coping with economic strain. Cognitive Therapy and Research. 2010;34(3):216–228. doi: 10.1007/s10608-009-9265-7. [ DOI ] [ Google Scholar ]
  • *. Raymer AM, Ciampitti M, Holliway B, Singletary F, Blonder LX, Ketterson T, Rothi LJG. Semantic-phonologic treatment for noun and verb retrieval impairments in aphasia. Neuropsychological Rehabilitation. 2007;17(2):244–270. doi: 10.1080/09602010600814661. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Rebsamen M, Boucheix J-M, Fayol M. Quality control in the optical industry: From a work analysis of lens inspection to a training programme, an experimental case study. Applied Ergonomics. 2010;41(1):150–160. doi: 10.1016/j.apergo.2009.07.004. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Regan KS, Mastropieri MA, Scruggs TE. Promoting expressive writing among students with emotional and behavioral disturbance via dialogue journals. Behavioral Disorders. 2005;31(1):33–50. [ Google Scholar ]
  • Reichle J, McComas J. Conditional use of a request for assistance. Disability and Rehabilitation: An International, Multidisciplinary Journal. 2004;26(21–22):1255–1262. doi: 10.1080/09638280412331280262. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Reichow B, Barton EE, Sewell JN, Good L, Wolery M. Effects of weighted vests on the engagement of children with developmental delays and autism. Focus on Autism and Other Developmental Disabilities. 2010;25(1):3–11. doi: 10.1177/1088357609353751. [ DOI ] [ Google Scholar ]
  • *. Reinhardt D, Theodore LA, Bray MA, Kehle TJ. Improving homework accuracy: Interdependent group contingencies and randomized components. Psychology in the Schools. 2009;46(5):471–488. doi: 10.1002/pits.20391. [ DOI ] [ Google Scholar ]
  • *. Reinke WM, Lewis-Palmer T, Martin E. The effect of visual performance feedback on teacher use of behavior-specific praise. Behavior Modification. 2007;31(3):247–263. doi: 10.1177/0145445506288967. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Renvall K, Laine M, Martin N. Treatment of anomia with contextual priming: Exploration of a modified procedure with additional semantic and phonological tasks. Aphasiology. 2007;21(5):499–527. doi: 10.1080/02687030701254248. [ DOI ] [ Google Scholar ]
  • *. Reynolds B, Dallery J, Shroff P, Patak M, Leraas K. A web-based contingency management program with adolescent smokers. Journal of Applied Behavior Analysis. 2008;41(4):597–601. doi: 10.1901/jaba.2008.41-597. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • *. Ribes E, Vargas I, Luna D, Martínez C. Adquisición y transferencia de una discriminación condicional en una secuencia de cinco criterios distintos de ajuste funcional. [Acquisition and transfer of a conditional discrimination in a sequence of five different functional adjustment criteria. Acta Comportamentalia. 2009;17(3):299–331. [ Google Scholar ]
  • *. Riesen T, McDonnell J, Johnson JW, Polychronis S, Jameson M. A comparison of constant time delay and simultaneous prompting within embedded instruction in general education classes with students with moderate to severe disabilities. Journal of Behavioral Education. 2003;12(4):241–259. doi: 10.1023/a:1026076406656. [ DOI ] [ Google Scholar ]
  • *. Rimondini M, Del Piccolo L, Goss C, Mazzi M, Paccaloni M, Zimmermann C. The evaluation of training in patient-centred interviewing skills for psychiatric residents. Psychological Medicine: A Journal of Research in Psychiatry and the Allied Sciences. 2010;40(3):467–476. doi: 10.1017/s0033291709990730. [ DOI ] [ PubMed ] [ Google Scholar ]
  • Rizvi SL, Linehan MM. The treatment of maladaptive shame in borderline personality disorder: A pilot study of “opposite action”. Cognitive and Behavioral Practice. 2005;12(4):437–447. doi: 10.1016/s1077-7229(05)80071-9. [ DOI ] [ Google Scholar ]
  • *. Rodriguez M, Wilder DA, Therrien K, Wine B, Miranti R, Daratany K, Rodriguez M. Use of the performance diagnostic checklist to select an intervention designed to increase the offering of promotional stamps at two sites of a restaurant franchise. Journal of Organizational Behavior Management. 2005;25(3):17–35. doi: 10.1300/J075v25n03_02. [ DOI ] [ Google Scholar ]
  • Romanowich P, Bourret J, Vollmer TR. Further analysis of the matching law to describe two- and three-point shot allocation by professional basketball players. Journal of Applied Behavior Analysis. 2007;40(2):311–315. doi: 10.1901/jaba.2007.119-05. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • *. Rosales R, Stone K, Rehfeldt RA. The effects of behavioral skills training on implementation of the Picture Exchange Communication System. Journal of Applied Behavior Analysis. 2009;42(3):541–549. doi: 10.1901/jaba.2009.42-541. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • *. Rosales R, Worsdell A, Trahan M. Comparison of methods for varying item presentation during noncontingent reinforcement. Research in Autism Spectrum Disorders. 2010;4(3):367–376. doi: 10.1016/j.rasd.2009.10.004. [ DOI ] [ Google Scholar ]
  • Rose HMS, Ludwig TD. Swimming pool hygiene: Self-monitoring, task clarification, and performance feedback increase lifeguard cleaning behaviors. Journal of Organizational Behavior Management. 2009;29(1):69–79. doi: 10.1080/01608060802660157. [ DOI ] [ Google Scholar ]
  • *. Ross SW, Horner RH. Bully prevention in positive behavior support. Journal of Applied Behavior Analysis. 2009;42(4):747–759. doi: 10.1901/jaba.2009.42-747. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • *. Ruiz-Olivares R, Pino MJ, Herruzo J. Reduction of disruptive behaviors using an intervention based on the Good Behavior Game and the say-do-report correspondence. Psychology in the Schools. 2010;47(10):1046–1058. doi: 10.1002/pits.20523. [ DOI ] [ Google Scholar ]
  • *. Ryan S. The effects of a sound-field amplification system on managerial time in middle school physical education settings. Language, Speech, and Hearing Services in Schools. 2009;40(2):131–137. doi: 10.1044/0161-1461(2008/08-0038). [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Saecker LB, Skinner CH, Sager-Brown K, Roberts AS. Using responsiveness data and learning theories to modify interventions: Cover, copy, and compare to enhance number-writing accuracy. Journal of Evidence-Based Practices for Schools. 2009;10(2):171–187. [ Google Scholar ]
  • Sakamoto M, Muto T, Mochizuki A. Enhancing the self-determination of students with autism: Evaluation of a training package for teachers. The Japanese Journal of Behavior Analysis. 2003;18(1):25–37. [ Google Scholar ]
  • Salem S, Fazzio D, Arnal L, Fregeau P, Thomson K, Martin GL, Yu CT. A self-instructional package for teaching university students to conduct discrete-trials teaching with children with autism. Journal on Developmental Disabilities. 2009;15(1):21–29. [ Google Scholar ]
  • Sante AD, McLaughlin TF, Weber KP. The use and evaluation of a direct instruction flash card strategy on multiplication math facts mastery with two students with developmental disabilities and attention deficit hyperactivity disorder. Journal of Precision Teaching & Celeration. 2001;17(2):68–75. [ Google Scholar ]
  • *. Sarokoff RA, Sturmey P. The effects of instructions, rehearsal, modeling, and feedback on acquisition and generalization of staff use of discrete trial teaching and student correct responses. Research in Autism Spectrum Disorders. 2008;2(1):125–136. doi: 10.1016/j.rasd.2007.04.002. [ DOI ] [ Google Scholar ]
  • Scattone D, Tingstrom DH, Wilczynski SM. Increasing appropriate social interactions of children with autism spectrum disorders using social stories. Focus on Autism and Other Developmental Disabilities. 2006;21(4):211–222. doi: 10.1177/10883576060210040201. [ DOI ] [ Google Scholar ]
  • *. Schadler JJ, Jr, Wilder DA, Blakely E. Signaling stimulus presentation during treatment with noncontingent reinforcement: Visual versus vocal signals. Behavioral Interventions. 2009;24(2):107–116. doi: 10.1002/bin.279. [ DOI ] [ Google Scholar ]
  • *. Scheeler MC, Lee DL. Using technology to deliver immediate corrective feedback to preservice teachers. Journal of Behavioral Education. 2002;11(4):231–241. doi: 10.1023/a:1021158805714. [ DOI ] [ Google Scholar ]
  • *. Scherrer MD, Wilder DA. Training to increase safe tray carrying among cocktail servers. Journal of Applied Behavior Analysis. 2008;41(1):131–135. doi: 10.1901/jaba.2008.41-131. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • *. Schilling DL, Washington K, Billingsley FF, Deitz J. Classroom seating for children with attention deficit hyperactivity disorder: Therapy balls versus chairs. American Journal of Occupational Therapy. 2003;57(5):534–541. doi: 10.5014/ajot.57.5.534. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Schindler HR, Horner RH. Generalized reduction of problem behavior of young children with autism: Building trans-situational interventions. American Journal on Mental Retardation. 2005;110(1):36–47. doi: 10.1352/0895-8017(2005)110&#x0003c;36:gropbo&#x0003e;2.0.co;2. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Schisler R, Joseph LM, Konrad M, Alber-Morgan S. Comparison of the effectiveness efficiency of oral and written retelling and passage review as strategies for comprehending text. Psychology in the Schools. 2010;47(2):135–152. [ Google Scholar ]
  • *. Schneider N, Goldstein H. Social Stories™ improve the on-task behavior for children with language impairment. Journal of Early Intervention. 2009;31(3):250–264. [ Google Scholar ]
  • *. Schneider N, Goldstein H. Using social stories and visual schedules to improve socially appropriate behaviors in children with autism. Journal of Positive Behavior Interventions. 2010;12(3):149–160. [ Google Scholar ]
  • *. Schneider SL, Frens RA. Training four-syllable CV patterns in individuals with acquired apraxia of speech: Theoretical implications. Aphasiology. 2005;19(3–5):451–471. doi: 10.1080/02687030444000886. [ DOI ] [ Google Scholar ]
  • Schwebel DC, Summerlin AL, Bounds ML, Morrongiello BA. The Stamp-in-Safety program: A behavioral intervention to reduce behaviors that can lead to unintentional playground injury in a preschool setting. Journal of Pediatric Psychology. 2006;31(2):152–162. doi: 10.1093/jpepsy/jsj001. [ DOI ] [ PubMed ] [ Google Scholar ]
  • Scott VB, Jr, Robare RD, Raines DB, Konwinski SJM, Chanin JA, Tolley RS. Emotive writing moderates the relationship between mood awareness and athletic performance in collegiate tennis players. North American Journal of Psychology. 2003;5(2):311–324. [ Google Scholar ]
  • Shaw R, Simms T. Reducing attention-maintained behavior through the use of positive punishment, differential reinforcement of low rates, and response marking. Behavioral Interventions. 2009;24(4):249–263. doi: 10.1002/bin.287. [ DOI ] [ Google Scholar ]
  • *. Shelton D, LeGros K, Norton L, Stanton-Cook S, Morgan J, Masterman P. Randomised controlled trial: A parent-based group education programme for overweight children. Journal of Paediatrics and Child Health. 2007;43(12):799–805. doi: 10.1111/j.1440-1754.2007.01150.x. [ DOI ] [ PubMed ] [ Google Scholar ]
  • Shepard DS, Bail RN, Merritt CG. Cost-effectiveness of USAID’s regional program for family planning in West Africa. Studies in Family Planning. 2003;34(2):117–126. doi: 10.1111/j.1728-4465.2003.00117.x. [ DOI ] [ PubMed ] [ Google Scholar ]
  • Sherriff KL, Wallis M, Chaboyer W. Nurses’ attitudes to and perceptions of knowledge and skills regarding evidence-based practice. International Journal of Nursing Practice. 2007;13(6):363–369. doi: 10.1111/j.1440-172X.2007.00651.x. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Shimizu H, Yoon S, McDonough CS. Teaching skills to use a computer mouse in preschoolers with developmental disabilities: Shaping moving a mouse and eye–hand coordination. Research in Developmental Disabilities. 2010;31(6):1448–1461. doi: 10.1016/j.ridd.2010.06.013. [ DOI ] [ PubMed ] [ Google Scholar ]
  • Shin JH. Application of repeated-measures analysis of variance and hierarchical linear model in nursing research. Nursing Research. 2009;58(3):211–217. doi: 10.1097/NNR.0b013e318199b5ae. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Shipherd JC, Beck JG, Hamblen JL, Lackner JM, Freeman JB. A preliminary examination of treatment for posttraumatic stress disorder in chronic pain patients: A case study. Journal of Traumatic Stress. 2003;16(5):451–457. doi: 10.1023/a:1025754310462. [ DOI ] [ PubMed ] [ Google Scholar ]
  • Sigafoos J, Drasgow E, Halle JW, O’Reilly M, Seely-York S, Edrisinha C, Andrews A. Teaching VOCA use as a communicative repair strategy. Journal of Autism and Developmental Disorders. 2004;34(4):411–422. doi: 10.1023/B:JADD.0000037417.04356.9c. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Sigmon SC, Higgins ST. Voucher-based contingent reinforcement of marijuana abstinence among individuals with serious mental illness. Journal of Substance Abuse Treatment. 2006;30(4):291–295. doi: 10.1016/j.jsat.2006.02.001. [ DOI ] [ PubMed ] [ Google Scholar ]
  • Sigurdsson SO, Austin J. Using real-time visual feedback to improve posture at computer workstations. Journal of Applied Behavior Analysis. 2008;41(3):365–375. doi: 10.1901/jaba.2008.41-365. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • *. Sigurdsson V, Foxall G, Saevarsson H. In-store experimental approach to pricing and consumer behavior. Journal of Organizational Behavior Management. 2010;30(3):234–246. doi: 10.1080/01608061.2010.499029. [ DOI ] [ Google Scholar ]
  • *. Siguröardóttir ZG, Sighvatsson MB. Operant conditioning and errorless learning procedures in the treatment of chronic aphasia. International Journal of Psychology. 2006;41(6):527–540. doi: 10.1080/00207590500492625. [ DOI ] [ Google Scholar ]
  • *. Silverman AH, Haines AA, Davies WH, Parton E. A cognitive behavioral adherence intervention for adolescents with type 1 diabetes. Journal of Clinical Psychology in Medical Settings. 2003;10(2):119–127. doi: 10.1023/a:1023346222153. [ DOI ] [ Google Scholar ]
  • *. Simmons-Mackie NN, Kearns KP, Potechin G. Treatment of aphasia through family member training. Aphasiology. 2005;19(6):583–593. doi: 10.1080/02687030444000408. [ DOI ] [ Google Scholar ]
  • *. Simpson K, Keen D. Teaching young children with autism graphic symbols embedded within an interactive song. Journal of Developmental and Physical Disabilities. 2010;22(2):165–177. doi: 10.1007/s10882-009-9173-5. [ DOI ] [ Google Scholar ]
  • *. Simpson S, Bell L, Britton P, Mitchell D, Morrow E, Johnston AL, Brebner J. Does video therapy work? A single case series of bulimic disorders. European Eating Disorders Review. 2006;14(4):226–241. doi: 10.1002/erv.686. [ DOI ] [ Google Scholar ]
  • *. Singer-Dudek J, Greer RD. A long-term analysis of the relationship between fluency and the training and maintenance of complex math skills. The Psychological Record. 2005;55(3):361–376. [ Google Scholar ]
  • *. Singh NN, Lancioni GE, Joy SDS, Winton ASW, Sabaawi M, Wahler RG, Singh J. Adolescents with conduct disorder can be mindful of their aggressive behavior. Journal of Emotional and Behavioral Disorders. 2007;15(1):56–63. doi: 10.1177/10634266070150010601. [ DOI ] [ Google Scholar ]
  • *. Singh NN, Lancioni GE, Winton ASW, Adkins AD, Singh J, Singh AN. Mindfulness training assists individuals with moderate mental retardation to maintain their community placements. Behavior Modification. 2007;31(6):800–814. doi: 10.1177/0145445507300925. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Singh NN, Lancioni GE, Winton ASW, Curtis WJ, Wahler RG, Sabaawi M, McAleavey K. Mindful staff increase learning and reduce aggression in adults with developmental disabilities. Research in Developmental Disabilities. 2006;27(5):545–558. doi: 10.1016/j.ridd.2005.07.002. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Singh NN, Lancioni GE, Winton ASW, Singh AN, Adkins AD, Singh J. Mindful staff can reduce the use of physical restraints when providing care to individuals with intellectual disabilities. Journal of Applied Research in Intellectual Disabilities. 2009;22(2):194–202. doi: 10.1111/j.1468-3148.2008.00488.x. [ DOI ] [ Google Scholar ]
  • *. Sliminng EC, Montes PB, Bustos CF, Hoyuelos XP, Vio CG. Efectos de un programa combinado de técnicas de modificación conductual para la disminución de la conducta disruptiva y el aumento de la conducta prosocial en escolares Chilenos. [Effects of a combined program of behavior modification techniques for decreasing disruptive behavior and increasing prosocial behavior in Chilean school children. Acta Colombiana de Psicología. 2009;12(1):67–76. [ Google Scholar ]
  • *. Slowiak JM, Madden GJ, Mathews R. The effects of a combined task clarification, goal setting, feedback, and performance contingent consequence intervention package on telephone customer service in a medical clinic environment. Journal of Organizational Behavior Management. 2005;25(4):15–35. doi: 10.1300/J075v25n04_02. [ DOI ] [ Google Scholar ]
  • *. Smidt A, Balandin S, Reed V, Sigafoos J. A communication training programme for residential staff working with adults with challenging behaviour: Pilot data on intervention effects. Journal of Applied Research in Intellectual Disabilities. 2007;20(1):16–29. doi: 10.1111/j.1468-3148.2006.00336.x. [ DOI ] [ Google Scholar ]
  • *. Smith JD, Handler L, Nash MR. Therapeutic assessment for preadolescent boys with oppositional defiant disorder: A replicated single-case time-series design. Psychological Assessment. 2010;22(3):593–602. doi: 10.1037/a0019697. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Solberg KM, Hanley GP, Layer SA, Ingvarsson ET. The effects of reinforcer pairing and fading on preschoolers’ snack selections. Journal of Applied Behavior Analysis. 2007;40(4):633–644. doi: 10.1901/jaba.2007.633-644. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Sorenson SB, Wiebe DJ, Berk RA. Legalized abortion and the homicide of young children: An empirical investigation. Analyses of Social Issues and Public Policy (ASAP) 2002;2(1):239–256. doi: 10.1111/j.1530-2415.2002.00040.x. [ DOI ] [ Google Scholar ]
  • *. Sprague J, Perkins K. Direct and collateral effects of the First Step to Success program. Journal of Positive Behavior Interventions. 2009;11(4):208–221. doi: 10.1177/1098300708330935. [ DOI ] [ Google Scholar ]
  • *. Squires J, Wilder DA, Fixsen A, Hess E, Rost K, Curran R, Zonneveld K. The effects of task clarification, visual prompts, and graphic feedback on customer greeting and up-selling in a restaurant. Journal of Organizational Behavior Management. 2007;27(3):1–13. doi: 10.1300/J075v27n03_01. [ DOI ] [ Google Scholar ]
  • Stanton-Chapman TL, Denning CB, Jamison KR. Exploring the effects of a social communication intervention for improving requests and word diversity in preschoolers with disabilities. Psychology in the Schools. 2008;45(7):644–664. doi: 10.1002/pits.20315. [ DOI ] [ Google Scholar ]
  • *. Stanton-Chapman TL, Jamison KR, Denning CB. Building school communication skills in young children with disabilities: An intervention to promote peer social interactions in preschool settings. Early Childhood Services: An Interdisciplinary Journal of Effectiveness. 2008;2(4):225–251. [ Google Scholar ]
  • *. Stanton-Chapman TL, Kaiser AP, Vijay P, Chapman C. A multicomponent intervention to increase peer-directed communication in Head Start children. Journal of Early Intervention. 2008;30(3):188–212. doi: 10.1177/1053815108318746. [ DOI ] [ Google Scholar ]
  • *. Stapleton S, Adams M, Atterton L. A mobile phone as a memory aid for individuals with traumatic brain injury: A preliminary investigation. Brain Injury. 2007;21(4):401–411. doi: 10.1080/02699050701252030. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Staubitz JE, Cartledge G, Yurick AL, Lo Y-Y. Repeated reading for students with emotional or behavioral disorders: Peer- and trainer-mediated instruction. Behavioral Disorders. 2005;31(1):51–64. [ Google Scholar ]
  • *. Stichter JP, Hudson S, Sasso GM. The use of structural analysis to identify setting events in applied settings for students with emotional/behavioral disorders. Behavioral Disorders. 2005;30(4):403–420. [ Google Scholar ]
  • *. Stichter JP, Randolph JK, Kay D, Gage N. The use of structural analysis to develop antecedent-based interventions for students with autism. Journal of Autism and Developmental Disorders. 2009;39(6):883–896. doi: 10.1007/s10803-009-0693-8. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Stormont MA, Smith SC, Lewis TJ. Teacher implementation of precorrection and praise statements in Head Start classrooms as a component of a program-wide system of positive behavior support. Journal of Behavioral Education. 2007;16(3):280–290. doi: 10.1007/s10864-007-9040-3. [ DOI ] [ Google Scholar ]
  • Strand EA, Stoeckel R, Baas B. Treatment of severe childhood apraxia of speech: A treatment efficacy study. Journal of Medical Speech-Language Pathology. 2006;14(4):297–307. [ Google Scholar ]
  • *. Strong AC, Wehby JH, Falk KB, Lane KL. The impact of a structured reading curriculum and repeated reading on the performance of junior high students with emotional and behavioral disorders. School Psychology Review. 2004;33(4):561–581. [ Google Scholar ]
  • Stuart A, Frazier CL, Kalinowski J, Vos PW. The effect of frequency altered feedback on stuttering duration and type. Journal of Speech, Language, and Hearing Research. 2008;51(4):889–897. doi: 10.1044/1092-4388(2008/065). [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Taber-Doughty T, Shurr J, Brewer J, Kubik S. Standard care and telecare services: Comparing the effectiveness of two service systems with consumers with intellectual disabilities. Journal of Intellectual Disability Research. 2010;54(9):843–859. doi: 10.1111/j.1365-2788.2010.01314.x. [ DOI ] [ PubMed ] [ Google Scholar ]
  • Tanol G, Johnson L, McComas J, Cote E. Responding to rule violations or rule following: A comparison of two versions of the Good Behavior Game with kindergarten students. Journal of School Psychology. 2010;48(5):337–355. doi: 10.1016/j.jsp.2010.06.001. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Tarbox RSF, Ghezzi PM, Wilson G. The effects of token reinforcement on attending in a young child with autism. Behavioral Interventions. 2006;21(3):155–164. doi: 10.1002/bin.213. [ DOI ] [ Google Scholar ]
  • *. Taylor BA, Hoch H. Teaching children with autism to respond to and initiate bids for joint attention. Journal of Applied Behavior Analysis. 2008;41(3):377–391. doi: 10.1901/jaba.2008.41-377. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • *. Taylor BA, Hoch H, Potter B, Rodriguez A, Spinnato D, Kalaigian M. Manipulating establishing operations to promote initiations toward peers in children with autism. Research in Developmental Disabilities. 2005;26(4):385–392. doi: 10.1016/j.ridd.2004.11.003. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Taylor M, Gillies RM, Ashman AF. Cognitive training, conflict resolution and exercise: Effects on young adolescents’ well-being. Australian Journal of Guidance & Counselling. 2009;19(2):131–149. doi: 10.1375/ajgc.19.2.131. [ DOI ] [ Google Scholar ]
  • *. Taylor R, Iacono T. AAC and scripting activities to facilitate communication and play. Advances in Speech Language Pathology. 2003;5(2):79–93. doi: 10.1080/14417040510001669111. [ DOI ] [ Google Scholar ]
  • *. ter Kuile MM, Bulté I, Weijenborg PTM, Beekman A, Melles R, Onghena P. Therapist-aided exposure for women with lifelong vaginismus: A replicated single-case design. Journal of Consulting and Clinical Psychology. 2009;77(1):149–159. doi: 10.1037/a0014273. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Tereshko L, MacDonald R, Ahearn WH. Strategies for teaching children with autism to imitate response chains using video modeling. Research in Autism Spectrum Disorders. 2010;4(3):479–489. doi: 10.1016/j.rasd.2009.11.005. [ DOI ] [ Google Scholar ]
  • *. Theodore LA, DioGuardi RJ, Hughes TL, Aloiso D, Carlo M, Eccles D. A class-wide intervention for improving homework performance. Journal of Educational & Psychological Consultation. 2009;19(4):275–299. doi: 10.1080/10474410902888657. [ DOI ] [ Google Scholar ]
  • *. Thomas O, Maynard I, Hanton S. Intervening with athletes during the time leading up to competition: Theory to practice II. Journal of Applied Sport Psychology. 2007;19(4):398–418. doi: 10.1080/10413200701599140. [ DOI ] [ Google Scholar ]
  • *. Thompson CK, Kearns KP, Edmonds LA. An experimental analysis of acquisition, generalisation, and maintenance of naming behaviour in a patient with anomia. Aphasiology. 2006;20(12):1226–1244. doi: 10.1080/02687030600875655. [ DOI ] [ Google Scholar ]
  • *. Thorne S, Kamps D. The effects of a group contingency intervention on academic engagement and problem behavior of at-risk students. Behavior Analysis in Practice. 2008;1(2):12–18. doi: 10.1007/BF03391723. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • *. Thothathiri M, Schwartz MF, Thompson-Schill SL. Selection for position: The role of left ventrolateral prefrontal cortex in sequencing language. Brain and Language. 2010;113(1):28–38. doi: 10.1016/j.bandl.2010.01.002. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • *. Tiger JH, Hanley GP. An example of discovery research involving the transfer of stimulus control. Journal of Applied Behavior Analysis. 2005;38(4):499–509. doi: 10.1901/jaba.2005.139-04. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • *. Tijs E, Matyas TA. Bilateral training does not facilitate performance of copying tasks in poststroke hemiplegia. Neurorehabilitation and Neural Repair. 2006;20(4):473–483. doi: 10.1177/1545968306287900. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Tincani M, Crozier S, Alazetta L. The Picture Exchange Communication System: Effects on manding and speech development for school-aged children with autism. Education and Training in Developmental Disabilities. 2006;41(2):177–184. [ Google Scholar ]
  • Tittelbach D, DeAngelis M, Sturmey P, Alvero AM. The effects of task clarification, feedback, and goal setting on student advisors’ office behaviors and customer service. Journal of Organizational Behavior Management. 2007;27(3):27–37. doi: 10.1300/J075v27n03_03. [ DOI ] [ Google Scholar ]
  • *. Todd T, Reid G, Butler-Kisber L. Cycling for students with ASD: Self-regulation promotes sustained physical activity. Adapted Physical Activity Quarterly. 2010;27(3):226–241. doi: 10.1123/apaq.27.3.226. [ DOI ] [ PubMed ] [ Google Scholar ]
  • Tonon MA. A longitudinal study of the impact of village health education on environmental sanitation. International Quarterly of Community Health Education. 2007;28(2):109–126. doi: 10.2190/IQ.28.2.c. [ DOI ] [ PubMed ] [ Google Scholar ]
  • Topbaş S, Ünal Ö. An alternating treatment comparison of minimal and maximal opposition sound selection in Turkish phonological disorders. Clinical Linguistics & Phonetics. 2010;24(8):646–668. doi: 10.3109/02699206.2010.486464. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Trembath D, Balandin S, Togher L, Stancliffe RJ. Peer-mediated teaching and augmentative and alternative communication for preschool-aged children with autism. Journal of Intellectual and Developmental Disability. 2009;34(2):173–186. doi: 10.1080/13668250902845210. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Trent JA, Kaiser AP, Wolery M. The use of responsive interaction strategies by siblings. Topics in Early Childhood Special Education. 2005;25(2):107–118. doi: 10.1177/02711214050250020101. [ DOI ] [ Google Scholar ]
  • Trent M, Judy SL, Ellen JM, Walker A. Use of an institutional intervention to improve quality of care for adolescents treated in pediatric ambulatory settings for pelvic inflammatory disease. Journal of Adolescent Health. 2006;39(1):50–56. doi: 10.1016/j.jadohealth.2005.08.008. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Trovato M, Slomine B, Pidcock F, Christensen J. Case study: The efficacy of donepezil hydrochloride on memory functioning in three adolescents with severe traumatic brain injury. Brain Injury. 2006;20(3):339–343. doi: 10.1080/02699050500487811. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Tsao L-L, Odom SL. Sibling-mediated social interaction intervention for young children with autism. Topics in Early Childhood Special Education. 2006;26(2):106–123. doi: 10.1177/02711214060260020101. [ DOI ] [ Google Scholar ]
  • *. Vallières A, Morin CM, Guay B. Sequential combinations of drug and cognitive behavioral therapy for chronic insomnia: An exploratory study. Behaviour Research and Therapy. 2005;43(12):1611–1630. doi: 10.1016/j.brat.2004.11.011. [ DOI ] [ PubMed ] [ Google Scholar ]
  • van der Sluis CK, Datema L, Saan I, Stant D, Dijkstra PU. Effects of a nurse practitioner on a multidisciplinary consultation team. Journal of Advanced Nursing. 2009;65(3):625–633. doi: 10.1111/j.1365-2648.2008.04916.x. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Van Houten R, Malenfant JEL. Effects of a driver enforcement program on yielding to pedestrians. Journal of Applied Behavior Analysis. 2004;37(3):351–363. doi: 10.1901/jaba.2004.37-351. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • *. Van Houten R, Malenfant JEL, Austin J, Lebbon A. The effects of a seatbelt-gearshift delay prompt on the seatbelt use of motorists who do not regularly wear seatbelts. Journal of Applied Behavior Analysis. 2005;38(2):195–203. doi: 10.1901/jaba.2005.48-04. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • *. Van Norman RK, Wood CL. Effects of prerecorded sight words on the accuracy of tutor feedback. Remedial and Special Education. 2008;29(2):96–107. doi: 10.1177/0741932507311634. [ DOI ] [ Google Scholar ]
  • *. Van Rie GL, Heflin LJ. The effect of sensory activities on correct responding for children with autism spectrum disorders. Research in Autism Spectrum Disorders. 2009;3(3):783–796. doi: 10.1016/j.rasd.2009.03.001. [ DOI ] [ Google Scholar ]
  • *. van Vonderen A. Effectiveness of immediate verbal feedback on trainer behaviour during communication training with individuals with intellectual disability. Journal of Intellectual Disability Research. 2004;48(3):245–251. doi: 10.1111/j.1365-2788.2003.00555.x. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. van Vonderen A, de Bresser A. The effect of supervisory feedback, self-recording, and graphic feedback on trainer behavior during one-to-one training. Behavioral Interventions. 2005;20(4):273–284. doi: 10.1002/bin.198. [ DOI ] [ Google Scholar ]
  • van Vonderen A, de Swart C, Didden R. Effectiveness of instruction and video feedback on staff’s use of prompts and children’s adaptive responses during one-to-one training in children with severe to profound intellectual disability. Research in Developmental Disabilities. 2010;31(3):829–838. doi: 10.1016/j.ridd.2010.02.008. [ DOI ] [ PubMed ] [ Google Scholar ]
  • van Vonderen A, Duker P, Didden R. Instruction and video feedback to improve staff’s trainer behaviour and response prompting during one-to-one training with young children with severe intellectual disability. Research in Developmental Disabilities. 2010;31(6):1481–1490. doi: 10.1016/j.ridd.2010.06.009. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. van Vonderen A, Duker P, Didden R. Professional development improves staff’s implementation of rehabilitation programmes for children with severe-to-profound intellectual disability. Developmental Neurorehabilitation. 2010;13(5):351–359. doi: 10.3109/17518423.2010.493916. [ DOI ] [ PubMed ] [ Google Scholar ]
  • VanDerHeyden AM, Witt JC, Gilbertson D. A multi-year evaluation of the effects of a response to intervention (RTI) model on identification of children for special education. Journal of School Psychology. 2007;45(2):225–256. doi: 10.1016/j.jsp.2006.11.004. [ DOI ] [ Google Scholar ]
  • *. VanWormer JJ. Predometers and brief e-counseling: Increasing physical activity for overweight adults. Journal of Applied Behavior Analysis. 2004;37(3):421–425. doi: 10.1901/jaba.2004.37-421. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • *. Vichi C, Andery MAPA, Glenn SS. A metacontingency experiment: The effects of contingent consequences on patterns of interlocking contingencies of reinforcement. Behavior and Social Issues. 2009;18(1):1–17. [ Google Scholar ]
  • *. Vidoni C, Ward P. Effects of fair play instruction on student social skills during a middle school sport education unit. Physical Education and Sport Pedagogy. 2009;14(3):285–310. doi: 10.1080/17408980802225818. [ DOI ] [ Google Scholar ]
  • *. Vintere P, Hemmes NS, Brown BL, Poulson CL. Gross-motor skill acquisition by preschool dance students under self-instruction procedures. Journal of Applied Behavior Analysis. 2004;37(3):305–322. doi: 10.1901/jaba.2004.37-305. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • *. Wade CM, Ortiz C, Gorman BS. Two-session group parent training for bedtime noncompliance in Head Start preschoolers. Child & Family Behavior Therapy. 2007;29(3):23–55. doi: 10.1300/J019v29n03_03. [ DOI ] [ Google Scholar ]
  • Wagenaar AC, Maldonado-Molina MM, Erickson DJ, Ma L, Tobler AL, Komro KA. General deterrence effects of U.S. statutory DUI fine and jail penalties: Long-term follow-up in 32 states. Accident Analysis and Prevention. 2007;39(5):982–994. doi: 10.1016/j.aap.2007.01.003. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Walberg JL, Craig-Unkefer LA. An examination of the effects of a social communication intervention on the play behaviors of children with autism spectrum disorder. Education and Training in Autism and Developmental Disabilities. 2010;45(1):69–80. [ Google Scholar ]
  • *. Walker HC, Phillips DE, Boswell DB, Guthrie BL, Guthrie SL, Nicholas AP, Watts RL. Relief of acquired stuttering associated with Parkinson’s disease by unilateral left subthalamic brain stimulation. Journal of Speech, Language, and Hearing Research. 2009;52(6):1652–1657. doi: 10.1044/1092-4388(2009/08-0089). [ DOI ] [ PubMed ] [ Google Scholar ]
  • Wambaugh J, Nessler C. Modification of sound production treatment for apraxia of speech: Acquisition and generalisation effects. Aphasiology. 2004;18(5–7):407–427. doi: 10.1080/02687030444000165. [ DOI ] [ Google Scholar ]
  • Wambaugh JL, Mauszycki SC. Sound production treatment: Application with severe apraxia of speech. Aphasiology. 2010;24(6–8):814–825. doi: 10.1080/02687030903422494. [ DOI ] [ Google Scholar ]
  • *. Warnes E, Allen KD. Biofeedback treatment of paradoxical vocal fold motion and respiratory distress in an adolescent girl. Journal of Applied Behavior Analysis. 2005;38(4):529–532. doi: 10.1901/jaba.2005.26-05. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • *. Watanabe T, Kanayama Y, Muto T. Improving communication between regular class teachers and teaching assistants: Increasing teachers’ comments by modifying communication cards. The Japanese Journal of Behavior Analysis. 2007;22(1):39–48. [ Google Scholar ]
  • Webb OJ, Eves FF. Promoting stair climbing: Intervention effects generalize to a subsequent stair ascent. American Journal of Health Promotion. 2007;22(2):114–119. doi: 10.4278/0890-1171-22.2.114. [ DOI ] [ PubMed ] [ Google Scholar ]
  • Webb OJ, Eves FF. Effects of environmental changes in a stair climbing intervention: Generalization to stair descent. American Journal of Health Promotion. 2007;22(1):38–44. doi: 10.4278/0890-1171-22.1.38. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Wehby JH, Lane KL, Falk KB. An inclusive approach to improving early literacy skills of students with emotional and behavioral disorders. Behavioral Disorders. 2005;30(2):155–169. [ Google Scholar ]
  • *. Weiner JS. Peer-mediated conversational repair in students with moderate and severe disabilities. Research and Practice for Persons with Severe Disabilities. 2005;30(1):26–37. doi: 10.2511/rpsd.30.1.26. [ DOI ] [ Google Scholar ]
  • *. Weiskop S, Richdale A, Matthews J. Behavioural treatment to reduce sleep problems in children with autism or fragile X syndrome. Developmental Medicine & Child Neurology. 2005;47(2):94–104. doi: 10.1017/s0012162205000186. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. West EA. Effects of verbal cues versus pictorial cues on the transfer of stimulus control for children with autism. Focus on Autism and Other Developmental Disabilities. 2008;23(4):229–241. doi: 10.1177/1088357608324715. [ DOI ] [ Google Scholar ]
  • *. Westerlund D, Granucci EA, Gamache P, Clark HB. Effects of peer mentors on work-related performance of adolescents with behavioral and/or learning disabilities. Journal of Positive Behavior Interventions. 2006;8(4):244–251. doi: 10.1177/10983007060080040601. [ DOI ] [ Google Scholar ]
  • *. Whalen C, Schreibman L. Joint attention training for children with autism using behavior modification procedures. Journal of Child Psychology and Psychiatry. 2003;44(3):456–468. doi: 10.1111/1469-7610.00135. [ DOI ] [ PubMed ] [ Google Scholar ]
  • Wheatley RK, West RP, Charlton CT, Sanders RB, Smith TG, Taylor MJ. Improving behavior through differential reinforcement: A praise note system for elementary school students. Education & Treatment of Children. 2009;32(4):551–571. doi: 10.1353/etc.0.0071. [ DOI ] [ Google Scholar ]
  • *. Whelan R, Barnes-Holmes D, Dymond S. The transformation of consequential functions in accordance with the relational frames of more-than and less-than. Journal of the Experimental Analysis of Behavior. 2006;86(3):317–335. doi: 10.1901/jeab.2006.113-04. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • *. Wilder DA, Atwell J. Evaluation of a guided compliance procedure to reduce noncompliance among preschool children. Behavioral Interventions. 2006;21(4):265–272. doi: 10.1002/bin.222. [ DOI ] [ Google Scholar ]
  • Will KE, Sabo CS, Porter BE. Evaluation of the Boost ‘em in the Back Seat Program: Using fear and efficacy to increase booster seat use. Accident Analysis and Prevention. 2009;41(1):57–65. doi: 10.1016/j.aap.2008.09.007. [ DOI ] [ PubMed ] [ Google Scholar ]
  • Williamson BD, Campbell- Whatley GD, Lo Y-y. Using a random dependent group contingency to increase on-task behaviors of high school students with high incidence disabilities. Psychology in the Schools. 2009;46(10):1074–1083. doi: 10.1002/pits.20445. [ DOI ] [ Google Scholar ]
  • *. Winn BD, Skinner CH, Allin JD, Hawkins JA. Practicing school consultants can empirically validate interventions: A description and demonstration of the non-concurrent multiple-baseline design. Journal of Applied School Psychology. 2004;20(2):109–128. doi: 10.1300/J370v20n02_07. [ DOI ] [ Google Scholar ]
  • *. Wong CJ, Dillon EM, Sylvest C, Silverman K. Evaluation of a modified contingency management intervention for consistent attendance in therapeutic workplace participants. Drug and Alcohol Dependence. 2004;74(3):319–323. doi: 10.1016/j.drugalcdep.2003.12.013. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Wong CJ, Dillon EM, Sylvest CE, Silverman K. Contingency management of reliable attendance of chronically unemployed substance abusers in a therapeutic workplace. Experimental and Clinical Psychopharmacology. 2004;12(1):39–46. doi: 10.1037/1064-1297.12.1.39. [ DOI ] [ PubMed ] [ Google Scholar ]
  • Wong ML, Chan R, Koh D. Long-term effects of condom promotion programmes for vaginal and oral sex on sexually transmitted infections among sex workers in Singapore. AIDS. 2004;18(8):1195–1199. doi: 10.1097/00002030-200405210-00013. [ DOI ] [ PubMed ] [ Google Scholar ]
  • *. Woods DW, Twohig MP. Using habit reversal to treat chronic vocal tic disorder in children. Behavioral Interventions. 2002;17(3):159–168. doi: 10.1002/bin.115. [ DOI ] [ Google Scholar ]
  • *. Woods J, Kashinath S, Goldstein H. Effects of embedding caregiver-implemented teaching strategies in daily routines on children’s communication outcomes. Journal of Early Intervention. 2004;26(3):175–193. doi: 10.1177/105381510402600302. [ DOI ] [ Google Scholar ]
  • Wragg JA, Whitehead RE. CBT for adolescents with psychosis: Investigating the feasibility & effectiveness of early intervention using a single case design. Behavioural and Cognitive Psychotherapy. 2004;32(3):313–329. doi: 10.1017/s1352465804001389. [ DOI ] [ Google Scholar ]
  • *. Wu H, Miller LK. A tutoring package to teach pronunciation of Mandarin Chinese characters. Journal of Applied Behavior Analysis. 2007;40(3):583–586. doi: 10.1901/jaba.2007.40-583. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • *. Yi JI, Christian L, Vittimberga G, Lowenkron B. Generalized negatively reinforced manding in children with autism. Analysis of Verbal Behavior. 2006;22:21–33. doi: 10.1007/BF03393024. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Yon A, Scogin F. Behavioral activation as a treatment for geriatric depression. Clinical Gerontologist: The Journal of Aging and Mental Health. 2009;32(1):91–103. doi: 10.1080/07317110802478016. [ DOI ] [ Google Scholar ]
  • Youmans G, Holland A, Muñoz ML, Bourgeois M. Script training and automaticity in two individuals with aphasia. Aphasiology. 2005;19(3–5):435–450. doi: 10.1080/02687030444000877. [ DOI ] [ Google Scholar ]
  • *. Zens NK, Gillon GT, Moran C. Effects of phonological awareness and semantic intervention on word-learning in children with SLI. International Journal of Speech-Language Pathology. 2009;11(6):509–524. doi: 10.3109/17549500902926881. [ DOI ] [ PubMed ] [ Google Scholar ]
  • Zeoli AM, Webster DW. Effects of domestic violence policies, alcohol taxes and police staffing levels on intimate partner homicide in large US cities. Injury Prevention. 2010;16(2):90–95. doi: 10.1136/ip.2009.024620. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Zimmerman RS, Palmgreen PM, Noar SM, Lustria MLA, Lu H-Y, Horosewski ML. Effects of a televised two-city safer sex mass media campaign targeting high-sensation-seeking and impulsive-decision-making young adults. Health Education & Behavior. 2007;34(5):810–826. doi: 10.1177/1090198107299700. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • *. Ziolkowski RA, Goldstein H. Effects of an embedded phonological awareness intervention during repeated book reading on preschool children with language delays. Journal of Early Intervention. 2008;31(1):67–90. doi: 10.1177/1053815108324808. [ DOI ] [ Google Scholar ]
  • Zisimopoulos DA. Enhancing multiplication performance in students with moderate intellectual disabilities using pegword mnemonics paired with a picture fading technique. Journal of Behavioral Education. 2010;19(2):117–133. doi: 10.1007/s10864-010-9104-7. [ DOI ] [ Google Scholar ]
  • *. Zuluaga CA, Normand MP. An evaluation of the high-probability instruction sequence with and without programmed reinforcement for compliance with high-probability instructions. Journal of Applied Behavior Analysis. 2008;41(3):453–457. doi: 10.1901/jaba.2008.41-453. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]

Autocorrelation estimates in this range can be caused by trends in the data streams, which creates complications in terms of detecting level-change effects. The Smith et al. (in press) study used a Monte Carlo simulation to control for trends in the data streams, but trends are likely to exist in real-world data with high lag-1 autocorrelation estimates.

The author makes no endorsement regarding the superiority of any statistical program or package over another by their mention or exclusion in this article. The author also has no conflicts of interest in this regard.

However, it should be noted that it was often very difficult to locate an actual effect size reported in studies that used statistical analysis. Although this issue would likely have added little to this review, it does inhibit the inclusion of the results in meta-analysis.

  • Albright JJ, Marinova DM. Estimating multilevel modelsuUsing SPSS, Stata, and SAS. Indiana University; 2010. Retrieved from http://www.iub.edu/%7Estatmath/stat/all/hlm/hlm.pdf . [ Google Scholar ]
  • Allison DB, Gorman BS. Calculating effect sizes for meta-analysis: The case of the single case. Behavior Research and Therapy. 1993;31(6):621–631. doi: 10.1016/0005-7967(93)90115-B. [ DOI ] [ PubMed ] [ Google Scholar ]
  • Alloy LB, Just N, Panzarella C. Attributional style, daily life events, and hopelessness depression: Subtype validation by prospective variability and specificity of symptoms. Cognitive Therapy Research. 1997;21:321–344. doi: 10.1023/A:1021878516875. [ DOI ] [ Google Scholar ]
  • Arbuckle JL. Amos (Version 7.0) Chicago, IL: SPSS, Inc; 2006. [ Google Scholar ]
  • Barlow DH, Nock MK, Hersen M. Single case research designs: Strategies for studying behavior change. 3. New York, NY: Allyn and Bacon; 2008. [ Google Scholar ]
  • Barrett LF, Barrett DJ. An introduction to computerized experience sampling in psychology. Social Science Computer Review. 2001;19(2):175–185. doi: 10.1177/089443930101900204. [ DOI ] [ Google Scholar ]
  • Bloom M, Fisher J, Orme JG. Evaluating practice: Guidelines for the accountable professional. 4. Boston, MA: Allyn & Bacon; 2003. [ Google Scholar ]
  • Bolger N, Davis A, Rafaeli E. Diary methods: Capturing life as it is lived. Annual Review of Psychology. 2003;54:579–616. doi: 10.1146/annurev.psych.54.101601.145030. [ DOI ] [ PubMed ] [ Google Scholar ]
  • Borckardt JJ. Simulation Modeling Analysis: Time series analysis program for short time series data streams (Version 8.3.3) Charleston, SC: Medical University of South Carolina; 2006. [ Google Scholar ]
  • Borckardt JJ, Nash MR, Murphy MD, Moore M, Shaw D, O’Neil P. Clinical practice as natural laboratory for psychotherapy research. American Psychologist. 2008;63:1–19. doi: 10.1037/0003-066X.63.2.77. [ DOI ] [ PubMed ] [ Google Scholar ]
  • Borsboom D, Mellenbergh GJ, van Heerden J. The theoretical status of latent variables. Psychological Review. 2003;110(2):203–219. doi: 10.1037/0033-295X.110.2.203. [ DOI ] [ PubMed ] [ Google Scholar ]
  • Bower GH. Mood and memory. American Psychologist. 1981;36(2):129–148. doi: 10.1037/0003-066x.36.2.129. [ DOI ] [ PubMed ] [ Google Scholar ]
  • Box GEP, Jenkins GM. Time-series analysis: Forecasting and control. San Francisco, CA: Holden-Day; 1970. [ Google Scholar ]
  • Brossart DF, Parker RI, Olson EA, Mahadevan L. The relationship between visual analysis and five statistical analyses in a simple AB single-case research design. Behavior Modification. 2006;30(5):531–563. doi: 10.1177/0145445503261167. [ DOI ] [ PubMed ] [ Google Scholar ]
  • Browne MW, Nesselroade JR. Representing psychological processes with dynamic factor models: Some promising uses and extensions of autoregressive moving average time series models. In: Maydeu-Olivares A, McArdle JJ, editors. Contemporary psychometrics: A festschrift for Roderick P McDonald. Mahwah, NJ: Lawrence Erlbaum Associates Publishers; 2005. pp. 415–452. [ Google Scholar ]
  • Busk PL, Marascuilo LA. Statistical analysis in single-case research: Issues, procedures, and recommendations, with applications to multiple behaviors. In: Kratochwill TR, Levin JR, editors. Single-case research design and analysis: New directions for psychology and education. Hillsdale, NJ, England: Lawrence Erlbaum Associates, Inc; 1992. pp. 159–185. [ Google Scholar ]
  • Busk PL, Marascuilo RC. Autocorrelation in single-subject research: A counterargument to the myth of no autocorrelation. Behavioral Assessment. 1988;10:229–242. [ Google Scholar ]
  • Campbell JM. Statistical comparison of four effect sizes for single-subject designs. Behavior Modification. 2004;28(2):234–246. doi: 10.1177/0145445503259264. [ DOI ] [ PubMed ] [ Google Scholar ]
  • Carr EG, Horner RH, Turnbull AP, Marquis JG, Magito McLaughlin D, McAtee ML, Doolabh A. Positive behavior support for people with developmental disabilities: A research synthesis. Washington, DC: American Association on Mental Retardation; 1999. [ Google Scholar ]
  • Center BA, Skiba RJ, Casey A. A methodology for the quantitative synthesis of intra-subject design research. Journal of Educational Science. 1986;19:387–400. doi: 10.1177/002246698501900404. [ DOI ] [ Google Scholar ]
  • Chambless DL, Hollon SD. Defining empirically supported therapies. Journal of Consulting and Clinical Psychology. 1998;66(1):7–18. doi: 10.1037/0022-006X.66.1.7. [ DOI ] [ PubMed ] [ Google Scholar ]
  • Chambless DL, Ollendick TH. Empirically supported psychological interventions: Controversies and evidence. Annual Review of Psychology. 2001;52:685–716. doi: 10.1146/annurev.psych.52.1.685. [ DOI ] [ PubMed ] [ Google Scholar ]
  • Chow S-M, Ho M-hR, Hamaker EL, Dolan CV. Equivalence and differences between structural equation modeling and state-space modeling techniques. Structural Equation Modeling. 2010;17(2):303–332. doi: 10.1080/10705511003661553. [ DOI ] [ Google Scholar ]
  • Cohen J. Statistical power analysis for the bahavioral sciences. 2. Hillsdale, NJ: Erlbaum; 1988. [ Google Scholar ]
  • Cohen J. The earth is round (p < .05) American Psychologist. 1994;49:997–1003. doi: 10.1037/0003-066X.49.12.997. [ DOI ] [ Google Scholar ]
  • Crosbie J. Interrupted time-series analysis with brief single-subject data. Journal of Consulting and Clinical Psychology. 1993;61(6):966–974. doi: 10.1037/0022-006X.61.6.966. [ DOI ] [ PubMed ] [ Google Scholar ]
  • Dattilio FM, Edwards JA, Fishman DB. Case studies within a mixed methods paradigm: Toward a resolution of the alienation between researcher and practitioner in psychotherapy research. Psychotherapy: Theory, Research, Practice, Training. 2010;47(4):427–441. doi: 10.1037/a0021181. [ DOI ] [ PubMed ] [ Google Scholar ]
  • Dempster A, Laird N, Rubin DB. Maximum likelihood from incomplete data via the EM algorithm. Journal of the Royal Statistical Society, Series B. 1977;39(1):1–38. [ Google Scholar ]
  • Des Jarlais DC, Lyles C, Crepaz N. Improving the reporting quality of nonrandomized evaluations of behavioral and public health interventions: the TREND statement. American Journal of Public Health. 2004;94(3):361–366. doi: 10.2105/ajph.94.3.361. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Diggle P, Liang KY. Analyses of longitudinal data. New York: Oxford University Press; 2001. [ Google Scholar ]
  • Doss BD, Atkins DC. Investigating treatment mediators when simple random assignment to a control group is not possible. Clinical Psychology: Science and Practice. 2006;13(4):321–336. doi: 10.1111/j.1468-2850.2006.00045.x. [ DOI ] [ Google Scholar ]
  • du Toit SHC, Browne MW. The covariance structure of a vector ARMA time series. In: Cudeck R, du Toit SHC, Sörbom D, editors. Structural equation modeling: Present and future. Lincolnwood, IL: Scientific Software International; 2001. pp. 279–314. [ Google Scholar ]
  • du Toit SHC, Browne MW. Structural equation modeling of multivariate time series. Multivariate Behavioral Research. 2007;42:67–101. doi: 10.1080/00273170701340953. [ DOI ] [ PubMed ] [ Google Scholar ]
  • Fechner GT. Elemente der psychophysik [Elements of psychophysics] Leipzig, Germany: Breitkopf & Hartel; 1889. [ Google Scholar ]
  • Ferron J, Sentovich C. Statistical power of randomization tests used with multiple-baseline designs. The Journal of Experimental Education. 2002;70:165–178. doi: 10.1080/00220970209599504. [ DOI ] [ Google Scholar ]
  • Ferron J, Ware W. Analyzing single-case data: The power of randomization tests. The Journal of Experimental Education. 1995;63:167–178. [ Google Scholar ]
  • Fox J. TEACHER’S CORNER: Structural equation modeling with the sem package in R. Structural Equation Modeling: A Multidisciplinary Journal. 2006;13(3):465–486. doi: 10.1207/s15328007sem1303_7. [ DOI ] [ Google Scholar ]
  • Franklin RD, Allison DB, Gorman BS, editors. Design and analysis of single-case research. Mahwah, NJ: Lawrence Erlbaum Associates; 1997. [ Google Scholar ]
  • Franklin RD, Gorman BS, Beasley TM, Allison DB. Graphical display and visual analysis. In: Franklin RD, Allison DB, Gorman BS, editors. Design and analysis of single-case research. Mahway, NJ: Lawrence Erlbaum Associates, Publishers; 1997. pp. 119–158. [ Google Scholar ]
  • Gardner W, Mulvey EP, Shaw EC. Regression analyses of counts and rates: Poisson, overdispersed Poisson, and negative binomial models. Psychological Bulletin. 1995;118(3):392–404. doi: 10.1037/0033-2909.118.3.392. [ DOI ] [ PubMed ] [ Google Scholar ]
  • Green AS, Rafaeli E, Bolger N, Shrout PE, Reis HT. Paper or plastic? Data equivalence in paper and electronic diaries. Psychological Methods. 2006;11(1):87–105. doi: 10.1037/1082-989X.11.1.87. [ DOI ] [ PubMed ] [ Google Scholar ]
  • Hamilton JD. Time series analysis. Princeton, NJ: Princeton University Press; 1994. [ Google Scholar ]
  • Hammond D, Gast DL. Descriptive analysis of single-subject research designs: 1983–2007. Education and Training in Autism and Developmental Disabilities. 2010;45:187–202. [ Google Scholar ]
  • Hanson MD, Chen E. Daily stress, cortisol, and sleep: The moderating role of childhood psychosocial environments. Health Psychology. 2010;29(4):394–402. doi: 10.1037/a0019879. [ DOI ] [ PubMed ] [ Google Scholar ]
  • Harvey AC. Forecasting, structural time series models and the Kalman filter. Cambridge, MA: Cambridge University Press; 2001. [ Google Scholar ]
  • Horner RH, Carr EG, Halle J, McGee G, Odom S, Wolery M. The use of single-subject research to identify evidence-based practice in special education. Exceptional Children. 2005;71:165–179. [ Google Scholar ]
  • Horner RH, Spaulding S. Single-case research designs. In: Salkind NJ, editor. Encyclopedia of research design. Thousand Oaks, CA: Sage Publications; 2010. [ Google Scholar ]
  • Horton NJ, Kleinman KP. Much ado about nothing: A comparison of missing data methods and software to fit incomplete data regression models. The American Statistician. 2007;61(1):79–90. doi: 10.1198/000313007X172556. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Hser Y, Shen H, Chou C, Messer SC, Anglin MD. Analytic approaches for assessing long-term treatment effects. Evaluation Review. 2001;25(2):233–262. doi: 10.1177/0193841X0102500206. [ DOI ] [ PubMed ] [ Google Scholar ]
  • Huitema BE. Autocorrelation in applied behavior analysis: A myth. Behavioral Assessment. 1985;7(2):107–118. [ Google Scholar ]
  • Huitema BE, McKean JW. Reduced bias autocorrelation estimation: Three jackknife methods. Educational and Psychological Measurement. 1994;54(3):654–665. doi: 10.1177/0013164494054003008. [ DOI ] [ Google Scholar ]
  • Ibrahim JG, Chen M-H, Lipsitz SR, Herring AH. Missing-data methods for generalized linear models: A comparative review. Journal of the American Statistical Association. 2005;100(469):332–346. doi: 10.1198/016214504000001844. [ DOI ] [ Google Scholar ]
  • Institute of Medicine. Reducing risks for mental disorders: Frontiers for preventive intervention research. Washington, DC: National Academy Press; 1994. [ PubMed ] [ Google Scholar ]
  • Jacobsen NS, Christensen A. Studying the effectiveness of psychotherapy: How well can clinical trials do the job? American Psychologist. 1996;51:1031–1039. doi: 10.1037/0003-066X.51.10.1031. [ DOI ] [ PubMed ] [ Google Scholar ]
  • Jones RR, Vaught RS, Weinrott MR. Time-series analysis in operant research. Journal of Behavior Analysis. 1977;10(1):151–166. doi: 10.1901/jaba.1977.10-151. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Jones WP. Single-case time series with Bayesian analysis: A practitioner’s guide. Measurement and Evaluation in Counseling and Development. 2003;36(28–39) [ Google Scholar ]
  • Kanfer H. Self-monitoring: Methodological limitations and clinical applications. Journal of Consulting and Clinical Psychology. 1970;35(2):148–152. doi: 10.1037/h0029874. [ DOI ] [ Google Scholar ]
  • Kazdin AE. Drawing valid inferences from case studies. Journal of Consulting and Clinical Psychology. 1981;49(2):183–192. doi: 10.1037/0022-006X.49.2.183. [ DOI ] [ PubMed ] [ Google Scholar ]
  • Kazdin AE. Mediators and mechanisms of change in psychotherapy research. Annual Review of Clinical Psychology. 2007;3:1–27. doi: 10.1146/annurev.clinpsy.3.022806.091432. [ DOI ] [ PubMed ] [ Google Scholar ]
  • Kazdin AE. Evidence-based treatment and practice: New opportunities to bridge clinical research and practice, enhance the knowledge base, and improve patient care. American Psychologist. 2008;63(3):146–159. doi: 10.1037/0003-066X.63.3.146. [ DOI ] [ PubMed ] [ Google Scholar ]
  • Kazdin AE. Understanding how and why psychotherapy leads to change. Psychotherapy Research. 2009;19(4):418–428. doi: 10.1080/10503300802448899. [ DOI ] [ PubMed ] [ Google Scholar ]
  • Kazdin AE. Single-case research designs: Methods for clinical and applied settings. 2. New York, NY: Oxford University Press; 2010. [ Google Scholar ]
  • Kirk RE. Practical significance: A concept whose time has come. Educational and Psychological Measurement. 1996;56:746–759. doi: 10.1177/0013164496056005002. [ DOI ] [ Google Scholar ]
  • Kratochwill TR. Preparing psychologists for evidence-based school practice: Lessons learned and challenges ahead. American Psychologist. 2007;62:829–843. doi: 10.1037/0003-066X.62.8.829. [ DOI ] [ PubMed ] [ Google Scholar ]
  • Kratochwill TR, Hitchcock J, Horner RH, Levin JR, Odom SL, Rindskopf DM, Shadish WR. Single-case designs technical documentation. 2010 Retrieved from What Works Clearinghouse website: http://ies.ed.gov/ncee/wwc/pdf/wwc_scd.pdf . Retrieved from http://ies.ed.gov/ncee/wwc/pdf/wwc_scd.pdf .
  • Kratochwill TR, Levin JR. Single-case research design and analysis: New directions for psychology and education. Hillsdale, NJ: Lawrence Erlbaum Associates, Inc; 1992. [ Google Scholar ]
  • Kratochwill TR, Levin JR. Enhancing the scientific credibility of single-case intervention research: Randomization to the rescue. Psychological Methods. 2010;15(2):124–144. doi: 10.1037/a0017736. [ DOI ] [ PubMed ] [ Google Scholar ]
  • Kratochwill TR, Levin JR, Horner RH, Swoboda C. Visual analysis of single-case intervention research: Conceptual and methodological considerations (WCER Working Paper No. 2011-6) 2011 Retrieved from University of Wisconsin–Madison, Wisconsin Center for Education Research website: http://www.wcer.wisc.edu/publications/workingPapers/papers.php .
  • Lambert D. Zero-inflated poisson regression, with an application to defects in manufacturing. Technometrics. 1992;34(1):1–14. [ Google Scholar ]
  • Lambert MJ, Hansen NB, Harmon SC. Developing and Delivering Practice-Based Evidence. John Wiley & Sons, Ltd; 2010. Outcome Questionnaire System (The OQ System): Development and practical applications in healthcare settings; pp. 139–154. [ Google Scholar ]
  • Littell JH, Corcoran J, Pillai VK. Systematic reviews and meta-analysis. New York: Oxford University Press; 2008. [ Google Scholar ]
  • Liu LM, Hudack GB. The SCA statistical system. Vector ARMA modeling of multiple time series. Oak Brook, IL: Scientific Computing Associates Corporation; 1995. [ Google Scholar ]
  • Lubke GH, Muthén BO. Investigating population heterogeneity with factor mixture models. Psychological Methods. 2005;10(1):21–39. doi: 10.1037/1082-989x.10.1.21. [ DOI ] [ PubMed ] [ Google Scholar ]
  • Manolov R, Solanas A. Comparing N = 1 effect sizes in presence of autocorrelation. Behavior Modification. 2008;32(6):860–875. doi: 10.1177/0145445508318866. [ DOI ] [ PubMed ] [ Google Scholar ]
  • Marshall RJ. Autocorrelation estimation of time series with randomly missing observations. Biometrika. 1980;67(3):567–570. doi: 10.1093/biomet/67.3.567. [ DOI ] [ Google Scholar ]
  • Matyas TA, Greenwood KM. Visual analysis of single-case time series: Effects of variability, serial dependence, and magnitude of intervention effects. Journal of Applied Behavior Analysis. 1990;23(3):341–351. doi: 10.1901/jaba.1990.23-341. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Kratochwill TR, Chair Members of the Task Force on Evidence-Based Interventions in School Psychology. Procedural and coding manual for review of evidence-based interventions. 2003 Retrieved July 18, 2011 from http://www.sp-ebi.org/documents/_workingfiles/EBImanual1.pdf .
  • Moher D, Schulz KF, Altman DF the CONSORT Group. The CONSORT statement: Revised recommendations for improving the quality of reports of parallel-group randomized trials. Journal of the American Medical Association. 2001;285:1987–1991. doi: 10.1001/jama.285.15.1987. [ DOI ] [ PubMed ] [ Google Scholar ]
  • Morgan DL, Morgan RK. Single-participant research design: Bringing science to managed care. American Psychologist. 2001;56(2):119–127. doi: 10.1037/0003-066X.56.2.119. [ DOI ] [ PubMed ] [ Google Scholar ]
  • Muthén BO, Curran PJ. General longitudinal modeling of individual differences in experimental designs: A latent variable framework for analysis and power estimation. Psychological Methods. 1997;2(4):371–402. doi: 10.1037/1082-989x.2.4.371. [ DOI ] [ Google Scholar ]
  • Muthén LK, Muthén BO. Mplus (Version 6.11) Los Angeles, CA: Muthén & Muthén; 2010. [ Google Scholar ]
  • Nagin DS. Analyzing developmental trajectories: A semiparametric, group-based approach. Psychological Methods. 1999;4(2):139–157. doi: 10.1037/1082-989x.4.2.139. [ DOI ] [ PubMed ] [ Google Scholar ]
  • National Institute of Child Health and Human Development. Report of the National Reading Panel. Teaching children to read: An evidence-based assessment of the scientific research literature on reading and its implications for reading instruction (NIH Publication No. 00-4769) Washington, DC: U.S. Government Printing Office; 2000. [ Google Scholar ]
  • Olive ML, Smith BW. Effect size calculations and single subject designs. Educational Psychology. 2005;25(2–3):313–324. doi: 10.1080/0144341042000301238. [ DOI ] [ Google Scholar ]
  • Oslin DW, Cary M, Slaymaker V, Colleran C, Blow FC. Daily ratings measures of alcohol craving during an inpatient stay define subtypes of alcohol addiction that predict subsequent risk for resumption of drinking. Drug and Alcohol Dependence. 2009;103(3):131–136. doi: 10.1016/J.Drugalcdep.2009.03.009. [ DOI ] [ PubMed ] [ Google Scholar ]
  • Palermo TP, Valenzuela D, Stork PP. A randomized trial of electronic versus paper pain diaries in children: Impact on compliance, accuracy, and acceptability. Pain. 2004;107(3):213–219. doi: 10.1016/j.pain.2003.10.005. [ DOI ] [ PubMed ] [ Google Scholar ]
  • Parker RI, Brossart DF. Evaluating single-case research data: A comparison of seven statistical methods. Behavior Therapy. 2003;34(2):189–211. doi: 10.1016/S0005-7894(03)80013-8. [ DOI ] [ Google Scholar ]
  • Parker RI, Cryer J, Byrns G. Controlling baseline trend in single case research. School Psychology Quarterly. 2006;21(4):418–440. doi: 10.1037/h0084131. [ DOI ] [ Google Scholar ]
  • Parker RI, Vannest K. An improved effect size for single-case research: Nonoverlap of all pairs. Behavior Therapy. 2009;40(4):357–367. doi: 10.1016/j.beth.2008.10.006. [ DOI ] [ PubMed ] [ Google Scholar ]
  • Parsonson BS, Baer DM. The analysis and presentation of graphic data. In: Kratochwill TR, editor. Single subject research. New York, NY: Academic Press; 1978. pp. 101–166. [ Google Scholar ]
  • Parsonson BS, Baer DM. The visual analysis of data, and current research into the stimuli controlling it. In: Kratochwill TR, Levin JR, editors. Single-case research design and analysis: New directions for psychology and education. Hillsdale, NJ; England: Lawrence Erlbaum Associates, Inc; 1992. pp. 15–40. [ Google Scholar ]
  • Piasecki TM, Hufford MR, Solham M, Trull TJ. Assessing clients in their natural environments with electronic diaries: Rationale, benefits, limitations, and barriers. Psychological Assessment. 2007;19(1):25–43. doi: 10.1037/1040-3590.19.1.25. [ DOI ] [ PubMed ] [ Google Scholar ]
  • R Development Core Team. R: A language and environment for statistical computing. Vienna, Austria: R Foundation for Statistical Computing; 2005. [ Google Scholar ]
  • Ragunathan TE. What do we do with missing data? Some options for analysis of incomplete data. Annual Review of Public Health. 2004;25:99–117. doi: 10.1146/annurev.publhealth.25.102802.124410. [ DOI ] [ PubMed ] [ Google Scholar ]
  • Raudenbush SW, Bryk AS, Congdon R. HLM 7 Hierarchical Linear and Nonlinear Modeling. Scientific Software International, Inc; 2011. [ Google Scholar ]
  • Redelmeier DA, Kahneman D. Patients’ memories of painful medical treatments: Real-time and retrospective evaluations of two minimally invasive procedures. Pain. 1996;66(1):3–8. doi: 10.1016/0304-3959(96)02994-6. [ DOI ] [ PubMed ] [ Google Scholar ]
  • Reis HT. Domains of experience: Investigating relationship processes from three perspectives. In: Erber R, Gilmore R, editors. Theoretical frameworks in personal relationships. Mahwah, NJ: Erlbaum; 1994. pp. 87–110. [ Google Scholar ]
  • Reis HT, Gable SL. Event sampling and other methods for studying everyday experience. In: Reis HT, Judd CM, editors. Handbook of research methods in social and personality psychology. New York, NY: Cambridge University Press; 2000. pp. 190–222. [ Google Scholar ]
  • Robey RR, Schultz MC, Crawford AB, Sinner CA. Single-subject clinical-outcome research: Designs, data, effect sizes, and analyses. Aphasiology. 1999;13(6):445–473. doi: 10.1080/026870399402028. [ DOI ] [ Google Scholar ]
  • Rossi PH, Freeman HE. Evaluation: A systematic approach. 5. Thousand Oaks, CA: Sage; 1993. [ Google Scholar ]
  • SAS Institute Inc. The SAS system for Windows, Version 9. Cary, NC: SAS Institute Inc; 2008. [ Google Scholar ]
  • Schmidt M, Perels F, Schmitz B. How to perform idiographic and a combination of idiographic and nomothetic approaches: A comparison of time series analyses and hierarchical linear modeling. Journal of Psychology. 2010;218(3):166–174. doi: 10.1027/0044-3409/a000026. [ DOI ] [ Google Scholar ]
  • Scollon CN, Kim-Pietro C, Diener E. Experience sampling: Promises and pitfalls, strengths and weaknesses. Assessing Well-Being. 2003;4:5–35. doi: 10.1007/978-90-481-2354-4_8. [ DOI ] [ Google Scholar ]
  • Scruggs TE, Mastropieri MA. Summarizing single-subject research: Issues and applications. Behavior Modification. 1998;22(3):221–242. doi: 10.1177/01454455980223001. [ DOI ] [ PubMed ] [ Google Scholar ]
  • Scruggs TE, Mastropieri MA, Casto G. The quantitative synthesis of single-subject research. Remedial and Special Education. 1987;8(2):24–33. doi: 10.1177/074193258700800206. [ DOI ] [ Google Scholar ]
  • Shadish WR, Cook TD, Campbell DT. Experimental and quasi-experimental designs for generalized causal inference. Boston, MA: Houghton Mifflin; 2002. [ Google Scholar ]
  • Shadish WR, Rindskopf DM, Hedges LV. The state of the science in the meta-analysis of single-case experimental designs. Evidence-Based Communication Assessment and Intervention. 2008;3:188–196. doi: 10.1080/17489530802581603. [ DOI ] [ Google Scholar ]
  • Shadish WR, Sullivan KJ. Characteristics of single-case designs used to assess treatment effects in 2008. Behavior Research Methods. 2011;43:971–980. doi: 10.3758/s13428-011-0111-y. [ DOI ] [ PubMed ] [ Google Scholar ]
  • Sharpley CF. Time-series analysis of behavioural data: An update. Behaviour Change. 1987;4:40–45. [ Google Scholar ]
  • Shiffman S, Hufford M, Hickcox M, Paty JA, Gnys M, Kassel JD. Remember that? A comparison of real-time versus retrospective recall of smoking lapses. Journal of Consulting and Clinical Psychology. 1997;65:292–300. doi: 10.1037/0022-006X.65.2.292.a. [ DOI ] [ PubMed ] [ Google Scholar ]
  • Shiffman S, Stone AA. Ecological momentary assessment: A new tool for behavioral medicine research. In: Krantz DS, Baum A, editors. Technology and methods in behavioral medicine. Mahwah, NJ: Erlbaum; 1998. pp. 117–131. [ Google Scholar ]
  • Shiffman S, Stone AA, Hufford MR. Ecological momentary assessment. Annual Review of Clinical Psychology. 2008;4:1–32. doi: 10.1146/annurev.clinpsy.3.022806.091415. [ DOI ] [ PubMed ] [ Google Scholar ]
  • Shumway RH, Stoffer DS. An approach to time series smoothing and forecasting using the EM Algorithm. Journal of Time Series Analysis. 1982;3(4):253–264. doi: 10.1111/j.1467-9892.1982.tb00349.x. [ DOI ] [ Google Scholar ]
  • Skinner BF. The behavior of organisms. New York, NY: Appleton-Century-Crofts; 1938. [ Google Scholar ]
  • Smith JD, Borckardt JJ, Nash MR. Inferential precision in single-case time-series datastreams: How well does the EM Procedure perform when missing observations occur in autocorrelated data? Behavior Therapy. doi: 10.1016/j.beth.2011.10.001. (in press) [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Smith JD, Handler L, Nash MR. Therapeutic Assessment for preadolescent boys with oppositional-defiant disorder: A replicated single-case time-series design. Psychological Assessment. 2010;22(3):593–602. doi: 10.1037/a0019697. [ DOI ] [ PubMed ] [ Google Scholar ]
  • Snijders TAB, Bosker RJ. Multilevel analysis: An introduction to basic and advanced multilevel modeling. Thousand Oaks, CA: Sage; 1999. [ Google Scholar ]
  • Soliday E, Moore KJ, Lande MB. Daily reports and pooled time series analysis: Pediatric psychology applications. Journal of Pediatric Psychology. 2002;27(1):67–76. doi: 10.1093/jpepsy/27.1.67. [ DOI ] [ PubMed ] [ Google Scholar ]
  • SPSS Statistics. Chicago, IL: SPSS Inc; 2011. (Version 20.0.0) [ Google Scholar ]
  • StataCorp. Stata Statistical Software: Release 12. College Station, TX: StataCorp LP; 2011. [ Google Scholar ]
  • Stone AA, Broderick JE, Kaell AT, Deles-Paul PAEG, Porter LE. Does the peak-end phenomenon observed in laboratory pain studies apply to real-world pain in rheumatoid arthritics? Journal of Pain. 2000;1:212–217. doi: 10.1054/jpai.2000.7568. [ DOI ] [ PubMed ] [ Google Scholar ]
  • Stone AA, Shiffman S. Capturing momentary, self-report data: A proposal for reporting guidelines. Annals of Behavioral Medicine. 2002;24:236–243. doi: 10.1207/S15324796ABM2403_09. [ DOI ] [ PubMed ] [ Google Scholar ]
  • Stout RL. Advancing the analysis of treatment process. Addiction. 2007;102:1539–1545. doi: 10.1111/j.1360-0443.2007.01880.x. [ DOI ] [ PubMed ] [ Google Scholar ]
  • Tate RL, McDonald S, Perdices M, Togher L, Schultz R, Savage S. Rating the methodological quality of single-subject designs and N-of-1 trials: Introducing the Single-Case Experimental Design (SCED) Scale. Neuropsychological Rehabilitation. 2008;18(4):385–401. doi: 10.1080/09602010802009201. [ DOI ] [ PubMed ] [ Google Scholar ]
  • Thiele C, Laireiter A-R, Baumann U. Diaries in clinical psychology and psychotherapy: A selective review. Clinical Psychology & Psychotherapy. 2002;9(1):1–37. doi: 10.1002/cpp.302. [ DOI ] [ Google Scholar ]
  • Tiao GC, Box GEP. Modeling multiple time series with applications. Journal of the American Statistical Association. 1981;76:802–816. [ Google Scholar ]
  • Tschacher W, Ramseyer F. Modeling psychotherapy process by time-series panel analysis (TSPA) Psychotherapy Research. 2009;19(4):469–481. doi: 10.1080/10503300802654496. [ DOI ] [ PubMed ] [ Google Scholar ]
  • Velicer WF, Colby SM. A comparison of missing-data procedures for ARIMA time-series analysis. Educational and Psychological Measurement. 2005a;65(4):596–615. doi: 10.1177/0013164404272502. [ DOI ] [ Google Scholar ]
  • Velicer WF, Colby SM. Missing data and the general transformation approach to time series analysis. In: Maydeu-Olivares A, McArdle JJ, editors. Contemporary psychometrics. A festschrift to Roderick P McDonald. Hillsdale, NJ: Lawrence Erlbaum; 2005b. pp. 509–535. [ Google Scholar ]
  • Velicer WF, Fava JL. Time series analysis. In: Schinka J, Velicer WF, Weiner IB, editors. Research methods in psychology. Vol. 2. New York, NY: John Wiley & Sons; 2003. [ Google Scholar ]
  • Wachtel PL. Beyond “ESTs”: Problematic assumptions in the pursuit of evidence-based practice. Psychoanalytic Psychology. 2010;27(3):251–272. doi: 10.1037/a0020532. [ DOI ] [ Google Scholar ]
  • Watson JB. Behaviorism. New York, NY: Norton; 1925. [ Google Scholar ]
  • Weisz JR, Hawley KM. Finding, evaluating, refining, and applying empirically supported treatments for children and adolescents. Journal of Clinical Child Psychology. 1998;27:206–216. doi: 10.1207/s15374424jccp2702_7. [ DOI ] [ PubMed ] [ Google Scholar ]
  • Weisz JR, Hawley KM. Procedural and coding manual for identification of beneficial treatments. Washinton, DC: American Psychological Association, Society for Clinical Psychology, Division 12, Committee on Science and Practice; 1999. [ Google Scholar ]
  • Westen D, Bradley R. Empirically supported complexity. Current Directions in Psychological Science. 2005;14:266–271. doi: 10.1111/j.0963-7214.2005.00378.x. [ DOI ] [ Google Scholar ]
  • Westen D, Novotny CM, Thompson-Brenner HK. The empirical status of empirically supported psychotherapies: Assumptions, findings, and reporting controlled clinical trials. Psychological Bulletin. 2004;130:631–663. doi: 10.1037/0033-2909.130.4.631. [ DOI ] [ PubMed ] [ Google Scholar ]
  • Wilkinson L The Task Force on Statistical Inference. Statistical methods in psychology journals: Guidelines and explanations. American Psychologist. 1999;54:694–704. doi: 10.1037/0003-066X.54.8.594. [ DOI ] [ Google Scholar ]
  • Wolery M, Busick M, Reichow B, Barton EE. Comparison of overlap methods for quantitatively synthesizing single-subject data. The Journal of Special Education. 2010;44(1):18–28. doi: 10.1177/0022466908328009. [ DOI ] [ Google Scholar ]
  • Wu Z, Huang NE, Long SR, Peng C-K. On the trend, detrending, and variability of nonlinear and nonstationary time series. Proceedings of the National Academy of Sciences. 2007;104(38):14889–14894. doi: 10.1073/pnas.0701020104. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • View on publisher site
  • PDF (269.4 KB)
  • Collections

Similar articles

Cited by other articles, links to ncbi databases.

  • Download .nbib .nbib
  • Format: AMA APA MLA NLM

Add to Collections

A systematic review of applied single-case research published between 2016 and 2018: Study designs, randomization, data aspects, and data analysis

  • Published: 26 October 2020
  • Volume 53 , pages 1371–1384, ( 2021 )

Cite this article

what is single case study design

  • René Tanious 1 &
  • Patrick Onghena 1  

7621 Accesses

35 Citations

21 Altmetric

Explore all metrics

Single-case experimental designs (SCEDs) have become a popular research methodology in educational science, psychology, and beyond. The growing popularity has been accompanied by the development of specific guidelines for the conduct and analysis of SCEDs. In this paper, we examine recent practices in the conduct and analysis of SCEDs by systematically reviewing applied SCEDs published over a period of three years (2016–2018). Specifically, we were interested in which designs are most frequently used and how common randomization in the study design is, which data aspects applied single-case researchers analyze, and which analytical methods are used. The systematic review of 423 studies suggests that the multiple baseline design continues to be the most widely used design and that the difference in central tendency level is by far most popular in SCED effect evaluation. Visual analysis paired with descriptive statistics is the most frequently used method of data analysis. However, inferential statistical methods and the inclusion of randomization in the study design are not uncommon. We discuss these results in light of the findings of earlier systematic reviews and suggest future directions for the development of SCED methodology.

Similar content being viewed by others

what is single case study design

A Systematic Review of Sequential Multiple-Assignment Randomized Trials in Educational Research

what is single case study design

Meta-Analysis and Meta-Synthesis Methodologies: Rigorously Piecing Together Research

what is single case study design

Desire to Find Causal Relations: Response to Robinson and Wainer’s (2023) Reflection on the Field— It’s Just an Observation

Avoid common mistakes on your manuscript.

Introduction

In single-case experimental designs (SCEDs) a single entity (e.g., a classroom) is measured repeatedly over time under different manipulations of at least one independent variable (Barlow et al., 2009 ; Kazdin, 2011 ; Ledford & Gast, 2018 ). Experimental control in SCEDs is demonstrated by observing changes in the dependent variable(s) over time under the different manipulations of the independent variable(s). Over the past few decades, the popularity of SCEDs has risen continuously as reflected in the number of published SCED studies (Shadish & Sullivan, 2011 ; Smith, 2012 ; Tanious et al., 2020 ), the development of domain-specific reporting guidelines (e.g., Tate et al., 2016a , 2016b ; Vohra et al., 2016 ), and guidelines on the quality of conduct and analysis of SCEDs (Horner, et al., 2005 ; Kratochwill et al., 2010 , 2013 ).

The What Works Clearinghouse guidelines

In educational science in particular, the US Department of Education has released a highly influential policy document through its What Works Clearinghouse (WWC) panel (Kratochwill et al., 2010 ) Footnote 1 . The WWC guidelines contain recommendations for the conduct and visual analysis of SCEDs. The panel recommended visually analyzing six data aspects of SCEDs: level, trend, variability, overlap, immediacy of the effect, and consistency of data patterns. However, given the subjective nature of visual analysis (e.g., Harrington, 2013 ; Heyvaert & Onghena, 2014 ; Ottenbacher, 1990 ), Kratochwill and Levin ( 2014 ) later called the formation of a panel for recommendations on the statistical analysis of SCEDs “ the highest imminent priority” (p. 232, emphasis in original) on the agenda of SCED methodologists. Furthermore, Kratochwill and Levin—both members of the original panel—contended that advocating for design-specific randomization schemes in line with the recommendations by Edgington ( 1975 , 1980 ) and Levin ( 1994 ) would constitute an important contribution to the development of updated guidelines.

Developments outside the WWC guidelines

Prior to the publication of updated guidelines, important progress had already been made in the development of SCED-specific statistical analyses and design-specific randomization schemes not summarized in the 2010 version of the WWC guidelines. Specifically, three interrelated areas can be distinguished: effect size calculation, inferential statistics, and randomization procedures. Note that this list includes effect size calculation even though the 2010 WWC guidelines include some recommendations for effect size calculation, but with the reference that further research is “badly needed” (p. 23) to develop novel effect size measures comparable to those used in group studies. In the following paragraphs, we give a brief overview of the developments in each area.

Effect size measures

The effect size measures mentioned in the 2010 version of the WWC guidelines mainly concern the data aspect overlap: percentage of non-overlapping data (Scruggs, Mastropieri, & Casto, 1987 ), percentage of all non-overlapping data (Parker et al., 2007 ), and percentage of data points exceeding the median (Ma, 2006 ). Other overlap-based effect size measures are discussed in Parker et al. ( 2011 ). Furthermore, the 2010 guidelines discuss multilevel models, regression models, and a standardized effect size measure proposed by Shadish et al. ( 2008 ) for comparing results between participants in SCEDs. In later years, this measure has been further developed for other designs and meta-analyses (Hedges et al., 2012 ; Hedges et al., 2013 ; Shadish et al., 2014 ) Without mentioning any specific measures, the guidelines further mention effect sizes that compare the different conditions within a single unit and standardize by dividing by the within-phase variance. These effect size measures quantify the data aspect level. Beretvas and Chung ( 2008 ) proposed for example to subtract the mean of the baseline phase from the mean of the intervention phase, and subsequently divide by the pooled within-case standard deviation. Other proposals for quantifying the data aspect level include the slope and level change procedure which corrects for baseline trend (Solanas et al., 2010 ), and the mean baseline reduction which is calculated by subtracting the mean of treatment observations from the mean of baseline observations and subsequently dividing by the mean of the baseline phase (O’Brien & Repp, 1990 ). Efforts have also been made to quantify the other four data aspects. For an overview of the available effect size measures per data aspect, the interested reader is referred to Tanious et al. ( 2020 ). Examples of quantifications for the data aspect trend include the split-middle technique (Kazdin, 1982 ) and ordinary least squares (Kromrey & Foster-Johnson, 1996 ), but many more proposals exist (see e.g., Manolov, 2018 , for an overview and discussion of different trend techniques). Fewer proposals exist for variability, immediacy, and consistency. The WWC guidelines recommend using the standard deviation for within-phase variability. Another option is the use of stability envelopes as suggested by Lane and Gast ( 2014 ). It should be noted, however, that neither of these methods is an effect size measure because they are assessed within a single phase. For the assessment of between-phase variability changes, Kromrey and Foster-Johnson ( 1996 ) recommend using variance ratios. More recently, Levin et al. ( 2020 ) recommended the median absolute deviation for the assessment of variability changes. The WWC guidelines recommend subtracting the mean of the last three baseline data points from the first three intervention data points to assess immediacy. Michiels et al. ( 2017 ) proposed the immediate treatment effect index extending this logic to ABA and ABAB designs. For consistency of data patterns, only one measure currently exists, based on the Manhattan distance between data points from experimentally similar phases (Tanious et al., 2019 ).

Inferential statistics

Inferential statistics are not summarized in the 2010 version of the WWC guidelines. However, inferential statistics do have a long and rich history in debates surrounding the methodology and data analysis of SCEDs. Excellent review articles detailing and explaining the available methods for analyzing data from SCEDs are available in Manolov and Moeyaert ( 2017 ) and Manolov and Solanas ( 2018 ). In situations in which results are compared across participants within or between studies, multilevel models have been proposed. The 2010 guidelines do mention multilevel models, but with the indication that more thorough investigation was needed before their use could be recommended. With few exceptions, such as the pioneering work by Van den Noortgate and Onghena ( 2003 , 2008 ), specific proposals for multilevel analysis of SCEDs had long been lacking. Not surprisingly, the 2010 WWC guidelines gave new impetus for the development of multilevel models for meta-analyzing SCEDs. For example, Moeyaert, Ugille, et al. ( 2014b ) and Moeyaert, Ferron, et al. ( 2014a ) discuss two-level and three-level models for combining results across single cases. Baek et al. ( 2016 ) suggested a visual analytical approach for refining multilevel models for SCEDs. Multilevel models can be used descriptively (i.e., to find an overall treatment effect size), inferentially (i.e., to obtain a p value or confidence interval), or a mix of both.

  • Randomization

One concept that is closely linked to inferential statistics is randomization. In the context of SCEDs, randomization refers to the random assignment of measurements to treatment levels (Onghena & Edgington, 2005 ). Randomization, when ethically and practically feasible, can reduce the risk of bias in SCEDs and strengthen the internal validity of the study (Tate et al., 2013 ). To incorporate randomization into the design, specific randomization schemes are needed, as previously stated (Kratochwill & Levin, 2014 ). In alternation designs, randomization can be introduced by randomly alternating the sequence of conditions, either unrestricted or restricted (e.g., maximum of two consecutive measurements under the same condition) (Onghena & Edgington, 1994 ). In phase designs (e.g., ABAB), multiple baseline designs, and changing criterion designs, where no rapid alternation of treatments takes place, it is possible to randomize the moment of phase change after a minimum number of measurements has taken place in each phase (Marascuilo & Busk, 1988 ; Onghena, 1992 ). In multiple baseline designs, it is also possible to predetermine different baseline phase lengths for each tier and then randomly allocate participants to different baseline phase lengths (Wampold & Worsham, 1986 ). Randomization tests use the randomization actually present in the design for quantifying the probability of the observed effect occurring by chance. These tests are among the earliest data analysis techniques specifically proposed for SCEDs (Edgington, 1967 , 1975 , 1980 ).

The main aim of the present paper is to systematically review the methodological characteristics of recently published SCEDs with an emphasis on the data aspects put forth in the WWC guidelines. Specific research questions are:

What is the frequency of the various single-case design options?

How common is randomization in the study design?

Which data aspects do applied researchers include in their analysis?

What is the frequency of visual and statistical data analysis techniques?

For systematic reviews of SCEDs predating the publication of the WWC guidelines, the interested reader is referred to Hammond and Gast ( 2010 ), Shadish and Sullivan ( 2011 ), and Smith ( 2012 ).

Justification for publication period selection

The present systematic review deals with applied SCED studies published in the period from 2016 to 2018. The reasons for the selection of this period are threefold: relevance, sufficiency, and feasibility. In terms of relevance, there is a noticeable lack of recent systematic reviews dealing with the methodological characteristics of SCEDs in spite of important developments in the field. Apart from the previously mentioned reviews predating the publication of the 2010 WWC guidelines, only two reviews can be mentioned that were published after the WWC guidelines. Solomon ( 2014 ) reviewed indicators of violations of normality and independence in school-based SCED studies until 2012. More recently, Woo et al. ( 2016 ) performed a content analysis of SCED studies published in American Counseling Association journals between 2003 and 2014. However, neither of these reviews deals with published SCEDs in relation to specific guidelines such as WWC. In terms of sufficiency, a three-year period can give sufficient insight into recent trends in applied SCEDs. In addition, it seems reasonable to assume a delay between the publication of guidelines such as WWC and their impact in the field. For example, several discussion articles regarding the WWC guidelines were published in 2013. Wolery ( 2013 ) and Maggin et al. ( 2013 ) pointed out perceived weaknesses in the WWC guidelines, which in turn prompted a reply by the original authors (Hitchcock et al., 2014 ). Discussions like these can help increase the exposure of the guidelines among applied researchers. In terms of feasibility, it is important to note that we did not set any specification on the field of study for inclusion. Therefore, the period of publication had to remain feasible and manageable to read and code all included publications across all different study fields (education, healthcare, counseling, etc.).

Data sources

We performed a broad search of the English-language SCED literature using PubMed and Web of Science. The choice for these two search engines was based on Gusenbauer and Haddaway ( 2019 ), who assessed the eligibility of 26 search engines for systematic reviews. Gusenbauer and Haddaway came to the conclusion that PubMed and Web of Science could be used as primary search engines in systematic reviews, as they fulfilled all necessary requirements such as functionality of Boolean operators and reproducibility of search results in different locations and at different times. We selected only these two of all eligible search engines to keep the size of the project manageable and to prevent excessive overlap between the results. Table 1 gives an overview of the search terms we used and the number of hits per search query. This list does not exclude duplicates between the search terms and between the two search engines. For all designs containing the term “randomized” (e.g., randomized block design) we added the Boolean operator AND specified that the search results must also contain either the term “single-case” or “single-subject”. An initial search for randomized designs without these specifications yielded well over 1000 results per search query.

Study selection

We specifically searched for studies published between 2016 and 2018. We used the date of first online publication to determine whether an article met this criterion (i.e., articles that were published online during this period, even if not yet published in print). Initially, the abstracts and article information of all search results were scanned for general exclusion criteria. In a first step, all articles that fell outside the date range of interest were excluded, as well as articles for which the full text was not available or only available against payment. We only included articles written in English. In a second step, all duplicate articles were deleted. From the remaining unique search results, all articles that did not use any form of single-case experimentation were excluded. Such studies include for example non-experimental forms of case studies. Lastly, all articles not reporting any primary empirical data were excluded from the final sample. Thus, purely methodological articles were discarded. Methodological articles were defined as articles that were within the realm of SCEDs but did not report any empirical data or reported only secondary empirical data. Generally, these articles propose new methods for analyzing SCEDs or perform simulation studies to test existing methods. Similarly, commentaries, systematic reviews, and meta-analyses were excluded from the final sample, as such articles do not contain primary empirical data. In line with systematic review guidelines (Staples & Niazi, 2007 ), the second author verified the accuracy of the selection process. Ten articles were randomly selected from an initial list of all search results for a joint discussion between the authors, and no disagreements about the selection emerged. Figure 1 presents the study attrition diagram.

figure 1

Study attrition diagram

Coding criteria

For all studies, the basic design was coded first. For coding the design, we followed the typology presented in Onghena and Edgington ( 2005 ) and Tate et al. ( 2016a ) with four overarching categories: phase designs, alternation designs, multiple baseline designs, and changing criterion designs. For each of these categories, different design options exist. Common variants of phase designs include for example AB and ABAB, but other forms also exist, such as ABC. Within the alternation designs category the main variants are the completely randomized design, the alternating treatments designs, and the randomized block design. Multiple baseline designs can be conducted across participants, behaviors, or settings. They can be either concurrent, meaning that all participants start the study at the same time, or non-concurrent. Changing criterion designs can employ either a single-value criterion or a range-bound criterion. In addition to these four overarching categories, we added a design category called hybrid Footnote 2 . The hybrid category consists of studies using several design strategies combined, for example a multiple baseline study with an integrated alternating treatments design. For articles reporting more than one study, each study was coded separately. For coding the basic design, we followed the authors’ original description of the study.

Randomization was coded as a dichotomous variable, i.e., either present or not present. In order to be coded as present, some form of randomization had to be present in the design itself, as previously defined in the randomization section. Studies with a fixed order of treatments or phase change moments with randomized stimulus presentation, for example, were coded as randomization not present.

Data aspect

A major contribution of the WWC guidelines was the establishment of six data aspects for the analysis of SCEDs: level, trend, variability, overlap, immediacy, and consistency. Following the guidelines, these data aspects can be defined operationally as follows. Level is the mean score within a phase. The straight line best fitting the data within a phase refers to the trend. The standard deviation or range in a phase represents the data aspect variability. The proportion of data points overlapping between adjacent phases is the data aspect overlap. The immediacy of an effect is assessed by a comparison of the last three data points of an intervention with the first three data points of the subsequent intervention. Finally, consistency Footnote 3 is assessed by comparing data patterns from experimentally similar interventions. In multiple baseline designs, consistency can be assessed horizontally (within series) when more than one phase change is present, and vertically (across series) by comparing experimentally similar phases across participants, behaviors, or settings. It was of course possible that studies reported more than one data aspect or none at all. For studies reporting more than one data aspect, each data aspect was coded separately.

Data analysis

The data analysis methods were coded directly from the authors’ description in the “data analysis” section. If no such section was present, the data analysis methods were coded according to the presentation of the results. Generally, two main forms of data analysis for SCEDs can be distinguished: visual and statistical analysis. In the visual analytical approach, a time series graph of the dependent variable under the different experimental conditions is analyzed to determine treatment effectiveness. The statistical analytical approach can be roughly divided into two categories: descriptive and inferential statistics. Descriptive statistics summarize the data without quantifying the uncertainty in the description. Examples of descriptive statistics include means, standard deviations, and effect sizes. Inferential statistics imply an inference from the observed results to unknown parameter values and quantify the uncertainty for doing so, for example, by providing p values and confidence intervals.

Number of participants

Finally, for each study we coded the number of participants, counting only participants who appeared in the results section. Participants who dropped out prematurely and whose data were not analyzed, were not counted.

General results

For each coding category, the interrater agreement was calculated with the formula \( \frac{\mathrm{no}.\kern0.5em \mathrm{of}\ \mathrm{agreements}}{\mathrm{no}.\kern0.5em \mathrm{of}\ \mathrm{agreements}+\mathrm{no}.\kern0.5em \mathrm{of}\ \mathrm{disagreements}} \) based on ten randomly selected articles. The interrater agreement was as follow: design (90%), analysis (60%), data aspect (80%), randomization (100%), number of participants (80%). Given the initial moderate agreement for analysis, the two authors discussed discrepancies and then reanalyzed a new sample of ten randomly selected articles. The interrater reliability for analysis then increased to 90%.

In total, 406 articles were included in the final sample, which represented 423 studies. One hundred thirty-eight of the 406 articles (34.00%) were published in 2016, 150 articles (36.95%) were published in 2017, and 118 articles (29.06%) were published in 2018. Out of the 423 studies, the most widely used form of SCEDs was the multiple baseline design, which accounted for 49.65% ( N  = 210) of the studies included in the final sample. Across all studies and designs, the median number of participants was three (IQR = 3). The most popular data analysis technique across all studies was visual analysis paired with descriptive statistics, which was used in 48.94% ( N  = 207) of the studies. The average number of data aspects analyzed per study was 2.61 ( SD =  1.63). The most popular data aspect across all designs and studies was level (83.45%, N =  353). Overall, 22.46% ( N  = 95) of the 423 studies included randomization in the design. However, these results vary between the different designs. In the following sections, we therefore present a summary of the results per design. A detailed overview of all the results per design can be found in Table 2 .

Results per design

Phase designs.

Phase designs accounted for 25.53% ( N  = 108) of the studies included in the systematic review. The median number of participants for phase designs was three (IQR = 4). Visual analysis paired with descriptive statistics was the most popular data analysis method for phase designs (40.74%, N  = 44), and the majority of studies analyzed several data aspects (54.62%, N  = 59); 20.37% ( N  = 22) did not report any of the six data aspects. The average number of data aspects analyzed in phase designs was 2.02 ( SD =  2.07). Level was the most frequently analyzed data aspect for phase designs (73.15%, N  = 79). Randomization was very uncommon in phase designs and was included in only 5.56% ( N  = 6) of the studies.

Alternation designs

Alternation designs accounted for 14.42% ( N  = 61) of the studies included in the systematic review. The median number of participants for alternation designs was three (IQR = 1). More than half of the alternation design studies used visual analysis paired with descriptive statistics (57.38%, N  = 35). The majority of alternation design studies analyzed several data aspects (75.41%, N  = 46), while 11.48% ( N  = 7) did not report which data aspect was the focus of analysis. The average number of data aspects analyzed in alternation designs was 2.38 ( SD =  2.06). The most frequently analyzed data aspect for alternation designs was level (85.25%, N =  52). Randomization was used in the majority of alternation designs (59.02%, N  = 36).

Multiple baseline designs

Multiple baseline designs, by a large margin the most prevalent design, accounted for nearly half of all studies (49.65%, N  = 210) included in the systematic review. The median number of participants for multiple baseline designs was four (IQR = 4). A total of 49.52% ( N  = 104) of multiple baseline studies were analyzed using visual analysis paired with descriptive statistics, and the vast majority (80.95%, N  = 170) analyzed several data aspects, while only 7.14% ( N  = 15) did not report any of the six data aspects. The average number of data aspects analyzed in multiple baseline designs was 3.01 ( SD =  1.61). The most popular data aspect was level, which was analyzed in 87.62% ( N =  184) of all multiple baseline designs. Randomization was not uncommon in multiple baseline designs (20.00%, N  = 42).

Changing criterion design

Changing criterion designs accounted for 1.42% ( N  = 6) of the studies included in the systematic review. The median number of participants for changing criterion designs was three (IQR = 0); 66.67% ( N =  4) of changing criterion designs were analyzed using visual analysis paired with descriptive statistics. Half of the changing criterion designs analyzed several data aspects ( N =  3), and one study (16.67%) did not report any data aspect. The average number of data aspects analyzed in changing criterion designs was 1.83 ( SD =  1.39). The most popular data aspect was level (83.33%, N  = 5). None of the changing criterion design studies included randomization in the design.

Hybrid designs

Hybrid designs accounted for 8.98% ( N  = 38) of the studies included in the systematic review. The median number of participants for hybrid designs was three (IQR = 2). A total of 52.63% ( N  = 20) of hybrid designs were analyzed with visual analysis paired with descriptive statistics, and the majority of studies analyzed several data aspects (73.68%, N  = 28); 10.53% ( N  = 4) did not report any of the six data aspects. The average number of data aspects considered for analysis was 2.55 ( SD =  2.02). The most popular data aspect was level (86.84%, N  = 33). Hybrid designs showed the second highest proportion of studies including randomization in the study design (28.95%, N  = 11).

Results per data aspect

Out of the 423 studies included in the systematic review, 72.34% ( N =  306) analyzed several data aspects, 16.08% ( N =  68) analyzed one data aspect, and 11.58% ( N =  49) did not report any of the six data aspects.

Across all designs, level was by far the most frequently analyzed data aspect (83.45%, N =  353). Remarkably, nearly all studies that analyzed more than one data aspect included the data aspect level (96.73%, N =  296). Similarly, for studies analyzing only one data aspect, there was a strong prevalence of level (83.82%, N =  57). For studies that only analyzed level, the most common form of analysis was visual analysis paired with descriptive statistics (54.39%, N =  31).

Trend was the third most popular data aspect. It was analyzed in 45.39% ( N =  192) of all studies included in the systematic review. There were no studies in which trend was the only data aspect analyzed, meaning that trend was always analyzed alongside other data aspects, making it difficult to isolate the analytical methods specifically used to analyze trend.

Variability

The data aspect variability was analyzed in 59.10% ( N =  250) of the studies, making it the second most prominent data aspect. A total of 80.72% ( N =  247) of all studies analyzing several data aspects included variability. However, variability was very rarely the only data aspect analyzed. Only 3.3% ( N =  3) of the studies analyzing only one data aspect focused on variability. All three studies that analyzed only variability did so using visual analysis.

The data aspect overlap was analyzed in 35.70% ( N =  151) of all studies and was thus the fourth most analyzed data aspect. Nearly half of all studies analyzing several data aspects included overlap (47.08%, N =  144). For studies analyzing only one data aspect, overlap was the second most common data aspect after level (10.29%, N =  7). The most common mode of analysis for these studies was descriptive statistics paired with inferential statistics (57.14%, N =  4).

The immediacy of the effect was assessed in 28.61% ( N =  121) of the studies, making it the second least analyzed data aspect; 39.22% ( N =  120) of the studies analyzing several data aspects included immediacy. Only one study analyzed immediacy as the sole data aspect, and this study used visual analysis.

Consistency

Consistency was analyzed in 9.46% ( N =  40) of the studies and was thus by far the least analyzed data aspect. It was analyzed in 13.07% ( N =  40) of the studies analyzing several data aspects and was never the focus of analysis for studies analyzing only one data aspect.

As stated previously, 72.34% ( N =  306) of all studies analyzed several data aspects. For these studies, the average number of data aspects analyzed was 3.39 ( SD =  1.18). The most popular data analysis technique for several data aspects was visual analysis paired with descriptive statistics (56.54%, N =  173).

Not reported

As mentioned previously, 11.58% ( N =  49) did not report any of the six data aspects. For these studies, the most prominent analytical technique was visual analysis alone (61.22%, N =  30). Of all studies not reporting any of the six data aspects, the highest proportion was phase designs (44.90%, N =  22).

Results per analytical method

Visual analysis, without the use of any descriptive or inferential statistics, was the analytical method used in 16.78% ( N =  71) of all included studies. Of all studies using visual analysis, the majority were multiple baseline design studies (45.07%, N =  32). The majority of studies using visual analysis did not report any data aspect (42.25%, N =  30), closely followed by several data aspects (40.85%, N =  29). Randomization was present in 20.53% ( N =  16) of all studies using visual analysis.

Descriptive statistics

Descriptive statistics, without the use of visual analysis, was the analytical method used in 3.78% ( N =  16) of all included studies. The most common designs for studies using descriptive statistics were phase designs and multiple baseline designs (both 43.75%, N =  7). Half of the studies using descriptive statistics (50.00%, N =  8) analyzed the data aspect level, and 37.5% ( N =  6) analyzed several data aspects. One study (6.25%) using descriptive statistics included randomization.

Inferential statistics, without the use of visual analysis, was the analytical method used in 2.84% ( N =  12) of all included studies. The majority of studies using inferential statistics were phase designs (58.33%, N =  7) and did not report any of the six data aspects (58.33%, N =  7). Of the remaining studies, three (25.00%) reported several data aspects, and two (16.67%) analyzed the data aspect level. Two studies (16.67) using inferential statistical analysis included randomization.

Descriptive and inferential statistics

Descriptive statistics combined with inferential statistics, but without the use of visual analysis, accounted for 5.67% ( N  = 24) of all included studies. The majority of studies using this combination of analytical methods were multiple baseline designs (62.5%, N =  15), followed by phase designs (33.33%, N =  8). There were no alternation or hybrid designs using descriptive and inferential statistics. Most of the studies using descriptive and inferential statistics analyzed several data aspects (41.67%, N =  10), followed by the data aspect level (29.17%, N =  7); 16.67% ( N =  4) of the studies using descriptive and inferential statistics included randomization.

Visual and descriptive statistics

As mentioned previously, visual analysis paired with descriptive statistics was the most popular analytical method. This method was used in nearly half (48.94%, N  = 207) of all included studies. The majority of these studies were multiple baseline designs (50.24%, N =  104), followed by phase designs (21.25%, N =  44). This method of analysis was prevalent across all designs. Nearly all of the studies using this combination of analytical methods analyzed either several data aspects (83.57%, N =  173) or level only (14.98%, N =  31). Randomization was present in 19.81% ( N =  41) of all studies using visual and descriptive analysis.

Visual and inferential statistics

Visual analysis paired with inferential statistics accounted for 2.60% ( N  = 11) of the included studies. The largest proportion of these studies were phase designs (45.45%, N  = 5), followed by multiple baseline designs and hybrid designs (both 27.27%, N =  3). This combination of analytical methods was thus not used in alternation or changing criterion designs. The majority of studies using visual analysis and inferential statistics analyzed several data aspects (72.73%, N =  8), while 18.18% ( N =  2) did not report any data aspect. One study (9.10%) included randomization.

Visual, descriptive, and inferential statistics

A combination of visual analysis, descriptive statistics, and inferential statistics was used in 18.44% ( N =  78) of all included studies. The majority of the studies using this combination of analytical methods were multiple baseline designs (56.41%, N =  44), followed by phase designs (23.08%, N =  18). This analytical approach was used in all designs except changing criterion designs. Nearly all studies using a combination of these three analytical methods analyzed several data aspects (97.44%, N =  76). These studies also showed the highest proportion of randomization (38.46%, N =  30).

None of the above

A small proportion of studies did not use any of the above analytical methods (0.95%, N =  4). Three of these studies (75%) were phase designs and did not report any data aspect. One study (25%) was a multiple baseline design that analyzed several data aspects. Randomization was not used in any of these studies.

To our knowledge, the present article is the first systematic review of SCEDs specifically looking at the frequency of the six data aspects in applied research. The systematic review has shown that level is by a large margin the most widely analyzed data aspect in recently published SCEDs. The second most popular data aspect from the WWC guidelines was variability, which was usually assessed alongside level (e.g., a combination of mean and standard deviation or range). The fact that these two data aspects are routinely assessed in group studies may be indicative of a lack of familiarity with SCED-specific analytical methods by applied researchers, but this remains speculative. Phase designs showed the highest proportion of studies not reporting any of the six data aspects and the second lowest number of data aspects analyzed on average, only second to changing criterion designs. This was an unexpected finding given that the WWC guidelines were developed specifically in the context of (and with examples of) phase designs. The multiple baseline design showed the highest number of data aspects analyzed and at the same time the lowest proportion of studies not analyzing any of the six data aspects.

These findings regarding the analysis and reporting of the six data aspects need more contextualization. The selection of data aspects for the analysis depends on the research questions and expected data pattern. For example, if the aim of the intervention is a gradual change over time, then trend becomes more important. If the aim of the intervention is a change in level, then it is import to also assess trend (to verify that the change in level is not just a continuation of a baseline trend) and variability (to assess whether the change in level is caused by excessive variability). In addition, assessing consistency can add information on whether the change in level is consistent over several repetitions of experimental conditions (e.g., in phase designs). Similarly, if an abrupt change in level of target behavior is expected after changing experimental conditions, then immediacy becomes a more relevant data aspect in addition to trend, variability, and level. The important point here is that oftentimes the research team has an idea of the expected data pattern and should choose the analysis of data aspects accordingly. The strong prevalence of level found in the present review could be indicative of a failure to assess other data aspects that may be relevant to demonstrate experimental control over an independent variable.

In line with the findings of earlier systematic reviews (Hammond & Gast, 2010 ; Shadish & Sullivan, 2011 ; Smith, 2012 ), the multiple baseline design continues to be the most frequently used design, and despite the advancement of sophisticated statistical methods for the analysis of SCEDs, two thirds of all studies still relied on visual analysis alone or visual analysis paired with descriptive statistics. A comparison to the findings of Shadish and Sullivan further reveals that the number of participants included in SCEDs has remained steady over the past decade at around three to four participants. The relatively small number of changing criterion designs in the present findings is partly due to the fact that changing criterion designs were often combined with other designs and thus coded in the hybrid category, even though we did not formally quantify that. This finding is supported by the results of Shadish and Sullivan, who found that changing criterion designs are more often used as part of hybrid designs than as a standalone design. Hammond and Gast even excluded changing criterion design from their review due to its low prevalence. They found a total of six changing criterion designs published over a period of 35 years. It should be noted, however, that the low prevalence of changing criterion designs is not indicative of the value of this design.

Regarding randomization, the results cannot be interpreted against earlier benchmarks, as neither Smith nor Shadish and Sullivan or Hammond and Gast quantified the proportion of randomized SCEDs. Overall, randomization in the study design was not uncommon. However, the proportion of randomized SCEDs differed greatly between different designs. The results showed that alternating treatments designs have the highest proportion of studies including randomization. This result was to be expected given that alternating treatments designs are particularly suited to incorporate randomization. In fact, when Barlow and Hayes ( 1979 ) first introduced the alternating treatments design, they emphasized randomization as an important part of the design: “Among other considerations, each design controls for sequential confounding by randomizing the order of treatment […]” (p. 208). Besides that, alternating treatments designs could work with already existing randomization procedures, such as the randomized block procedure proposed by Edgington ( 1967 ). The different design options for alternating treatments designs (e.g., randomized block design) and accompanying randomization procedures are discussed in detail in Manolov and Onghena ( 2018 ). For multiple baseline designs, a staggered introduction of the intervention is needed. Proposals to randomize the order of the introduction of the intervention have been around since the 1980s (Marascuilo & Busk, 1988 ; Wampold & Worsham, 1986 ). These randomization procedures have their counterparts in group studies where particpants are randomdly assigned to treatments or different blocks of treatments. Other randomization procedures for multiple baseline designs are discussed in Levin et al. ( 2018 ). These include the restricted Marascuilo–Busk procedure proposed by Koehler and Levin and the randomization test procedure proposed by Revusky. For phase designs and changing criterion designs, the incorporation of randomization is less evident. For phase designs, Onghena ( 1992 ) proposed a method to randomly determine the moment of phase change between two succesive phases. However, this method is rather uncommon and has no counterpart in group studies. Specific randomization schemes for changing criterion designs have only very recently been proposed (Ferron et al., 2019 ; Manolov et al., 2020 ; Onghena et al., 2019 ), and it remains to be seen how common they will become in applied SCEDs.

Implications for SCED research

The results of the systematic review have several implications for SCED research regarding methodology and analyses. An important finding of the present study is that the frequency of use of randomization differs greatly between different designs. For example, while phase designs were found to be the second most popular design, randomization is used very infrequently for this design type. Multiple baseline designs, as the most frequently used design, showed a higher percentage of randomized studies, but only every fifth study used randomization. Given that randomization in the study design increases the internal and statistical conclusion validity irrespective of the design, it seems paramount to further stress the importance of the inclusion of randomization beyond alternating treatments designs. Another implication concerns the analysis of specific data aspects. While level was by a large margin the most popular data aspect, it is important to stress that conclusions based on only one data aspect may be misleading. This seems particularly relevant for phase designs, which were found to contain the highest proportion of studies not reporting any of the six data aspects and the lowest proportion of studies analyzing several data aspects (apart from changing criterion designs, which only accounted for a very small proportion of the included studies). A final implication concerns the use of analytical methods, in particular triangulation of different methods. Half of the included studies used visual analysis paired with descriptive statistics. These methods should of course not be discarded, as they generate important information about the data, but they cannot make statements regarding the uncertainty of a possible intervention effect. Therefore, triangulation of visual analysis, descriptive statistics, and inferential statistics should form an important part of future guidelines on SCED analysis.

Reflections on updated WWC guidelines

Updated WWC guidelines were recently published, after the present systematic review had been conducted (What Works Clearinghouse, 2020a , 2020c ). Two major changes in the updated guidelines are of direct relevance to the present systematic review: (a) the removal of visual analysis for demonstrating intervention effectiveness and (b) recommendation for a design comparable effect size measure for demonstrating intervention effects (D-CES, Pustejovsky et al., 2014 ; Shadish et al., 2014 ). This highlights a clear shift away from visual analysis towards statistical analysis of SCED data, especially compared to the 2010 guidelines. These changes in the guidelines have prompted responses from the public, to which What Works Clearinghouse ( 2020b ) published a statement addressing the concerns. Several concerns relate to the removal of visual analysis. In response to a concern that visual analysis should be reinstated, the panel clearly states that “visual analysis will not be used to characterize study findings” (p. 3). Another point from the public concerned the analysis of studies where no effect size can be calculated (e.g., due to unavailability of raw data). Even in these instances, the panel does not recommend visual analysis. Rather, “the WWC will extract raw data from those graphs for use in effect size computation” (p. 4). In light of the present findings, these statements are particularly noteworthy. Given that the present review found a strong continued reliance on visual analysis, it remains to be seen if and how the updated WWC guidelines impact the analyses conducted by applied SCED researchers.

Another update of relevance in the recent guidelines concerns the use of design categories. While the 2010 guidelines were demonstrated with the example of a phase design, the updated guidelines include quality rating criteria for each major design option. Given that the present results indicate a very low prevalence of the changing criterion design in applied studies, the inclusion of this design in the updated guidelines may increase the prominence of the changing criterion design. For changing criterion designs, the updated guidelines recommend that “the reversal or withdrawal (AB) design standards should be applied to changing criterion designs” (What Works Clearinghouse, 2020c , p. 80). With phase designs being the second most popular design choice, this could further facilitate the use of the changing criterion design.

While other guidelines on conduct and analysis (e.g., Tate et al., 2013 ), as well as members of the 2010 What Works Clearinghouse panel (Kratochwill & Levin, 2014 ), have clearly highlighted the added value of randomization in the design, the updated guidelines do not include randomization procedures for SCEDs. Regarding changes between experimental conditions, the updated guidelines state that “the independent variable is systematically manipulated, with the researcher determining when and how the independent variable conditions change” (What Works Clearinghouse, 2020c , p. 82). While the frequency of use of randomization differs considerably between different designs, the present review has shown that overall randomization is not uncommon. The inclusion of randomization in the updated guidelines may therefore have offered guidance to applied researchers wishing to incorporate randomization into their SCEDs, and may have further contributed to the popularity of randomization.

Limitations and future research

One limitation of the current study concerns the used databases. SCEDs that were published in journals that are not indexed in these databases may not have been included in our sample. A similar limitation concerns the search terms used in the systematic search. In this systematic review, we focused on the common names “single-case” and “single-subject.” However, as Shadish and Sullivan ( 2011 ) note, SCEDs go by many names. They list several less common alternative terms: instrasubject replication design (Gentile et al., 1972 ), n -of-1 design (Center et al., 1985 -86), intrasubject experimental design (White et al., 1989 ), one-subject experiment (Edgington, 1980 ), and individual organism research (Michael, 1974 ). Even though these terms date back to the 1970s and 1980s, a few authors may still use them to describe their SCED studies. Studies using these terms may not have come up during the systematic search. It should furthermore be noted that we followed the original description provided by the authors for the coding of the design and analysis to reduce bias. We therefore made no judgments regarding the correctness or accuracy of the authors’ naming of the design and analysis techniques.

The systematic review offers several avenues for future research. The first avenue may be to explore more in depth the reasons for the unequal distribution of data aspects. As the systematic review has shown, level is assessed far more often than the other five data aspects. While level is an important data aspect, failing to assess it alongside other data aspects can lead to erroneous conclusions. Gaining an understanding of the reasons for the prevalence of level, for example through author interviews or questionnaires, may help to improve the quality of data analysis in applied SCEDs.

In a similar vein, a second avenue of future research may explore why randomization is much more prevalent in some designs. Apart from the aforementioned differences in randomization procedures between designs, it may be of interest to gain a better understanding of the reasons that applied researchers see for randomizing their SCEDs. As the incorporation of randomization enhances the internal validity of the study design, promoting the inclusion of randomization for designs other than alternation designs will help in advancing the credibility of SCEDs in the scientific community. Searching the methodological sections of the articles that used randomization may be a first step to gain a better understanding of why applied researchers use randomization. Such a text search may reveal how the authors discuss randomization and which reasons they name for randomizing. A related question is how the randomization was actually carried out. For example, was the randomization carried out a priori or in a restricted way taking into account the evolving data pattern? A deeper understanding of the reasons for randomizing and the mechanisms of randomization may be gained by author interviews or questionnaires.

A third avenue of future research may explore in detail the specifics of inferential analytical methods used to analyze SCED data. Within the scope of the present review, we only distinguished between visual, descriptive and inferential statistics. However, deeper insight into the inferential analysis methods and their application to SCED data may help to understand the viewpoint of applied researchers. This may be achieved through a literature review of articles that use inferential analysis. Research questions for such a review may include: Which inferential methods do applied SCED researchers use and what is the frequency of these methods? Are these methods adapted to SCED methodology? And how do applied researchers justify their choice for an inferential method? Similar questions may also be answered for effect size measures understood as descriptive statistics. For example, why do applied researchers choose a particular effect size measure over a competing one? Are these effect size measures adapted to SCED research?

Finally, future research may go into greater detail about the descriptive statistics used in SCEDs. In the present review, we distinguished between two major categories: descriptive and inferential statistics. Effect sizes that were not accompanied by a standard error, confidence limits, or by the result of a significance test were coded in the descriptive statistics category. Effect sizes do however go beyond merely summarizing the data by quantifying the treatment effect between different experimental conditions, contrary to within phase quantifications such as the mean and standard deviation. Therefore, future research may examine in greater detail the use of effect sizes separately from other descriptive statistics such the mean and standard deviation. Such research could focus in depth on the exact methods used to quantify each data aspect in the form of either a quantification (e.g., mean or range) or an effect size measure (e.g., standardized mean difference or variance ratios).

The What Works Clearinghouse panel ( 2020a , 2020c ) has recently released an updated version of the guidelines. We will discuss the updated guidelines in light of the present findings in the Discussion section.

As holds true for most single-case designs, the same design is often described with different terms. For example, Ledford and Gast ( 2018 ) call these designs combination designs, and Moeyaert et al. ( 2020 ) call them combined designs. Given that this is a purely terminological question, it is hard to argue in favor of one term over the other. We do, however, prefer the term hybrid, given that it emphasizes that neither of the designs remains in its pure form. For example, a multiple baseline design with alternating treatments is not just a combination of a multiple baseline design and an alternating treatments design. It is rather a hybrid of the two. This term is also found in recent literature (e.g., Pustejovski & Ferron, 2017 ; Swan et al., 2020 ).

For the present systematic review, we strictly followed the data aspects as outlined in the 2010 What Works Clearinghouse guidelines. While the assessment of consistency of effects is an important data aspect, this data aspect is not described in the guidelines. Therefore, we did not code it in the present review.

Baek, E. K., Petit-Bois, M., Van den Noortgate, W., Beretvas, S. N., & Ferron, J. M. (2016). Using visual analysis to evaluate and refine multilevel models of single-case studies. The Journal of Special Education, 50 , 18-26. https://doi.org/10.1177/0022466914565367 .

Article   Google Scholar  

Barlow, D. H., & Hayes, S. C. (1979). Alternating Treatments Design: One Strategy for Comparing the Effects of Two Treatments in a Single Subject. Journal of Applied Behavior Analysis, 12 , 199-210. https://doi.org/10.1901/jaba.1979.12-199 .

Article   PubMed   PubMed Central   Google Scholar  

Barlow, D. H., Nock, M. K., & Hersen, M. (2009). Single case experimental designs: Strategies for studying behavior change ( 3rd ). Pearson.

Beretvas, S. N., & Chung, H. (2008). A review of meta-analyses of single-subject experimental designs: Methodological issues and practice. Evidence-Based Communication Assessment and Intervention, 2 , 129-141. https://doi.org/10.1080/17489530802446302 .

Center, B. A., Skiba, R. J., & Casey, A. (1985-86). A Methodology for the Quantitative Synthesis of Intra-Subject Design research. Journal of Special Education, 19 , 387–400. https://doi.org/10.1177/002246698501900404 .

Edgington, E. S. (1967). Statistical inference from N=1 experiments. The Journal of Psychology, 65 , 195-199. https://doi.org/10.1080/00223980.1967.10544864 .

Article   PubMed   Google Scholar  

Edgington, E. S. (1975). Randomization tests for one-subject operant experiments. The Journal of Psychology, 90 , 57-68. https://doi.org/10.1080/00223980.1975.9923926 .

Edgington, E. S. (1980). Random assignment and statistical tests for one-subject experiments. Journal of Educational Statistics, 5 , 235-251.

Ferron, J., Rohrer, L. L., & Levin, J. R. (2019). Randomization procedures for changing criterion designs. Behavior Modification https://doi.org/10.1177/0145445519847627 .

Gentile, J. R., Roden, A. H., & Klein, R. D. (1972). An analysis-of-variance model for the intrasubject replication design. Journal of Applied Behavior Analysis, 5 , 193-198. https://doi.org/10.1901/jaba.1972.5-193 .

Gusenbauer, M., & Haddaway, N. R. (2019). Which academic search systems are suitable for systematic Reviews or meta-analyses? Evaluating retrieval qualities of Google Scholar, PubMed and 26 other Resources. Research Synthesis Methods https://doi.org/10.1002/jrsm.1378 .

Hammond, D., & Gast, D. L. (2010). Descriptive analysis of single subject research designs: 1983—2007. Education and Training in Autism and Developmental Disabilities, 45 , 187-202.

Google Scholar  

Harrington, M. A. (2013). Comparing visual and statistical analysis in single-subject studies. Open Access Dissertations , Retrieved from http://digitalcommons.uri.edu/oa_diss .

Hedges, L. V., Pustejovsky, J. E., & Shadish, W. R. (2012). A standardized mean difference effect size for single case designs. Research Synthesis Methods, 3 , 224-239. https://doi.org/10.1002/jrsm.1052 .

Hedges, L. V., Pustejovsky, J. E., & Shadish, W. R. (2013). A standardized mean difference effect size for multiple baseline designs across individuals. Research Synthesis Methods, 4 , 324-341. https://doi.org/10.1002/jrsm.1086 .

Heyvaert, M., & Onghena, P. (2014). Analysis of single-case data: Randomization tests for measures of effect size. Neuropsychological Rehabilitation, 24 , 507-527. https://doi.org/10.1080/09602011.2013.818564 .

Hitchcock, J. H., Horner, R. H., Kratochwill, T. R., Levin, J. R., Odom, S. L., Rindskopf, D. M., & Shadish, W. R. (2014). The What Works Clearinghouse single-case design pilot standards: Who will guard the guards? Remedial and Special Education, 35 , 145-152. https://doi.org/10.1177/0741932513518979 .

Horner, R. H., Carr, E. G., Halle, J., McGee, G., Odom, S., & Wolery, M. (2005). The use of single-subject research to identify evidence-based practice in special education. Exceptional Children, 71 , 165-179. https://doi.org/10.1177/001440290507100203 .

Kazdin, A. E. (1982). Single-case research designs: Methods for clinical and applied settings. Oxford University Press.

Kazdin, A. E. (2011). Single-case research designs: Methods for clinical and applied settings ( 2nd ). Oxford University Press.

Kratochwill, T. R., Hitchcock, J., Horner, R. H., Levin, J. R., Odom, S. L., Rindskopf, D. M., & Shadish, W. R. (2010). Single-case designs technical documentation. Retrieved from What Works Clearinghouse: https://files.eric.ed.gov/fulltext/ED510743.pdf

Kratochwill, T. R., Hitchcock, J., Horner, R. H., Levin, J. R., Odom, S. L., Rindskopf, D. M., & Shadish, W. R. (2013). Single-case intervention research design standards. Remedial and Special Education, 34 , 26-38. https://doi.org/10.1177/0741932512452794 .

Kratochwill, T. R., & Levin, J. R. (2014). Meta- and statistical analysis of single-case intervention research data: Quantitative gifts and a wish list. Journal of School Psychology, 52 , 231-235. https://doi.org/10.1016/j.jsp.2014.01.003 .

Kromrey, J. D., & Foster-Johnson, L. (1996). Determining the efficacy of intervention: The use of effect sizes for data analysis in single-subject research. The Journal of Experimental Education, 65 , 73-93. https://doi.org/10.1080/00220973.1996.9943464 .

Lane, J. D., & Gast, D. L. (2014). Visual analysis in single case experimental design studies: Brief review and guidelines. Neuropsychological Rehabilitation, 24 , 445-463. https://doi.org/10.1080/09602011.2013.815636 .

Ledford, J. R., & Gast, D. L. (Eds.) (2018). Single case research methodology: Applications in special education and behavioral sciences (3rd). Routledge.

Levin, J. R. (1994). Crafting educational intervention research that's both credible and creditable. Educational Psychology Review, 6 , 231-243. https://doi.org/10.1007/BF02213185 .

Levin, J. R., Ferron, J. M., & Gafurov, B. S. (2018). Comparison of randomization-test procedures for single-case multiple-baseline designs. Developmental Neurorehabilitation, 21 , 290-311. https://doi.org/10.1080/17518423.2016.1197708 .

Levin, J. R., Ferron, J. M., & Gafurov, B. S. (2020). Investigation of single-case multiple-baseline randomization tests of trend and variability. Educational Psychology Review . https://doi.org/10.1007/s10648-020-09549-7 .

Ma, H.-H. (2006). Quantitative synthesis of single-subject researches: Percentage of data points exceeding the median. Behavior Modification, 30 , 598-617. https://doi.org/10.1177/0145445504272974 .

Maggin, D. M., Briesch, A. M., & Chafouleas, S. M. (2013). An application of the What Works Clearinghouse standards for evaluating single-subject research: Synthesis of the self-management literature base. Remedial and Special Education, 34 , 44-58. https://doi.org/10.1177/0741932511435176 .

Manolov, R. (2018). Linear trend in single-case visual and quantitative analyses. Behavior Modification, 42 , 684-706. https://doi.org/10.1177/0145445517726301 .

Manolov, R., & Moeyaert, M. (2017). Recommendations for choosing single-case data analytical techniques. Behavior Therapy, 48 , 97-114. https://doi.org/10.1016/j.beth.2016.04.008 .

Manolov, R., & Onghena, P. (2018). Analyzing data from single-case alternating treatments designs. Psychological Methods, 23 , 480-504. https://doi.org/10.1037/met0000133 .

Manolov, R., & Solanas, A. (2018). Analytical options for single-case experimental designs: Review and application to brain impairment. Brain Impairment, 19 , 18-32. https://doi.org/10.1017/BrImp.2017.17 .

Manolov, R., Solanas, A., & Sierra, V. (2020). Changing Criterion Designs: Integrating Methodological and Data Analysis Recommendations. The Journal of Experimental Education, 88 , 335-350. https://doi.org/10.1080/00220973.2018.1553838 .

Marascuilo, L., & Busk, P. (1988). Combining statistics for multiple-baseline AB and replicated ABAB designs across subjects. Behavioral Assessment, 10 , 1-28.

Michael, J. (1974). Statistical inference for individual organism research: Mixed blessing or curse? Journal of Applied Behavior Analysis, 7 , 647-653. https://doi.org/10.1901/jaba.1974.7-647 .

Michiels, B., Heyvaert, M., Meulders, A., & Onghena, P. (2017). Confidence intervals for single-case effect size measures based on randomization test inversion. Behavior Research Methods, 49 , 363-381. https://doi.org/10.3758/s13428-016-0714-4 .

Moeyaert, M., Akhmedjanova, D., Ferron, J. M., Beretvas, S. N., & Van den Noortgate, W. (2020). Effect size estimation for combined single-case experimental designs. Evidence-Based Communication Assessment and Intervention, 14 , 28-51. https://doi.org/10.1080/17489539.2020.1747146 .

Moeyaert, M., Ferron, J. M., Beretvas, S. N., & Van den Noortgate, W. (2014a). From a single-level analysis to a multilevel analysis of single-case experimental designs. Journal of School Psychology, 52 , 191-211. https://doi.org/10.1016/j.jsp.2013.11.003 .

Moeyaert, M., Ugille, M., Ferron, J. M., Beretvas, S. N., & Van den Noortgate, W. (2014b). Three-level analysis of single-case experimental data: Empirical validation. The Journal of Experimental Education, 82 , 1-21. https://doi.org/10.1080/00220973.2012.745470 .

O’Brien, S., & Repp, A. C. (1990). Reinforcement-based reductive procedures: A review of 20 years of their use with persons with severe or profound retardation. Journal of the Association for Persons with Severe Handicaps, 15 , 148–159. https://doi.org/10.1177/154079699001500307 .

Onghena, P. (1992). Randomization tests for extensions and variations of ABAB single-case experimental designs: A rejoinder. Behavioral Assessment, 14 , 153-172.

Onghena, P., & Edgington, E. S. (1994). Randomization tests for restricted alternating treatment designs. Behaviour Research and Therapy, 32 , 783-786. https://doi.org/10.1016/0005-7967(94)90036-1 .

Onghena, P., & Edgington, E. S. (2005). Customization of pain treatments: Single-case design and analysis. The Clinical Journal of Pain, 21 , 56-68. https://doi.org/10.1097/00002508-200501000-00007 .

Onghena, P., Tanious, R., De, T. K., & Michiels, B. (2019). Randomization tests for changing criterion designs. Behaviour Research and Therapy, 117 , 18-27. https://doi.org/10.1016/j.brat.2019.01.005 .

Ottenbacher, K. J. (1990). When is a picture worth a thousand p values? A comparison of visual and quantitative methods to analyze single subject data. The Journal of Special Education, 23 , 436-449. https://doi.org/10.1177/002246699002300407 .

Parker, R. I., Hagan-Burke, S., & Vannest, K. (2007). Percentage of all non-overlapping data (PAND): An alternative to PND. The Journal of Special Education, 40 , 194-204. https://doi.org/10.1177/00224669070400040101 .

Parker, R. I., Vannest, K. J., & Davis, J. L. (2011). Effect Size in Single-Case Research: A Review of Nine Nonoverlap Techniques. Behavior Modification, 35 , 303-322. https://doi.org/10.1177/0145445511399147 .

Pustejovski, J. E., & Ferron, J. M. (2017). Research synthesis and meta-analysis of single-case designs. In J. M. Kaufmann, D. P. Hallahan, & P. C. Pullen, Handbook of Special Education (pp. 168-185). New York: Routledge.

Chapter   Google Scholar  

Pustejovsky, J. E., Hedges, L. V., & Shadish, W. R. (2014). Design-comparable effect sizes in multiple baseline designs: A general modeling framework. Journal of Educational and Behavioral Statistics, 39 , 368-393. https://doi.org/10.3102/1076998614547577 .

Scruggs, T. E., Mastropieri, M. A., & Casto, G. (1987). The quantitative synthesis of single-subject research: Methodology and validation. Remedial and Special Education, 8 , 24-33. https://doi.org/10.1177/074193258700800206 .

Shadish, W. R., Hedges, L. V., & Pustejovsky, J. E. (2014). Analysis and meta-analysis of single-case designs with a standardized mean difference statistic: A primer and applications. Journal of School Psychology, 52 , 123–147. https://doi.org/10.1016/j.jsp.2013.11.005 .

Shadish, W. R., Rindskopf, D. M., & Hedges, L. V. (2008). The state of the science in the meta-analysis of single-case experimental designs. Evidence-Based Communication Assessment and Intervention, 2 , 188-196. https://doi.org/10.1080/17489530802581603 .

Shadish, W. R., & Sullivan, K. J. (2011). Characteristics of single-case designs used to assess intervention effects in 2008. Behavior Research Methods, 43 , 971-980. https://doi.org/10.3758/s13428-011-0111-y .

Smith, J. D. (2012). Single-case experimental designs: A systematic review of published research and current standards. Psychological Methods, 17 , 510-550. https://doi.org/10.1037/a0029312 .

Solanas, A., Manolov, R., & Onghena, P. (2010). Estimating slope and level change in N=1 designs. Behavior Modification, 34 , 195-218. https://doi.org/10.1177/0145445510363306 .

Solomon, B. G. (2014). Violations of school-based single-case data: Implications for the selection and interpretation of effect sizes. Behavior Modification, 38 , 477-496. https://doi.org/10.1177/0145445513510931 .

Staples, M., & Niazi, M. (2007). Experiences using systematic review guidelines. The Journal of Systems and Software, 80 , 1425-1437. https://doi.org/10.1016/j.jss.2006.09.046 .

Swan, D. M., Pustejovsky, J. E., & Beretvas, S. N. (2020). The impact of response-guided designs on count outcomes in single-case experimental design baselines. Evidence-Based Communication Assessment and Intervention, 14 , 82-107. https://doi.org/10.1080/17489539.2020.1739048 .

Tanious, R., De, T. K., Michiels, B., Van den Noortgate, W., & Onghena, P. (2019). Consistency in single-case ABAB phase designs: A systematic review. Behavior Modification https://doi.org/10.1177/0145445519853793 .

Tanious, R., De, T. K., Michiels, B., Van den Noortgate, W., & Onghena, P. (2020). Assessing consistency in single-case A-B-A-B phase designs. Behavior Modification, 44 , 518-551. https://doi.org/10.1177/0145445519837726 .

Tate, R. L., Perdices, M., Rosenkoetter, U., McDonald, S., Togher, L., Shadish, W. R., … Vohra, S. (2016b). The Single-Case Reporting guideline In BEhavioural Interventions (SCRIBE) 2016: Explanation and Elaboration. Archives of Scientific Psychology, 4 , 1-9. https://doi.org/10.1037/arc0000026 .

Tate, R. L., Perdices, M., Rosenkoetter, U., Shadish, W. R., Vohra, S., Barlow, D. H., … Wilson, B. (2016a). The Single-Case Reporting guideline In BEhavioural interventions (SCRIBE) 2016 statement. Aphasiology, 30 , 862-876. https://doi.org/10.1080/02687038.2016.1178022 .

Tate, R. L., Perdices, M., Rosenkoetter, U., Wakim, D., Godbee, K., Togher, L., & McDonald, S. (2013). Revision of a method quality rating scale for single-case experimental designs and n-of-1 trials: The 15-item Risk of Bias in N-of-1 Trials (RoBiNT) Scale. Neuropsychological Rehabilitation, 23 , 619-638. https://doi.org/10.1080/09602011.2013.824383 .

Van den Noortgate, W., & Onghena, P. (2003). Hierarchical linear models for the quantitative integration of effect sizes in single-case research. Behavior Research Methods, Instruments, & Computers, 35 , 1-10. https://doi.org/10.3758/bf03195492 .

Van den Noortgate, W., & Onghena, P. (2008). A multilevel meta-analysis of single-subject experimental design studies. Evidence-Based Communication Assessment and Intervention, 2 , 142-151. https://doi.org/10.1080/17489530802505362 .

Vohra, S., Shamseer, L., Sampson, M., Bukutu, C., Schmid, C. H., Tate, R., … Group, TC (2016). CONSORT extension for reporting N-of-1 trials (CENT) 2015 statement. Journal of Clinical Epidemiology, 76 , 9–17. https://doi.org/10.1016/j.jclinepi.2015.05.004 .

Wampold, B., & Worsham, N. (1986). Randomization tests for multiple-baseline designs. Behavioral Assessment, 8 , 135-143.

What Works Clearinghouse. (2020a). Procedures Handbook (Version 4.1). Retrieved from Institute of Education Sciences: https://ies.ed.gov/ncee/wwc/Docs/referenceresources/WWC-Procedures-Handbook-v4-1-508.pdf

What Works Clearinghouse. (2020b). Responses to comments from the public on updated version 4.1 of the WWC Procedures Handbook and WWC Standards Handbook. Retrieved from Institute of Education Sciences: https://ies.ed.gov/ncee/wwc/Docs/referenceresources/SumResponsePublicComments-v4-1-508.pdf

What Works Clearinghouse. (2020c). Standards Handbook, version 4.1. Retrieved from Institute of Education Sciences: https://ies.ed.gov/ncee/wwc/Docs/referenceresources/WWC-Standards-Handbook-v4-1-508.pdf

White, D. M., Rusch, F. R., Kazdin, A. E., & Hartmann, D. P. (1989). Applications of meta-analysis in individual-subject research. Behavioral Assessment, 11 , 281-296.

Wolery, M. (2013). A commentary: Single-case design technical document of the What Works Clearinghouse. Remedial and Special Education , 39-43. https://doi.org/10.1177/0741932512468038 .

Woo, H., Lu, J., Kuo, P., & Choi, N. (2016). A content analysis of articles focusing on single-case research design: ACA journals between 2003 and 2014. Asia Pacific Journal of Counselling and Psychotherapy, 7 , 118-132. https://doi.org/10.1080/21507686.2016.1199439 .

Download references

Author information

Authors and affiliations.

Faculty of Psychology and Educational Sciences, Methodology of Educational Sciences Research Group, KU Leuven, Tiensestraat 102, Box 3762, B-3000, Leuven, Belgium

René Tanious & Patrick Onghena

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to René Tanious .

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Electronic supplementary material

(DOCX 110 kb)

Rights and permissions

Reprints and permissions

About this article

Tanious, R., Onghena, P. A systematic review of applied single-case research published between 2016 and 2018: Study designs, randomization, data aspects, and data analysis. Behav Res 53 , 1371–1384 (2021). https://doi.org/10.3758/s13428-020-01502-4

Download citation

Accepted : 09 October 2020

Published : 26 October 2020

Issue Date : August 2021

DOI : https://doi.org/10.3758/s13428-020-01502-4

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Single-case experimental designs
  • Visual analysis
  • Statistical analysis
  • Data aspects
  • Systematic review
  • Find a journal
  • Publish with us
  • Track your research

IMAGES

  1. PPT

    what is single case study design

  2. How to Create a Case Study + 14 Case Study Templates

    what is single case study design

  3. Mixed Methods Single Case Research: State of the Art and Future

    what is single case study design

  4. Embedded single-case study design

    what is single case study design

  5. How to Write a Case Study

    what is single case study design

  6. PPT

    what is single case study design

VIDEO

  1. Ayurvedic Management of Varicose Ulcer Single case study

  2. chinese case study

  3. Using Case Study Design in Health Research

  4. Brown Jana Defense Presentation 05 11 2024 Recording

  5. Research Study Designs

  6. CASE STUDY in Research methods