• Introduction
  • Acknowledgements
  • 1. Groundwork
  • 1.1. Research
  • 1.2. Knowing
  • 1.3. Theories
  • 1.4. Ethics
  • 2. Paradigms
  • 2.1. Inferential Statistics
  • 2.2. Sampling
  • 2.3. Qualitative Rigor
  • 2.4. Design-Based Research
  • 2.5. Mixed Methods
  • 3. Learning Theories
  • 3.1. Behaviorism
  • 3.2. Cognitivism
  • 3.3. Constructivism
  • 3.4. Socioculturalism
  • 3.5. Connectivism
  • Appendix A. Supplements
  • Appendix B. Example Studies
  • Example Study #1. Public comment sentiment on educational videos
  • Example Study #2. Effects of open textbook adoption on teachers' open practices
  • Appendix C. Historical Readings
  • Manifesto of the Communist Party (1848)
  • On the Origin of Species (1859)
  • Science and the Savages (1905)
  • Theories of Knowledge (1916)
  • Theories of Morals (1916)
  • Translations

Design-Based Research

Choose a sign-in option.

Tools and Settings

Questions and Tasks

Citation and Embed Code

design based research (dbr)

In an educational setting, design-based research is a research approach that engages in iterative designs to develop knowledge that improves educational practices. This chapter will provide a brief overview of the origin, paradigms, outcomes, and processes of design-based research (DBR). In these sections we explain that (a) DBR originated because some researchers believed that traditional research methods failed to improve classroom practices, (b) DBR places researchers as agents of change and research subjects as collaborators, (c) DBR produces both new designs and theories, and (d) DBR consists of an iterative process of design and evaluation to develop knowledge.

Origin of DBR

DBR originated as researchers like Allan Collins (1990) and Ann Brown (1992) recognized that educational research often failed to improve classroom practices. They perceived that much of educational research was conducted in controlled, laboratory-like settings. They believed that this laboratory research was not as helpful as possible for practitioners.

Proponents of DBR claim that educational research is often detached from practice (The Design-Based Research Collective, 2002). There are at least two problems that arise from this detachment: (a) practitioners do not benefit from researchers’ work and (b) research results may be inaccurate because they fail to account for context (The Design-Based Research Collective, 2002).

Practitioners do not benefit from researchers’ work if the research is detached from practice. Practitioners are able to benefit from research when they see how the research can inform and improve their designs and practices. Some practitioners believe that educational research is often too abstract or sterilized to be useful in real contexts (The Design-Based Research Collective, 2002).

Not only is lack of relevance a problem, but research results can also be inaccurate by failing to account for context. Findings and theories based on lab results may not accurately reflect what happens in real-world educational settings.

Conversely, a problem that arises from an overemphasis on practice is that while individual practices may improve, the general body of theory and knowledge does not increase. Scholars like Collins (1990) and Brown (1992) believed that the best way to conduct research would be to achieve the right balance between theory-building and practical impact.

Paradigms of DBR

Proponents of DBR believe that conducting research in context, rather than in a controlled laboratory setting, and iteratively designing interventions yields authentic and useful knowledge. Sasha Barab (2004) says that the goal of DBR is to “directly impact practice while advancing theory that will be of use to others” (p. 8). This implies “a pragmatic philosophical underpinning, one in which the value of a theory lies in its ability to produce changes in the world” (p. 6). The aims of DBR and the role of researchers and subjects are informed by this philosophical underpinning.

Aims of DBR

Traditional, experimental research is conducted by theorists focused on isolating variables to test and refine theory. DBR is conducted by designers focused on (a) understanding contexts, (b) designing effective systems, and (c) making meaningful changes for the subjects of their studies (Barab & Squire, 2004; Collins, 1990). Traditional methods of research generate refined understandings of how the world works, which may indirectly affect practice. In DBR there is an intentionality in the research process to both refine theory and practice (Collins et al., 2004).

Role of DBR Researcher

In DBR, researchers assume the roles of “curriculum designers, and implicitly, curriculum theorists” (Barab & Squire, 2004, p.2). As curriculum designers, DBR researchers come into their contexts as informed experts with the purpose of creating, “test[ing] and refin[ing] educational designs based on principles derived from prior research” (Collins et al., 2004, p. 15). These educational designs may include curricula, practices, software, or tangible objects beneficial to the learning process (Barab & Squire, 2004). As curriculum theorists, DBR researchers also come into their research contexts with the purpose to refine extant theories about learning (Brown, 1992).

This duality of roles for DBR researchers contributes to a greater sense of responsibility and accountability within the field. Traditional, experimental researchers isolate themselves from the subjects of their study (Barab & Squire, 2004). This separation is seen as a virtue, allowing researchers to make dispassionate observations as they test and refine their understandings of the world around them. In comparison, design-based researchers “bring agendas to their work,” see themselves as necessary agents of change and see themselves as accountable for the work they do (Barab & Squire, 2004, p. 2).

Role of DBR Subjects

Within DBR, research subjects are seen as key contributors and collaborators in the research process. Classic experimentalism views the subjects of research as things to be observed or experimented on, suggesting a unidirectional relationship between researcher and research subject. The role of the research subject is to be available and genuine so that the researcher can make meaningful observations and collect accurate data. In contrast, design-based researchers view the subjects of their research (e.g., students, teachers, schools) as “co-participants” (Barab & Squire, 2004, p. 3) and “co-investigators” (Collins, 1990, p. 4). Research subjects are seen as necessary in “helping to formulate the questions,” “making refinements in the designs,” “evaluating the effects of...the experiment,” and “reporting the results of the experiment to other teachers and researchers” (Collins, 1990, pp. 4-5). Research subjects are co-workers with the researcher in iteratively pushing the study forward.

Outcomes of DBR

DBR educational research develops knowledge through this collaborative, iterative research process. The knowledge developed by DBR can be separated into two categories: (a) tangible, practical outcomes and (b) intangible, theoretical outcomes.

Tangibles Outcomes

A major goal of design-based research is producing meaningful interventions and practices. Within educational research these interventions may “involve the development of technological tools [and] curricula” (Barab & Squire, 2004, p. 1). But more than just producing meaningful educational products for a specific context, DBR aims to produce meaningful, effective educational products that can be transferred and adapted (Barab & Squire, 2004). As expressed by Brown (1992), “an effective intervention should be able to migrate from our experimental classroom to average classrooms operated by and for average students and teachers” (p.143).

Intangible Outcomes

It is important to recognize that DBR is not only concerned with improving practice but also aims to advance theory and understanding (Collins et al., 2004). DBR’s emphasis on the importance of context enhances the knowledge claims of the research. “Researchers investigate cognition in context...with the broad goal of developing evidence-based claims derived from both laboratory-based and naturalistic investigations that result in knowledge about how people learn” (Barab & Squire, 2004, p.1). This new knowledge about learning then drives future research and practice.

Process of DBR

A hallmark of DBR is the iterative nature of its interventions. As each iteration progresses, researchers refine and rework the intervention drawing on a variety of research methods that best fit the context. This flexibility allows the end result to take precedence over the process. While each researcher may use different methods, McKenny and Reeves (2012) outlined three core processes of DBR: (a) analysis and exploration, (b) design and construction, and (c) evaluation and reflection. To put these ideas in context, we will refer to a recent DBR study completed by Siko and Barbour regarding the use of PowerPoint games in the classroom.

DBR Cycle

Analysis and Exploration

Analysis is a critical aspect of DBR and must be used throughout the entire process. At the start of a DBR project, it is critical to understand and define which problem will be addressed. In collaboration with practitioners, researchers seek to understand all aspects of a problem. Additionally, they “seek out and learn from how others have viewed and solved similar problems ” (McKenny & Reeves, 2012, p. 85). This analysis helps to provide an understanding of the context within which to execute an intervention.

Since theories cannot account for the variety of variables in a learning situation, exploration is needed to fill the gaps. DBR researchers can draw from a number of disciplines and methodologies as they execute an intervention. The decision of which methodologies to use should be driven by the research context and goals.

Siko and Barbour (2016) used the DBR process to address a gap they found in research regarding the effectiveness of having students create their own PowerPoint games to review for a test. In analyzing existing research, they found studies that stated teaching students to create their own PowerPoint games did not improve content retention. Siko and Barbour wanted to “determine if changes to the implementation protocol would lead to improved performance” (Siko & Barbour, 2016, p. 420). They chose to test their theory in three different phases and adapt the curriculum following each phase.

Design and Construction

Informed by the analysis and exploration, researchers design and construct interventions, which may be a specific technology or “less concrete aspects such as activity structures, institutions, scaffolds, and curricula” (Design-Based Research Collective, 2003, pp. 5–6). This process involves laying out a variety of options for a solution and then creating the idea with the most promise.

Within Siko and Barbour’s design, they planned to observe three phases of a control group and a test group. Each phase would use t-tests to compare two unit tests for each group. They worked with teachers to implement time for playing PowerPoint games as well as a discussion on what makes games successful. The first implementation was a control phase that replicated past research and established a baseline. Once they finished that phase, they began to evaluate.

Evaluation and Reflection

Researchers can evaluate their designs both before and after use. The cyclical process involves careful, constant evaluation for each iteration so that improvements can be made. While tests and quizzes are a standard way of evaluating educational progress, interviews and observations also play a key role, as they allow for better understanding of how teachers and students might see the learning situation.

Reflection allows the researcher to make connections between actions and results. Researchers must take the time to analyze what changes allowed them to have success or failure so that theory and practice at large can be benefited. Collins (1990) states:

It is important to analyze the reasons for failure and to take steps to fix them. It is critical to document the nature of the failures and the attempted revisions, as well as the overall results of the experiment, because this information informs the path to success. (pg. 5)

As researchers reflect on each change they made, they find what is most useful to the field at large, whether it be a failure or a success.

After evaluating results of the first phase, Siko and Barbour revisited the literature of instructional games. Based on that research, they first tried extending the length of time students spent creating the games. They also discovered that the students struggled to design effective test questions, so the researchers tried working with teachers to spend more time explaining how to ask good questions. As they explored these options, researchers were able to see unit test scores improve.

Reflection on how the study was conducted allowed the researchers to properly place their experiences within the context of existing research. They recognized that while they found positive impacts as a result of their intervention, there were a number of limitations with the study. This is an important realization for the research and allows readers to not misinterpret the scope of the findings.

This chapter has provided a brief overview of the origin, paradigms, outcomes, and processes of Design-Based Research (DBR). We explained that (a) DBR originated because some researchers believed that traditional research methods failed to improve classroom practices, (b) DBR places researchers as agents of change and research subjects as collaborators, (c) DBR produces both new designs and theories, and (d) DBR consists of an iterative process of design and evaluation to develop knowledge.

Barab, S., & Squire, K. (2004). Design-based research: putting a stake in the ground. Journal of the Learning Sciences, 13(1), 1–14.

Brown, A. L. (1992). Design experiments: theoretical and methodological challenges in creating complex interventions in classroom settings. Journal of the Learning Sciences, 2(2), 141–178.

Collins, A. (1990). Toward a design science of education (Report No. 1). Washington, DC: Center for Technology in Education.

Collins, A., Joseph, D., & Bielaczyc, K. (2004). Design research: Theoretical and methodological issues. Journal of the Learning Sciences, 13(1), 15–42.

Mckenney, S., & Reeves, T.C. (2012) Conducting Educational Design Research. New York, NY: Routledge.

Siko, J. P., & Barbour, M. K. (2016). Building a better mousetrap: how design-based research was used to improve homemade PowerPoint games. TechTrends, 60(5), 419–424.

The Design-Based Research Collective. (2003). Design-based research: An emerging paradigm for educational inquiry. Educational Researcher, 32(1), 5–8.

This content is provided to you freely by BYU Open Learning Network.

Access it online or download it at https://open.byu.edu/education_research/design_based_research .

Change Password

Your password must have 8 characters or more and contain 3 of the following:.

  • a lower case character, 
  • an upper case character, 
  • a special character 

Password Changed Successfully

Your password has been changed

  • Sign in / Register

Request Username

Can't sign in? Forgot your username?

Enter your email address below and we will send you your username

If the address matches an existing account you will receive an email with instructions to retrieve your username

Design-Based Research: A Methodology to Extend and Enrich Biology Education Research

  • Emily E. Scott
  • Mary Pat Wenderoth
  • Jennifer H. Doherty

*Address correspondence to: Emily E. Scott ( E-mail Address: [email protected] ).

Department of Biology, University of Washington, Seattle, WA 98195

Search for more papers by this author

Recent calls in biology education research (BER) have recommended that researchers leverage learning theories and methodologies from other disciplines to investigate the mechanisms by which students to develop sophisticated ideas. We suggest design-based research from the learning sciences is a compelling methodology for achieving this aim. Design-based research investigates the “learning ecologies” that move student thinking toward mastery. These “learning ecologies” are grounded in theories of learning, produce measurable changes in student learning, generate design principles that guide the development of instructional tools, and are enacted using extended, iterative teaching experiments. In this essay, we introduce readers to the key elements of design-based research, using our own research into student learning in undergraduate physiology as an example of design-based research in BER. Then, we discuss how design-based research can extend work already done in BER and foster interdisciplinary collaborations among cognitive and learning scientists, biology education researchers, and instructors. We also explore some of the challenges associated with this methodological approach.

INTRODUCTION

There have been recent calls for biology education researchers to look toward other fields of educational inquiry for theories and methodologies to advance, and expand, our understanding of what helps students learn to think like biologists ( Coley and Tanner, 2012 ; Dolan, 2015 ; Peffer and Renken, 2016 ; Lo et al. , 2019 ). These calls include the recommendations that biology education researchers ground their work in learning theories from the cognitive and learning sciences ( Coley and Tanner, 2012 ) and begin investigating the underlying mechanisms by which students to develop sophisticated biology ideas ( Dolan, 2015 ; Lo et al. , 2019 ). Design-based research from the learning sciences is one methodology that seeks to do both by using theories of learning to investigate how “learning ecologies”—that is, complex systems of interactions among instructors, students, and environmental components—support the process of student learning ( Brown, 1992 ; Cobb et al. , 2003 ; Collins et al. , 2004 ; Peffer and Renken, 2016 ).

The purpose of this essay is twofold. First, we want to introduce readers to the key elements of design-based research, using our research into student learning in undergraduate physiology as an example of design-based research in biology education research (BER). Second, we will discuss how design-based research can extend work already done in BER and explore some of the challenges of its implementation. For a more in-depth review of design-based research, we direct readers to the following references: Brown (1992) , Barab and Squire (2004) , and Collins et al. (2004) , as well as commentaries by Anderson and Shattuck (2012) and McKenney and Reeves (2013) .

WHAT IS DESIGN-BASED RESEARCH?

Design-based research is a methodological approach that aligns with research methods from the fields of engineering or applied physics, where products are designed for specific purposes ( Brown, 1992 ; Joseph, 2004 ; Middleton et al. , 2008 ; Kelly, 2014 ). Consequently, investigators using design-based research approach educational inquiry much as an engineer develops a new product: First, the researchers identify a problem that needs to be addressed (e.g., a particular learning challenge that students face). Next, they design a potential “solution” to the problem in the form of instructional tools (e.g., reasoning strategies, worksheets; e.g., Reiser et al. , 2001 ) that theory and previous research suggest will address the problem. Then, the researchers test the instructional tools in a real-world setting (i.e., the classroom) to see if the tools positively impact student learning. As testing proceeds, researchers evaluate the instructional tools with emerging evidence of their effectiveness (or lack thereof) and progressively revise the tools— in real time —as necessary ( Collins et al. , 2004 ). Finally, the researchers reflect on the outcomes of the experiment, identifying the features of the instructional tools that were successful at addressing the initial learning problem, revising those aspects that were not helpful to learning, and determining how the research informed the theory underlying the experiment. This leads to another research cycle of designing, testing, evaluating, and reflecting to refine the instructional tools in support of student learning. We have characterized this iterative process in Figure 1 after Sandoval (2014) . Though we have portrayed four discrete phases to design-based research, there is often overlap of the phases as the research progresses (e.g., testing and evaluating can occur simultaneously).

FIGURE 1. The four phases of design-based research experienced in an iterative cycle (A). We also highlight the main features of each phase of our design-based research project investigating students’ use of flux in physiology (B).

Design-based research has no specific requirements for the form that instructional tools must take or the manner in which the tools are evaluated ( Bell, 2004 ; Anderson and Shattuck, 2012 ). Instead, design-based research has what Sandoval (2014) calls “epistemic commitments” 1 that inform the major goals of a design-based research project as well as how it is implemented. These epistemic commitments are: 1) Design based research should be grounded in theories of learning (e.g., constructivism, knowledge-in-pieces, conceptual change) that both inform the design of the instructional tools and are improved upon by the research ( Cobb et al. , 2003 ; Barab and Squire, 2004 ). This makes design-based research more than a method for testing whether or not an instructional tool works; it also investigates why the design worked and how it can be generalized to other learning environments ( Cobb et al. , 2003 ). 2) Design-based research should aim to produce measurable changes in student learning in classrooms around a particular learning problem ( Anderson and Shattuck, 2012 ; McKenney and Reeves, 2013 ). This requirement ensures that theoretical research into student learning is directly applicable, and impactful, to students and instructors in classroom settings ( Hoadley, 2004 ). 3) Design-based research should generate design principles that guide the development and implementation of future instructional tools ( Edelson, 2002 ). This commitment makes the research findings broadly applicable for use in a variety of classroom environments. 4) Design-based research should be enacted using extended, iterative teaching experiments in classrooms. By observing student learning over an extended period of time (e.g., throughout an entire term or across terms), researchers are more likely to observe the full effects of how the instructional tools impact student learning compared with short-term experiments ( Brown, 1992 ; Barab and Squire, 2004 ; Sandoval and Bell, 2004 ).

HOW IS DESIGN-BASED RESEARCH DIFFERENT FROM AN EXPERIMENTAL APPROACH?

Many BER studies employ experimental approaches that align with traditional scientific methods of experimentation, such as using treatment versus control groups, randomly assigning treatments to different groups, replicating interventions across multiple spatial or temporal periods, and using statistical methods to guide the kinds of inferences that arise from an experiment. While design-based research can similarly employ these strategies for educational inquiry, there are also some notable differences in its approach to experimentation ( Collins et al. , 2004 ; Hoadley, 2004 ). In this section, we contrast the differences between design-based research and what we call “experimental approaches,” although both paradigms represent a form of experimentation.

The first difference between an experimental approach and design-based research regards the role participants play in the experiment. In an experimental approach, the researcher is responsible for making all the decisions about how the experiment will be implemented and analyzed, while the instructor facilitates the experimental treatments. In design-based research, both researchers and instructors are engaged in all stages of the research from conception to reflection ( Collins et al. , 2004 ). In BER, a third condition frequently arises wherein the researcher is also the instructor. In this case, if the research questions being investigated produce generalizable results that have the potential to impact teaching broadly, then this is consistent with a design-based research approach ( Cobb et al. , 2003 ). However, when the research questions are self-reflective about how a researcher/instructor can improve his or her own classroom practices, this aligns more closely with “action research,” which is another methodology used in education research (see Stringer, 2013 ).

A second difference between experimental research and design-based research is the form that hypotheses take and the manner in which they are investigated ( Collins et al. , 2004 ; Sandoval, 2014 ). In experimental approaches, researchers develop a hypothesis about how a specific instructional intervention will impact student learning. The intervention is then tested in the classroom(s) while controlling for other variables that are not part of the study in order to isolate the effects of the intervention. Sometimes, researchers designate a “control” situation that serves as a comparison group that does not experience the intervention. For example, Jackson et al. (2018) were interested in comparing peer- and self-grading of weekly practice exams to if they were equally effective forms of deliberate practice for students in a large-enrollment class. To test this, the authors (including authors of this essay J.H.D., M.P.W.) designed an experiment in which lab sections of students in a large lecture course were randomly assigned to either a peer-grading or self-grading treatment so they could isolate the effects of each intervention. In design-based research, a hypothesis is conceptualized as the “design solution” rather than a specific intervention; that is, design-based researchers hypothesize that the designed instructional tools, when implemented in the classroom, will create a learning ecology that improves student learning around the identified learning problem ( Edelson, 2002 ; Bell, 2004 ). For example, Zagallo et al. (2016) developed a laboratory curriculum (i.e., the hypothesized “design solution”) for molecular and cellular biology majors to address the learning problem that students often struggle to connect scientific models and empirical data. This curriculum entailed: focusing instruction around a set of target biological models; developing small-group activities in which students interacted with the models by analyzing data from scientific papers; using formative assessment tools for student feedback; and providing students with a set of learning objectives they could use as study tools. They tested their curriculum in a novel, large-enrollment course of upper-division students over several years, making iterative changes to the curriculum as the study progressed.

By framing the research approach as an iterative endeavor of progressive refinement rather than a test of a particular intervention when all other variables are controlled, design-based researchers recognize that: 1) classrooms, and classroom experiences, are unique at any given time, making it difficult to truly “control” the environment in which an intervention occurs or establish a “control group” that differs only in the features of an intervention; and 2) many aspects of a classroom experience may influence the effectiveness of an intervention, often in unanticipated ways, which should be included in the research team’s analysis of an intervention’s success. Consequently, the research team is less concerned with controlling the research conditions—as in an experimental approach—and instead focuses on characterizing the learning environment ( Barab and Squire, 2004 ). This involves collecting data from multiple sources as the research progresses, including how the instructional tools were implemented, aspects of the implementation process that failed to go as planned, and how the instructional tools or implementation process was modified. These characterizations can provide important insights into what specific features of the instructional tools, or the learning environment, were most impactful to learning ( DBR Collective, 2003 ).

A third difference between experimental approaches and design-based research is when the instructional interventions can be modified. In experimental research, the intervention is fixed throughout the experimental period, with any revisions occurring only after the experiment has concluded. This is critical for ensuring that the results of the study provide evidence of the efficacy of a specific intervention. By contrast, design-based research takes a more flexible approach that allows instructional tools to be modified in situ as they are being implemented ( Hoadley, 2004 ; Barab, 2014 ). This flexibility allows the research team to modify instructional tools or strategies that prove inadequate for collecting the evidence necessary to evaluate the underlying theory and ensures a tight connection between interventions and a specific learning problem ( Collins et al. , 2004 ; Hoadley, 2004 ).

Finally, and importantly, experimental approaches and design-based research differ in the kinds of conclusions they draw from their data. Experimental research can “identify that something meaningful happened; but [it is] not able to articulate what about the intervention caused that story to unfold” ( Barab, 2014 , p. 162). In other words, experimental methods are robust for identifying where differences in learning occur, such as between groups of students experiencing peer- or self-grading of practice exams ( Jackson et al. , 2018 ) or receiving different curricula (e.g., Chi et al. , 2012 ). However, these methods are not able to characterize the underlying learning process or mechanism involved in the different learning outcomes. By contrast, design-based research has the potential to uncover mechanisms of learning, because it investigates how the nature of student thinking changes as students experience instructional interventions ( Shavelson et al. , 2003 ; Barab, 2014 ). According to Sandoval (2014) , “Design research, as a means of uncovering causal processes, is oriented not to finding effects but to finding functions , to understanding how desired (and undesired) effects arise through interactions in a designed environment” (p. 30). In Zagallo et al. (2016) , the authors found that their curriculum supported students’ data-interpretation skills, because it stimulated students’ spontaneous use of argumentation during which group members coconstructed evidence-based claims from the data provided. Students also worked collaboratively to decode figures and identify data patterns. These strategies were identified from the researchers’ qualitative data analysis of in-class recordings of small-group discussions, which allowed them to observe what students were doing to support their learning. Because design-based research is focused on characterizing how learning occurs in classrooms, it can begin to answer the kinds of mechanistic questions others have identified as central to advancing BER ( National Research Council [NRC], 2012 ; Dolan, 2015 ; Lo et al. , 2019 ).

DESIGN-BASED RESEARCH IN ACTION: AN EXAMPLE FROM UNDERGRADUATE PHYSIOLOGY

To illustrate how design-based research could be employed in BER, we draw on our own research that investigates how students learn physiology. We will characterize one iteration of our design-based research cycle ( Figure 1 ), emphasizing how our project uses Sandoval’s four epistemic commitments (i.e., theory driven, practically applied, generating design principles, implemented in an iterative manner) to guide our implementation.

Identifying the Learning Problem

Understanding physiological phenomena is challenging for students, given the wide variety of contexts (e.g., cardiovascular, neuromuscular, respiratory; animal vs. plant) and scales involved (e.g., using molecular-level interactions to explain organism functioning; Wang, 2004 ; Michael, 2007 ; Badenhorst et al. , 2016 ). To address these learning challenges, Modell (2000) identified seven “general models” that undergird most physiology phenomena (i.e., control systems, conservation of mass, mass and heat flow, elastic properties of tissues, transport across membranes, cell-to-cell communication, molecular interactions). Instructors can use these models as a “conceptual framework” to help students build intellectual coherence across phenomena and develop a deeper understanding of physiology ( Modell, 2000 ; Michael et al. , 2009 ). This approach aligns with theoretical work in the learning sciences that indicates that providing students with conceptual frameworks improves their ability to integrate and retrieve knowledge ( National Academies of Sciences, Engineering, and Medicine, 2018 ).

Before the start of our design-based project, we had been using Modell’s (2000) general models to guide our instruction. In this essay, we will focus on how we used the general models of mass and heat flow and transport across membranes in our instruction. These two models together describe how materials flow down gradients (e.g., pressure gradients, electrochemical gradients) against sources of resistance (e.g., tube diameter, channel frequency). We call this flux reasoning. We emphasized the fundamental nature and broad utility of flux reasoning in lecture and lab and frequently highlighted when it could be applied to explain a phenomenon. We also developed a conceptual scaffold (the Flux Reasoning Tool) that students could use to reason about physiological processes involving flux.

Although these instructional approaches had improved students’ understanding of flux phenomena, we found that students often demonstrated little commitment to using flux broadly across physiological contexts. Instead, they considered flux to be just another fact to memorize and applied it to narrow circumstances (e.g., they would use flux to reason about ions flowing across membranes—the context where flux was first introduced—but not the bulk flow of blood in a vessel). Students also struggled to integrate the various components of flux (e.g., balancing chemical and electrical gradients, accounting for variable resistance). We saw these issues reflected in students’ lower than hoped for exam scores on the cumulative final of the course. From these experiences, and from conversations with other physiology instructors, we identified a learning problem to address through design-based research: How do students learn to use flux reasoning to explain material flows in multiple physiology contexts?

The process of identifying a learning problem usually emerges from a researcher’s own experiences (in or outside a classroom) or from previous research that has been described in the literature ( Cobb et al. , 2003 ). To remain true to Sandoval’s first epistemic commitment, a learning problem must advance a theory of learning ( Edelson, 2002 ; McKenney and Reeves, 2013 ). In our work, we investigated how conceptual frameworks based on fundamental scientific concepts (i.e., Modell’s general models) could help students reason productively about physiology phenomena (National Academies of Sciences, Engineering, and Medicine, 2018; Modell, 2000 ). Our specific theoretical question was: Can we characterize how students’ conceptual frameworks around flux change as they work toward robust ideas? Sandoval’s second epistemic commitment stated that a learning problem must aim to improve student learning outcomes. The practical significance of our learning problem was: Does using the concept of flux as a foundational idea for instructional tools increase students’ learning of physiological phenomena?

We investigated our learning problem in an introductory biology course at a large R1 institution. The introductory course is the third in a biology sequence that focuses on plant and animal physiology. The course typically serves between 250 and 600 students in their sophomore or junior years each term. Classes have the following average demographics: 68% male, 21% from lower-income situations, 12% from an underrepresented minority, and 26% first-generation college students.

Design-Based Research Cycle 1, Phase 1: Designing Instructional Tools

The first phase of design-based research involves developing instructional tools that address both the theoretical and practical concerns of the learning problem ( Edelson, 2002 ; Wang and Hannafin, 2005 ). These instructional tools can take many forms, such as specific instructional strategies, classroom worksheets and practices, or technological software, as long as they embody the underlying learning theory being investigated. They must also produce classroom experiences or materials that can be evaluated to determine whether learning outcomes were met ( Sandoval, 2014 ). Indeed, this alignment between theory, the nature of the instructional tools, and the ways students are assessed is central to ensuring rigorous design-based research ( Hoadley, 2004 ; Sandoval, 2014 ). Taken together, the instructional tools instantiate a hypothesized learning environment that will advance both the theoretical and practical questions driving the research ( Barab, 2014 ).

In our work, the theoretical claim that instruction based on fundamental scientific concepts would support students’ flux reasoning was embodied in our instructional approach by being the central focus of all instructional materials, which included: a revised version of the Flux Reasoning Tool ( Figure 2 ); case study–based units in lecture that explicitly emphasized flux phenomena in real-world contexts ( Windschitl et al. , 2012 ; Scott et al. , 2018 ; Figure 3 ); classroom activities in which students practiced using flux to address physiological scenarios; links to online videos describing key flux-related concepts; constructed-response assessment items that cued students to use flux reasoning in their thinking; and pretest/posttest formative assessment questions that tracked student learning ( Figure 4 ).

FIGURE 2. The Flux Reasoning Tool given to students at the beginning of the quarter.

FIGURE 3. An example flux case study that is presented to students at the beginning of the neurophysiology unit. Throughout the unit, students learn how ion flows into and out of cells, as mediated by chemical and electrical gradients and various ion/molecular channels, sends signals throughout the body. They use this information to better understand why Jaime experiences persistent neuropathy. Images from: uz.wikipedia.org/wiki/Fayl:Blausen_0822_SpinalCord.png and commons.wikimedia.org/wiki/File:Figure_38_01_07.jpg.

FIGURE 4. An example flux assessment question about ion flows given in a pre-unit/post-unit formative assessment in the neurophysiology unit.

Phase 2: Testing the Instructional Tools

In the second phase of design-based research, the instructional tools are tested by implementing them in classrooms. During this phase, the instructional tools are placed “in harm’s way … in order to expose the details of the process to scrutiny” ( Cobb et al. , 2003 , p. 10). In this way, researchers and instructors test how the tools perform in real-world settings, which may differ considerably from the design team’s initial expectations ( Hoadley, 2004 ). During this phase, if necessary, the design team may make adjustments to the tools as they are being used to account for these unanticipated conditions ( Collins et al. , 2004 ).

We implemented the instructional tools during the Autumn and Spring quarters of the 2016–2017 academic year. Students were taught to use the Flux Reasoning Tool at the beginning of the term in the context of the first case study unit focused on neurophysiology. Each physiology unit throughout the term was associated with a new concept-based case study (usually about flux) that framed the context of the teaching. Embedded within the daily lectures were classroom activities in which students could practice using flux. Students were also assigned readings from the textbook and videos related to flux to watch during each unit. Throughout the term, students took five exams that each contained some flux questions as well as some pre- and post-unit formative assessment questions. During Winter quarter, we conducted clinical interviews with students who would take our course in the Spring term (i.e., “pre” data) as well as students who had just completed our course in Autumn (i.e., “post” data).

Phase 3: Evaluating the Instructional Tools

The third phase of a design-based research cycle involves evaluating the effectiveness of instructional tools using evidence of student learning ( Barab and Squire, 2004 ; Anderson and Shattuck, 2012 ). This can be done using products produced by students (e.g., homework, lab reports), attitudinal gains measured with surveys, participation rates in activities, interview testimonials, classroom discourse practices, and formative assessment or exam data (e.g., Reiser et al. , 2001 ; Cobb et al. , 2003 ; Barab and Squire, 2004 ; Mohan et al. , 2009 ). Regardless of the source, evidence must be in a form that supports a systematic analysis that could be scrutinized by other researchers ( Cobb et al. , 2003 ; Barab, 2014 ). Also, because design-based research often involves multiple data streams, researchers may need to use both quantitative and qualitative analytical methods to produce a rich picture of how the instructional tools affected student learning ( Collins et al. , 2004 ; Anderson and Shattuck, 2012 ).

In our work, we used the quality of students’ written responses on exams and formative assessment questions to determine whether students improved their understanding of physiological phenomena involving flux. For each assessment question, we analyzed a subset of student’s pretest answers to identify overarching patterns in students’ reasoning about flux, characterized these overarching patterns, then ordinated the patterns into different levels of sophistication. These became our scoring rubrics, which identified five different levels of student reasoning about flux. We used the rubrics to code the remainder of students’ responses, with a code designating the level of student reasoning associated with a particular reasoning pattern. We used this ordinal rubric format because it would later inform our theoretical understanding of how students build flux conceptual frameworks (see phase 4). This also allowed us to both characterize the ideas students held about flux phenomena and identify the frequency distribution of those ideas in a class.

By analyzing changes in the frequency distributions of students’ ideas across the rubric levels at different time points in the term (e.g., pre-unit vs. post-unit), we could track both the number of students who gained more sophisticated ideas about flux as the term progressed and the quality of those ideas. If the frequency of students reasoning at higher levels increased from pre-unit to post-unit assessments, we could conclude that our instructional tools as a whole were supporting students’ development of sophisticated flux ideas. For example, on one neuromuscular ion flux assessment question in the Spring of 2017, we found that relatively more students were reasoning at the highest levels of our rubric (i.e., levels 4 and 5) on the post-unit test compared with the pre-unit test. This meant that more students were beginning to integrate sophisticated ideas about flux (i.e., they were balancing concentration and electrical gradients) in their reasoning about ion movement.

To help validate this finding, we drew on three additional data streams: 1) from in-class group recordings of students working with flux items, we noted that students increasingly incorporated ideas about gradients and resistance when constructing their explanations as the term progressed; 2) from plant assessment items in the latter part of the term, we began to see students using flux ideas unprompted; and 3) from interviews, we observed that students who had already taken the course used flux ideas in their reasoning.

Through these analyses, we also noticed an interesting pattern in the pre-unit test data for Spring 2017 when compared with the frequency distribution of students’ responses with a previous term (Autumn 2016). In Spring 2017, 42% of students reasoned at level 4 or 5 on the pre-unit test, indicating these students already had sophisticated ideas about ion flux before they took the pre-unit assessment. This was surprising, considering only 2% of students reasoned at these levels for this item on the Autumn 2016 pre-unit test.

Phase 4: Reflecting on the Instructional Tools and Their Implementation

The final phase of a design-based research cycle involves a retrospective analysis that addresses the epistemic commitments of this methodology: How was the theory underpinning the research advanced by the research endeavor (theoretical outcome)? Did the instructional tools support student learning about the learning problem (practical outcome)? What were the critical features of the design solution that supported student learning (design principles)? ( Cobb et al. , 2003 ; Barab and Squire, 2004 ).

Theoretical Outcome (Epistemic Commitment 1).

Reflecting on how a design-based research experiment advances theory is critical to our understanding of how students learn in educational settings ( Barab and Squire, 2004 ; Mohan et al. , 2009 ). In our work, we aimed to characterize how students’ conceptual frameworks around flux change as they work toward robust ideas. To do this, we drew on learning progression research as our theoretical framing ( NRC, 2007 ; Corcoran et al. , 2009 ; Duschl et al. , 2011 ; Scott et al. , 2019 ). Learning progression frameworks describe empirically derived patterns in student thinking that are ordered into levels representing cognitive shifts in the ways students conceive a topic as they work toward mastery ( Gunckel et al. , 2012 ). We used our ion flux scoring rubrics to create a preliminary five-level learning progression framework ( Table 1 ). The framework describes how students’ ideas about flux often start with teleological-driven accounts at the lowest level (i.e., level 1), shift to focusing on driving forces (e.g., concentration gradients, electrical gradients) in the middle levels, and arrive at complex ideas that integrate multiple interacting forces at the higher levels. We further validated these reasoning patterns with our student interviews. However, our flux conceptual framework was largely based on student responses to our ion flux assessment items. Therefore, to further validate our learning progression framework, we needed a greater diversity of flux assessment items that investigated student thinking more broadly (i.e., about bulk flow, water movement) across physiological systems.

Practical Outcome (Epistemic Commitment 2).

In design-based research, learning theories must “do real work” by improving student learning in real-world settings ( DBR Collective, 2003 ). Therefore, design-based researchers must reflect on whether or not the data they collected show evidence that the instructional tools improved student learning ( Cobb et al. , 2003 ; Sharma and McShane, 2008 ). We determined whether our flux-based instructional approach aided student learning by analyzing the kinds of answers students provided to our assessment questions. Specifically, we considered students who reasoned at level 4 or above as demonstrating productive flux reasoning. Because almost half of students were reasoning at level 4 or 5 on the post-unit assessment after experiencing the instructional tools in the neurophysiology unit (in Spring 2017), we concluded that our tools supported student learning in physiology. Additionally, we noticed that students used language in their explanations that directly tied to the Flux Reasoning Tool ( Figure 2 ), which instructed them to use arrows to indicate the magnitude and direction of gradient-driving forces. For example, in a posttest response to our ion flux item ( Figure 4 ), one student wrote:

Ion movement is a function of concentration and electrical gradients . Which arrow is stronger determines the movement of K+. We can make the electrical arrow bigger and pointing in by making the membrane potential more negative than Ek [i.e., potassium’s equilibrium potential]. We can make the concentration arrow bigger and pointing in by making a very strong concentration gradient pointing in.

Given that almost half of students reasoned at level 4 or above, and that students used language from the Flux Reasoning Tool, we concluded that using fundamental concepts was a productive instructional approach for improving student learning in physiology and that our instructional tools aided student learning. However, some students in the 2016–2017 academic year continued to apply flux ideas more narrowly than intended (i.e., for ion and simple diffusion cases, but not water flux or bulk flow). This suggested that students had developed nascent flux conceptual frameworks after experiencing the instructional tools but could use more support to realize the broad applicability of this principle. Also, although our cross-sectional interview approach demonstrated how students’ ideas, overall, could change after experiencing the instructional tools, it did not provide information about how a student developed flux reasoning.

Reflecting on practical outcomes also means interpreting any learning gains in the context of the learning ecology. This reflection allowed us to identify whether there were particular aspects of the instructional tools that were better at supporting learning than others ( DBR Collective, 2003 ). Indeed, this was critical for our understanding why 42% of students scored at level 3 and above on the pre-unit ion assessment in the Spring of 2017, while only 2% of students scored level 3 and above in Autumn of 2016. When we reviewed notes of the Spring 2017 implementation scheme, we saw that the pretest was due at the end of the first day of class after students had been exposed to ion flux ideas in class and in a reading/video assignment about ion flow, which may be one reason for the students’ high performance on the pretest. Consequently, we could not tell whether students’ initial high performance was due to their learning from the activities in the first day of class or for other reasons we did not measure. It also indicated we needed to close pretests before the first day of class for a more accurate measure of students’ incoming ideas and the effectiveness of the instructional tools employed at the beginning of the unit.

Design Principles (Epistemic Commitment 3).

Although design-based research is enacted in local contexts (i.e., a particular classroom), its purpose is to inform learning ecologies that have broad applications to improve learning and teaching ( Edelson, 2002 ; Cobb et al. , 2003 ). Therefore, design-based research should produce design principles that describe characteristics of learning environments that researchers and instructors can use to develop instructional tools specific to their local contexts (e.g., Edelson, 2002 ; Subramaniam et al. , 2015 ). Consequently, the design principles must balance specificity with adaptability so they can be used broadly to inform instruction ( Collins et al. , 2004 ; Barab, 2014 ).

From our first cycle of design-based research, we developed the following design principles: 1) Key scientific concepts should provide an overarching framework for course organization. This way, the individual components that make up a course, like instructional units, activities, practice problems, and assessments, all reinforce the centrality of the key concept. 2) Instructional tools should explicitly articulate the principle of interest, with specific guidance on how that principle is applied in context. This stresses the applied nature of the principle and that it is more than a fact to be memorized. 3) Instructional tools need to show specific instances of how the principle is applied in multiple contexts to combat students’ narrow application of the principle to a limited number of contexts.

Design-Based Research Cycle 2, Phase 1: Redesign and Refine the Experiment

The last “epistemic commitment” Sandoval (2014) articulated was that design-based research be an iterative process with an eye toward continually refining the instructional tools, based on evidence of student learning, to produce more robust learning environments. By viewing educational inquiry as formative research, design-based researchers recognize the difficulty in accounting for all variables that could impact student learning, or the implementation of the instructional tools, a priori ( Collins et al. , 2004 ). Robust instructional designs are the products of trial and error, which are strengthened by a systematic analysis of how they perform in real-world settings.

To continue to advance our work investigating student thinking using the principle of flux, we began a second cycle of design-based research that continued to address the learning problem of helping students reason with fundamental scientific concepts. In this cycle, we largely focused on broadening the number of physiological systems that had accompanying formative assessment questions (i.e., beyond ion flux), collecting student reasoning from a more diverse population of students (e.g., upper division, allied heath, community college), and refining and validating the flux learning progression with both written and interview data in a student through time. We developed a suite of constructed-response flux assessment questions that spanned neuromuscular, cardiovascular, respiratory, renal, and plant physiological contexts and asked students about several kinds of flux: ion movement, diffusion, water movement, and bulk flow (29 total questions; available at beyondmultiplechoice.org). This would provide us with rich qualitative data that we could use to refine the learning progression. We decided to administer written assessments and conduct interviews in a pretest/posttest manner at the beginning and end of each unit both as a way to increase our data about student reasoning and to provide students with additional practice using flux reasoning across contexts.

From this second round of designing instructional tools (i.e., broader range of assessment items), testing them in the classroom (i.e., administering the assessment items to diverse student populations), evaluating the tools (i.e., developing learning progression–aligned rubrics across phenomena from student data, tracking changes in the frequency distribution of students across levels through time), and reflecting on the tools’ success, we would develop a more thorough and robust characterization of how students use flux across systems that could better inform our creation of new instructional tools to support student learning.

HOW CAN DESIGN-BASED RESEARCH EXTEND AND ENRICH BER?

While design-based research has primarily been used in educational inquiry at the K–12 level (see Reiser et al. , 2001 ; Mohan et al. , 2009 ; Jin and Anderson, 2012 ), other science disciplines at undergraduate institutions have begun to employ this methodology to create robust instructional approaches (e.g., Szteinberg et al. , 2014 in chemistry; Hake, 2007 , and Sharma and McShane, 2008 , in physics; Kelly, 2014 , in engineering). Our own work, as well as that by Zagallo et al. (2016) , provides two examples of how design-based research could be implemented in BER. Below, we articulate some of the ways incorporating design-based research into BER could extend and enrich this field of educational inquiry.

Design-Based Research Connects Theory with Practice

One critique of BER is that it does not draw heavily enough on learning theories from other disciplines like cognitive psychology or the learning sciences to inform its research ( Coley and Tanner, 2012 ; Dolan, 2015 ; Peffer and Renken, 2016 ; Davidesco and Milne, 2019 ). For example, there has been considerable work in BER developing concept inventories as formative assessment tools that identify concepts students often struggle to learn (e.g., Marbach-Ad et al. , 2009 ; McFarland et al. , 2017 ; Summers et al. , 2018 ). However, much of this work is detached from a theoretical understanding of why students hold misconceptions in the first place, what the nature of their thinking is, and the learning mechanisms that would move students to a more productive understanding of domain ideas ( Alonzo, 2011 ). Using design-based research to understand the basis of students’ misconceptions would ground these practical learning problems in a theoretical understanding of the nature of student thinking (e.g., see Coley and Tanner, 2012 , 2015 ; Gouvea and Simon, 2018 ) and the kinds of instructional tools that would best support the learning process.

Design-Based Research Fosters Collaborations across Disciplines

Recently, there have been multiple calls across science, technology, engineering, and mathematics education fields to increase collaborations between BER and other disciplines so as to increase the robustness of science education research at the collegiate level ( Coley and Tanner, 2012 ; NRC, 2012 ; Talanquer, 2014 ; Dolan, 2015 ; Peffer and Renken, 2016 ; Mestre et al. , 2018 ; Davidesco and Milne, 2019 ). Engaging in design-based research provides both a mechanism and a motivation for fostering interdisciplinary collaborations, as it requires the design team to have theoretical knowledge of how students learn, domain knowledge of practical learning problems, and instructional knowledge for how to implement instructional tools in the classroom ( Edelson, 2002 ; Hoadley, 2004 ; Wang and Hannafin, 2005 ; Anderson and Shattuck, 2012 ). For example, in our current work, our research team consists of two discipline-based education learning scientists from an R1 institution, two physiology education researchers/instructors (one from an R1 institution the other from a community college), several physiology disciplinary experts/instructors, and a K–12 science education expert.

Design-based research collaborations have several distinct benefits for BER: first, learning or cognitive scientists could provide theoretical and methodological expertise that may be unfamiliar to biology education researchers with traditional science backgrounds ( Lo et al. , 2019 ). This would both improve the rigor of the research project and provide biology education researchers with the opportunity to explore ideas and methods from other disciplines. Second, collaborations between researchers and instructors could help increase the implementation of evidence-based teaching practices by instructors/faculty who are not education researchers and would benefit from support while shifting their instructional approaches ( Eddy et al. , 2015 ). This may be especially true for community college and primarily undergraduate institution faculty who often do not have access to the same kinds of resources that researchers and instructors at research-intensive institutions do ( Schinske et al. , 2017 ). Third, making instructors an integral part of a design-based research project ensures they are well versed in the theory and learning objectives underlying the instructional tools they are implementing in the classroom. This can improve the fidelity of implementation of the instructional tools, because the instructors understand the tools’ theoretical and practical purposes, which has been cited as one reason there have been mixed results on the impact of active learning across biology classes ( Andrews et al. , 2011 ; Borrego et al. , 2013 ; Lee et al. , 2018 ; Offerdahl et al. , 2018 ). It also gives instructors agency to make informed adjustments to the instructional tools during implementation that improve their practical applications while remaining true to the goals of the research ( Hoadley, 2004 ).

Design-Based Research Invites Using Mixed Methods to Analyze Data

The diverse nature of the data that are often collected in design-based research can require both qualitative and quantitative methodologies to produce a rich picture of how the instructional tools and their implementation influenced student learning ( Anderson and Shattuck, 2012 ). Using mixed methods may be less familiar to biology education researchers who were primarily trained in quantitative methods as biologists ( Lo et al. , 2019 ). However, according to Warfa (2016 , p. 2), “Integration of research findings from quantitative and qualitative inquiries in the same study or across studies maximizes the affordances of each approach and can provide better understanding of biology teaching and learning than either approach alone.” Although the number of BER studies using mixed methods has increased over the past decade ( Lo et al. , 2019 ), engaging in design-based research could further this trend through its collaborative nature of bringing social scientists together with biology education researchers to share research methodologies from different fields. By leveraging qualitative and quantitative methods, design-based researchers unpack “mechanism and process” by characterizing the nature of student thinking rather than “simply reporting that differences did or did not occur” ( Barab, 2014 , p. 158), which is important for continuing to advance our understanding of student learning in BER ( Dolan, 2015 ; Lo et al. , 2019 ).

CHALLENGES TO IMPLEMENTING DESIGN-BASED RESEARCH IN BER

As with any methodological approach, there can be challenges to implementing design-based research. Here, we highlight three that may be relevant to BER.

Collaborations Can Be Difficult to Maintain

While collaborations between researchers and instructors offer many affordances (as discussed earlier), the reality of connecting researchers across departments and institutions can be challenging. For example, Peffer and Renken (2016) noted that different traditions of scholarship can present barriers to collaboration where there is not mutual respect for the methods and ideas that are part and parcel to each discipline. Additionally, Schinske et al. (2017) identified several constraints that community college faculty face for engaging in BER, such as limited time or support (e.g., infrastructural, administrative, and peer support), which could also impact their ability to form the kinds of collaborations inherent in design-based research. Moreover, the iterative nature of design-based research requires these collaborations to persist for an extended period of time. Attending to these challenges is an important part of forming the design team and identifying the different roles researchers and instructors will play in the research.

Design-Based Research Experiments Are Resource Intensive

The focus of design-based research on studying learning ecologies to uncover mechanisms of learning requires that researchers collect multiple data streams through time, which often necessitates significant temporal and financial resources ( Collins et al., 2004 ; O’Donnell, 2004 ). Consequently, researchers must weigh both practical as well as methodological considerations when formulating their experimental design. For example, investigating learning mechanisms requires that researchers collect data at a frequency that will capture changes in student thinking ( Siegler, 2006 ). However, researchers may be constrained in the number of data-collection events they can anticipate depending on: the instructor’s ability to facilitate in-class collection events or solicit student participation in extracurricular activities (e.g., interviews); the cost of technological devices to record student conversations; the time and logistical considerations needed to schedule and conduct student interviews; the financial resources available to compensate student participants; the financial and temporal costs associated with analyzing large amounts of data.

Identifying learning mechanisms also requires in-depth analyses of qualitative data as students experience various instructional tools (e.g., microgenetic methods; Flynn et al. , 2006 ; Siegler, 2006 ). The high intensity of these in-depth analyses often limits the number of students who can be evaluated in this way, which must be balanced with the kinds of generalizations researchers wish to make about the effectiveness of the instructional tools ( O’Donnell, 2004 ). Because of the large variety of data streams that could be collected in a design-based research experiment—and the resources required to collect and analyze them—it is critical that the research team identify a priori how specific data streams, and the methods of their analysis, will provide the evidence necessary to address the theoretical and practical objectives of the research (see the following section on experimental rigor; Sandoval, 2014 ). These are critical management decisions because of the need for a transparent, systematic analysis of the data that others can scrutinize to evaluate the validity of the claims being made ( Cobb et al. , 2003 ).

Concerns with Experimental Rigor

The nature of design-based research, with its use of narrative to characterize versus control experimental environments, has drawn concerns about the rigor of this methodological approach. Some have challenged its ability to produce evidence-based warrants to support its claims of learning that can be replicated and critiqued by others ( Shavelson et al. , 2003 ; Hoadley, 2004 ). This is a valid concern that design-based researchers, and indeed all education researchers, must address to ensure their research meets established standards for education research ( NRC, 2002 ).

One way design-based researchers address this concern is by “specifying theoretically salient features of a learning environment design and mapping out how they are predicted to work together to produce desired outcomes” ( Sandoval, 2014 , p. 19). Through this process, researchers explicitly show before they begin the work how their theory of learning is embodied in the instructional tools to be tested, the specific data the tools will produce for analysis, and what outcomes will be taken as evidence for success. Moreover, by allowing instructional tools to be modified during the testing phase as needed, design-based researchers acknowledge that it is impossible to anticipate all aspects of the classroom environment that might impact the implementation of instructional tools, “as dozens (if not millions) of factors interact to produce the measureable outcomes related to learning” ( Hoadley, 2004 , p. 204; DBR Collective, 2003 ). Consequently, modifying instructional tools midstream to account for these unanticipated factors can ensure they retain their methodological alignment with the underlying theory and predicted learning outcomes so that inferences drawn from the design experiment accurately reflect what was being tested ( Edelson, 2002 ; Hoadley, 2004 ). Indeed, Barab (2014) states, “the messiness of real-world practice must be recognized, understood, and integrated as part of the theoretical claims if the claims are to have real-world explanatory value” (p. 153).

CONCLUSIONS

providing a methodology that integrates theories of learning with practical experiences in classrooms,

using a range of analytical approaches that allow for researchers to uncover the underlying mechanisms of student thinking and learning,

fostering interdisciplinary collaborations among researchers and instructors, and

characterizing learning ecologies that account for the complexity involved in student learning

By employing this methodology from the learning sciences, biology education researchers can enrich our current understanding of what is required to help biology students achieve their personal and professional aims during their college experience. It can also stimulate new ideas for biology education that can be discussed and debated in our research community as we continue to explore and refine how best to serve the students who pass through our classroom doors.

1 “Epistemic commitment” is defined as engaging in certain practices that generate knowledge in an agreed-upon way.

ACKNOWLEDGMENTS

We thank the UW Biology Education Research Group’s (BERG) feedback on drafts of this essay as well as Dr. L. Jescovich for last-minute analyses. This work was supported by a National Science Foundation award (NSF DUE 1661263/1660643). Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the NSF. All procedures were conducted in accordance with approval from the Institutional Review Board at the University of Washington (52146) and the New England Independent Review Board (120160152).

  • Alonzo, A. C. ( 2011 ). Learning progressions that support formative assessment practices . Measurement , 9 (2/3), 124–129. Google Scholar
  • Anderson, T., & Shattuck, J. ( 2012 ). Design-based research: A decade of progress in education research? Educational Researcher , 41 (1), 16–25. Google Scholar
  • Andrews, T. M., Leonard, M. J., Colgrove, C. A., & Kalinowski, S. T. ( 2011 ). Active learning not associated with student learning in a random sample of college biology courses . CBE—Life Sciences Education , 10 (4), 394–405. Link ,  Google Scholar
  • Badenhorst, E., Hartman, N., & Mamede, S. ( 2016 ). How biomedical misconceptions may arise and affect medical students’ learning: A review of theoretical perspectives and empirical evidence . Health Professions Education , 2 (1), 10–17. Google Scholar
  • Barab, S. ( 2014 ). Design-based research: A methodological toolkit for engineering change . In The Cambridge handbook of the learning sciences (2nd ed., pp. 151–170). Cambridge University Press. https://doi.org/10.1017/CBO9781139519526.011 Google Scholar
  • Barab, S., & Squire, K. ( 2004 ). Design-based research: Putting a stake in the ground . Journal of the Learning Sciences , 13 (1), 1–14. Google Scholar
  • Bell, P. ( 2004 ). On the theoretical breadth of design-based research in education . Educational Psychologist , 39 (4), 243–253. Google Scholar
  • Borrego, M., Cutler, S., Prince, M., Henderson, C., & Froyd, J. E. ( 2013 ). Fidelity of implementation of research-based instructional strategies (RBIS) in engineering science courses . Journal of Engineering Education , 102 (3), 394–425. Google Scholar
  • Brown, A. L. ( 1992 ). Design experiments: Theoretical and methodological challenges in creating complex interventions in classroom settings . Journal of the Learning Sciences , 2 (2), 141–178. Google Scholar
  • Chi, M. T. H., Roscoe, R. D., Slotta, J. D., Roy, M., & Chase, C. C. ( 2012 ). Misconceived causal explanations for emergent processes . Cognitive Science , 36 (1), 1–61. Medline ,  Google Scholar
  • Cobb, P., Confrey, J., diSessa, A., Lehrer, R., & Schauble, L. ( 2003 ). Design experiments in educational research . Educational Researcher , 32 (1), 9–13. Google Scholar
  • Coley, J. D., & Tanner, K. D. ( 2012 ). Common origins of diverse misconceptions: Cognitive principles and the development of biology thinking . CBE—Life Sciences Education , 11 (3), 209–215. Link ,  Google Scholar
  • Coley, J. D., & Tanner, K. ( 2015 ). Relations between intuitive biological thinking and biological misconceptions in biology majors and nonmajors . CBE—Life Sciences Education , 14 (1). https://doi.org/10.1187/cbe.14-06-0094 Medline ,  Google Scholar
  • Collins, A., Joseph, D., & Bielaczyc, K. ( 2004 ). Design research: Theoretical and methodological issues . Journal of the Learning Sciences , 13 (1), 15–42. Google Scholar
  • Corcoran, T., Mosher, F. A., & Rogat, A. D. ( 2009 ). Learning progressions in science: An evidence-based approach to reform (CPRE Research Report No. RR-63) . Philadelphia, PA: Consortium for Policy Research in Education. Google Scholar
  • Davidesco, I., & Milne, C. ( 2019 ). Implementing cognitive science and discipline-based education research in the undergraduate science classroom . CBE—Life Sciences Education , 18 (3), es4. Link ,  Google Scholar
  • Design-Based Research Collective . ( 2003 ). Design-based research: An emerging paradigm for educational inquiry . Educational Researcher , 32 (1), 5–8. Google Scholar
  • Dolan, E. L. ( 2015 ). Biology education research 2.0 . CBE—Life Sciences Education , 14 (4), ed1. Link ,  Google Scholar
  • Duschl, R., Maeng, S., & Sezen, A. ( 2011 ). Learning progressions and teaching sequences: A review and analysis . Studies in Science Education , 47 (2), 123–182. Google Scholar
  • Eddy, S. L., Converse, M., & Wenderoth, M. P. ( 2015 ). PORTAAL: A classroom observation tool assessing evidence-based teaching practices for active learning in large science, technology, engineering, and mathematics classes . CBE—Life Sciences Education , 14 (2), ar23. Link ,  Google Scholar
  • Edelson, D. C. ( 2002 ). Design research: What we learn when we engage in design . Journal of the Learning Sciences , 11 (1), 105–121. Google Scholar
  • Flynn, E., Pine, K., & Lewis, C. ( 2006 ). The microgenetic method—Time for change? The Psychologist , 19 (3), 152–155. Google Scholar
  • Gouvea, J. S., & Simon, M. R. ( 2018 ). Challenging cognitive construals: A dynamic alternative to stable misconceptions . CBE—Life Sciences Education , 17 (2), ar34. Link ,  Google Scholar
  • Gunckel, K. L., Mohan, L., Covitt, B. A., & Anderson, C. W. ( 2012 ). Addressing challenges in developing learning progressions for environmental science literacy . In Alonzo, A. C.Gotwals, A. W. (Eds.), Learning progressions in science: Current challenges and future directions (pp. 39–75). Rotterdam: SensePublishers. https://doi.org/10.1007/978-94-6091-824-7_4 Google Scholar
  • Hake, R. R. ( 2007 ). Design-based research in physics education research: A review . In Kelly, A. E.Lesh, R. A.Baek, J. Y. (Eds.), Handbook of design research methods in mathematics, science, and technology education (p. 24). New York: Routledge. Google Scholar
  • Hoadley, C. M. ( 2004 ). Methodological alignment in design-based research . Educational Psychologist , 39 (4), 203–212. Google Scholar
  • Jackson, M., Tran, A., Wenderoth, M. P., & Doherty, J. H. ( 2018 ). Peer vs. self-grading of practice exams: Which is better? CBE—Life Sciences Education , 17 (3), es44. https://doi.org/10.1187/cbe.18-04-0052 Link ,  Google Scholar
  • Jin, H., & Anderson, C. W. ( 2012 ). A learning progression for energy in socio-ecological systems . Journal of Research in Science Teaching , 49 (9), 1149–1180. Google Scholar
  • Joseph, D. ( 2004 ). The practice of design-based research: Uncovering the interplay between design, research, and the real-world context . Educational Psychologist , 39 (4), 235–242. Google Scholar
  • Kelly, A. E. ( 2014 ). Design-based research in engineering education . In Cambridge handbook of engineering education research (pp. 497–518). New York, NY: Cambridge University Press. https://doi.org/10.1017/CBO9781139013451.032 Google Scholar
  • Lee, C. J., Toven-Lindsey, B., Shapiro, C., Soh, M., Mazrouee, S., Levis-Fitzgerald, M., & Sanders, E. R. ( 2018 ). Error-discovery learning boosts student engagement and performance, while reducing student attrition in a bioinformatics course . CBE—Life Sciences Education , 17 (3), ar40. https://doi.org/10.1187/cbe.17-04-0061 Link ,  Google Scholar
  • Lo, S. M., Gardner, G. E., Reid, J., Napoleon-Fanis, V., Carroll, P., Smith, E., & Sato, B. K. ( 2019 ). Prevailing questions and methodologies in biology education research: A longitudinal analysis of research in CBE — life sciences education and at the society for the advancement of biology education research . CBE—Life Sciences Education , 18 (1), ar9. Link ,  Google Scholar
  • Marbach-Ad, G., Briken, V., El-Sayed, N. M., Frauwirth, K., Fredericksen, B., Hutcheson, S., … & Smith, A. C. ( 2009 ). Assessing student understanding of host pathogen interactions using a concept inventory . Journal of Microbiology & Biology Education , 10 (1), 43–50. Medline ,  Google Scholar
  • McFarland, J. L., Price, R. M., Wenderoth, M. P., Martinková, P., Cliff, W., Michael, J. , … & Wright, A. ( 2017 ). Development and validation of the homeostasis concept inventory . CBE—Life Sciences Education , 16 (2), ar35. Link ,  Google Scholar
  • McKenney, S., & Reeves, T. C. ( 2013 ). Systematic review of design-based research progress: Is a little knowledge a dangerous thing? Educational Researcher , 42 (2), 97–100. Google Scholar
  • Mestre, J. P., Cheville, A., & Herman, G. L. ( 2018 ). Promoting DBER-cognitive psychology collaborations in STEM education . Journal of Engineering Education , 107 (1), 5–10. Google Scholar
  • Michael, J. A. ( 2007 ). What makes physiology hard for students to learn? Results of a faculty survey . AJP: Advances in Physiology Education , 31 (1), 34–40. Medline ,  Google Scholar
  • Michael, J. A., Modell, H., McFarland, J., & Cliff, W. ( 2009 ). The “core principles” of physiology: What should students understand? Advances in Physiology Education , 33 (1), 10–16. Medline ,  Google Scholar
  • Middleton, J., Gorard, S., Taylor, C., & Bannan-Ritland, B. ( 2008 ). The “compleat” design experiment: From soup to nuts . In Kelly, A. E.Lesh, R. A.Baek, J. Y. (Eds.), Handbook of design research methods in education: Innovations in science, technology, engineering, and mathematics learning and teaching (pp. 21–46). New York, NY: Routledge. Google Scholar
  • Modell, H. I. ( 2000 ). How to help students understand physiology? Emphasize general models . Advances in Physiology Education , 23 (1), S101–S107. Medline ,  Google Scholar
  • Mohan, L., Chen, J., & Anderson, C. W. ( 2009 ). Developing a multi-year learning progression for carbon cycling in socio-ecological systems . Journal of Research in Science Teaching , 46 (6), 675–698. Google Scholar
  • National Academies of Sciences, Engineering, and Medicine . ( 2018 ). How People Learn II: Learners, Contexts, and Cultures . Washington, DC: National Academies Press. Retrieved June 24, 2019, from https://doi.org/10.17226/24783 Google Scholar
  • National Research Council (NRC) . ( 2002 ). Scientific research in education . Washington, DC: National Academies Press. Retrieved January 31, 2019, from https://doi.org/10.17226/10236 Google Scholar
  • NRC . ( 2007 ). Taking science to school: Learning and teaching science in grades K–8 . Washington, DC: National Academies Press. Retrieved March 22, 2019, from www.nap.edu/catalog/11625/taking-science-to-school-learning-and-teaching-science-in-grades . https://doi.org/10.17226/11625 Google Scholar
  • NRC . ( 2012 ). Discipline-based education research: Understanding and improving learning in undergraduate science and engineering . Washington, DC: National Academies Press. Retrieved from www.nap.edu/catalog/13362/discipline-based-education-research-understanding-and-improving-learning-in-undergraduate . https://doi.org/10.17226/13362 Google Scholar
  • NRC . ( 2018 ). How people learn II: Learners, contexts, and cultures . Washington, DC: National Academies Press. Retrieved from www.nap.edu/read/24783/chapter/7 . https://doi.org/10.17226/24783 Google Scholar
  • O’Donnell, A. M. ( 2004 ). A commentary on design research . Educational Psychologist , 39 (4), 255–260. Google Scholar
  • Offerdahl, E. G., McConnell, M., & Boyer, J. ( 2018 ). Can I have your recipe? Using a fidelity of implementation (FOI) framework to identify the key ingredients of formative assessment for learning . CBE—Life Sciences Education , 17 (4), es16. Link ,  Google Scholar
  • Peffer, M., & Renken, M. ( 2016 ). Practical strategies for collaboration across discipline-based education research and the learning sciences . CBE—Life Sciences Education , 15 (4), es11. Link ,  Google Scholar
  • Reiser, B. J., Smith, B. K., Tabak, I., Steinmuller, F., Sandoval, W. A., & Leone, A. J. ( 2001 ). BGuILE: Strategic and conceptual scaffolds for scientific inquiry in biology classrooms . In Carver, S. M.Klahr, D. (Eds.), Cognition and instruction: Twenty-five years of progress (pp. 263–305). Mahwah, NJ: Lawrence Erlbaum Associates, Inc. Google Scholar
  • Sandoval, W. ( 2014 ). Conjecture mapping: An approach to systematic educational design research . Journal of the Learning Sciences , 23 (1), 18–36. Google Scholar
  • Sandoval, W. A., & Bell, P. ( 2004 ). Design-based research methods for studying learning in context: Introduction . Educational Psychologist , 39 (4), 199–201. Google Scholar
  • Schinske, J. N., Balke, V. L., Bangera, M. G., Bonney, K. M., Brownell, S. E., Carter, R. S. , … & Corwin, L. A. ( 2017 ). Broadening participation in biology education research: Engaging community college students and faculty . CBE—Life Sciences Education , 16 (2), mr1. Link ,  Google Scholar
  • Scott, E., Anderson, C. W., Mashood, K. K., Matz, R. L., Underwood, S. M., & Sawtelle, V. ( 2018 ). Developing an analytical framework to characterize student reasoning about complex processes . CBE—Life Sciences Education , 17 (3), ar49. https://doi.org/10.1187/cbe.17-10-0225 Link ,  Google Scholar
  • Scott, E., Wenderoth, M. P., & Doherty, J. H. ( 2019 ). Learning progressions: An empirically grounded, learner-centered framework to guide biology instruction . CBE—Life Sciences Education , 18 (4), es5. https://doi.org/10.1187/cbe.19-03-0059 Link ,  Google Scholar
  • Sharma, M. D., & McShane, K. ( 2008 ). A methodological framework for understanding and describing discipline-based scholarship of teaching in higher education through design-based research . Higher Education Research & Development , 27 (3), 257–270. Google Scholar
  • Shavelson, R. J., Phillips, D. C., Towne, L., & Feuer, M. J. ( 2003 ). On the science of education design studies . Educational Researcher , 32 (1), 25–28. Google Scholar
  • Siegler, R. S. ( 2006 ). Microgenetic analyses of learning . In Damon, W.Lerner, R. M. (Eds.), Handbook of child psychology (pp. 464–510). Hoboken, NJ: John Wiley & Sons, Inc. https://doi.org/10.1002/9780470147658.chpsy0211 Google Scholar
  • Stringer, E. T. ( 2013 ). Action research . Thousand Oaks, CA: Sage Publications, Inc. Google Scholar
  • Subramaniam, M., Jean, B. S., Taylor, N. G., Kodama, C., Follman, R., & Casciotti, D. ( 2015 ). Bit by bit: Using design-based research to improve the health literacy of adolescents . JMIR Research Protocols , 4 (2), e62. Medline ,  Google Scholar
  • Summers, M. M., Couch, B. A., Knight, J. K., Brownell, S. E., Crowe, A. J., Semsar, K. , … & Batzli, J. ( 2018 ). EcoEvo-MAPS: An ecology and evolution assessment for introductory through advanced undergraduates . CBE—Life Sciences Education , 17 (2), ar18. Link ,  Google Scholar
  • Szteinberg, G., Balicki, S., Banks, G., Clinchot, M., Cullipher, S., Huie, R. , … & Sevian, H. ( 2014 ). Collaborative professional development in chemistry education research: Bridging the gap between research and practice . Journal of Chemical Education , 91 (9), 1401–1408. Google Scholar
  • Talanquer, V. ( 2014 ). DBER and STEM education reform: Are we up to the challenge? Journal of Research in Science Teaching , 51 (6), 809–819. Google Scholar
  • Wang, F., & Hannafin, M. J. ( 2005 ). Design-based research and technology-enhanced learning environments . Educational Technology Research and Development , 53 (4), 5–23. Google Scholar
  • Wang, J.-R. ( 2004 ). Development and validation of a Two-tier instrument to examine understanding of internal transport in plants and the human circulatory system . International Journal of Science and Mathematics Education , 2 (2), 131–157. Google Scholar
  • Warfa, A.-R. M. ( 2016 ). Mixed-methods design in biology education research: Approach and uses . CBE—Life Sciences Education , 15 (4), rm5. Link ,  Google Scholar
  • Windschitl, M., Thompson, J., Braaten, M., & Stroupe, D. ( 2012 ). Proposing a core set of instructional practices and tools for teachers of science . Science Education , 96 (5), 878–903. Google Scholar
  • Zagallo, P., Meddleton, S., & Bolger, M. S. ( 2016 ). Teaching real data interpretation with models (TRIM): Analysis of student dialogue in a large-enrollment cell and developmental biology course . CBE—Life Sciences Education , 15 (2), ar17. Link ,  Google Scholar
  • Codéveloppement d’un programme d’autogestion de la douleur chronique en ligne: un projet de recherche basé sur la conception et axé sur l’engagement des patients 12 March 2024 | Canadian Journal of Pain, Vol. 8, No. 1
  • Enhancing undergraduates’ engagement in a learning community by including their voices in the technological and instructional design 1 Jun 2024 | Computers & Education, Vol. 214
  • Practice-Based Teacher Education Benefits Graduate Trainees and Their Students Through Inclusive and Active Teaching Methods 16 October 2023 | Journal for STEM Education Research, Vol. 7, No. 1
  • Developing and Validating the Preschool Nutrition Education Practices Survey 1 Apr 2024 | Journal of Nutrition Education and Behavior, Vol. 22
  • Leveraging learning experience design: digital media approaches to influence motivational traits that support student learning behaviors in undergraduate online courses 11 October 2022 | Journal of Computing in Higher Education, Vol. 35, No. 3
  • Investigating an Assessment Design that Prevents Students from Using ChatGPT as the Sole Basis to Pass Assessment at the Tertiary Level 30 November 2023 | E-Journal of Humanities, Arts and Social Sciences
  • Spatial Variations in Aquatic Insect Community Structure in the Winam Gulf of Lake Victoria, Kenya 8 Sep 2023 | International Journal of Ecology, Vol. 2023
  • The Perceived Effectiveness of Various Forms of Feedback on the Acquisition of Technical Skills by Advanced Learners in Simulation-Based Health Professions Education 28 Aug 2023 | Cureus, Vol. 44
  • Occupational therapists' acceptance of 3D printing 22 August 2023 | South African Journal of Occupational Therapy, Vol. 53, No. 2
  • An app by students for students – the DPaCK-model for a digital collaborative teamwork project to identify butterflies 4 August 2023 | Frontiers in Education, Vol. 8
  • Applying DBR to design protocols for synchronous online Chinese learning: An activity theoretic perspective 1 Aug 2023 | System, Vol. 116
  • Defining the Nature of Augmented Feedback for Learning Intraosseous Access Skills in Simulation-Based Health Professions Education 14 Jul 2023 | Cureus, Vol. 86
  • Practice-based 21st-century teacher education: Design principles for adaptive expertise 1 Jul 2023 | Teaching and Teacher Education, Vol. 128
  • Undergraduate students’ neurophysiological reasoning: what we learn from the attractive distractors students select 1 Jun 2023 | Advances in Physiology Education, Vol. 47, No. 2
  • Oaks to arteries: the Physiology Core Concept of flow down gradients supports transfer of student reasoning 1 Jun 2023 | Advances in Physiology Education, Vol. 47, No. 2
  • Audrey Chen ,
  • Kimberley A. Phillips ,
  • Jennifer E. Schaefer , and
  • Patrick M. Sonner
  • Kyle Frantz,, Monitoring Editor
  • Optimizing the Learner’s Role in Feedback: Development of a Feedback-Preparedness Online Application for Medical Students in the Clinical Setting 8 May 2023 | Cureus, Vol. 42
  • History, Status, and Development of AI-Based Learning Science 8 April 2023 | SN Computer Science, Vol. 4, No. 3
  • An Analytical Dashboard of Collaborative Activities for the Knowledge Building 4 March 2023 | Technology, Knowledge and Learning, Vol. 29
  • The Application of a Design-Based Research Framework for Simulation-Based Education 22 Nov 2022 | Cureus, Vol. 22
  • Erin Stanfield ,
  • Corin D. Slown ,
  • Quentin Sedlacek , and
  • Suzanne E. Worcester
  • James Hewlett, Monitoring Editor
  • 2022 | , Vol. 511
  • The effect of the e-mentoring-based education program on professional development of preschool teachers 3 July 2021 | Education and Information Technologies, Vol. 27, No. 1
  • 2022 | Education Sciences, Vol. 12, No. 8
  • Training Digital Competences of Educators in Continuing Education: A Three-Level Approach 27 October 2022
  • Design-based research as a framework for developing and deploying augmented reality applications and scenarios for intercultural exchange 13 December 2021
  • Repetition Is Important to Students and Their Understanding during Laboratory Courses That Include Research 10 Sep 2021 | Journal of Microbiology & Biology Education, Vol. 22, No. 2
  • Another look at the core concepts of physiology: revisions and resources 1 Dec 2020 | Advances in Physiology Education, Vol. 44, No. 4

design based research (dbr)

Submitted: 18 November 2019 Revised: 3 March 2020 Accepted: 25 March 2020

© 2020 E. E. Scott et al. CBE—Life Sciences Education © 2020 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

Want to create or adapt books like this? Learn more about how Pressbooks supports open publishing practices.

Kimberly Christensen and Richard E. West

Design-Based Research (DBR) is one of the most exciting evolutions in research methodology of our time, as it allows for the potential knowledge gained through the intimate connections designers have with their work to be combined with the knowledge derived from research. These two sources of knowledge can inform each other, leading to improved design interventions as well as improved local and generalizable theory. However, these positive outcomes are not easily attained, as DBR is also a difficult method to implement well. The good news is that we can learn much from other disciplines who are also seeking to find effective strategies for intertwining design and research. In this chapter, we will review the history of DBR as well as Interdisciplinary Design Research (IDR) and then discuss potential implications for our field.

Shared Origins with IDR

These two types of design research, both DBR and IDR, share a common genesis among the design revolution of the 1960s, where designers, researchers, and scholars sought to elevate design from mere practice to an independent scholarly discipline, with its own research and distinct theoretical and methodological underpinnings. A scholarly focus on design methods, they argued, would foster the development of design theories, which would in turn improve the quality of design and design practice (Margolin, 2010). Research on design methods, termed design research, would be the foundation of this new discipline.

Design research had existed in primitive form—as market research and process analysis—since before the turn of the 20th century, and, although it had served to improve processes and marketing, it had not been applied as scientific research. John Chris Jones, Bruce Archer, and Herbert Simon were among the first to shift the focus from research for design (e.g., research with the intent of gathering data to support product development) to research on design (e.g., research exploring the design process). Their efforts framed the initial development of design research and science.

John Chris Jones

An engineer, Jones (1970) felt that the design process was ambiguous and often too abstruse to discuss effectively. One solution, he offered, was to define and discuss design in terms of methods. By identifying and discussing design methods, researchers would be able to create transparency in the design process, combating perceptions of design being more or less mysteriously inspired. This discussion of design methods, Jones proposed, would in turn raise the level of discourse and practice in design.

Bruce Archer

Archer, also an engineer, worked with Jones and likewise supported the adoption of research methods from other disciplines. Archer (1965) proposed that applying systematic methods would improve the assessment of design problems and foster the development of effective solutions. Archer recognized, however, that improved practice alone would not enable design to achieve disciplinary status. In order to become a discipline, design required a theoretical foundation to support its practice. Archer (1981) advocated that design research was the primary means by which theoretical knowledge could be developed. He suggested that the application of systematic inquiry, such as existed in engineering, would yield knowledge about not only product and practice, but also the theory that guided each.

Herbert Simon

It was multidisciplinary social scientist Simon, however, that issued the clarion call for transforming design into design science (Buchanan, 2007; Collins, 1992; Collins, Joseph, & Bielaczyc, 2004; Cross, 1999; Cross, 2007; Friedman, 2003; Jonas, 2007; Willemien, 2009). In The Sciences of the Artificial, Simon (1969) reasoned that the rigorous inquiry and discussion surrounding naturally occurring processes and phenomena was just as necessary for man-made products and processes. He particularly called for “[bodies] of intellectually tough, analytic, partly formalizable, partly empirical, teachable doctrine about the design process” (p. 132). This call for more scholarly discussion and practice resonated with designers across disciplines in design and engineering (Buchanan, 2007; Cross, 1999; Cross, 2007; Friedman, 2003; Jonas, 2007; Willemien, 2009). IDR sprang directly from this early movement and has continued to gain momentum, producing an interdisciplinary body of research encompassing research efforts in engineering, design, and technology.

Years later, in the 1980s, Simon’s work inspired the first DBR efforts in education (Collins et al., 2004). Much of the DBR literature attributes its beginnings to the work of Ann Brown and Allan Collins (Cobb, Confrey, diSessa, Lehrer, & Schauble, 2003; Collins et al., 2004; Kelly, 2003; McCandliss, Kalchman, & Bryant, 2003; Oh & Reeves, 2010; Reeves, 2006; Shavelson, Phillips, Towne, & Feuer, 2003; Tabak, 2004; van den Akker, 1999). Their work, focusing on research and development in authentic contexts, drew heavily on research approaches and development practices in the design sciences, including the work of early design researchers such as Simon (Brown, 1992; Collins, 1992; Collins et al., 2004). However, over generations of research, this connection has been all but forgotten, and DBR, although similarly inspired by the early efforts of Simon, Archer, and Jones, has developed into an isolated and discipline-specific body of design research, independent from its interdisciplinary cousin.

Current Issues in DBR

The initial obstacle to understanding and engaging in DBR is understanding what DBR is. What do we call it? What does it entail? How do we do it? Many of the current challenges facing DBR concern these questions. Specifically, there are three issues that influence how DBR is identified, implemented, and discussed. First, proliferation of terminology among scholars and inconsistent use of these terms have created a sprawling body of literature, with various splinter DBR groups hosting scholarly conversations regarding their particular brand of DBR. Second, DBR, as a field, is characterized by a lack of definition, in terms of its purpose, its characteristics, and the steps or processes of which it is comprised. Third, the one consistent element of DBR across the field is an unwieldy set of considerations incumbent upon the researcher.

Because it is so difficult to define and conceptualize DBR, it is similarly difficult to replicate authentically. Lack of scholarly agreement on the characteristics and outcomes that define DBR withholds a structure by which DBR studies can be identified and evaluated and, ultimately, limits the degree to which the field can progress. The following sections will identify and explore the three greatest challenges facing DBR today: proliferation of terms, lack of definition, and competing demands.

Proliferation of terminology

One of the most challenging characteristics of DBR is the quantity and use of terms that identify DBR in the research literature. There are seven common terms typically associated with DBR: design experiments, design research, design-based research, formative research, development research, developmental research, and design-based implementation research.

Synonymous terms

Collins and Brown first termed their efforts design experiments (Brown, 1992; Collins, 1992). Subsequent literature stemming from or relating to Collins’ and Brown’s work used design research and design experiments synonymously (Anderson & Shattuck, 2012; Collins et al., 2004). Design-based research was introduced to distinguish DBR from other research approaches. Sandoval and Bell (2004) best summarized this as follows:

We have settled on the term design-based research over the other commonly used phrases “design experimentation,” which connotes a specific form of controlled experimentation that does not capture the breadth of the approach, or “design research,” which is too easily confused with research design and other efforts in design fields that lack in situ research components. (p. 199)

Variations by discipline

Terminology across disciplines refers to DBR approaches as formative research, development research, design experiments, and developmental research. According to van den Akker (1999), the use of DBR terminology also varies by educational sub-discipline, with areas such as (a) curriculum, (b) learning and instruction, (c) media and technology, and (d) teacher education and didactics favoring specific terms that reflect the focus of their research (Figure 1).

Figure 1. Variations in DBR terminology across educational sub-disciplines.

Lack of definition

This variation across disciplines, with design researchers tailoring design research to address discipline-specific interests and needs, has created a lack of definition in the field overall. In addition, in the literature, DBR has been conceptualized at various levels of granularity. Here, we will discuss three existing approaches to defining DBR: (a) statements of the overarching purpose, (b) lists of defining characteristics, and (c) models of the steps or processes involved.

General purpose

In literature, scholars and researchers have made multiple attempts to isolate the general purpose of design research in education, with each offering a different insight and definition. According to van den Akker (1999), design research is distinguished from other research efforts by its simultaneous commitment to (a) developing a body of design principles and methods that are based in theory and validated by research and (b) offering direct contributions to practice. This position was supported by Sandoval and Bell (2004), who suggested that the general purpose of DBR was to address the “tension between the desire for locally usable knowledge, on the one hand, and scientifically sound, generalizable knowledge on the other” (p. 199). Cobb et al. (2003) particularly promoted the theory-building focus, asserting “design experiments are conducted to develop theories, not merely to empirically tune ‘what works’” (p. 10). Shavelson et al. (2003) recognized the importance of developing theory but emphasized that the testing and building of instructional products was an equal focus of design research rather than the means to a theoretical end.

The aggregate of these definitions suggests that the purpose of DBR involves theoretical and practical design principles and active engagement in the design process. However, DBR continues to vary in its prioritization of these components, with some focusing largely on theory, others emphasizing practice or product, and many examining neither but all using the same terms.

Specific characteristics

Another way to define DBR is by identifying the key characteristics that both unite and define the approach. Unlike other research approaches, DBR can take the form of multiple research methodologies, both qualitative and quantitative, and thus cannot be recognized strictly by its methods. Identifying characteristics, therefore, concern the research process, context, and focus. This section will discuss the original characteristics of DBR, as introduced by Brown and Collins, and then identify the seven most common characteristics suggested by DBR literature overall.

Brown’s concept of DBR. Brown (1992) defined design research as having five primary characteristics that distinguished it from typical design or research processes. First, a design is engineered in an authentic, working environment. Second, the development of research and the design are influenced by a specific set of inputs: classroom environment, teachers and students as researchers, curriculum, and technology. Third, the design and development process includes multiple cycles of testing, revision, and further testing. Fourth, the design research process produces an assessment of the design’s quality as well as the effectiveness of both the design and its theoretical underpinnings. Finally, the overall process should make contributions to existing learning theory.

Collins’s concept of DBR. Collins (1990, 1992) posed a similar list of design research characteristics. Collins echoed Brown’s specifications of authentic context, cycles of testing and revision, and design and process evaluation. Additionally, Collins provided greater detail regarding the characteristics of the design research processes—specifically, that design research should include the comparison of multiple sample groups, be systematic in both its variation within the experiment and in the order of revisions (i.e., by testing the innovations most likely to succeed first), and involve an interdisciplinary team of experts including not just the teacher and designer, but technologists, psychologists, and developers as well. Unlike Brown, however, Collins did not refer to theory building as an essential characteristic.

Current DBR characteristics. The DBR literature that followed expanded, clarified, and revised the design research characteristics identified by Brown and Collins. The range of DBR characteristics discussed in the field currently is broad but can be distilled to seven most frequently referenced identifying characteristics of DBR: design driven, situated, iterative, collaborative, theory building, practical, and productive.

Design driven.  All literature identifies DBR as focusing on the evolution of a design (Anderson & Shattuck, 2012; Brown, 1992; Cobb et al., 2003; Collins, 1992; Design-Based Research Collective, 2003). While the design can range from an instructional artifact to an intervention, engagement in the design process is what yields the experience, data, and insight necessary for inquiry.

Situated.  Recalling Brown’s (1992) call for more authentic research contexts, nearly all definitions of DBR situate the aforementioned design process in a real-world context, such as a classroom (Anderson & Shattuck, 2012; Barab & Squire, 2004; Cobb et al., 2003).

Iterative. Literature also appears to agree that a DBR process does not consist of a linear design process, but rather multiple cycles of design, testing, and revision (Anderson & Shattuck, 2012; Barab & Squire, 2004; Brown, 1992; Design-Based Research Collective, 2003; Shavelson et al., 2003). These iterations must also represent systematic adjustment of the design, with each adjustment and subsequent testing serving as a miniature experiment (Barab & Squire, 2004; Collins, 1992).

Collaborative.  While the literature may not always agree on the roles and responsibilities of those engaged in DBR, collaboration between researchers, designers, and educators appears to be key (Anderson & Shattuck, 2012; Barab & Squire, 2004; McCandliss et al., 2003). Each collaborator enters the project with a unique perspective and, as each engages in research, forms a role-specific view of phenomena. These perspectives can then be combined to create a more holistic view of the design process, its context, and the developing product.

Theory building.  Design research focuses on more than creating an effective design; DBR should produce an intimate understanding of both design and theory (Anderson & Shattuck, 2012; Barab & Squire, 2004; Brown, 1992; Cobb et al., 2003; Design-Based Research Collective, 2003; Joseph, 2004; Shavelson et al., 2003). According to Barab & Squire (2004), “Design-based research requires more than simply showing a particular design works but demands that the researcher . . . generate evidence-based claims about learning that address contemporary theoretical issues and further the theoretical knowledge of the field” (p. 6). DBR needs to build and test theory, yielding findings that can be generalized to both local and broad theory (Hoadley, 2004).

Practical.  While theoretical contributions are essential to DBR, the results of DBR studies “must do real work” (Cobb et al., 2003, p. 10) and inform instructional, research, and design practice (Anderson & Shattuck, 2012; Barab & Squire, 2004; Design-Based Research Collective, 2003; McCandliss et al., 2003).

Productive.  Not only should design research produce theoretical and practical insights, but also the design itself must produce results, measuring its success in terms of how well the design meets its intended outcomes (Barab & Squire, 2004; Design-Based Research Collective, 2003; Joseph, 2004; McCandliss et al., 2003).

Steps and processes

The third way DBR could possibly be defined is to identify the steps or processes involved in implementing it. The sections below illustrate the steps outlined by Collins (1990) and Brown (1992) as well as models by Bannan-Ritland (2003), Reeves (2006), and an aggregate model presented by Anderson & Shattuck (2012).

Collins’s design experimentation steps.  In his technical report, Collins (1990) presented an extensive list of 10 steps in design experimentation (Figure 2). While Collins’s model provides a guide for experimentally testing and developing new instructional programs, it does not include multiple iterative stages or any evaluation of the final product. Because Collins was interested primarily in development, research was not given much attention in his model.

Brown’s design research example.  The example of design research Brown (1992) included in her article was limited and less clearly delineated than Collins’s model (Figure 2). Brown focused on the development of educational interventions, including additional testing with minority populations. Similar to Collins, Brown also omitted any summative evaluation of intervention quality or effectiveness and did not specify the role of research through the design process.

Bannan-Ritland’s DBR model.  Bannan-Ritland (2003) reviewed design process models in fields such as product development, instructional design, and engineering to create a more sophisticated model of design-based research. In its simplest form, Bannan-Ritland’s model is comprised of multiple processes subsumed under four broad stages: (a) informed exploration, (b) enactment, (c) evaluation of local impact, and (d) evaluation of broad impact. Unlike Collins and Brown, Bannan-Ritland dedicated large portions of the model to evaluation in terms of the quality and efficacy of the final product as well as the implications for theory and practice.

Reeves’s development research model.  Reeves (2006) provided a simplified model consisting of just four steps (Figure 2). By condensing DBR into just a few steps, Reeves highlighted what he viewed as the most essential processes, ending with a general reflection on both the process and product generated in order to develop theoretical and practical insights.

Anderson and Shattuck’s aggregate model.  Anderson and Shattuck (2012) reviewed design-based research abstracts over the past decade and, from their review, presented an eight-step aggregate model of DBR (Figure 2). As an aggregate of DBR approaches, this model was their attempt to unify approaches across DBR literature, and includes similar steps to Reeves’s model. However, unlike Reeves, Anderson and Shattuck did not include summative reflection and insight development.

Comparison of models. Following in Figure 2, we provide a comparison of all these models side-by-side.

design based research (dbr)

Competing demands and roles

The third challenge facing DBR is the variety of roles researchers are expected to fulfill, with researchers often acting simultaneously as project managers, designers, and evaluators. However, with most individuals able to focus on only one task at a time, these competing demands on resources and researcher attention and faculties can be challenging to balance, and excess focus on one role can easily jeopardize others. The literature has recognized four major roles that a DBR professional must perform simultaneously: researcher, project manager, theorist, and designer.

Researcher as researcher

Planning and carrying out research is already comprised of multiple considerations, such as controlling variables and limiting bias. The nature of DBR, with its collaboration and situated experimentation and development, innately intensifies some of these issues (Hoadley, 2004). While simultaneously designing the intervention, a design-based researcher must also ensure that high-quality research is accomplished, per typical standards of quality associated with quantitative or qualitative methods.

However, research is even more difficult in DBR because the nature of the method leads to several challenges. First, it can be difficult to control the many variables at play in authentic contexts (Collins et al., 2004). Many researchers may feel torn between being able to (a) isolate critical variables or (b) study the comprehensive, complex nature of the design experience (van den Akker, 1999). Second, because many DBR studies are qualitative, they produce large amounts of data, resulting in demanding data collection and analysis (Collins et al., 2004). Third, according to Anderson and Shattuck (2012), the combination of demanding data analysis and highly invested roles of the researchers leaves DBR susceptible to multiple biases during analysis. Perhaps best expressed by Barab and Squire (2004), “if a researcher is intimately involved in the conceptualization, design, development, implementation, and researching of a pedagogical approach, then ensuring that researchers can make credible and trustworthy assertions is a challenge” (p. 10). Additionally, the assumption of multiple roles invests much of the design and research in a single person, diminishing the likelihood of replicability (Hoadley, 2004). Finally, it is impossible to document or account for all discrete decisions made by the collaborators that influenced the development and success of the design (Design-Based Research Collective, 2003).

Quality research, though, was never meant to be easy! Despite these challenges, DBR has still been shown to be effective in simultaneously developing theory through research as well as interventions that can benefit practice—the two simultaneous goals of any instructional designer.

Researcher as project manager

The collaborative nature of DBR lends the approach one of its greatest strengths: multiple perspectives. While this can be a benefit, collaboration between researchers, developers, and practitioners needs to be highly coordinated (Collins et al., 2004), because it is difficult to manage interdisciplinary teams and maintain a productive, collaborative partnership (Design-Based Research Collective, 2003).

Researcher as theorist

For many researchers in DBR, the development or testing of theory is a foundational component and primary focus of their work. However, the iterative and multi-tasking nature of a DBR process may not be well-suited to empirically testing or building theory. According to Hoadley (2004), “the treatment’s fidelity to theory [is] initially, and sometimes continually, suspect” (p. 204). This suggests that researchers, despite intentions to test or build theory, may not design or implement their solution in alignment with theory or provide enough control to reliably test the theory in question.

Researcher as designer

Because DBR is simultaneously attempting to satisfy the needs of both design and research, there is a tension between the responsibilities of the researcher and the responsibilities of the designer (van den Akker, 1999). Any design decision inherently alters the research. Similarly, research decisions place constraints on the design. Skilled design-based researchers seek to balance these competing demands effectively.

What we can learn from IDR

IDR has been encumbered by similar issues that currently exist in DBR. While IDR is by no means a perfect field and is still working to hone and clarify its methods, it has been developing for two decades longer than DBR. The history of IDR and efforts in the field to address similar issues can yield possibilities and insights for the future of DBR. The following sections address efforts in IDR to define the field that hold potential for application in DBR, including how professionals in IDR have focused their efforts to increase unity and worked to define sub-approaches more clearly.

Defining Approaches

Similar to DBR, IDR has been subject to competing definitions as varied as the fields in which design research has been applied (i.e., product design, engineering, manufacturing, information technology, etc.) (Findeli, 1998; Jonas, 2007; Schneider, 2007). Typically, IDR scholars have focused on the relationship between design and research, as well as the underlying purpose, to define the approach. This section identifies three defining conceptualizations of IDR—the prepositional approach trinity, Cross’s -ologies, and Buchanan’s strategies of productive science—and discusses possible implications for DBR.

The approach trinity

One way of defining different purposes of design research is by identifying the preposition in the relationship between research and design: research into design, research for design, and research through design (Buchanan, 2007; Cross, 1999; Findeli, 1998; Jonas, 2007; Schneider, 2007).

Jonas (2007) identified research into design as the most prevalent—and straightforward—form of IDR. This approach separates research from design practice; the researcher observes and studies design practice from without, commonly addressing the history, aesthetics, theory, or nature of design (Schneider, 2007). Research into design generally yields little or no contribution to broader theory (Findeli, 1998).

Research for design applies to complex, sophisticated projects, where the purpose of research is to foster product research and development, such as in market and user research (Findeli, 1998; Jonas, 2007). Here, the role of research is to build and improve the design, not contribute to theory or practice.

According to Jonas’s (2007) description, research through design bears the strongest resemblance to DBR and is where researchers work to shape their design (i.e., the research object) and establish connections to broader theory and practice. This approach begins with the identification of a research question and carries through the design process experimentally, improving design methods and finding novel ways of controlling the design process (Schneider, 2007). According to Findeli (1998), because this approach adopts the design process as the research method, it helps to develop authentic theories of design.

Cross’s -ologies

Cross (1999) conceived of IDR approaches based on the early drive toward a science of design and identified three bodies of scientific inquiry: epistemology, praxiology, and phenomenology. Design epistemology primarily concerns what Cross termed “designerly ways of knowing” or how designers think and communicate about design (Cross, 1999; Cross, 2007). Design praxiology deals with practices and processes in design or how to develop and improve artifacts and the processes used to create them. Design phenomenology examines the form, function, configuration, and value of artifacts, such as exploring what makes a cell phone attractive to a user or how changes in a software interface affect user’s activities within the application.

Buchanan’s strategies of productive science

Like Cross, Buchanan (2007) viewed IDR through the lens of design science and identified four research strategies that frame design inquiry: design science, dialectic inquiry, rhetorical inquiry, and productive science (Figure 2). Design science focuses on designing and decision-making, addressing human and consumer behavior. According to Buchanan (2007), dialectic inquiry examines the “social and cultural context of design; typically [drawing] attention to the limitations of the individual designer in seeking sustainable solutions to problems” (p.57). Rhetorical inquiry focuses on the design experience as well as the designer’s process to create products that are usable, useful, and desirable. Productive science studies how the potential of a design is realized through the refinement of its parts, including materials, form, and function. Buchanan (2007) conceptualized a design research—what he termed design inquiry—that includes elements of all four strategies, looking at the designer, the design, the design context, and the refinement process as a holistic experience.

design based research (dbr)

Implications for DBR

While the literature has yet to accept any single approach to defining types of IDR, it may still be helpful for DBR to consider similar ways of limiting and defining sub-approaches in the field. The challenges brought on by collaboration, multiple researcher roles, and lack of sufficient focus on the design product could be addressed and relieved by identifying distinct approaches to DBR. This idea is not new. Bell and Sandoval (2004) opposed the unification of DBR, specifically design-based research, across educational disciplines (such as developmental psychology, cognitive science, and instructional design). However, they did not suggest any potential alternatives. Adopting an IDR approach, such as the approach trinity, could serve to both unite studies across DBR and clearly distinguish the purpose of the approach and its primary functions. Research into design could focus on the design process and yield valuable insights on design thinking and practice. Research for design could focus on the development of an effective product, which development is missing from many DBR approaches. Research through design would use the design process as a vehicle to test and develop theory, reducing the set of expected considerations. Any approach to dividing or defining DBR efforts could help to limit the focus of the study, helping to prevent the diffusion of researcher efforts and findings.

In this chapter we have reviewed the historical development of both design-based research and interdisciplinary design research in an effort to identify strategies in IDR that could benefit DBR development. Following are a few conclusions, leading to recommendations for the DBR field.

Improve interdisciplinary collaboration

Overall, one key advantage that IDR has had—and that DBR presently lacks—is communication and collaboration with other fields. Because DBR has remained so isolated, only rarely referencing or exploring approaches from other design disciplines, it can only evolve within the constraints of educational inquiry. IDR’s ability to conceive solutions to issues in the field is derived, in part, from a wide variety of disciplines that contribute to the body of research. Engineers, developers, artists, and a range of designers interpose their own ideas and applications, which are in turn adopted and modified by others. Fostering collaboration between DBR and IDR, while perhaps not the remedy to cure all scholarly ills, could yield valuable insights for both fields, particularly in terms of refining methodologies and promoting the development of theory.

Simplify terminology and improve consistency in use

As we identified in this paper, a major issue facing DBR is the proliferation of terminology among scholars and the inconsistency in usage. From IDR comes the useful acknowledgement that there can be research into design, for design, and through design (Buchanan, 2007; Cross, 1999; Findeli, 1998; Jonas, 2007; Schneider, 2007). This framework was useful for scholars in our conversations at the conference. A resulting recommendation, then, is that, in published works, scholars begin articulating which of these approaches they are using in that particular study. This can simplify the requirements on DBR researchers, because instead of feeling the necessity of doing all three in every paper, they can emphasize one. This will also allow us to communicate our research better with IDR scholars.

Describe DBR process in publications

Oftentimes authors publish DBR studies using the same format as regular research studies, making it difficult to recognize DBR research and learn how other DBR scholars mitigate the challenges we have discussed in this chapter. Our recommendation is that DBR scholars publish the messy findings resulting from their work and pull back the curtain to show how they balanced competing concerns to arrive at their results. We believe it would help if DBR scholars adopted more common frameworks for publishing studies. In our review of the literature, we identified the following characteristics, which are the most frequently used to identify DBR:

  • DBR is design driven and intervention focused
  • DBR is situated within an actual teaching/learning context
  • DBR is iterative
  • DBR is collaborative between researchers, designers, and practitioners
  • DBR builds theory but also needs to be practical and result in useful interventions

One recommendation is that DBR scholars adopt these as the characteristics of their work that they will make explicit in every published paper so that DBR articles can be recognized by readers and better aggregated together to show the value of DBR over time. One suggestion is that DBR scholars in their methodology sections could adopt these characteristics as subheadings. So in addition to discussing data collection and data analysis, they would also discuss Design Research Type (research into, through, or of design), Description of the Design Process and Product, Design and Learning Context, Design Collaborations, and a discussion explicitly of the Design Iterations, perhaps by listing each iteration and then the data collection and analysis for each. Also in the concluding sections, in addition to discussing research results, scholars would discuss Applications to Theory (perhaps dividing into Local Theory and Outcomes and Transferable Theory and Findings) and Applications for Practice. Papers that are too big could be broken up with different papers reporting on different iterations but using this same language and formatting to make it easier to connect the ideas throughout the papers. Not all papers would have both local and transferable theory (the latter being more evident in later iterations), so it would be sufficient to indicate in a paper that local theory and outcomes were developed and met with some ideas for transferable theory that would be developed in future iterations. The important thing would be to refer to each of these main characteristics in each paper so that scholars can recognize the work as DBR, situate it appropriately, and know what to look for in terms of quality during the review process.

Application Exercises

  • According to the authors, what are the major issues facing DBR and what are some things that can be done to address this problem?
  • Imagine you have designed a new learning app for use in public schools. How would you go about testing it using design-based research?

Anderson, T., & Shattuck, J. (2012). Design-based research: A decade of progress in education research? Educational Researcher, 41 (1), 16–25.

Archer, L.B. (1965). Systematic method for designers. In N. Cross (ed.), Developments in design methodology. London, England: John Wiley, 1984, pp. 57–82.

Archer, L. B. (1981). A view of the nature of design research. In R. Jacques & J.A. Powell (Eds.), Design: Science: Method (pp. 36-39). Guilford, England: Westbury House.

Bannan-Ritland, B. (2003). The role of design in research: The integrative learning design framework. Educational Researcher, 32 (1), 21 –24. doi:10.3102/0013189X032001021

Barab, S., & Squire, K. (2004). Design-based research: Putting a stake in the ground. The Journal of the Learning Sciences, 13 (1), 1–14.

Brown, A. L. (1992). Design experiments: Theoretical and methodological challenges in creating complex interventions in classroom settings. The Journal of the Learning Sciences, 2 (2), 141–178.

Buchanan, R. (2007). Strategies of design research: Productive science and rhetorical inquiry. In R. Michel (Ed.), Design research now (pp. 55–66). Basel, Switzerland: Birkhäuser Verlag AG.

Cobb, P., Confrey, J., diSessa, A., Lehrer, R., & Schauble, L. (2003). Design experiments in educational research. Educational Researcher, 32 (1), 9–13. doi:10.3102/0013189X032001009

Collins, A. (1990). Toward a Design Science of Education. Technical Report No. 1.

Collins, A. (1992). Toward a design science of education. In E. Scanlon & T. O’Shea (Eds.), New directions in educational technology. Berlin, Germany: Springer-Verlag.

Collins, A., Joseph, D., & Bielaczyc, K. (2004). Design research: Theoretical and methodological issues. The Journal of the Learning Sciences, 13 (1), 15–42.

Cross, N. (1999). Design research: A disciplined conversation. Design Issues, 15 (2), 5–10. doi:10.2307/1511837

Cross, N. (2007). Forty years of design research. Design Studies, 28 (1), 1–4. doi:10.1016/j.destud.2006.11.004

Design-Based Research Collective. (2003). Design-based research: An emerging paradigm for educational inquiry. Educational Researcher, 32 (1), 5–8. doi:10.3102/0013189X032001005

Findeli, A. (1998). A quest for credibility: Doctoral education and research in design at the University of Montreal. Doctoral Education in Design, Ohio, 8–11 October 1998.

Friedman, K. (2003). Theory construction in design research: Criteria: approaches, and methods. Design Studies, 24 (6), 507–522.

Hoadley, C. M. (2004). Methodological alignment in design-based research. Educational Psychologist, 39 (4), 203–212.

Jonas, W. (2007). Design research and its meaning to the methodological development of the discipline. In R. Michel (Ed.), Design research now (pp. 187–206). Basel, Switzerland: Birkhäuser Verlag AG.

Jones, J. C. (1970). Design methods: Seeds of human futures. New York, NY: John Wiley & Sons Ltd.

Joseph, D. (2004). The practice of design-based research: uncovering the interplay between design, research, and the real-world context. Educational Psychologist, 39 (4), 235–242.

Kelly, A. E. (2003). Theme issue: The role of design in educational research. Educational Researcher, 32 (1), 3–4. doi:10.3102/0013189X032001003

Margolin, V. (2010). Design research: Towards a history. Presented at the Design Research Society Annual Conference on Design & Complexity, Montreal, Canada. Retrieved from http://www.drs2010.umontreal.ca/data/PDF/080.pdf

McCandliss, B. D., Kalchman, M., & Bryant, P. (2003). Design experiments and laboratory approaches to learning: Steps toward collaborative exchange. Educational Researcher, 32 (1), 14–16. doi:10.3102/0013189X032001014

Michel, R. (Ed.). (2007). Design research now. Basel, Switzerland: Birkhäuser Verlag AG

Oh, E., & Reeves, T. C. (2010). The implications of the differences between design research and instructional systems design for educational technology researchers and practitioners. Educational Media International, 47 (4), 263–275.

Reeves, T. C. (2006). Design research from a technology perspective. In J. van den Akker, K. Gravemeijer, S. McKenney, & N. Nieveen (Eds.), Educational design research (Vol. 1, pp. 52–66). London, England: Routledge.

Reigeluth, C. M., & Frick, T. W. (1999). Formative research: A methodology for creating and improving design theories. In C. Reigeluth (Ed.), Instructional-design theories and models. A new paradigm of instructional theory (Vol. 2) (pp. 633–651), Mahwah, NJ: Lawrence Erlbaum Associates.

Richey, R. C., & Nelson, W. A. (1996). Developmental research. In D. Jonassen (Ed.), Handbook of research for educational communications and technology (pp. 1213–1245), London, England: Macmillan.

Sandoval, W. A., & Bell, P. (2004). Design-based research methods for studying learning in context: Introduction. Educational Psychologist, 39 (4), 199–201.

Schneider, B. (2007). Design as practice, science and research. In R. Michel (Ed.), Design research now (pp. 207–218). Basel, Switzerland: Birkhäuser Verlag AG.

Shavelson, R. J., Phillips, D. C., Towne, L., & Feuer, M. J. (2003). On the science of education design studies. Educational Researcher, 32 (1), 25–28. doi:10.3102/0013189X032001025

Simon, H. A. (1969). The sciences of the artificial. Cambridge, MA: The MIT Press.

Tabak, I. (2004). Reconstructing context: Negotiating the tension between exogenous and endogenous educational design. Educational Psychologist, 39 (4), 225–233.

van den Akker, J. (1999). Principles and methods of development research. In J. van den Akker, R. M. Branch, K. Gustafson, N. Nieveen, & T. Plomp (Eds.), Design approaches and tools in education and training (pp. 1–14). Norwell, MA: Kluwer Academic Publishers.

van den Akker, J., & Plomp, T. (1993). Development research in curriculum: Propositions and experiences. Paper presented at the annual meeting of the American Educational Research Association, April 12–14, Atlanta, GA.

Walker, D.F., (1992). Methodological issues in curriculum research, In Jackson, P. (Ed.), Handbook of research on curriculum (pp. 98–118). New York, NY: Macmillan.

Walker, D. & Bresler, L. (1993). Development research: Definitions, methods, and criteria.  Paper presented at the annual meeting of the American Educational Research Association, April 12–16, Atlanta, GA.

Willemien, V. (2009). Design: One, but in different forms. Design Studies, 30 (3), 187–223. doi:10.1016/j.destud.2008.11.004

Further Video Resource

Rick West at DBRX

Video available at  http://bit.ly/WestDBRX

question mark

Foundations of Learning and Instructional Design Technology Copyright © 2018 by Kimberly Christensen and Richard E. West is licensed under a Creative Commons Attribution 4.0 International License , except where otherwise noted.

Share This Book

  • About this Book
  • Learning Theories
  • Behaviorism
  • Cognitivism
  • Constructivism
  • Socioculturalism
  • Research Methods
  • Case Studies

Design-Based Research

  • Inferential Statistics
  • Learning Analytics
  • Psychometrics
  • Design and Development
  • Continuous Improvement Dashboards
  • Design Layers
  • Gamification
  • Informal Learning
  • Instructional Design Methods
  • Personalized and Blended Learning
  • Project-Based Learning
  • Secure Web Application Development
  • Socratic Seminar
  • Visual Aesthetics
  • List of Authors
  • Translations

Choose a Sign-in Option

Tools and Settings

Questions and Tasks

Citation and Embed Code

design based research (dbr)

In an educational setting, design-based research is a research approach that engages in iterative designs to develop knowledge that improves educational practices. This chapter will provide a brief overview of the origin, paradigms, outcomes, and processes of design-based research (DBR). In these sections we explain that (a) DBR originated because some researchers believed that traditional research methods failed to improve classroom practices, (b) DBR places researchers as agents of change and research subjects as collaborators, (c) DBR produces both new designs and theories, and (d) DBR consists of an iterative process of design and evaluation to develop knowledge.

Origin of DBR

DBR originated as researchers like Allan Collins (1990) and Ann Brown (1992) recognized that educational research often failed to improve classroom practices. They perceived that much of educational research was conducted in controlled, laboratory-like settings. They believed that this laboratory research was not as helpful as possible for practitioners.

Proponents of DBR claim that educational research is often detached from practice (The Design-Based Research Collective, 2002). There are at least two problems that arise from this detachment: (a) practitioners do not benefit from researchers’ work and (b) research results may be inaccurate, because they fail to account for context (The Design-Based Research Collective, 2002).

Practitioners do not benefit from researchers’ work if the research is detached from practice. Practitioners are able to benefit from research when they see how the research can inform and improve their designs and practices. Some practitioners believe that educational research is often too abstract or sterilized to be useful in real contexts (The Design-Based Research Collective, 2002).

Not only is lack of relevance a problem, but research results can also be inaccurate by failing to account for context. Findings and theories based on lab results may not accurately reflect what happens in real-world educational settings.

Conversely, a problem that arises from an overemphasis on practice is that while individual practices may improve, the general body of theory and knowledge does not increase. Scholars like Collins (1990) and Brown (1992) believed that the best way to conduct research would be to achieve the right balance between theory-building and practical impact.

Paradigms of DBR

Proponents of DBR believe that conducting research in context, rather than in a controlled laboratory setting, and iteratively designing interventions yields authentic and useful knowledge. Sasha Barab (2004) says that the goal of DBR is to “directly impact practice while advancing theory that will be of use to others” (p. 8). This implies “a pragmatic philosophical underpinning, one in which the value of a theory lies in its ability to produce changes in the world” (p. 6). The aims of DBR and the role of researchers and subjects are informed by this philosophical underpinning.

Aims of DBR

Traditional, experimental research is conducted by theorists focused on isolating variables to test and refine theory. DBR is conducted by designers focused on (a) understanding contexts, (b) designing effective systems, and (c) making meaningful changes for the subjects of their studies (Barab & Squire, 2004; Collins, 1990). Traditional methods of research generate refined understandings of how the world works, which may indirectly affect practice. In DBR there is an intentionality in the research process to both refine theory and practice (Collins et al., 2004).

Role of DBR Researcher

In DBR, researchers assume the roles of “curriculum designers, and implicitly, curriculum theorists” (Barab & Squire, 2004, p.2). As curriculum designers, DBR researchers come into their contexts as informed experts with the purpose of creating, “test[ing] and refin[ing] educational designs based on principles derived from prior research” (Collins et al., 2004, p. 15). These educational designs may include curricula, practices, software, or tangible objects beneficial to the learning process (Barab & Squire, 2004). As curriculum theorists, DBR researchers also come into their research contexts with the purpose to refine extant theories about learning (Brown, 1992).

This duality of roles for DBR researchers contributes to a greater sense of responsibility and accountability within the field. Traditional, experimental researchers isolate themselves from the subjects of their study (Barab & Squire, 2004). This separation is seen as a virtue, allowing researchers to make dispassionate observations as they test and refine their understandings of the world around them. In comparison, design-based researchers “bring agendas to their work,” see themselves as necessary agents of change and see themselves as accountable for the work they do (Barab & Squire, 2004, p. 2).

Role of DBR Subjects

Within DBR, research subjects are seen as key contributors and collaborators in the research process. Classic experimentalism views the subjects of research as things to be observed or experimented on, suggesting a unidirectional relationship between researcher and research subject. The role of the research subject is to be available and genuine so that the researcher can make meaningful observations and collect accurate data. In contrast, design-based researchers view the subjects of their research (e.g., students, teachers, schools) as “co-participants” (Barab & Squire, 2004, p. 3) and “co-investigators” (Collins, 1990, p. 4). Research subjects are seen as necessary in “helping to formulate the questions,” “making refinements in the designs,” “evaluating the effects of...the experiment,” and “reporting the results of the experiment to other teachers and researchers” (Collins, 1990, pp. 4-5). Research subjects are co-workers with the researcher in iteratively pushing the study forward.

Outcomes of DBR

DBR educational research develops knowledge through this collaborative, iterative research process. The knowledge developed by DBR can be separated into two categories: (a) tangible, practical outcomes and (b) intangible, theoretical outcomes.

Tangibles Outcomes

A major goal of design-based research is producing meaningful interventions and practices. Within educational research these interventions may “involve the development of technological tools [and] curricula” (Barab & Squire, 2004, p. 1). But more than just producing meaningful educational products for a specific context, DBR aims to produce meaningful, effective educational products that can be transferred and adapted (Barab & Squire, 2004). As expressed by Brown (1992), “an effective intervention should be able to migrate from our experimental classroom to average classrooms operated by and for average students and teachers” (p.143).

Intangible Outcomes

It is important to recognize that DBR is not only concerned with improving practice but also aims to advance theory and understanding (Collins et al., 2004). DBR’s emphasis on the importance of context enhances the knowledge claims of the research. “Researchers investigate cognition in context...with the broad goal of developing evidence-based claims derived from both laboratory-based and naturalistic investigations that result in knowledge about how people learn” (Barab & Squire, 2004, p.1). This new knowledge about learning then drives future research and practice.

Process of DBR

A hallmark of DBR is the iterative nature of its interventions. As each iteration progresses, researchers refine and rework the intervention drawing on a variety of research methods that best fit the context. This flexibility allows the end result to take precedence over the process. While each researcher may use different methods, McKenny and Reeves (2012) outlined three core processes of DBR: (a) analysis and exploration, (b) design and construction, and (c) evaluation and reflection. To put these ideas in context, we will refer to a recent DBR study completed by Siko and Barbour regarding the use of PowerPoint games in the classroom.

The Iterative Process of Design-Based Research

the DBR process

Analysis and Exploration

Analysis is a critical aspect of DBR and must be used throughout the entire process. At the start of a DBR project, it is critical to understand and define which problem will be addressed. In collaboration with practitioners, researchers seek to understand all aspects of a problem. Additionally, they “seek out and learn from how others have viewed and solved similar problems ” (McKenny & Reeves, 2012, p. 85). This analysis helps to provide an understanding of the context within which to execute an intervention.

Since theories cannot account for the variety of variables in a learning situation, exploration is needed to fill the gaps. DBR researchers can draw from a number of disciplines and methodologies as they execute an intervention. The decision of which methodologies to use should be driven by the research context and goals.

Siko and Barbour (2016) used the DBR process to address a gap they found in research regarding the effectiveness of having students create their own PowerPoint games to review for a test. In analyzing existing research, they found studies that stated teaching students to create their own PowerPoint games did not improve content retention. Siko and Barbour wanted to “determine if changes to the implementation protocol would lead to improved performance” (Siko & Barbour, 2016, p. 420). They chose to test their theory in three different phases and adapt the curriculum following each phase.

Design and Construction

Informed by the analysis and exploration, researchers design and construct interventions, which may be a specific technology or “less concrete aspects such as activity structures, institutions, scaffolds, and curricula” (Design-Based Research Collective, 2003, pp. 5–6). This process involves laying out a variety of options for a solution and then creating the idea with the most promise.

Within Siko and Barbour’s design, they planned to observe three phases of a control group and a test group. Each phase would use t-tests to compare two unit tests for each group. They worked with teachers to implement time for playing PowerPoint games as well as a discussion on what makes games successful. The first implementation was a control phase that replicated past research and established a baseline. Once they finished that phase, they began to evaluate.

Evaluation and Reflection

Researchers can evaluate their designs both before and after use. The cyclical process involves careful, constant evaluation for each iteration so that improvements can be made. While tests and quizzes are a standard way of evaluating educational progress, interviews and observations also play a key role, as they allow for better understanding of how teachers and students might see the learning situation.

Reflection allows the researcher to make connections between actions and results. Researchers must take the time to analyze what changes allowed them to have success or failure so that theory and practice at large can be benefited. Collins (1990) states:

It is important to analyze the reasons for failure and to take steps to fix them. It is critical to document the nature of the failures and the attempted revisions, as well as the overall results of the experiment, because this information informs the path to success. (pg. 5)

As researchers reflect on each change they made, they find what is most useful to the field at large, whether it be a failure or a success.

After evaluating results of the first phase, Siko and Barbour revisited the literature of instructional games. Based on that research, they first tried extending the length of time students spent creating the games. They also discovered that the students struggled to design effective test questions, so the researchers tried working with teachers to spend more time explaining how to ask good questions. As they explored these options, researchers were able to see unit test scores improve.

Reflection on how the study was conducted allowed the researchers to properly place their experiences within the context of existing research. They recognized that while they found positive impacts as a result of their intervention, there were a number of limitations with the study. This is an important realization for the research and allows readers to not misinterpret the scope of the findings.

This chapter has provided a brief overview of the origin, paradigms, outcomes, and processes of Design-Based Research (DBR). We explained that (a) DBR originated because some researchers believed that traditional research methods failed to improve classroom practices, (b) DBR places researchers as agents of change and research subjects as collaborators, (c) DBR produces both new designs and theories, and (d) DBR consists of an iterative process of design and evaluation to develop knowledge.

Barab, S., & Squire, K. (2004). Design-based research: putting a stake in the ground. Journal of the Learning Sciences, 13(1), 1–14.

Brown, A. L. (1992). Design experiments: theoretical and methodological challenges in creating complex interventions in classroom settings. Journal of the Learning Sciences, 2(2), 141–178.

Collins, A. (1990). Toward a design science of education (Report No. 1). Washington, DC: Center for Technology in Education.

Collins, A., Joseph, D., & Bielaczyc, K. (2004). Design research: Theoretical and methodological issues. Journal of the Learning Sciences, 13(1), 15–42.

Mckenney, S., & Reeves, T.C. (2012) Conducting Educational Design Research. New York, NY: Routledge.

Siko, J. P., & Barbour, M. K. (2016). Building a better mousetrap: how design-based research was used to improve homemade PowerPoint games. TechTrends, 60(5), 419–424.

The Design-Based Research Collective. (2003). Design-based research: An emerging paradigm for educational inquiry. Educational Researcher, 32(1), 5–8.

This content is provided to you freely by EdTech Books.

Access it online or download it at https://edtechbooks.org/studentguide/design-based_research .

Logo for OPEN OKSTATE

3 Design-Based Research and Interventions

Design-Based Research (DBR) is a research methodology used by researchers in the learning sciences. DBR is a concentrated, collaborative and participatory approach to educational inquiry. The basic process of DBR involves developing solutions or interventions to problems (Anderson & Shattuck, 2012). An “Intervention” is any interference that would modify a process or situation. Interventions are thus intentionally implemented change strategies (Sundell & Olsson, 2017). Data analysis takes the form of iterative comparisons. The purpose of this research perspective is to generate new theories and frameworks for conceptualising learning and instruction.

One positive aspect of DBR is that it can be employed to bring researchers and practitioners together to design context-based solutions to educational problems, which have deep-rooted meaning for practitioners about the relationship between educational theory and practice. DBR assumes a timeframe which allows for several rounds of review and iteration. It might be seen as a long-term and intensive approach to educational inquiry which is not really suitable for doctoral work, but increasingly there are examples of this approach being used (Goff & Getenet, 2017).

DBR provides a significant methodological approach for understanding and addressing problems of practice, particularly in the educational context, where a long criticism of educational research is that it is often divorced from the reality of the everyday (Design-Based Research Collective, 2003). DBR is about balancing practice and theory, meaning the researcher must act both as a practitioner and a researcher. DBR allows the collection of data in multiple ways and encourages the development of meaningful relationships with the data and the participants. DBR can also be used as a practical way to engage with real-life issues in education.

DBR & Interventions: GO-GN Insights

Roberts (2019) used a design-based research (DBR) approach to examine how secondary students expanded their learning from formal to informal learning environments using the open learning design intervention (OLDI) framework to support the development of open educational practices (OEP).

“We took some methods and research classes in my EdD program. I took Design-based research (DBR) and found it confusing and overwhelming. As such, I decided to take an extra course on case study research because it seemed to speak to me the most. In my mind I thought I could compare and contrast a variety of secondary school teachers integrating open ed practices. Through my initial exploration, I discovered that in my school district (30,000 + students), there are many teachers using OEP, but they were not interested in working “with” me, they wanted me to watch and observe them teach – then write about it. I began to understand that not only did I want to consider focusing my research on an emerging pedagogy (OEP) I also realized that I wanted to consider newer participatory methods. I did notmthink of DBR in this context when I took the initial course. “I knew I wanted to work with a teacher and complete some kind of intervention in order to support them in thinking about and actually integrating OEP. DBR was suggested to me multiple times, but I kept pushing it away. At the same time many of my supervisory committee and my peers did not think I should even consider DBR. I discovered that many researchers don’t know about it and are fearful of it. As I learned, when you do choose DBR, it is kind of like being an open learner in that you believe in the philosophy behind the DBR process. You just “are” a DBR researcher and educator. “It took many hours of reflection, reading about different examples of DBR, going to workshops and webinars about DBR in order to really see the possible benefits of DBR (collaborative, iterative, responsive, flexibility, balance between theory/ practice and relationships based) to get me to take the plunge…” (Verena Roberts)

Useful references for Design-Based Research: Anderson & Shattuck (2012);Design-Based Research Collective (2003); Goff & Getenet (2017); Sundell & Olsson(2017)

Research Methods Handbook Copyright © 2020 by Rob Farrow; Francisco Iniesto; Martin Weller; and Rebecca Pitt is licensed under a Creative Commons Attribution 4.0 International License , except where otherwise noted.

Share This Book

design based research (dbr)

Design-Based Research Methods (DBR)

Summary: Design-Based Research is a lens or set of analytical techniques that balances the positivist and interpretivist paradigms and attempts to bridge theory and practice in education. A blend of empirical educational research with the theory-driven design of learning environments, DBR is an important methodology for understanding how, when, and why educational innovations work in practice; DBR methods aim to uncover the relationships between educational theory, designed artefact, and practice.

Originators: A. Brown [1] , A. Collins [2] , DBR Collective [3] , and others

Keywords: design experiments, iterative, interventionist, theory-building, theory-driven

In recent years, educators have been trying to narrow the chasm between research and practice. Part of the challenge is that research that is detached from practice “may not account for the influence of contexts, the emergent and complex nature of outcomes, and the incompleteness of knowledge about which factors are relevant for prediction” [3] .

According to Collins et al., Design-based Research (also known as design experiments) intends to address several needs and issues central to the study of learning [4] . These include the following:

  • The need to address theoretical questions about the nature of learning in context
  • The need for approaches to the study of learning phenomena in the real world situations rather than the laboratory
  • The need to go beyond narrow measures of learning.
  • The need to derive research findings from formative evaluation.

Characteristics of design-based research experiments include:

  • addressing complex problems in real, authentic contexts in collaboration with practitioners
  • applying integrating known and hypothetical design principles to render plausible solutions
  • conducting rigorous and reflective inquiry to test and refine innovative learning environments
  • intertwined goals of (1) designing learning environments and (2) developing theories of learning
  • research and development through continuous cycles of design, enactment, analysis, and redesign
  • research on designs that must lead to sharable theories that help communicate relevant implications to practitioners and other educational designers
  • research must account for how designs function in authentic settings
  • development of such accounts relies on methods that can document and connect processes of enactment to outcomes of interest [3] .

Design-based research vs. traditional evaluation

The following excerpt highlights the difference between the goals and contributions of design-based research methods can offer and traditional evaluation:

“In traditional evaluation, an intervention (e.g. a textbook, an instructional program, a policy) is measured against a set of standards. During formative evaluation, iterative cycles of development, implementation, and study allow the designer to gather information about how an intervention is or is not succeeding in ways that might lead to better design. Then the intervention is ‘frozen’, and the rigorous summative evaluation begins….Like formative evaluation, design-based research uses mixed methods to analyze an intervention’s outcomes and refine the intervention. Unlike evaluation research, design-based research views a successful innovation as a joint product of the designed intervention and the context. Hence, design-based research goes beyond perfecting a particular product. The intention of design-based research…is to inquire more broadly into the nature of learning in a complex system and to refine generative or predictive theories of learning. Models of successful innovation can be generated through such work — models, rather than particular artifacts or programs, are the goal” [3] .

For more information, see:

  • Cobb, P., diSessa, A., Lehrer, R., Schauble, L. (2003). Design experiments in educational research. Educational Researcher, 32(1): 9-13.
  • Brown, A. L. (1992). Design experiments: Theoretical and methodological challenges in creating complex interventions in classroom settings. The Journal of the Learning Sciences, 2(2): 141-178.
  • Collins, A. (1992). Towards a design science of education. In E. Scanlon & T. O’Shea (Eds.), New directions in educational technology (pp. 15-22). Berlin: Springer.
  • Design-Based Research Collective. (2003). Design-based research: An emerging paradigm for educational inquiry. Educational Researcher, 32(1): 5-8.
  • Barab, S., & Squire, K. (2004). Design-based research: Putting a stake in the ground. The Journal of the Learning Sciences, 13(1).

Leave a Reply

You must be logged in to post a comment.

Related Posts

design based research (dbr)

21st Century Skills (P21 and others)

design based research (dbr)

Constructivism

design based research (dbr)

Cognitivism

A systematic literature review of design-based research from 2004 to 2013

  • Published: 24 June 2015
  • Volume 2 , pages 399–420, ( 2015 )

Cite this article

design based research (dbr)

  • Lanqin Zheng 1  

17k Accesses

43 Citations

7 Altmetric

Explore all metrics

Design-based research (DBR) that blends designing learning environments and developing theories has proliferated in recent years. In order to gain insights into DBR, 162 studies related to DBR published from 2004 to 2013 were selected and reviewed. The major findings indicated that most of the studies focused on designing, developing, and redesigning learning environments through interventions. However, how to revise the intervention was not specified in detail. Also, the testing of an intervention was found to be still dependent on the measurement of cognitive outcomes. Furthermore, it was found that most DBR only conducted one cycle of iteration. This review of research not only identifies the progress of DBR, but also provides future directions of DBR for researchers and practitioners.

Similar content being viewed by others

design based research (dbr)

Reflection on Design-Based Research: Challenges and Future Direction

design based research (dbr)

Research methods for design knowledge: clarifying definitions, characteristics, and areas of confusion

Design research with a focus on learning processes: an overview on achievements and challenges.

Avoid common mistakes on your manuscript.

Introduction

Design-based research (DBR) has emerged as a new research methodology from the beginning of this century. Being situated in a real context, DBR focuses on examining a particular intervention by continuous iteration of design, enactment, analysis, and redesign (Brown 1992 ; Cobb et al. 2003 ; Collins 1992 ). The intervention can be an instructional approach, or a type of assessment, or a learning activity, or a technological intervention, namely testing the effectiveness of the particular learning environment or tool (Anderson and Shattuck 2012 ). With the aim of designing learning environments and developing theories, the DBR explicates how designs work in the real settings and how to better understand the teaching and learning issues involved (The Design-Based Research Collective 2003 ). As an emerging paradigm, DBR highlights how the design principles evolved by multiple iterations as well as what kinds of intervention can lead to improved outcomes. By linking processes to outcomes in particular contexts, DBR can get a better understanding of intervention as well as improved theoretical accounts (The Design-Based Research Collective 2003 ).

DBR has been used increasingly in educational field, especially in K-12 contexts with technological interventions (Anderson and Shattuck 2012 ). Although the promising benefits of DBR are acknowledged in the field of education, many critiques have been proposed in previous studies. It is doubtful that researchers can produce reliable and faithful statements in DBR because researchers themselves are involved in design, development, and implementation of interventions (Barab and Squire 2004 ). Thus, it is difficult to produce the high research validity in DBR. Furthermore, it is impossible to replicate an intervention in other settings because DBR is contextually dependent (The Design-Based Research Collective 2003 ; Fishman et al. 2004 ; Hoadley 2002 ). Therefore, it is very obvious that there are big gaps between the expectation and application of DBR. This phenomenon makes us question how DBR was adopted and realized in education research in the past decade. Has DBR been most effective in a particular learning domain or research settings? What kinds of methods were utilized in DBR? How researchers design and implement the interventions in DBR? In order to answer these questions, a systematic review of existing studies was conducted to gain insights into the research issues of DBR and provide valuable references for educators and practitioners in this study.

Previous studies have attempted to analyze the methodology, progress, and issues of DBR. For example, Anderson and Shattuck ( 2012 ) reviewed the characteristics and progress of DBR by analyzing the abstracts of 47 most cited papers from 2002 to 2011. McKenney and Reeves ( 2013 ) suggested that in-depth analysis of full text of DBR should be conducted in order to provide sufficient evidence for assessing the progress of a decade. However, little research has been conducted to thoroughly analyze demographics, the research methodology, intervention, and research outcomes in the field of DBR. Therefore, this study aims to provide an overview of DBR through the systematic analysis of 162 selected studies in the database of 219 social sciences citation index (SSCI) educational journals from 2004 to 2013.

As Noyons and van Raan ( 1998 ) reported, separating the published papers into two periods can provide insights into the variation of the particular topic. Several studies have analyzed the variation by splitting the data into different periods of time. For example, Tsai et al. ( 2011 ) examined the variation of science learning by analyzing 228 empirical studies during 2000–2004 and 2005–2009. Kinshuk et al. ( 2013 ) analyzed highly cited educational technology papers during 2003–2006 and 2007–2010. Zheng et al. ( 2014 ) investigated the research topics of computer-supported collaborative learning by analysis of 706 papers published during 2003–2007 and 2008–2012. Thus, an in-depth review of demographics, research methodology, intervention, and research outcomes concerning DBR has been conducted in the present study between the first 5 years (2004–2008) and the second 2 years (2009–2013). The purpose of this review is twofold. First, the authors investigate the status quo of DBR from 2004 to 2013. Second, the variations between the first 5 years (2004–2008) and the second 5 years (2009–2013) in demographics, research methods, intervention characteristics, and research outcomes have been explored based on the selected studies. Therefore, the research questions addressed in this study are as follows:

What are the demographics of the selected studies from 2004 to 2013? And, what were the demographics variations between the first 5 years and the second 5 years?

What research methodologies in DBR were selected in these selected studies from 2004 to 2013? And, what were the methodology variations during the two periods?

What kinds of interventions were adopted in DBR from 2004 to 2013? And, what were the intervention variations during the two periods?

What are the measured outcomes in DBR from 2004 to 2013? And, what were the measured outcomes variations during the two periods?

Methodology

This study adopted content analysis method to review the research papers regarding the DBR from 2004 to 2013. This section will describe the details of the paper selection process, coding scheme, and inter-rater reliability.

Paper selection processes

In order to conduct a systematic literature review on DBR, this study selected papers relevant to DBR in the database of 219 education and educational research SSCI indexed journals from 2004 to 2013. More specifically, the paper identification process proceeded in three stages. In the first stage, 479 papers related on DBR were selected using keyword and paper title searches within the 219 journals. The search terms included "design research" and its synonyms (viz. “design-based research,” “developmental research,” “design experiment,” “design based research,” “design experiments,” “design research,” “development research,” “developmental research,” and “formative research”). In the second stage, the authors selected papers based on the following six criteria:

First, only papers that were categorized as “articles” in the SSCI database were analyzed in this study. So non-research publications such as “book reviews,” “editorials,” and “letters” were excluded from this study.

Second, conceptual papers closely related to DBR were included so as to produce a comprehensive understanding of DBR.

Third, the studies had to adopt DBR method to conduct the empirical study.

Fourth, the measured outcome variable(s) in the empirical study was related to student outcomes (cognitive outcomes, attitude, and psychomotor skills).

Fifth, the empirical study needs to follow appropriate methodology (Jitendra et al. 2011 ). The research sample groups, settings, learning domains, data sources, and data analysis procedure need to be specified in the empirical study.

Sixth, the paper had to be written in English and published from 2004 to 2013.

Failure to satisfy any of these criteria cannot be included in the literature review. Finally, the search and identification resulted in 162 selected articles.

Coding scheme

To answer the aforementioned four research questions, the coding scheme was developed for the purposes of reviewing DBR in the past decade. To address the first research question concerning the demographics in DBR, the category included: research sample group, research settings, and research learning domains. To analyze research methodology in DBR, we focus on the research methods and data sources adopted in selected studies. To answer the third research question regarding the intervention characteristics, the category included: intervention type, revision of intervention, iteration frequency, and iteration duration. To explore what measured outcomes were assessed, we adopted the coding scheme proposed by Wang et al. ( 2014 ), namely cognitive outcomes, attitude, psychomotor skills, integrated, and others. Some of these categories, namely research sample groups, research settings, research learning domains, research methods, data sources, and measured outcomes have also been applied in other reviews (Hsu et al. 2012 ; Wang et al. 2014 ). Following sections illustrate the details of each sub-dimension.

Research sample groups

Research sample groups were classified into one of the following sub-categories: (1) preschool, (2) primary school, (3) junior and senior high school, (4) higher education, (5) vocational education, (6) teachers, (7) mixed group, and (8) non-specified.

Research settings

Research settings refer to the contexts in which the research was mainly conducted. Research settings were coded as follows: (1) face-to-face classroom, (2) workplace, (3) distance learning setting, (4) blended learning setting, and, (5) non-specified. If a study took place in workplace mixed with a distance learning setting, it was coded into workplace.

Research learning domains

Research learning domains were classified into the following sub-categories: (1) natural science (including science, mathematics, physics, chemistry, biology, geography, and environment science), (2) social science (including politics, education, psychology, and linguistics), (3) engineering and technological science (including engineering and computer science), (4) medical science, (5) mixed learning domain, and (6) non-specified.

Research method

Research method was coded as follows: (1) qualitative method, (2) quantitative method, and, (3) qualitative and quantitative method. Qualitative method refers to the one in which investigators use narratives, ethnographies, case studies, and so on to develop knowledge. Quantitative method means that investigators adopt experiments, surveys, and so on to develop knowledge (Cresswell 2009 ).

Data sources

Within DBR, multiple data sources can be used to analyze the outcomes of an intervention and to refine it (Cobb et al. 2003 ; The Design-Based Research Collective 2003 ; Wang and Hannafin 2005 ). In this study, the data sources were coded as follows: (1) process data, including video and audio records, log data, think-aloud protocols, (2) outcome data, including test and various kinds of artifacts, (3) miscellaneous data, including questionnaire, interview data, notes (such as field notes, journals, written reflections, observation records), and (4) non-specified.

Intervention type

The intervention type was coded as follows: (1) instructional method (such as collaborative learning, project-based instruction), (2) scaffolding (conceptual scaffolding, procedural scaffolding, and metacognitive scaffolding), (3) integrated teaching models (such as knowledge-building activity), (4) technological intervention, namely testing the effectiveness of the learning environment or the particular tool), and (5) other models or methods (such as professional development model or heuristic task analysis method).

Revision of intervention

Revision of intervention refers to whether the intervention was revised and specified. In terms of revision of intervention, it was coded as follows: (1) revised, and (2) no revision. With respect to specifying how the intervention was revised, it was coded as follows: (1) reported, and, (2) no report.

Iteration frequency

Iteration frequency refers to the number of times the intervention is implemented during the whole research. In this study, the value of iteration frequency was coded as once, twice, thrice, four times, five times, and more than five times.

Iteration duration

Iteration duration refers to how long the intervention is conducted in the whole research. This time span can range from several days to several years.

Measured outcomes

The measured outcomes refer to the investigated crucial variables. Three major domains are selected as the measured outcomes for this study, namely cognitive outcomes, attitude, and psychomotor skills. In addition, if some studies measure multiple kinds of variables, then they are categorized as “Integrated.” If the measured outcomes did not belong to these four domains, they are classified as ‘others.’ Therefore, the measure outcomes are coded as follows: (1) cognitive outcomes, (2) attitude, (3) psychomotor skills, (4) integrated, and (5) others.

Inter-rater reliability

Three raters manually and independently coded all of the articles based on the aforementioned schemes. The percent agreement was used to calculate the inter-rater reliability. The agreement rate between coders was above 0.9, regarded as reliable and stable results (Landis and Koch 1977 ). The three raters resolved all discrepancies after face-to-face discussion.

Demographics of the selected studies

Table  1 shows the descriptive data for the demographics results of the selected studies in the first 5 years (2004–2008) and the second 5 years (2009–2013).

As shown in Table  1 , researchers most often selected the higher education group in both periods. On the other hand, preschool sample was the least selected group during both periods. Additionally, the most significant increase was found in the sample group of vocational education ( x 2  = 1.97, p  < 0.05) and the most significant decrease in the group of junior and senior high school ( x 2  = 2.01, p  < 0.05) between these two periods. No significant differences were found in other sample groups.

Table  1 also shows that most of the research works were conducted in face-to-face classroom. However, there was significant decrease in face-to-face classroom ( x 2  = 2.51, p  < 0.05) between the first 5 years and the second 5 years. In addition, there was significant increase in distant learning setting between the two periods ( x 2  = 4.73, p  < 0.05). With respect to the blended learning setting and workplace, there were more growths in these two periods. However, no significant difference was found in blended learning setting ( x 2  = 0.53, p  > 0.05) and workplace ( x 2  = 1.85, p  > 0.05).

In DBR, researchers selected different learning domains to investigate how the interventions function through several cycles. During the past decades, natural science was selected the most often and medical science was selected the least. However, no significant difference was found in natural science ( x 2  = 0.68, p  > 0.05), social science ( x 2  = 1.24, p  > 0.05), engineering and technological science ( x 2  = 0.06, p  > 0.05), medical science ( x 2  = 0.01, p  > 0.05), and mixed learning domains ( x 2  = 1.22, p  > 0.05) between these two periods.

  • Research methodology

Table  2 shows the descriptive results of research methods and data sources. With respect to research method, qualitative method was adopted the most and quantitative method was conducted the least. However, there was no significant difference in qualitative method ( x 2  = 0.25, p  > 0.05), quantitative method ( x 2  = 0.006, p  > 0.05), and qualitative and quantitative methods ( x 2  = 0.26, p  > 0.05).

With regard to data sources, researchers collected various kinds of data to conduct the research. Miscellaneous data (such as interview data, questionnaires, and various kinds of notes) were utilized the most in the past decade. In addition, the process data increased from 11.82 to 14.54 %. The outcome data decreased from 27.27 to 18.44 %. However, there was no significant difference in process data ( x 2  = 0.25, p  > 0.05), outcome data ( x 2  = 0.61, p  > 0.05), and miscellaneous data ( x 2  = 0.50, p  > 0.05).

  • Intervention

Table  3 shows the descriptive data of intervention type, iteration frequency and duration, and the revision of intervention. In the past decade, the technological intervention was the major type of intervention used in DBR. However, there was no significant difference in the technological intervention ( x 2  = 0.46, p  > 0.05) over the two periods. In terms of other intervention types, the most significant increase was found in the intervention of scaffold ( x 2  = 5.37, p  > 0.05) and the least significant decrease was found in the intervention of instructional method ( x 2  = 2.25, p  < 0.05). Although there were increases in the integrated teaching models and other models (from 15.56 to 16.24 %), no significant differences were found in these two types of interventions.

In the last decade, 74.07 % of DBR revised the intervention. During the two 5-year periods, there was growing tendency towards intervention without revision and decreasing trend in revision of intervention. However, there was no significant difference in intervention with revision ( x 2  = 1.35, p  > 0.05) and without revision ( x 2  = 1.08, p  > 0.05). Furthermore, we examined whether the studies provided detailed reports of how to revise the intervention. As shown in Table  3 , 60.49 % of DBR in the past decade reported what had been revised in terms of intervention. However, there was significant decrease in terms of intervention with revision ( x 2  = 3.08, p  < 0.05) and significant increase in terms of intervention without revision ( x 2  = 2.02, p  < 0.05) in these two periods.

In terms of iteration frequency, 50 % of the DBR only conducted one cycle in the past decade. There was a slight increase in the iteration frequency of once, twice, and four times. Also, there was a slight decrease in the iteration frequency of three and five times. However, there was no significant difference in any kinds of iteration frequency between the two 5-year periods.

Most of DBR spent less than 1 year (42.6 %) or only 1 year (25.93 %) to design and test an intervention. 15.43 % of DBR spent 2 years to examine an intervention. Only a small proportion of DBR studies (4.32 %) were conducted for more than 3 years. As shown in Table  3 , the iteration duration of 2, 3 years, and more than 3 years decreased from the first 5 years to the second 5 years. The short iteration durations including 1 month, 6 months, and 1 year slightly increased from the first 5 years to the second 5 years. However, no significant difference was found in any kinds of iteration duration between these two periods.

Among the 162 studies, most studies focused on the measurement of cognitive outcomes (see Table 4 ). Some studies examined the integrated skills, for example, problem solving, inquiry abilities, and so on. Only few studies measured learners’ attitude and psychomotor skills. However, there was significant increase in the attitude between the first 5 years and the second 5 years ( x 2  = 2.77, p  < 0.05). No significant differences were found in the measurement of cognitive process ( x 2  = 0.21, p  > 0.05), psychomotor skills ( x 2  = 0.01, p  > 0.05), integrated skills ( x 2  = 0.26, p  > 0.05), and others ( x 2  = 0.11,  p  > 0.05).

The study presented in this paper describes the status of DBR in the past decade based on the selected 162 SSCI papers. The demographics of the selected studies revealed that the higher education sample group was the most commonly used group in DBR. In terms of research settings, distant learning settings significantly increased and face-to-face classroom settings significantly decreased during the two time periods. Also, DBR was more common in natural science learning domain.

With regard to research methodology, most researchers selected qualitative method to conduct DBR. This result is consistent with prior research that indicated that DBR can be descriptive and explanatory in nature (McKenney and Reeves 2012 ). In terms of data sources, miscellaneous data such as interview data, questionnaires, and various kinds of notes were adopted in most DBR. This is in line with previous research studies that reported that DBR is typically conducted using multiple forms of data (Dede 2004 ; Wang and Hannafin 2005 ). Furthermore, multivocal analysis in DBR was called for in order to obtain the trustworthy and credible conclusion (Fujita 2013 ).

Results of the present study indicated that most researchers tested technological intervention through designing, developing, implementing, and revising particular technological tools. Furthermore, results also revealed that although most of the research studies revised the intervention, significant decrease was found in specifying how the intervention was revised. This indicated that there was a tendency among researchers towards not providing the details of revising the design and intervention. In addition, most of the research studies only tested the intervention by one cycle in DBR. Also, the iteration duration was only 1 year in most DBR. This can be explained by the findings of Anderson and Shattuck ( 2012 ) that multiple iterations and cycles indeed go beyond the time and resources available to researchers.

With respect to the measured outcomes, the results indicated that the effectiveness of design and intervention was captured by measuring cognitive processes of learners, such as learning achievements, conceptual change, and artifacts. Very few studies measured attitudes and psychomotor skills of learners in DBR.

The theoretical and practical implications for future research are proposed as follows. First, this study suggests that much more effort needs to be put to make DBR more sound and reliable because good research requires objectivity, reliability, and validity (Norris 1997 ). Second, the findings of the present study suggest that multiple iterations are required in DBR so as to refine the theory, methods, or tools. Third, caution should be made when generalize the results of DBR because findings are drawn from the local context (The Design-Based Research Collective 2003 ). Fourth, the design activities that can yield very interesting outcomes have been paid less attention in DBR (Reimann 2011 ). It is suggested that the design itself and how the design functions should be emphasized in future study. Finally, educational research has been required to create useful knowledge and provide scientific claims (Lagemann 2002 ; National Research Council 2002 ). Therefore, DBR needs to be improved so as to produce useful and replicable knowledge in the future.

This study thoroughly examined the research sample groups, research settings, research learning domains, research methods, data sources, intervention type, revision of intervention, iteration frequency, iteration duration, and measured outcomes in DBR. The main conclusion is that technological intervention is dominated in most of DBR studies. However, there is a tendency among researchers towards not reporting the details of how to revise the intervention in DBR. Also, only one cycle of iteration is conducted in most studies. In addition, the qualitative approach and miscellaneous data were adopted in most DBR.

This study contributes towards better understanding about the status of DBR. The thorough analysis of variation between the two periods (2004–2008 and 2009–2013) can provide directions for the potential research topics. Furthermore, this study proposes the need for new approaches that would emphasize the design process and highlight the value of replicability of research.

Results of this study are influenced by several constraints. First, this study only analyzed the demographics, research methods, interventions, and research outcomes. It would be very valuable to thoroughly analyze how design functions and evolves in different cycles. Second, only journal articles published from 2004 to 2013 were examined in this study. Future studies should extend the data sources to conduct a more deliberate analysis. Finally, it would also be very useful to thoroughly analyze the highly cited papers that have high influence and valuable contribution in the field of DBR, which can provide important insights into future directions for educators and researchers.

Anderson, T., & Shattuck, J. (2012). Design-based research: A decade of progress in education research? Educational Researcher, 41 (1), 16–25.

Article   Google Scholar  

Barab, S., & Squire, B. (2004). Design-based research: Putting a stake in the ground. Journal of the Learning Sciences, 13 (1), 1–14.

Brown, A. L. (1992). Design experiments: Theoretical and methodological challenges in creating complex interventions in classroom settings. Journal of the Learning Sciences, 2 (2), 141–178.

Cobb, P., Confrey, J., Lehrer, R., & Schauble, L. (2003). Design experiments in educational research. Educational Researcher, 32 (1), 9–13.

Collins, A. (1992). Towards a design science of education. In E. Scanlon & T. O’Shea (Eds.), New directions in educational technology (pp. 15–22). Berlin: Springer.

Chapter   Google Scholar  

Cresswell, J. W. (2009). Research Design: Qualitative, quantitative, and mixed methods approaches . Thousand Oaks, CA: Sage Publications.

Google Scholar  

Dede, C. (2004). If design-based research is the answer, what is the question? A commentary on Collins, Joseph, and Bielaczyc; diSessa and Cobb; and Fishman, Marx, Blumenthal, Krajcik, and Soloway in the JLS special issue on design-based research. The Journal of the Learning Sciences, 13 (1), 105–114.

Fishman, B., Marx, R. W., Blumenfeld, P., Krajcik, J., & Soloway, E. (2004). Creating a framework for research on systemic technology innovations. The Journal of the Learning Sciences, 13 (1), 43–76.

Fujita, N. (2013). Critical reflections on multivocal analysis and implications for design-based research. In D. D. Suthers, et al. (Eds.), Productive multivocality in the analysis of group interactions (pp. 435–455). New York: Springer.

Hoadley, C. P. (2002). Creating context: Design-based research in creating and understanding CSCL. In G. Stahl (Ed.), Proceedings of the conference on computer support for collaborative learning : Foundations for a CSCL community (pp. 453–462). Psychology Press.

Hsu, Y.-C., Ho, H. N. J., Tsai, C.-C., Hwang, G.-J., Chu, H.-C., Wang, C.-Y., & Chen, N.-S. (2012). Research trends in technology-based learning from 2000 to 2009: A content analysis of publications in selected journals. Educational Technology and Society, 15 (2), 354–370.

Jitendra, A. K., Burgess, C., & Gajria, M. (2011). Cognitive strategy instruction for improving expository text comprehension of students with learning disabilities: The quality of evidence. Exceptional Children, 77 , 135–159.

Kinshuk, Huang, H.-W., Sampson, D., & Chen, N.-S. (2013). Trends in educational technology through the lens of the highly cited articles published in the Journal of Educational Technology and Society. Educational Technology and Society, 16 (2), 3–20.

Lagemann, E. C. (2002). An elusive science: The troubling history of education research . Chicago: University of Chicago Press.

Landis, J. R., & Koch, G. G. (1977). The measurement of observer agreement for categorical data. Biometrics, 33 (1), 159–174.

McKenney, S., & Reeves, T. C. (2012). Conducting educational design research . London: Routledge.

McKenney, S., & Reeves, T. C. (2013). Systematic review of design-based research progress is a little knowledge a dangerous thing? Educational Researcher, 42 (2), 97–100.

National Research Council. (2002). Scientific research in education . Washington, DC: National Academy Press.

Norris, N. (1997). Error, bias and validity in qualitative research. Educational Action Research, 5 (1), 172–176.

Noyons, E. C. M., & van Raan, A. F. J. (1998). Monitoring science developments from dynamic perspective: Self-organized structuring to map neural network research. Journal of the American Society for Information Science and Technology, 49 (1), 68–81.

Reimann, P. (2011). Design-based research. In L. Markauskaite, et al. (Eds.), Methodological choice and design: Scholarship, policy and practice in social and educational research (pp. 37–50). New York: Springer.

The Design-Based Research Collective. (2003). Design-based research: An emerging paradigm for educational inquiry. Educational Researcher, 32 (1), 5–8.

Tsai, C.-C., Wu, Y.-T., Lin, Y.-C., & Liang, J.-C. (2011). Research regarding science learning in Asia: An analysis of selected science education journals. The Asia–Pacific Education Researcher, 20 (2), 352–363.

Wang, F., & Hannafin, M. J. (2005). Design-based research and technology-enhanced learning environments. Educational Technology Research and Development, 53 (4), 5–23.

Wang, C.-Y., Wu, H.-K., Lee, S. W.-Y., Hwang, F.-K., Chang, H.-Y., Wu, Y.-T., et al. (2014). A review of research on technology-assisted school science laboratories. Educational Technology and Society, 17 (2), 307–320.

Zheng, L., Huang, R., & Yu, J. (2014). Identifying computer-supported collaborative learning (CSCL) research in selected journals published from 2003 to 2012: A content analysis of research topics and issues. Educational Technology and Society, 17 (4), 335–351.

Download references

Acknowledgments

This study was supported by the 2014 Beijing youth project of the Twelfth Five-Year Planfor Educational Science “The empirical study on improving the self-regulated learning abilities in smart learning environment” (CJA14185).

Author information

Authors and affiliations.

School of Educational Technology, Faculty of Education, Beijing Normal University, Beijing, 100875, China

Lanqin Zheng

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Lanqin Zheng .

See Table  5 .

Rights and permissions

Reprints and permissions

About this article

Zheng, L. A systematic literature review of design-based research from 2004 to 2013. J. Comput. Educ. 2 , 399–420 (2015). https://doi.org/10.1007/s40692-015-0036-z

Download citation

Received : 16 April 2015

Revised : 11 May 2015

Accepted : 10 June 2015

Published : 24 June 2015

Issue Date : December 2015

DOI : https://doi.org/10.1007/s40692-015-0036-z

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Design-based research
  • Literature review
  • Find a journal
  • Publish with us
  • Track your research

IMAGES

  1. Design-Based Research: A Methodology to Extend and Enrich Biology

    design based research (dbr)

  2. Design-based research (DBR) process.

    design based research (dbr)

  3. The four phases of Design-Based Research adapted from McKenney and

    design based research (dbr)

  4. Two Perspectives on Design-Based Research

    design based research (dbr)

  5. PPT

    design based research (dbr)

  6. Design-based research (DBR) process.

    design based research (dbr)

VIDEO

  1. Praktisi Mengajar Model Model Pembelajaran Inovatif [Kupas tuntas: Project Based Learning]

  2. SHRINGAR 9372879532

  3. complicated form but super easy to draw #design #drawing #ring #satisfying #jewelry #3d #sketch #diy

  4. जवाब दो...... 🤯 brain test quiz || #upscmotivationlsong #upsc #shorts

  5. Stars With Big Size Size is no problem for Acting

  6. Ranking EVERY Battle Card Background in Brawl Stars!

COMMENTS

  1. Design-based research

    Design-based research (DBR) is a type of research methodology used by researchers in the learning sciences, which is a sub-field of education. The basic process of DBR involves developing solutions (called "interventions") to problems. Then, the interventions are put to use to test how well they work. The iterations may then be adapted and re ...

  2. Full article: Design-based research: What it is and why it matters to

    Design-based research methods. DBR attempts to understand the world by trying to change it, making it an interventionist research method. However, DBR problematizes the designed nature of interventions, recognizing that the intended design is different from what may be enacted in a complex social context, ...

  3. Design-Based Research

    In an educational setting, design-based research is a research approach that engages in iterative designs to develop knowledge that improves educational practices. This chapter will provide a brief overview of the origin, paradigms, outcomes, and processes of design-based research (DBR). In these sections we explain that (a) DBR originated ...

  4. (PDF) Design-Based Research

    Design-based research (DBR) 1 projects share some basic characteristics, namely the situatedness in real educational contexts, the focus on the design and testing of a significant intervention ...

  5. Design-Based Research: Definition, Characteristics, Application and

    Design-based research (DBR) that blends designing learning environments and developing theories has proliferated in recent years. In order to gain insights into DBR, 162 studies related to DBR ...

  6. Design-Based Research: A Decade of Progress in Education Research

    Design-based research (DBR) evolved near the beginning of the 21st century and was heralded as a practical research methodology that could effectively bridge the chasm between research and practice in formal education. In this article, the authors review the characteristics of DBR and analyze the five most cited DBR articles from each year of ...

  7. 9

    Design-based research (DBR) is a methodology used to study learning in environments that are designed and systematically changed by the researcher. The goal of DBR is to engage the close study of learning as it unfolds within a particular context that contains one or more theoretically inspired innovations and then to develop new theories, artifacts, and practices that can be used to inform ...

  8. An Introduction to Design-Based Research with an Example From

    Educational design-based research (DBR) can be characterized as research in which the design of educational materials (e.g., computer tools, learning activities, or a professional development program) is a crucial part of the research. That is, the design of learning environments is interwoven with the testing or developing of theory.

  9. Design-Based Research

    Following The Design-Based Research Collective , the term DBR encompasses a paradigm that has different names in the literature, including 'design experiment s' (Brown , 1992; Collins , 1992), 'design research' (Edelson , 2002; Lesh , Kelly , & Yoon , 2008), and 'development research' (van den Akker , 1999). DBR is the ...

  10. Design-Based Research

    Design-based research (DBR) was proposed as design experiments in articles by Brown and Collins . And now, it is a type of research methodology commonly used by researchers in the learning sciences. Design-based research is a systemic approach to the planning and implementing of innovations that emphasize an iterative approach to design with ...

  11. Design-Based Research: A Methodology to Extend and Enrich Biology

    In design-based research, learning theories must "do real work" by improving student learning in real-world settings (DBR Collective, 2003). Therefore, design-based researchers must reflect on whether or not the data they collected show evidence that the instructional tools improved student learning ( Cobb et al. , 2003 ; Sharma and McShane ...

  12. PDF Design-Based Research: An Emerging Paradigm for Educational Inquiry

    Design-based research (Brown, 1992; Collins, 1992) is an emerging paradigm for the study of learning in context through the systematic design and study of instructional strategies and tools. We argue that design-based research can help create and extend knowledge about developing, enacting, and sustaining in-

  13. The Development of Design-Based Research

    Design-Based Research (DBR) is one of the most exciting evolutions in research methodology of our time, as it allows for the potential knowledge gained through the intimate connections designers have with their work to be combined with the knowledge derived from research. These two sources of knowledge can inform each other, leading to improved ...

  14. Design-Based Research

    Design-Based Research. In an educational setting, design-based research is a research approach that engages in iterative designs to develop knowledge that improves educational practices. This chapter will provide a brief overview of the origin, paradigms, outcomes, and processes of design-based research (DBR).

  15. 3 Design-Based Research and Interventions

    Design-Based Research (DBR) is a research methodology used by researchers in the learning sciences. DBR is a concentrated, collaborative and participatory approach to educational inquiry. The basic process of DBR involves developing solutions or interventions to problems (Anderson & Shattuck, 2012). An "Intervention" is any interference ...

  16. 10

    Design-based research (DBR) is used to study learning in environments which are designed and systematically changed by the researcher. The goal of DBR is to use the close study of a single learning environment, usually as it passes through multiple iterations and as it occurs in naturalistic contexts, to develop new theories, artifacts, and practices that can be generalized to other schools ...

  17. Design-based research process: Problems, phases, and applications

    We build on existing efforts by. defining DBR as an iterative process of 6 phases: focus, understand, define, conceiv e, build, and test, in which other scientific processes are recursively nested ...

  18. PDF Using Design-Based Research in Higher Education Innovation

    DBR, Design-Based Implementation Research (DBIR) is focused on building organizational or system capacity for implementing, scaling, and sustaining educational innovations. DBIR's research focus extends to the identification and design of organizational routines and processes that support collaborative design and productive adaptation of core ...

  19. Design-Based Research Methods (DBR)

    Summary: Design-Based Research is a lens or set of analytical techniques that balances the positivist and interpretivist paradigms and attempts to bridge theory and practice in education. A blend of empirical educational research with the theory-driven design of learning environments, DBR is an important methodology for understanding how, when, and why educational innovations work in practice ...

  20. A systematic literature review of design-based research from 2004 to

    Design-based research (DBR) that blends designing learning environments and developing theories has proliferated in recent years. In order to gain insights into DBR, 162 studies related to DBR published from 2004 to 2013 were selected and reviewed. The major findings indicated that most of the studies focused on designing, developing, and redesigning learning environments through interventions ...

  21. [PDF] Design-Based Research

    574. In an educational setting, design-based research is a research approach that engages in iterative designs to develop knowledge that improves educational practices. This chapter will provide a brief overview of the origin, paradigms, outcomes, and processes of design-based research (DBR). In these sections we explain that (a) DBR originated ...

  22. PDF Design-Based Research (DBR) in educational enquiry and

    Abstract. This article discusses educational design-based research (DBR) as an emerging. paradigm/methodology in educational enquiry that can be used as a mixed-method, problem-oriented research framework, and thus can act as an alternative to other. traditional. paradigms/methodologies. prominent.

  23. Sustainability

    Design-based research [18,19,20,21], can be seen as the 'engineering' arm of educational research. DBR seeks to develop new knowledge through the iterative design and improvement of real-world educational technologies, resources, practices and so on . In this project, DBR was being used to explore how a set of iVR equipment donated to our ...

  24. A Design-Based Research (DBR) Framework to Guide Curriculum Design

    DBR is described as an authentic and ethically-based approach to curriculum design, and a pragmatic research. methodology for deali ng with real world lea rning contexts (Amiel & Reeves, 2008 ...