class
see attachment title LESSON THIS WEEK
https://doi.org/10.1177/1053451218819234
Intervention in School and Clinic 2019, Vol. 54(5) 272 279
Hammill Institute on Disabilities 2019
Article reuse guidelines: sagepub.com/journals-permissions
DOI: 10.1177/1053451218819234
isc.sagepub.com
Feature
Despite advances in research and increasing knowledge of
empirically supported practices, there continues to be a sig-
nificant research-to-practice gap for practitioners in schools.
Unfortunately, special education research knowledge is not
frequently incorporated into the classroom (Hott, Berkeley,
Fairfield, & Shora, 2017). These gaps may exist for a mul-
titude of reasons, including but not limited to (a) clear sep-
arateness between research and practice communities, (b)
lack of practical interventions that can be easily incorpo-
rated into the classroom, and (c) lack of ongoing opportuni-
ties for professional development or collaboration between
researchers and practitioners (Van Ingen, Alvarez McHatton,
& Vomvoridi-Ivanovic, 2016).
With growing caseloads, increasing paperwork
demands, and pressure from parents and administrators to
produce results, teachers are looking for effective strate-
gies. Far too often, teachers gravitate toward quick-fix
practices that are popular and claim to produce results.
These promised new ways claim to deliver beneficial out-
comes that are less stressful, lower in cost, easier to imple-
ment, and less risky than previous interventions (Schreck,
Russell, & Vargas, 2013). New fads are implemented and
tossed aside, with no actual student improvement. With
increasing demands and limited time, teachers do not
always look to the research to determine whether these
interventions are worthwhile and scientifically validated.
Increases in diagnoses, especially in autism, have led to
increases in the number of treatments that are not sup-
ported by quality research (Zane, Davis, & Rosswurm,
2008). The use of these treatments wastes time that could
be better spent supporting student growth and progress.
Examples of current trends that do not have solid research
backing include (a) mindfulness exercises to improve aca-
demic performance (Burke, 2010), (b) use of tinted lenses
(Hyatt, Stephenson, & Carter, 2009) to help students with
dyslexia, (c) sensory integration therapy for students with
autism (Lang et al., 2012), (d) instruction tailored to meet
students learning styles (Willingham, Hughes, &
Dobolyi, 2015), and (e) use of high-tech educational apps,
which are often not designed with principles of effective
instruction in mind (Boone & Higgins, 2007). Practitioners
adoption of these practices illustrates that well-meaning
819234 ISCXXX10.1177/1053451218819234Intervention in School and ClinicKonrad et al.
research-article2019
1The Ohio State University, Columbus, OH, USA
Corresponding Author:
Moira Konrad, PhD, The Ohio State University, A358 PAES Building, 305
Annie & John Glenn Ave., Columbus, OH 43210, USA.
Email: [emailprotected]
Fads or Facts? Sifting Through the Evidence to
Find What Really Works
Moira Konrad, PhD1, Caitlin J. Criss, MA1, and Alana Oif Telesman, MA1
Abstract
Despite the requirement that teachers implement evidence-based instruction in their classrooms, a significant research-to-
practice gap persists. Far too often, teachers resort to quick fixes found through online searches or rely on conventional
wisdom to make instructional decisions. This is no surprise as identifying evidence-based interventions can be time-
consuming, overwhelming, and confusing. Indeed, claims of practices being evidence based are ubiquitous, even for practices
that clearly lack evidence to support their efficacy. In addition, once an evidence-based practice is selected, the process for
implementing it and evaluating its effectiveness can be an additional challenge. The purposes of this article are to distinguish
between an evidence-based practice as an instructional strategy and evidence-based education as a problem-solving process
and to assist teachers in identifying, implementing, and evaluating evidence-based practices in their classrooms.
Keywords
evidence-based practice, evidence-based education, high-incidence disabilities, data-based decision making
https://us.sagepub.com/en-us/journals-permissions
https://isc.sagepub.com
mailto:[emailprotected]
http://crossmark.crossref.org/dialog/?doi=10.1177%2F1053451218819234&domain=pdf&date_stamp=2019-01-28
Konrad et al. 273
teachers are often confused about where to start or how to
select an appropriate intervention.
The purposes of this article are to (a) distinguish between
evidence-based practice as a specific treatment or strategy
and evidence-based education as a problem-solving process
and (b) describe steps and resources to assist teachers as they
adopt an evidence-based approach to teachingan approach
that will yield improved academic and social outcomes for
their students with high-incidence disabilities.
Evidence-Based Practice or Evidence-
Based Education?
The term evidence-based practice has become ubiquitous
(Cook & Cook, 2013; Detrich, 2008)it seems everything
is evidence based now. Even those who have solid criteria
for identifying a practice as evidence based are not in agree-
ment about what it means, and far more problematic is pub-
lishers, and even researchers, using the term to describe
practices that have not been found to be effective. Adding to
the confusion are the myriad other terms used to describe
recommended practices: research-based practice, empiri-
cally supported treatment, best practice, scientifically based
research, and high-leverage practice, just to name a few.
Detrich and Lewis (2012) suggested that it is limiting to
use the term evidence-based practice to describe a spe-
cific teaching practice that has met a specific set of eviden-
tiary criteria. A broader, more flexible definition of
evidence-based practice, according to Detrich (2008), is
teaching that includes three steps: (a) identifying a practice,
(b) implementing the practice, and (c) evaluating that prac-
tice. However, Cook and Cook (2013) noted that using the
term evidence-based practice in this way may be confusing
to educators and suggested the term evidence-based educa-
tion as an alternative. The term evidence-based education is
used throughout this article to refer to an approach to teach-
ing and problem solving in the classroom.
Evidence-Based Education: A Problem-
Solving Process
As educators work to close achievement gaps, it is critical
they identify, implement, and evaluate their practice. This
involves more than simply finding a practice that has been
deemed effective. It involves a scientific approach to educa-
tion and professional wisdom (Cook, Tankersley, &
Harjusola-Webb, 2008). Rather than simply taking a prac-
tice and blindly implementing it in their classrooms, teach-
ers should use their knowledge and previous experiences to
make judgments about how to best incorporate this practice
in their unique classrooms. What follows is a description of
this process. A checklist provided in Figure 1 can help
teachers as they embrace evidence-based education.
Step 1: Identify the Practice
Identifying an evidence-based practice (EBP) can be frus-
trating and time-consuming; sifting through the numerous
resources, books, and online search engines can be over-
whelming. Fortunately, there are resources (see Table 1)
teachers and other practitioners can keep in their toolkits to
identify effective practices for their classroom (e.g., meta-
analyses, expert panels, and clearinghouses). It should be
noted that these three types of resources are not mutually
exclusive; an expert panel, for instance, may conduct a
meta-analysis and/or develop and oversee a clearinghouse.
All three of these resources can guide educators in their
selection and evaluation of EBPs.
Meta-Analysis and Meta-Meta-Analysis
Using statistical analysis to synthesize findings from mul-
tiple studies into a comprehensive review is called meta-
analysis. Educators can refer to meta-analyses to determine
if a specific intervention has been shown to work with stu-
dents similar to those in their classrooms or compare vari-
ous interventions to identify the ones with the most robust
effects. Some meta-analyses that have been conducted in
the field of special education include Therriens (2004)
review of repeated reading, Kroesbergen and Van Luits
(2003) review of math interventions, Gillespie and
Grahams (2014) review of writing interventions, and
Murawski and Swansons (2001) review of coteaching. For
more examples of meta-analyses and discussion of how to
use meta-analysis as a guide for intervention selection, see
Banda and Therrien (2008).
Hattie (2009) took meta-analysis further by conducting a
comprehensive meta-meta-analysis (i.e., an analysis of
meta-analyses) to identify and compare influences related
to learning outcomes. Using information gathered from
over 1,200 meta-analyses across the field of education,
Hattie determined the efficacy of these influences based on
their outcomes and ranked them according to their effec-
tiveness. In a 2017 update, he listed 256 influences in order
of most effective to least effective. Many of the strategies
Hattie found to be effective can be applied broadly across
different subjects, interventions, and teaching methods.
Expert Panels
When looking for effective practices, teachers can also use
information from an expert panela group of individuals
from the field that work collaboratively to outline effective
practices. The Council for Exceptional Children (CEC) fre-
quently convenes expert panels to disseminate information to
educators. For example, CEC and the Center for Collaboration
for Effective Educator Development, Accountability, and
Reform (CEEDAR; McLeskey et al., 2017) convened an
274 Intervention in School and Clinic 54(5)
expert panel of practitioners, researchers, and advocates to
outline effective practices for special education teachers and
teacher candidates, which resulted in publication of High-
Leverage Practices in Special Education. The high-leverage
practices (HLPs) include 22 effective practices for school-
age special education teachers organized into four categories:
collaboration, assessment, social/emotional/behavioral prac-
tices, and instruction (McLeskey et al., 2017). Other exam-
ples of expert panels include the National Reading Panel, the
National Mathematics Advisory Panel, and the National
Standards Project.
Clearinghouses
Educators can also access national clearinghouses, which
house findings from literature reviews conducted to deter-
mine the effectiveness of interventions. Often clearinghouse
reports will present both the quality and quantity of research
to determine if there is enough evidence to rate intervention
effectiveness. National clearinghouses may focus on a
specific population (e.g., the National Clearinghouse on
Autism Evidence and Practice) or include a more compre-
hensive list of educational interventions for all students (e.g.,
What Works Clearinghouse [WWC]). The Results First
Clearinghouse Database (see Table 1) contains data from
eight different national clearinghouses to identify EBPs for
educators and policy makers. As of June 2018, the Results
First Clearinghouse included information on over 2,800 pro-
grams focused on education and social change. Users can
search the database by intervention name, age, setting, and
effectiveness. Practitioners can also search for interventions
they are currently using to determine their effectiveness.
Identifying Evidence-Based Practices
No clearinghouse, expert panel, or meta-analysis is the end
all be all. Expert panels may be biased, meta-analyses are
often fraught with methodological problems (Lipsey &
Wilson, 2001), and clearinghouses may be both biased and
compromised by methodological issues and may be
Step Zero: Determine your broad goal(s).
Have you analyzed your current environment (e.g., classroom climate, family concerns, student population, profession development oppor-
tunities and initiatives, philosophical approaches, financial resources)?
Are you searching for an evidence-based practice that will benefit all your students in a classwide setting, or are you targeting a specific
student or small group of students who needs intervention in a specific area?
Are you trying to find a practice to meet a need, or are you trying to figure out if a practice youve heard about or read about is effective?
Step One: Identify an evidence-based practice.
Use a reliable source (see Table 1) to identify a practice that fits your needs and setting (i.e., those identified in Step Zero).
o Start with What Works Clearinghouse or Best Evidence Encyclopedia, and use the filters or topic links to identify practices that match
your needs.
o If your goal is to determine effectiveness of a practice youve heard about, be sure to evaluate it using resources in Table 1, and look for
signs of pseudoscience (Travers, 2017).
o Look for overlapping/corroborating evidence (i.e., more than one source documenting its effectiveness), if possible. Pay particular
attention to CECs high-leverage practices.
Gather student outcome data (baseline data) in the area(s) youre targeting for intervention. Be sure the skills and behaviors youre targeting
are meaningful.
Set a more specific objective/learning outcome.
Step Two: Implement the practice.
Be sure you are equipped to implement the practice with fidelity. Do you have the proper training, administrative support, time, space,
human resources, materials, and technology? If not, seek out needed resources.
Begin implementation. Be sure to implement the most critical components of the intervention (e.g., those that overlap with high-leverage
practices) with fidelity. You may want to develop a checklist of implementation steps or use one that already exists (see Table 2 for
resources to assist).
Collect frequent data on outcomes that are meaningful. See Table 3 for resources to assist with data collection.
Step Three: Evaluate the practice
Analyze data, and make instructional decisions accordingly (see Table 3 for resources to assist with data analysis).
If students are not progressing or are progressing too slowly, consider the following:
o Are you implementing with fidelity? Has someone else observed to collect fidelity data?
o Are students accessing your instruction, or are there other interventions (e.g., classroom management tactics or academic interventions
targeting prerequisite skills) that need to be in place?
o Do you need to change intensity, group size, opportunities to respond, feedback, or setting?
If students are exceeding expectations, consider increasing expectations.
Figure 1. Checklist for implementing evidence-based education.
Konrad et al. 275
characterized by somewhat arbitrary criteria. Indeed,
although the WWC has been widely adopted as the pre-
ferred source for identifying EBPs, it is not above criticism.
For instance, Engelmann (2008) noted that although there
have been over 90 studies documenting the effectiveness of
Reading Mastery (see https://www.nifdi.org/programs
/reading/reading-mastery), many published in peer-reviewed
journals, the WWC has indicated that no Reading Mastery
studies met their evidentiary standards. This is quite prob-
lematic and confusing for educators and underscores the
importance of professional judgment and evidence-based
education as an approach, not merely a collection of prac-
tices that have been deemed effective by one organization.
However, despite their limitations, these comprehensive
reviews represent a good place to start. As Wilczynski
(2012) noted, Although there are inherent risks associated
with practice guidelines, consumers must not lose sight of
the fact that the greatest risk comes from the failure to use
the best available evidence as the basis for making impor-
tant decisions (p. 309). But educators should be cautious.
Because of the limitations of expert panels, clearinghouses,
and meta-analyses, it is important to look for overlapping/
corroborating evidence (i.e., more than one source docu-
menting its effectiveness). For example, although repeated
reading instruction has not qualified as an EBP by the WWC,
it is a common intervention for reading fluency. Before
negating this practice as ineffective, teachers should look to
other sources, such as expert panels and meta-analyses, to
corroborate these conclusions. In Hatties 2018 updated list
of Factors Related to Student Achievement, he found that
repeated reading had an effect size of d = 0.75, indicating a
strong, positive effect. With two differing judgments, each
teacher must use professional wisdom to determine whether
or not to adopt this practice. And, more importantly, the
teacher must not assume the practice will work but rather
should approach implementation scientifically by collect-
ing and monitoring data.
Although deep discussion of these ideas is beyond the
scope of this article, it is important to note that a practice
should not be considered evidence-based until the
Table 1. Resources to Assist in Identifying and Evaluating Evidence-Based Practices.
Resource Description
High-Leverage Practices
www.highleveragepractices.org
Complementary resources can be found at http://
ceedar.education.ufl.edu/wp-content/uploads/2017/12/
HLPs-and-EBPs-A-Promising-Pair.pdf
An expert panel of educators, researchers, and advocates identified 22
effective practiceshigh-leverage practicesfor special educators.
Practices relate to collaboration; assessment; social, emotional, and
behavioral practices; and instruction.
What Works Clearinghouse
https://ies.ed.gov/ncee/wwc/
WWC is a free search engine educators can use to learn about
effectiveness of interventions based on evidence. Teachers can filter
searches by topic, outcome, age/grade, name, population, and setting.
Best Evidence Encyclopedia
http://www.bestevidence.org/
Provides educators and researchers information about strength of
evidence supporting a variety of programs available for students in grades
K12.
Results First Clearinghouse
http://www.pewtrusts.org/en/research-and-analysis/
data-visualizations/2015/results-first-clearinghouse-
database
Focused on social change and created to provide policy makers a
source for identifying evidence-based interventions, the Results First
Clearinghouse compiles results from 8 different clearinghouses.
Educators can search by topic, setting, overall rating, and individual
clearinghouses.
John Hatties Visible Learning
https://visible-learning.org/
Complementary resources can be found at www.
evidencedbasedteaching.org.au
These 256 influences on student learning were identified, and ranked,
through a study of nearly 1,200 meta-analyses. Teachers can identify
most effective influential practices by searching according to rank, name,
effectiveness, and focus.
The Iris Center
Developed by experts in education, including researchers and educators,
the Iris Center shares information on a variety of topics such as
accommodations, diversity, and transition. Teachers can search by topic
and type of resources.
EBI Network
http://ebi.missouri.edu/
Complementary resources can be found at www.
interventioncentral.com.
Website developed to provide practitioners with guidance for choosing
and implementing evidence-based practices. Resources include
intervention briefs, implementation videos, and guidance for linking
intervention and assessment.
NTACT
https://transitionta.org/effectivepractices
The National Technical Assistance Center on Transition has evaluated
a range of secondary transition practices and rated them as evidence
based, research based, promising, or unestablished. Target skills include
academic, vocational, and life skills.
https://www.nifdi.org/programs/reading/reading-mastery
https://www.nifdi.org/programs/reading/reading-mastery
www.highleveragepractices.org
http://ceedar.education.ufl.edu/wp-content/uploads/2017/12/HLPs-and-EBPs-A-Promising-Pair.pdf
http://ceedar.education.ufl.edu/wp-content/uploads/2017/12/HLPs-and-EBPs-A-Promising-Pair.pdf
http://ceedar.education.ufl.edu/wp-content/uploads/2017/12/HLPs-and-EBPs-A-Promising-Pair.pdf
https://ies.ed.gov/ncee/wwc/
http://www.bestevidence.org/
http://www.pewtrusts.org/en/research-and-analysis/data-visualizations/2015/results-first-clearinghouse-database
http://www.pewtrusts.org/en/research-and-analysis/data-visualizations/2015/results-first-clearinghouse-database
http://www.pewtrusts.org/en/research-and-analysis/data-visualizations/2015/results-first-clearinghouse-database
https://visible-learning.org/
www.evidencedbasedteaching.org.au
www.evidencedbasedteaching.org.au
http://ebi.missouri.edu/
www.interventioncentral.com
www.interventioncentral.com
https://transitionta.org/effectivepractices
276 Intervention in School and Clinic 54(5)
following criteria have been met: (a) adequate experimental
research designs, (b) sufficient quantity of studies, (c) high-
quality research, and (d) robust effects on meaningful out-
comes (Cook & Cook, 2013). Also beyond the scope of this
article is how to identify practices that are backed by pseu-
dosciencepractices that are actually ineffective, and in
some cases harmful. For a discussion of pseudoscience,
including why people adopt unproven practices and how to
spot pseudoscientific claims, see Travers (2017).
What If There Are No Evidence-Based Practices?
Despite advances the scientific field has made in identify-
ing practices that are effective, there are many unanswered
questions. When a teacher identifies a problem for which
there is no clear EBP, what should she do? Although there
may be no specific intervention to implement, teachers can
still incorporate evidence-based strategies (e.g., HLPs) that
have broad enough utility that they can apply to virtually all
academic and social behavior. For example, if a teacher is
struggling to find an EBP to teach a specific skill, she can
incorporate HLPs such as explicit and intensive instruction
(HLPs 16 and 20), scaffolding (HLP 15), active student
responding (HLP 18), and feedback (HLP 22). Even if she
hasnt selected a specific EBP, applying these strategies will
likely benefit her students. See Table 1 for website addresses.
Step 2: Implement the Practice
Selecting an evidence-based teaching practice does not
guarantee that practice will be effective. This is only the
beginning. An evidence-based practice is one thing, imple-
mentation of that practice is another thing altogether
(Fixsen, Blase, Horner, & Sugai, 2009, p. 5). Detrich (2008)
noted that an evidence-based approach must include not
only identifying and implementing the intervention but
evaluating it as well. Teachers should use their professional
wisdom and expertise to adapt EBPs to the unique needs of
their students and to harness their own strengths as educa-
tors (Cook et al., 2008). In addition, teachers should con-
sider their environment and capacity within their school
when selecting EBPs for their classroom. For example,
some interventions may not be feasible to implement in
smaller schools with less flexibility in schedules and staff.
Adapting EBPs to match factors such as student popula-
tion, school culture, and practitioner needs may contribute
to higher student achievement (Webster-Stratton, Reinke,
Herman, & Newcomer, 2011). However, it is still important
to understand how to implement EBPs with fidelity (Harn,
Parisi, & Stoolmiller, 2013). If the intervention is not imple-
mented with adherence to the procedures that were found
effective in the research, the teacher cannot be confident
about what is contributing to student growth or lack thereof.
Furthermore, implementing programs with fidelity ensures
that teachers do not simply discard practices without using
sound reasoning and data-based rationales. Jones (2009)
found that when special educators implemented EBPs, they
were used for only a short period of time. When using an
EBP it is critical for practitioners to understand how to best
fit it into their classrooms. However, before making adapta-
tions to the practices, teachers must be aware of the critical
elements that should not be modified in order to maintain
the integrity of the intervention (Harn et al., 2013).
There are many fidelity checklists available to teachers
to determine what key ingredients are necessary to con-
tribute to student success. These elements represent sound
principles of instruction and should not be modified because
they are imperative for the success of the EBP. For instance,
Beecher, Abbott, Petersen, and Greenwood (2017) provided
a checklist for implementing high quality early literacy
instruction. Similarly, Walker, Clancy, Tsai, and Cheney
(2013) described a process for engaging teachers in pro-
gram evaluation to improve their delivery of services to stu-
dents with emotional or behavioral disorders.
There are also many free or low-cost resources available
to support teachers as they implement a new practice. For
instance, the WWC has several practice guides (e.g.,
Teaching Secondary Students to Write Effectively) based on
research reviews, expert opinions, and previous experiences
of practitioners. These guides summarize helpful strategies
and examples that teachers can implement in their class-
rooms. They may also highlight student work samples,
include webinars and other training materials, and suggest
ways to overcome implementation obstacles in the class-
room. See Table 2 for additional implementation resources.
While implementing an EBP, it is critical to collect fre-
quent data by directly measuring the skills being targeted
for improvement. Good measures for monitoring student
progress should be valid, reliable, repeatable, and meaning-
ful. Meaningful skills and behaviors are those that (a) allow
the students to access other skills and opportunities, (b)
align with grade-level/district standards and IEP goals, (c)
address concerns of students families, (d) promote inde-
pendence and generalization, and (e) promote success and
self-efficacy. Table 3 provides resources for data collection
and monitoring progress.
Step 3: Evaluate the Practice
Data collection is necessary, but not sufficient, for evaluat-
ing an EBP and making subsequent data-based decisions.
Teachers must carefully analyze their student data to judge
the effectiveness of an EBP. Table 3, in addition to providing
data collection tools, provides some resources to assist with
data analysis. For teachers using curriculum-based measure-
ment (e.g., Aimsweb), the process generally includes graph-
ing baseline fluency data, setting a goal, and creating a goal
line or aim line connecting baseline data to the goal.
Konrad et al. 277
Table 3. Free Resources to Assist With Evaluating Effectiveness of Evidence-Based Practices.
Resource Description
Intervention Central
https://www.interventioncentral.org/
Resources for progress monitoring, including charts, data
collection forms, and customizable assessments and forms
Center on Response to Intervention
https://rti4success.org/
Resources and training materials for progress monitoring, data
collection, and making data-based decisions
RTI Action Network
http://rtinetwork.org/essential/assessment
Resources for using curriculum-based measurement for
progress monitoring and data-based decision making
National Center on Intensive Intervention
https://charts.intensiveintervention.org/chart/progress-monitoring
Comprehensive list of progress monitoring tools and reviews
of their validity, reliability, and effectiveness
Autism Classroom News and Resources
Data sheets and other resources for measuring academic and
social behaviornot just for autism!
DIBELS
https://dibels.uoregon.edu
Progress-monitoring resources and information on screening,
benchmarking, and evaluating student growth
Recommendations for Data-Based Decision Making
https://www.mayinstitute.org/pdfs/presentations/2016PBIS%20
C1-4%20EvidenceBasedInstructionalImprovementChecklist.pdf
Checklists of quantitative and qualitative changes to
instruction teachers can try when students are not meeting
goals; strategies linked to hypotheses about why students
are not responding
Table 2. Resources to Assist With Implementation of Evidence-Based Practices (EBPs).
Resource Focus Area Description
The IRIS Center
General evidence-based
practices
Free resource with information and training
materials for a range of EBPs
Center on Instruction
http://www.centeroninstruction.org/
General evidence-based
practices
Practitioner guides, professional development
materials, tools for educators, and examples
WWC Practice Guides
https://ies.ed.gov/ncee/wwc/PracticeGuides
General evidence-based
practices
Guides include actionable steps for using EBPs in
a range of subject and skill areas and suggestions
for overcoming implementation challenges
Explicit Instruction Checklists
http://yostscience.weebly.com/
uploads/2/1/7/5/2175283/explicit_
instruction_checklist__reflection_copy.pdf
http://resources.buildingrti.utexas.
org/documents/Explicit_Systematic_
Instruction_Tool.pdf
Explicit instruction Checklists to help teachers implement explicit
instruction with fidelity; forms include guides for
self-reflection as well as those that can be used by
observers to provide teachers with feedback on
fidelity of implementation
Ok, Kim, Kang, and Bryant (2016) Technology: Educational
apps
Article provides guidelines to help teachers and
parents evaluate quality of educational apps for
children with learning disabilities
Beecher, Abbott, Petersen, and Greenwood
(2017)
Literacy Article presents an early literacy implementation
checklist teachers can use to assess their
implementation of EBPs that support literacy
Center on Instruction Guide for Teaching
Math
http://www.centeroninstruction.org/files/
Mathematics%20Instruction%20LD%20
Guide%20for%20Teachers.pdf
Math Guide reports findings from two research
syntheses on effective math intervention
and provides seven specific teaching
recommendations
Evidence-Based Practices for Writing
Instruction
http://ceedar.education.ufl.edu/wp-content/
uploads/2014/09/IC-5_FINAL_08-31-14.pdf
Written expression Guide features a matrix that can (a) guide teachers
planning and (b) help teachers evaluate their
writing curriculum
Walker, Clancy, Tsai, and Cheney (2013)
MacSuga and Simonsen (2011)
Social behavior and
classroom management
Articles describe EBPs for addressing challenging
behavior and social skills; provide checklists to
help teachers evaluate their use of EBPs
https://www.interventioncentral.org/
https://rti4success.org/
http://rtinetwork.org/essential/assessment
https://charts.intensiveintervention.org/chart/progress-monitoring
https://dibels.uoregon.edu
https://www.mayinstitute.org/pdfs/presentations/2016PBIS%20C1-4%20EvidenceBasedInstructionalImprovementChecklist.pdf
https://www.mayinstitute.org/pdfs/presentations/2016PBIS%20C1-4%20EvidenceBasedInstructionalImprovementChecklist.pdf
http://www.centeroninstruction.org/
https://ies.ed.gov/ncee/wwc/PracticeGuides
http://yostscience.weebly.com/uploads/2/1/7/5/2175283/explicit_instruction_checklist__reflection_copy.pdf
http://yostscience.weebly.com/uploads/2/1/7/5/2175283/explicit_instruction_checklist__reflection_copy.pdf
http://yostscience.weebly.com/uploads/2/1/7/5/2175283/explicit_instruction_checklist__reflection_copy.pdf
http://resources.buildingrti.utexas.org/documents/Explicit_Systematic_Instruction_Tool.pdf
http://resources.buildingrti.utexas.org/documents/Explicit_Systematic_Instruction_Tool.pdf
http://resources.buildingrti.utexas.org/documents/Explicit_Systematic_Instruction_Tool.pdf
http://www.centeroninstruction.org/files/Mathematics%20Instruction%20LD%20Guide%20for%20Teachers.pdf
http://www.centeroninstruction.org/files/Mathematics%20Instruction%20LD%20Guide%20for%20Teachers.pdf
http://www.centeroninstruction.org/files