Professional learning results in equitable and excellent outcomes for all students when educators create expectations and build capacity for use of evidence, leverage evidence, data, and research from multiple sources to plan educator learning, and measure and report the impact of professional learning.
Educators turn to evidence at all stages of planning, monitoring, and assessing professional learning. They use a combination of relevant data and research findings as well as data from evaluations of their own system’s professional learning to make decisions about professional learning policies, resources, plans, and goals. The use of data, evidence, and research is a recursive process; their use informs decisions about planning professional learning, and impact results inform the next planning decisions.
Educators understand that data are information points, while evidence is the collection of data assembled to contribute to a better understanding of a situation or inform a decision. Data can be quantitative (counts and percentages), qualitative (interview responses and written artifacts), formative, and summative. Educators use research about professional learning, including evaluation studies, to better understand the components of any professional learning and its potential impacts.
Educators have varying responsibilities related to using data and evidence depending on their roles. System leaders establish guidelines and structures for collecting and using data, establish policies for evaluations of professional learning, and sustain a culture favorable to using evidence to inform planning. They regularly assess and document the impact of professional learning and monitor investments of time and resources.
Facilitators of professional learning consistently use formative and summative data, evidence, and research in their day-to-day work of planning and assessing learning, progress, and impact. They support educators at all levels in their own capacity-building related to the use of data, evidence, and research.
School leaders and teachers examine school, classroom, and student-level data to inform and assess the professional learning they facilitate and experience and make improvements in their individual and team practices.
Here are the main constructs of the Evidence standard.
Educators create expectations and build capacity to prioritize evidence use.
Educators establish expectations and strategies for using evidence throughout the process of planning, implementing, and assessing professional learning. Through their expectations and practices, system leaders establish a culture of transparency about using valid and reliable data as a tool that improves learning and guides choices. Educators at every level are open and consistent about demanding that professional learning is guided by the use of evidence so that it consistently leads to sustained impact on educator and student learning.
Educators responsible for planning and providing resources for professional learning support widescale evidence use by procuring relevant data and evidence platforms. Educators at the system level become expert in using multiple types of data and supporting educators at every level to use the data that make sense for their roles and responsibilities.
Educators build data and evidence expertise for themselves and their peers. While data and evidence inform professional learning design, professional learning focused on data use builds educators’ capacity to analyze and make meaning from multiple sources of information and appreciate which data and research are relevant, useful, and high quality.
Educators understand limitations related to their data, such as how survey fatigue can influence educator self-report data or how the design of a research study can leave out traditionally underrepresented populations. Educators learn to weigh data differently depending on whether they are looking for progress on a rubric or Innovation Configuration maps or looking for year-over-year improvements.
Educators determine what data are available and relevant to track progress against their defined benchmarks and goals. They also periodically widen their view to pay attention to other data that might indicate unforeseen changes in the system or an emerging challenge.
Educators are strategic and purposeful about the data they collect and their plan to use their analyses as evidence, and they balance the need to collect data with an awareness of other demands on their colleagues.
In making choices about what data to use, educators may need to prioritize quality and relevance of data rather than quantity or volume, especially because of the time and effort required to collect and analyze data and respond to requests for information.
Educators model a commitment to using evidence by sharing the results of their own learning and actively participating in others’ efforts to collect and analyze data about the impact of professional learning. As appropriate, they publicize goals, benchmarks, and impact results to increase collective responsibility and coherence across a system.
Educators use multiple sources of evidence to plan professional learning.
Educators collect, analyze, and interpret a range of evidence to develop the goals, content, and design of professional learning. Educators examine student data to establish student learning goals and use student and educator data to understand educators’ discipline-specific, pedagogical, or curriculum implementation needs.
Quantitative and qualitative data are important types of data to use, as are formative and summative data at the classroom, school, and system levels about both educators and students. Relevant information includes student assessment scores, indicators of student engagement, project evaluation data, educators’ perception of the quality of professional learning, teacher work products, and notes from observations. Research reports are useful because they contain data from relevant studies.
Together these different sources of data become evidence that informs decisions about planning and expected outcomes for professional learning. Educators consider individual, school, and system evidence to guide decisions about the allocation of human resources, funding, technology and material resources, and time.
Educators collect and analyze data that provide information relevant to progress against their goals, making sure it is at the appropriate level (classroom, school, or system) and of sufficient scope to be helpful (across multiple schools, for instance).
Educators might examine disaggregated student data about disciplinary or gifted referrals to reveal discrepancies in students’ access to learning but collect qualitative data about collaborative conversations to better understand the impact of professional learning on instruction and pedagogical content knowledge.
And while educator-level data might provide information about educators’ experience, content knowledge, and self-reported efficacy, school and system data provide insights into teacher attrition, retention, out-of-field placements, experience with current curricula, and student access to effective teachers.
Educators consult research about school or classroom improvement strategies to understand what aspects or components of professional learning are likely to have positive impacts on educator and student outcomes. They consider whether and how the findings might apply to their context. Educators understand that, while research findings can’t always be replicated, evidence about what works for educators and students can inform their own initiatives and set expectations for investments of time and resources.
Leaders of professional learning periodically scan the landscape for new relevant studies about professional learning, innovations in research methodologies, and new knowledge about implementation of pilot programs in schools and systems. To the extent possible, they link professional learning to a research base to create an evidence-based rationale for their approaches and define expected outcomes.
Educators use evidence to measure and report impact.
Educators monitor whether professional learning is on track to meet educator and student needs to make improvements to individual and collaborative instructional practices along the way.
Gathering evidence to monitor learning varies in form, depending on its purpose. Educators might collect data about participation and processes as a potential measure of engagement, while coaches, mentors, peers, and supervisors might assess progress toward established goals through qualitative data analysis.
Knowing how many teachers participated in professional learning offers one piece of evidence, while notes from observations collected in the classroom offer additional evidence about what was learned and the application of that knowledge.
At another level, educators collaborating in inquiry cycles build in evidence use as part of their learning process, using monitoring strategies to inform their ongoing work and make any necessary adjustments.
Educators commit to an evaluation at the outset of planning professional learning and set expectations among participants for data collection and measures of progress. They articulate a theory about how intended changes will lead to improvements and establish measurable benchmarks along the way from start to finish.
Educators establish clear, relevant measures and benchmarks and share them among all stakeholders. Educators gather data about learning processes and outcomes and collaboratively discuss what they find to ensure they are moving toward their goals.
Leaders of professional learning develop evaluation plans that define short- and long-term outcomes, actions to take to achieve those goals, expectations for the type and degree of changes that will result, resources and supports to be mobilized, indicators of success, and the type of data that will be collected to determine whether professional learning has had its intended effect.
Assessing professional learning includes measuring impact on educator knowledge, practice, and beliefs, and, where possible, on student-level measures. Evaluations often use the KASAB (knowledge, attitudes, skills, aspirations, and behaviors) framework to define the expected outcomes for educators. Evaluations can also examine the content, relevance, and facilitation of professional learning.
Educators embrace formative and summative evaluation as essential for measuring the quality, effectiveness, and impact of professional learning, recognizing the importance of systematically gathering data about achievement related to goals and benchmarks (impact), coherence, alignment, and engagement of participants (quality), and progress related to building knowledge or sustaining implementation (effectiveness).
Data collection plans are tied tightly to the purpose of the evaluation and the professional learning itself, and these plans vary depending on whether educators seek to impact educator beliefs, classroom practices, or student outcomes.
Collecting and analyzing data about the impact and effectiveness of professional learning is critical to reaching student, educator, and system goals. Educators are committed to documenting investments of time and resources to demonstrate how professional learning impacts educators and students. Evidence of impact and effectiveness then informs decisions about continuing, adjusting, or scaling back an initiative or professional learning approach and also informs planning for future professional learning.
Booher, L., Nadelson, L.S., & Nadelson, S.G. (2020). What about research and evidence? Teachers’ perceptions and uses of education research to inform STEM teaching. The Journal of Educational Research, (113)3, 213-225.
Desimone, L., Porter, A., Garet, M., Yoon, K.S., & Birman, B. (2002, Summer). Effects of professional development on teachers’ instruction: Results from a three-year longitudinal study. Educational Evaluation and Policy Analysis, 24(2), 81-112.
Guskey, T.R. (2016, February). Gauge impact with 5 levels of data. JSD, 37(1), 32-37. learningforward.org/journal/february-2016-issue/gauge-impact-with-5-levels-of-data/
Killion, J. (2018). Assessing impact: Evaluating professional learning (3rd ed.). Corwin.
Pendray, A. & Crockett, J. (2016, December). Make evaluation count: To assess impact, know what to measure. JSD, 37(6), 42-45.
Schildkamp, K. (2019). Data-based decision-making for school improvement: Research insights and gaps. Educational Research, 61(3), 257-273. doi: 10.1080/00131881.2019.1625716
Links to other standards
Educators use the Standards for Professional Learning together to inspire and drive improvement. Each of the 11 standards connects to the other standards to support a high-functioning learning system. Here are some of the ways the Evidence standard connects to other standards:
Considering a range of evidence is one aspect of planning professional learning to meet educator needs. The Learning Designs standard delves into the full range of factors to be considered for designing professional learning to address individual, team, or schoolwide challenges in ways that align to learner characteristics and learning purposes.
The means to establish and maintain a culture that supports evidence use and transparency about data are addressed in the Leadership standard.
Educators seeking to ensure that all learners are being considered and supported in professional learning can look to the Equity Drivers standard and its description of using disaggregated and longitudinal data.