Outcomes, Not Outputs

Moving STEM Education Programs towards an Impact-Based Measurement Model 

Science, Technology, Engineering, and Math (STEM) education is getting a lot of attention these days, in large part because employers in need of such skills are finding them in short supply, while governments and philanthropists see such skill sets as a path to increased incomes and opportunities for individuals, families, and communities overall. But it’s a long road from improved classroom programming to an improved 21st century workforce. How do you know whether one actually has an impact on the other?

As with any large-scale challenge, a common view of what success (impact) looks like is key to achieving it. So too is establishing a common way to track performance, so successes and failures can be easily identified, shared, and used to drive continuous improvement across the sector. Unfortunately, many individual educational programs perceive impact measurement as impractical—too complex, too resource intensive—and instead default to measuring inputs (for example, how much time or money invested in an educational program) or outputs (for example, how many resources were provided to how many students).

As with any large-scale challenge, a common view of what success (impact) looks like is key to achieving it.

In over 13 years of providing social impact measurement tools and services, we at True Impact have found a modelling approach—which includes mapping out the expected interim- and end-outcomes of a program in logic-model form and then quantifying performance using “best available data”—is an effective and practical solution for at measuring impact, regardless of an organization’s capacity.

We use a Reach-Prepared-Mobilized-Impact model, which logically maps the core interim- and end-outcomes for virtually any program. Starting with Reach, first quantify the number of beneficiaries being served by the program. Of those individuals served, then determine how many successfully gain the skills, knowledge, access to resources, or motivation to achieve the program’s goals. This is the Prepared number. Next, of those beneficiaries successfully prepared, ask how many took action or changed their behavior. This is the Mobilized number. Finally, of those beneficiaries that successfully mobilized, identify how many realized the type of improvement in individual or social wellbeing that is the stated goal of the program. This is the social impact.

This structure helps guide programs to accurately distinguish activities from outcomes, and, in the absence of perfect information, create more accurate estimates.

Applying the Model (What to measure)
Imagine, for example, a group of educators and scientists working together to design a classroom activity to teach 10th graders about solar wind. The goal of this program is to excite students and help them better understand states of matter, increase their engagement with science, and encourage them to pursue science-related career paths. Although the program team understands the science and the pedagogy, they are concerned they lack the skills and resources to evaluate the program. Here is how the model might be applied to their program:

  • Reach 
    The first step of the model is determining the number of learners that engage in the activity or participate in the program or initiative. Here, relevant indicators might be the number of students exposed to the solar wind curriculum or to a solar science exhibit at a museum.
  • Preparation 
    Organizations will then identify the number of learners who gain science-related knowledge, skills, or motivation as a result of program participation. For this stage, indicators might be the number of students who better understand concepts related to solar wind, such as states of matter, solar eruptions, and the structure of the solar system; or who learn how to gather scientific data; or who express increased interest in science.
  • Mobilization
    This step entails determining the number of learners who take action to increase their engagement with STEM. In our solar wind example, this could be improved science grades or an increase in science-related activities, such as visiting science museums or joining science-related interest groups.
  • Social Impact
    The purpose of this program is ultimately to excite and engage students around science. Reasonable indicators of achievement could include the number of learners enrolling in college with a STEM major; getting a job or promotion that uses STEM knowledge; becoming a mentor or advocate for STEM education; or even, longer term, contributing to society through scientific discovery or other achievements. Any of these measures could be a social impact of the solar wind activity; the specific one chosen would depend on the program goals, as defined by the educators and scientists involved in its development.

Monitoring and Evaluation (How to measure)
Assessing effectiveness is a key component of any educational program, but “perfect” evaluation is not easy. Fortunately, reasonable opportunities to generate data exist in virtually any circumstance.

  • Formal Evaluation
    Ideally, the research model will incorporate formal program evaluation and testing. For some programs, this would include assessing baseline STEM achievement for all participants (or a randomly selected subset) and a control group, tracking changes as participants complete the activity, and gathering ongoing data as they internalize the learning, apply it, and continue with their education and career. Other models may lack specific aspects of a full experiment (such as a control group, or randomization), but still include systematic data collection and evaluation components.
  • Proxy Data
    Many programs lack resources or design structures necessary for formal evaluation. For example, if the solar wind activity is shared on a website with teachers, data cannot be collected directly from students. But the developers of the program could request feedback from teachers, or a sample of teachers from which to generalize. Teachers might report the number of students in their classrooms who engage in the solar wind activity, providing a measure of reach. Reporting student reaction to the activity—increased motivation or understanding—provides a measure of preparation.
  • Extrapolation
    Assessing mobilization and social impact can be more challenging, but opportunities to generate proxy data for those stages remain. Beyond tracking a sample of students instead of the entire population, the program could look to research conducted in similar programs and use the results to generate forecasts.
  • Speculation
    If no research is available, a last option is informed speculation. Even a well-considered estimate based on transparent logical assumptions is a step in the right direction of evaluating individual programs and enabling benchmarking with others. In all cases, transparency about data sources is crucial to ensuring data is properly contextualized as to its level of precision.

In short, this simple and flexible framework allows any organization, regardless of its sophistication or available resources, to determine the social impact of its STEM investments. If embraced across all STEM stakeholders—including funders and practitioners from the private, public, and social sectors—it offers a practical “shared language” for identifying what’s working and what’s not across the sector, to drive continuous improvement in achieving our shared goals.



This article is part of a series on “solvable problems” within the context of the UN Sustainable Development Goals. The Global Engagement Forum: Live takes place this October 10–11, 2018, bringing together leaders from across the private, public, and social sectors to co-create solutions and partnerships to address four urgent, yet solvable problems—closing the skills gap in STEM, reducing post-harvest food loss, ending energy poverty, and eliminating marine debris and ocean plastics.
 Learn more about the Forum here.

View Comments

Comments

Your email address will not be published. Required fields are marked *