A Recipe for…Nothing

To my friends in the education community, I have a simple and sure-fire way for you NOT to succeed in your SEL work, or at least, to have a very small chance at success. Let’s call this “standard practice”:

  1. Buy an evidence-based SEL program or curriculum.
  2. Invest in some up-front PD about the program.
  3. Spend staff time talking about SEL.
  4. After the start of implementation, assume everything is going great.

What’s wrong with this model?

Changing systems and behaviors is hard. And implementing even a well-crafted evidence-based SEL program requires both systems and behavior to change. When new initiatives like SEL programs are introduced, mostly teachers continue doing what they have always done, which doesn’t include using the new SEL program.

The problem is not teachers—it’s about human nature. When a new initiative is launched, a small percentage of early adopters will jump on it. The rest will lag or never really get off the ground. That seems to be what often happens in districts that use “standard practice” in implementing SEL programs. The program or initiative never really gets off the ground.

Why Spotty Social Emotional Learning (SEL) Implementation Matters

So what, you say? Why does it matter if implementation is spotty?

Districts that use the standard practices described above often find that they spent a lot of time and money on SEL, but very little changes in the classrooms. At the same time, no one has a great handle on where implementation is happening well and where it is not. As a result, resources (like instructional coaching) cannot be deployed where they are most needed to support consistent and high-quality implementation. It should come as no surprise then that student outcomes don’t change. Under standard practices, too often, districts and communities conclude that SEL programs don’t work and should be abandoned. In fact, the program never had a chance to make a difference because implementation was inconsistent.

So, to the question, why does it matter if implementation is spotty? It matters because time, money, and opportunities to support students with effective practices all go to waste.

How Assessment Data Can Support Consistent and High-Quality SEL Practices

What can district leaders do to prevent this all-too-common situation? One of the problems with standard practice is that there’s no way to know what happens in classrooms after implementation begins, or what difference it is making. As a result, it’s not possible to know where to focus coaching and other ongoing PD resources to support consistent and high-quality practices. What’s needed is information about what’s happening and what impact it’s having, so coaching and PD resources can be deployed where needed to improve implementation.

Imagine that, instead of standard practice, a district decided to use this model:

  1. Incorporate SEL into the strategic plan and commit to measurable social and emotional outcomes.
  2. Select an evidence-based SEL program designed to teach the competencies described in the strategic plan.
  3. Provide initial PD for the program’s use and ongoing instructional coaching focused on ensuring that the program lessons are taught consistently and well and that teachers reinforce social and emotional competencies throughout the day.
  4. Assess student competencies and perhaps climate before or at the beginning of the SEL program.
  5. Use findings from the competence and climate assessment data to focus SEL instruction to build on strengths and address needs.
  6. Periodically assess implementation—the extent to which teachers are teaching program lessons consistently and well and that teachers are consistently reinforcing social and emotional competencies throughout the day.
  7. Use implementation data to guide PD and coaching resources where teachers need help to implement the program more consistently and well.
  8. Assess student competencies after a period of instruction to evaluate progress.

What does this model do that is different from standard practice? It shines a light on what is and is not happening in classrooms, giving educational leaders an opportunity to focus coaching and other PD resources where they are needed most, and to evaluate change in practice as a result of those efforts. And it shines a light on student competencies in ways that can focus instruction and evaluate progress.

This model also provides the data needed to determine the extent to which better implementation produces greater social and emotional gains. If so, as districts work to support consistent and high-quality SEL practice, they can feel confident that where implementation is strong, it is leading to the intended benefits. That argues for continued effort to bolster implementation, rather than a premature conclusion that the initiative is not working, reducing the risk of throwing out the proverbial baby with the bathwater.

How and When to Use this SEL Practice Model

This sounds great, you say, but how can we pull off this kind of practice model? xSEL Labs offers a suite of assessments and ongoing support to help you put these ideas into practice. And we are inviting like-minded districts to join cross-district PLCs focused on data-informed SEL practices. (Let us know if you are interested.) This offers the benefits of shared learning and economies of scale. We’d like to work with as many districts as possible to put these ideas into practice.

But Why Bother?

Maybe you’re not convinced of the value of assessment—after all, SEL assessment costs time, money, and the intangibles of a new learning curve.

What is it worth to you to ensure that your considerable investment in SEL programs, PD, and staff time do not become not sunk costs? Is it worth it to invest a little more to ensure that your large SEL investment doesn’t go to waste?

I’d wager that the cost of assessments designed to shine a light on practice and the outcomes those practices are intended to produce is considerably smaller than the substantial waste arising from inconsistent implementation under standard practice.

The cost of not assessing is higher than you probably think.