When things go wrong on a project or change implementation, we may gather our team together to debrief. We identify lessons and commit to practicing them to avoid complications and failures in the future. While research indicates debriefs are an effective method for improving team performance, you may wonder, like I do, if there are ways we can sharpen our foresight, not just our hindsight.
In search of haystacks
In my experience, when challenges arise on implementations the cause is often not surprising or unavoidable. Debriefs and after actions provide learning, for sure, but not so much about the needle in the haystack no one knew was there until they sat on it. (Ouch!) Rather, hindsight often highlights the haystacks that were obvious, but everyone pretended not to see! Things like: "We had a process in place, but we stopped using it...I guess things pretty much fell apart after that." Or, "No one knew who was responsible, but no one said anything because we thought someone else would..." Or "We had a gap in our stakeholder engagement efforts, but it was hard to get face time with those people, and we had to move forward, so we just skipped over them...."
Such "haystacks" commonly relate to the big themes, or best principles, of effective implementation. When we ignore them, we are asking for trouble. When we acknowledge them and use them consistently as a guide, such best principles can help us sharpen our foresight, avoid avoidable issues, and increase our chances of success.
Below, I discuss four best principles I've identified from research and practical experience in implementation. Because everything's better with a story, I present these principles using a case study from an education implementation in California, excerpted from my book, The Implementer's Starter Kit.
Principles in Action — An implementation case study*
Money in search of a solution
In the mid-1990’s, the economy in California was booming. As a result, the state found itself with a budget surplus, much of which it was required by law to spend on education. At the same time, standardized test scores indicated lagging student performance, particularly amongst elementary school kids, and low-income and minority students. In response, the Governor of California secured passage of legislation that aimed to increase student literacy scores by reducing the size of classes in kindergarten through third grades throughout the state.
This initiative in California was inspired, at least in part, by a pilot program in Tennessee called Project STAR. Project STAR had garnered national attention for impacting student outcomes through the adoption of smaller class sizes.
The class-size reduction program in California was voluntary and offered to all schools throughout the state. At its inception, the program provided schools with $650 for each student in a classroom with less than 20 other kids, as well as one-time facility grants for additional classrooms. The dollar amount provided was the same for all schools, regardless of the size of their classes at the start of the program. By the end of the first year, nearly 90% of first-grade students in California public schools were in classes of reduced size, with almost 60% of second graders in such classes. The program also proved to be wildly popular with teachers and parents alike.
Getting things done or getting results?
So, it seems, the California program got things done. Class sizes were smaller; teachers and parents were happy.
But, wait! The purpose of the program was to improve student literacy outcomes. Specifically, "to increase student achievement, particularly in reading and mathematics, by decreasing the size of K-3 classes to 20 or fewer students per certificated teacher." You may wonder, how did the program do in those terms?
Unfortunately, after spending $2.5B on implementation and teacher training over two years, the California effort demonstrated no direct impact on student literacy outcomes. (After six years, some improvement in student achievement was seen, but a clear link with reduced classroom size could not be identified.)
Why was a program that was successful in improving student outcomes in Tennessee less successful in California? There are probably many reasons, but when we get into the details, it seems at least part of the cause may have been related to a failure to adhere to best principles of effective implementation. Let’s look at a few examples.
Principle: Engage with stakeholders at all levels and plan before acting.
When the California legislation was passed in July 1996, school administrators were caught off guard. They faced a daunting deadline to reduce class sizes by October — that’s just three months. Consider the practical challenges this posed for school administrators. If you want smaller class sizes, but have the same number of students, you have to increase the number of available teachers and classrooms. In fact, implementing the program required securing “18,000 additional classrooms and 12,000 new teachers” in a period of three months. This proved to be an impossible task.
Further, throughout the life of the program, school administrators and state legislators struggled to align their perspectives. For example, these groups disagreed on things such as: Was this a voluntary program or essentially a mandate? Did the state commit to fully fund costs related to the program — and how were such costs to be defined? Where did schools have the flexibility to modify the program and where did they not?
Principle: Evaluate potential solutions based on technical appropriateness and capacity to implement.
The California effort focused largely on copying “the what” of the Tennessee program, i.e., the innovation of small class sizes. It does not appear, however, that the Californian legislators studied how this innovation was implemented in Tennessee, including assessing the full set of resources implementation would require given the California context. (For example, at the outset of the program California schools struggled with overcrowding, while the schools involved in Tennessee had additional classroom space available.) Nor did the leaders of the California program appear to question whether the innovation of small classroom sizes was the most cost effective means to increase literacy scores in the context of California schools. Perhaps reducing class sizes was a solution to lagging student outcomes, rather than the solution to that challenge.
California was not alone in embracing classroom size reductions as a solution to its achievement challenges. Many states chose to implement this innovation, although the efficacy and appropriateness of the innovation was reportedly not studied in many states prior to implementing it.
Principle: Clearly define what you are implementing and implement it consistently.
Although the Tennessee and California programs were both labeled “class-size reduction” efforts, what was actually implemented in each state was quite different. In California, districts were given the discretion to implement the program to any degree they wanted. For example, one school could reduce class sizes in all K-3 classrooms, while another could choose to only do so in only a few grades. Notably, at its start, the California program involved 150 times the number of students that were part of the Tennessee effort. Therefore, although class sizes were reduced, the definition of “small” in California was still larger than it had been in the Tennessee program. Further, due to a lack of available, qualified staff and space, classes in California were led by uncertified teachers and in spaces not intended to be used as classrooms — such as gyms, libraries, and labs.
Should one assume that two programs that were similar in name only would produce similarly promising results? Likely not.
Principle: Provide training and ongoing support to end-users.
Some states that evaluated the effectiveness of their classroom size reduction programs found that significant outcomes could result from this intervention. However, these outcomes were dependent on factors beyond classroom size, including a supply of qualified teachers and suitable classroom space as well as rigorous curricula and professional development for teachers. It seems you have to offer a variety of supports to help teachers to actively put good practices in place if you want those practices to have an impact on kids.
Does any of this seem familiar to you?
The basics of good implementation are commonly overlooked, even in highly public, high-stakes efforts such as this one in California. (Recently, challenges related to the implementation of another politically popular education program have also hit the news.) Unfortunately, as this case demonstrates, quick fix or ‘just do it’ approaches to implementation are a risky proposition if you are looking to achieve results beyond popularity.
Even if you have no professional experience in government or education, if you’ve worked in an organization, you can probably name a few implementations that fell short for some of the reasons highlighted in this case. Things such as enthusiasm about a solution without a firm analysis of its effectiveness, or the problem to be solved; failure to consider the full costs and requirements of implementation; fuzzy definitions of what is being implemented; and a lack of engagement with impacted stakeholders, just to name a few. Imagine how different things may have been if a few trained implementers had been involved in this effort!
Turning hindsight into foresight
Can you identify learning from this case study that you can proactively apply in your work? Can you suggest other "best principles" from your implementation experience (good & bad) that we should add to this list? If so, please share them in the comments section — we'll all benefit from your contributions.
*All facts and figures related to this case study were drawn from:
Bohrnstedt, G. W., & Stecher, B. M. (2002). What we have learned about class size reduction in California. Sacramento: California Department of Education.
Stecher, B.M. & Bohrnstedt, G.W. (Eds.). (2002). Class size reduction in California: Findings from 1999–00 and 2000–01. Sacramento, CA: California Department of Education. See classize.org.