ETHICS INACTION: Lessons for project leaders from the Flint water crisis.

For the past few months, I’ve been actively following the news about Flint, Michigan, where poor decisions, wrong action and inaction resulted in lead-contaminated water being poured into people’s homes for months before corrective action was taken.  This is devastating for many reasons, not the least of which is that even minuscule amounts of lead can dangerously and permanently impact children’s development.  While my interest in Flint is that of a concerned citizen — this is a deeply troubling case of environmental injustice — as a project leader, it's also professional.  

There is something we can learn from this.

Although I, perhaps like you, have never and likely will never do work related to local water systems, I have been involved in the implementation of many large-scale organizational initiatives over the last 20 years. When they go wrong, the impacts may not be life or death, but they are not minor and are often long-lasting. Beyond economic loss, fall-out from failed projects can include wide-spread employee stress, demotivation, and an erosion of trust.  For this reason, I encourage anyone involved in project or strategy implementation efforts to take an interest in what happened in Flint. 

The big learning from the Flint crisis is that it was not inevitable, as is often the case with large-scale failure. The opportunity is to identify where and how the trajectory could have been changed and to actively integrate that learning into our own implementation management practices. 

What happened in Flint, exactly? 

If you review what happened in Flint, which I’ll briefly summarize, I think you’ll agree it was an avoidable crisis. Flint, Michigan is a U.S. city in extreme economic distress, with aging infrastructure. To save money, government officials, led by an emergency manager, switched the city’s water from a treated, to an untreated source. The switch happened in April 2014. At that time, officials failed to put necessary corrosion controls in place. As a result, the untreated water eroded lead pipes in the city’s water system and lead leached into Flint’s water supply. Problems were detected at least as early as October 2014, when General Motors stopped using Flint’s water because it found corrosion on new car parts.  However, it was only a year later, in October 2015, after months of citizen complaints, presentations of data from academics and physicians, and repeated denials from various government authorities, that officials admitted to making a mistake and formally urged citizens to stop using Flint’s water. (For more detailed summaries see here or here.)

As noted by the Virginia Tech researcher who helped raise alarm about Flint, “No one gets up in the morning thinking they’re going to poison some kids and destroy a city.”  I agree. But I still find myself questioning why, if this crisis wasn’t avoided completely, it wasn't addressed in a more timely manner?  In a recent interview on National Public Radio (NPR) Gina McCarthy, the head of the Environmental Protection Agency (EPA), said something that struck me as the beginning of an answer.

"Well, EPA's role in the Clean Water Act is not unlike the role of a parent with a grown child. You know, basically, the Safe Drinking Water Act says that the states make the decisions. They do the implementing. They do the enforcing. And it specifically creates a hurdle for EPA to intervene, just like a parent cannot intervene in a grown child's life unless they really mess up."

If this is the world view that informs the EPA’s actions, it does help to explain (not absolve) their failures related to Flint. It’s a cautious, bureaucratic view that focuses more on rules than public health.  It's a view that could give rise to the shoulder-shrug, hands in the air, “Whaddya gonna do?” response we saw from the EPA in this case.

The other thing that struck me about McCarthy’s remark, and one of the reasons I’m writing this article, is that I’ve used a similar rationalization myself. I can remember attending meetings for a massive project, which eventually failed, and not saying anything about clear weaknesses in the assumptions being made by the project's leaders...because it wasn’t my role.  We could argue that the project would have failed whether I spoke up or not, or that, surely, I would have said something if it was a life or death scenario.  What I learned through my research for this post, is that while we could argue those points, we would be wrong to do so. 

When ethical failures happen to (good) people. 

From the perspective of behavioral ethics, we should evaluate our actions (e.g, failure to speak up), without regard to the outcome (e.g., failure or success of the project).  Secondly, there is no reason to think that we’ll act when the stakes are high, particularly if we fail to do so when they are not (relatively speaking).  This certainly was the case in Flint, where only a few ‘heroes’ spoke out.  While these heroes should be celebrated, what’s frustrating is that there should not have been a need for heroism in Flint.  All that was needed was technical and ethical competence.  I can’t explain why the officials didn’t have the technical competence required, but I've learned that ethical competence is an area that most of need help with.   

As humans, we are fallible.  There are common missteps in ethical decision-making that we are all vulnerable to. This doesn’t mean we should be excused for our ethical failings. We should be held accountable. It does mean, however, that unethical behavior is not limited to bad actors.  Any of us could find ourselves in a situation where we are left wondering:  "How did I not see that? How did I let this happen? Why didn't I do something?”  

As the world and our work become increasingly complex and fast-paced, it is easier than you may think to stray into unethical territory through action or inaction. Before you find yourself there, you may want to build awareness of common ethical challenges and integrate strategies to counteract them into your management routine.    

Mind the Gap: Common risks to ethical judgment. 

Below, I summarize risks to ethical decision-making taken from my reading, along with examples of how these risks played out in the Flint, Michigan water crisis. Because awareness only gets you so far, in the subsequent section, I summarize several strategies that leaders can use to better navigate ethical dilemmas. 


Source: The  New York Times

Source: The New York Times

We have a difficult time being objective when we have a vested interested in the outcome. This can affect how we assess, seek, use and remember information.  As noted by Max Bazerman and colleagues in their research on ‘motivated blindness’, people tend to ignore or dismiss relevant information, and those who provide it, when the information contradicts their beliefs or interests.  For this reason, it’s particularly important to seek out opinions and information that are counter to our prevailing view.



Goals and performance rewards can spur us to greater levels of achievement. (See more in my post on goal setting.) However, even goals set with good intentions can potentially encourage unethical behavior. This is especially true when goals are incredibly challenging or when decisions are framed purely in the context of the goal, without consideration of ethical dimensions.  Researchers note when managers see undesirable behavior, they should review incentive systems and identify alternative or balancing goals to avoid encouraging unethical choices.



When we are one or two steps removed from the source of unethical actions, we hold ourselves, and others tend to hold us, less responsible. This distance can cause us to be blind to, or not to seek, important information and/or to unconsciously “delegate unethical actions.” Assessing potential actions as if we are the ones taking them — and facing their consequences —  may help to avoid this bias.





Source: The  New York Times

Source: The New York Times

When we accept minor ethical violations, we become more likely to accept more widespread or serious violations.  By the time it becomes a “no brainer” to act, it may be way too late. Therefore, as noted by psychologist, Daniel Ariely: “….the first act of dishonesty might be particularly important in shaping the way a person looks at himself and his actions from that point on — and because of that, the first dishonest act is the most important one to prevent.”




There are times when people simply lie.  More often, we may share a more self-serving version of the truth.  Although 'spin' may be common practice, when we manipulate information we can easily wander into unethical territory.  If we are on the receiving end of deception, we are not necessarily off the hook.  If we suspect something isn't right, and we do not seek additional information or voice concerns, we may not be fulfilling our responsibilities.  




You Can Make It If You Try: Strategies that support ethical action. 

While it’s important to be aware of ethical pitfalls, awareness alone won’t help you to avoid them. There are strategies, however,  that can help project leaders and managers to better navigate ethical dilemmas.  Those outlined below are based on an ethical decision-making model, put forth by researchers at the University of Oklahoma, which focuses on how leaders successfully “make sense” of complex ethical situations.

Keep your cool.

Both positive and negative emotions can impact our ethical judgment.  Therefore, it's wise to keep our emotions from driving our actions. To do this, we can use a range of methods from relaxation, meditation, or just sleeping on it.  Reappraisal of the situation in a less emotional state may lead us to different conclusions. This does not mean we should solely rely on logic and/or data. Research indicates that we should aim for a balanced approach informed by logic and rules, as well as intuition. 

Remember that time when?

Reviewing prior experiences for learning and alternatives that can be applied to the current situation helps us to make a more comprehensive assessment.  It’s particularly important to focus on the process or action that was undertaken previously, rather than the outcome. It may also be worthwhile to seek an outside point of view to develop alternative approaches.

Use your crystal ball.

When we predict potential outcomes of a particular action, it can help us to uncover biases or gaps in our thinking. It’s a good idea to identify a variety of potential outcomes to be sure we are truly testing, not just confirming, our thinking. However, the idea is to focus on the most critical aspects of the situation to avoid getting overwhelmed by possibilities. Finally, it can be useful to ask the question, “Why not?” to ensure we look at all sides of an issue not just those that align with our current point of view.

Connect the dots.

When navigating ethically contentious situations, leaders are called on to seek, use and share information in order to come to a resolution.  We should be aware that context (time pressures, organizational culture, goals) can impact both how we gather information as well as how we synthesize it to ‘make sense’ of the ethical situation. Some research indicates that taking a broad perspective, considering implications not just for oneself, but also others in one’s organization or beyond it, leads to better information synthesis.  

The journey of a thousand miles...

Among the many things I've learned trying to understand what happened in Flint is that, as distressing as it is, what happened in Flint is not that exceptional.  As it turns out, we all generally think we are more ethical than we really are.  It takes concerted effort and practice to build competence in ethical decision-making.  But surely, it's worth it.  As noted by behavioral ethics researcher Ann Tenbrunsel:

"The goal [of such an effort] is not to get you to recognize, ‘Gosh, I'm not really good'; it's to get you to want to use this information to be the person that you think you are, that you want to be, that you can become."


Selected References and Additional Resources

For direct quotes or direct references to research, citations have been provided within the article above, using hyperlinks.  Below are additional or particularly useful resources that informed this article, which you may be interested in as additional reading. 

Behavioral Ethics

Ethics Unwrapped, is a web-based resource on behavioral ethics provided by the University of Texas at Austin, McCombs School of Business.  It provides overviews of ethical concepts as well as case studies.  Hint: Content is largely provided via videos; if you prefer reading, scroll to the bottom of each concept page to find a transcript. 

Max Bazerman is a professor at Harvard whose research includes behavioral ethics.  If you have access to Harvard Business Review, you can find a series of articles by him and his collaborators, which are a bit easier to digest than those in academic journals.  Additionally, if you want to go deeper, you may be interested in books by Bazerman or Daniel Ariely, a psychology professor from Duke. 

Chase Thiel and his collaborators provide an ethical decision-making model in this article,  which appeared in the Journal of Business Ethics.   Because the model is based on a wide-body of research in the field, the article is also a useful summary of research findings on various aspects of ethical decision-making.  


The Detroit Free Press has a special section on the Flint water crisis, which provides a trove of information. This timeline is particularly useful, as it is more detailed than others I reviewed. 

The New York Times has been covering the water crisis in Flint since before it was thought of as a crisis.  This article from March 2015 is helpful for context.  Additionally, this summary provides a high-level overview of key missteps that led to the crisis in Flint.