Image of brain with question marks

One of the “myths” about ethics is that people who do unethical things are somehow different from the rest of us: that they are psychopaths or other kinds of individuals who simply do not care about doing the right thing. In other words, the myth is that people are generally the problem. While certainly there are people who do not care about doing the right thing, thankfully, they tend to be a very small slice of humanity. They are an even smaller slice of the group of people that choose public service as a career.

If so, with all the other important tasks agency leaders juggle, can we assume ethical behavior is a given, that public agency team members will do the right thing?

Unfortunately, both experience and science say the answer is “no.” The science of behavioral ethics offers interesting insights into why even well-intended individuals fall short of our intentions to act ethically. The reason often is because many ethical missteps are situational. They are an unfortunate combination of circumstances and how we as humans react to such circumstances.

This article explains what we as management professionals can learn from that science and what it means for encouraging ourselves and others to act on our ethical intentions. It does so through the lens of three public administration tragedies. (These tragedies are part of the “experience” element of the equation.)

A Bit of Context: Behavioral Psychology

Behavioral ethics is an extension of psychologist Daniel Kahneman’s research into how humans really process information and decide what to do. This research was so ground-breaking that it received a Nobel Prize for economics in 2002. Dr. Kahneman summarizes his findings in his bestselling book, Thinking: Fast and Slow. He invites us to think of the human brain as operating in two modes.

  • The first is “system 1,” which we do with little effort. In fact, this system operates virtually automatically, often drawing on intuition and emotion. This kind of “thinking” occurs quickly (hence the moniker “fast” thinking).
  • The second kind of thinking is “system 2,” which is slower, more effortful, and more deliberative.

What are the practical implications of the two systems? Dr. Kahneman’s answer would be that system 1 thinking is “error-prone.” Put another way, when we are operating in system 1 mode, our brains tend to react in ways that cause us to make decisions that don’t serve us as well or match our good intentions.

This includes decisions with ethical dimensions. Scholars Bazerman and Tenbrunsel observe that system 1 thinking often dominates decision-making with ethical dimensions. The net result is that what they describe as our “want” (emotional and self-interested) selves can override our “should” selves. In fact, system 1 ”want self” thinking can cause us to miss the ethical dimensions of a situation entirely. This is why the scholars’ book on this subject is called Blind Spots.

The practical takeaway: sometimes intuition can serve one well, such as when one gets an uncomfortable “gut” feeling about a contemplated course of action. That “gut feeling” may be signaling that the action could be inconsistent with one’s values.

However, the obverse is not always true. The absence of a “gut feeling” does not necessarily mean that there is no ethical issue. Let’s discuss what kinds of dynamics can block those signals.

Brain Dynamics to Be Aware of

Decision-making experiments indicate that humans suffer from a number of cognitive tendencies that affect how we react to situations. For example, humans often use decision-making short cuts to help us make decisions more easily. Situational factors, like being overworked, can play a role as well. Finally, even when we think that we are acting/deciding something rationally, we can actually be reasoning backward (or rationalizing) to justify a decision made more emotionally (in system 1 by our “want selves”).

While not exhaustive, the following are examples of these dynamics that are especially likely to present themselves in the workplace. They are presented in the contexts of a variety of case study situations involving seemingly well-intended public agency staff making decisions they likely came to regret.

The descriptions of behavioral ethics concepts draw on information in a helpful website developed by the University of Texas’ McCombs’ School of Business: ethicsunwrapped.utexas.edu. The school developed the website to help the business community understand the implications of behavioral ethics for private sector ethics efforts.

The Veterans Administration Scandal

Self-Serving Bias and Unrealistic Goals

The 2014 Veterans Administration (VA) scandal illustrates the power (and corrosive effect) of self-serving bias. Self-serving bias is the tendency people have to process information that supports their own self-interests or pre-existing views. (This is one of the drivers of the “want self” that Bazerman and Tenbrunsel describe.)

Self-serving bias can also discourage people from speaking up if they fear adverse consequences.

In the VA scandal, performance incentives at the agency rewarded work units that met specified targets for seeing patients in a timely manner. But, because the agency was understaffed, employees falsified records about how quickly patients were in fact being seen. This obscured the problem that the agency was understaffed, with the tragic result that patients actually died while waiting for appointments.

A video on the University of Texas site (see bottom of page for transcript of narration) delves deeper into the role that unrealistic targets and workloads can play in eroding employees’ commitment to values and ethical action. It notes that when employees feel mistreated, they are more likely to mistreat others (customers or, in the case of public agencies, the public or agency clients) and engage in other misconduct (for example, lying, which is what those that participated in the records falsification basically did).

The Space Shuttle Disasters

Status Quo Bias, Framing, and More Unrealistic Goals

Status quo bias is the human tendency toward inaction. A manifestation of this is moral muteness, which can occur when people remain silent when observing unethical behavior. Diffusion of responsibility can be a related dynamic, which involves people not taking action because people around them are not taking action. 11 They conclude action is either not appropriate or someone else will take action if it is appropriate.

The National Aeronautics and Space Administration (NASA) learned in a very tragic and public way what happens when leaders do not receive information necessary to make a good decision. In the Challenger disaster, engineers had warned mid-level NASA management that cold temperatures forecast for the launch day posed a serious hazard to the space shuttle and its seven crew members. Sadly, the engineers’ warnings proved prophetic when the shuttle broke apart shortly after launch, killing all seven crew members in front of a nation and its school children as the television cameras rolled.

There are a number of lessons to be learned (and illustrations of behavioral ethics concepts) given what happened. NASA reportedly had an overall organizational culture that discouraged unwelcome information up the chain of command (which sadly produced another deadly disaster years later when the space shuttle Columbia broke apart on reentry to the Earth’s atmosphere). The contractor’s engineers reportedly assumed that their concerns would be communicated up the chain of command, which proved untrue (a variation on moral muteness/status quo bias). With no one speaking up, the countdown to launch (the status quo) continued, with tragic consequences and damage to the public’s and others’ trust in NASA’s competency.

In addition, the private sector contractor’s management team overrode the engineers’ recommendation to postpone launch. The engineers’ unit leader on the management team was reportedly told to “take off his engineer’s hat” and look at the situation from a business perspective. This is an example of how framing a decision too narrowly can obscure the ethical dimensions of a decision. Research indicates that how an issue is framed powerfully influences the factors people take into consideration. Research also demonstrates that when we are laser focused on making one thing happen, we block out other sensory and other inputs.

Moreover, those NASA mid-level managers that were pressuring the contractor to override the engineers’ concerns were under budgetary pressures that led to an unrealistic launch schedule, not unlike those involved in the VA scandal.

The takeaway is our tendency toward inaction is one of the reasons organizational leaders need to take affirmative steps to encourage people to speak up when they have concerns about proposed actions (sometimes called fostering a “speak-up culture”).

The Blind-Sided County Administrator

Conformity, Obedience to Authority and Overconfidence Biases, and the Importance of Organizational Culture

Conformity bias describes our human tendency to take behavioral cues from our surroundings. In other words, we respond to social norms and pressures. The good news is if those around us are acting in positive and pro-social ways (actively looking for and evaluating the values dimensions of their workplace actions), we will likely follow suit.

The reverse is unfortunately true as well: if others are cheating or acting unethically, we may be tempted to do likewise. A related concept is group think, which describes a desire to maintain group loyalty, even when it conflicts with one’s personal standards. In short, as social animals, we are strongly motivated to go along to get along.

The leadership takeaway from this insight about human nature relates to the importance of being intentional about shaping organizational culture. Organizational culture has been described as “the way we do things around here”—the norms, values, and traditions that shape how employees behave and do their work.

We are also strongly influenced by what sometimes is referred to as the “tone at the top” of our work units or organizations. Obedience to authority bias is a tendency to comply with superior’s wishes or directions, even if it conflicts with one’s own judgment. People tend to respect and follow those they perceive to have legitimate authority. The dynamic extends to actions people take to please those in authority, even without expressly being directed or asked to act unethically.

In short, leaders send signals, whether intentionally or not, on what matters. This public administration case study (based on an actual situation and used in public administration textbooks) is potentially illustrative. The organization’s leader ensured that his agency had a code of ethics and he was a member of ICMA. He trusted his management team and reportedly had a hands-off leadership style. The administrator also reportedly took pride in his organization’s performance in getting work accomplished efficiently. The administrator was well-regarded by both his board and the community.

That all unraveled, unfortunately, when the local newspaper reported that a 55-year-old public works employee took some $15,000 in gifts from a company whose contract the employee helped supervise. This may be an example of self-serving bias insofar as the longtime public servant accepted nice gestures from those trying to curry favor. (The employee may have also rationalized, as humans do, that there was a genuine relationship motivating the gestures or that his good work made him deserving of such gestures.)

The situation got worse, however, for the administrator. As is often the case when scandal hits, closer scrutiny revealed other procurement improprieties that seemed to be efforts to get around procurement process requirements. What might have caused the employees to think that would be okay?

We do not conclusively know, of course, but a lesson going forward is the importance of leaders—particularly those in the public sector—communicating how the work gets accomplished is just as or even more important than simply getting the work done. A tool for doing this is an organization’s values statement, ideally incorporating core values (see sidebar) of trustworthiness, fairness, and responsibility (including responsibility to adhere to laws and rules designed to promote fairness and trust in the agency’s processes).

There is a strong consensus in the literature that leaders ignore organizational culture at their peril. Conversely, effective leaders—particularly those concerned about ethics—are intentional in shaping organizational culture to promote values-based decision-making. They work to keep themselves and others in their organization from falling prey to our human tendency toward overconfidence bias. Overconfidence bias is a tendency people have to be more confident in their abilities, including the ability to act ethically, than is objectively reasonable. As in Lake Wobegon, we all consider ourselves to be above average (in fact, well above average) when it comes to ethicality.

The cruel irony is that overconfidence bias can cause us to neglect to look for the ethical dimensions of a decision based on a flawed assumption that we will always act ethically. This is one of the reasons that scholars counsel ethics educators to help learners understand our shared human cognitive tendencies. Even more alarming, research indicates that leaders in particular fall prey to this overconfidence dynamic, often with ethically and professionally disastrous results.

Finance Officers Embrace Behavioral Ethics in Recent Ethics Initiatives

The Government Finance Officers Association (GFOA) recently launched an effort to “reinvent” its approach to ethics, expressly embracing behavioral ethics concepts. 24 The organization represents over 20,000 finance officials in the United States and Canada.

In deciding to do so, GFOA concluded that the real impediment to public finance professionals’ ethical behavior is situational factors or organizational pressures. GFOA developed a values-based code of ethics and is developing training programs and implementation guides to support its members in embracing the code.

Conclusion

In terms of leadership takeaways, several have already been identified, including:

  • Make sure performance targets are realistic.
  • Encourage “sharing up” of important and even unwelcome information (foster a “speak up culture”).
  • Mind your messaging: getting work done is important in public service, but so is how that work gets done.

Having an organizational code of ethics and demonstrating one’s commitment to ethics by being a member of ICMA are important first steps. However, these steps must be reinforced by leadership actions and messages that underscore the importance of including ethics and values (“is this the right thing to do?”) as an important criterion in decision-making. Such efforts can have a powerful effect on organizational culture, which in turn shapes employee behavior.

Leaders are also well advised to model the behavior of slowing down (engaging in system 2 thinking) in decision-making and actively looking for the values dimensions of situations. Sometimes the analysis involves competing “right values,” which in turn leads to an analysis of which value is most important in a given situation. Another reality is that appearances matter when it comes to public service ethics issues. This means that erring on the side of caution and avoiding even the appearance of impropriety is often a wise default.

Is it an administrator’s job to concern themselves with organizational ethics? The American Society for Public Administration unequivocally says “yes.” Principle 7 of the organization’s code of ethics encourages people in public service to “promote ethical organizations.” While the code’s accompanying practices offer ideas on how to do this, an additional practice is to understand what the science says about human nature and ethical decision-making. This reduces the likelihood of falling prey to ethical blind spots. It can also create positive situational dynamics that support others in doing the same.

The benefits of doing so include fostering an organization that operates in a way that promotes public trust and confidence in your agency. More practically, it can mean keeping one’s job, which means continuing to do the good work you are already doing.

Headshot of JoAnne Speers

 

JOANNE SPEERS, MPP, JD, trains and consults on public service ethics as principal of S2 (as in “System 2”) Ethics Strategies. She previously served as chief executive of the Institute for Local Government, where she developed and directed its ethics program, and has also taught ethics as an adjunct professor at the graduate school level.

 

 

 

 

 

 

New, Reduced Membership Dues

A new, reduced dues rate is available for CAOs/ACAOs, along with additional discounts for those in smaller communities, has been implemented. Learn more and be sure to join or renew today!

LEARN MORE