This is the first installment of a series of posts on bias by lawyer, author, and educator Jim Peterson. Jim spent 19 years as in-house counsel at Arthur Andersen. His international practice focuses on the accounting profession’s litigation and disputes, practice quality, and regulatory issues.
Trustworthy, high-quality service lies at the heart of the accountants’ claim for public confidence and professional legitimacy. Rigorous standards and high expectations are set for independence of judgment, skepticism, and ethical behavior.
Yet, to the frustration of the profession, its leaders, and its public critics, undesirable behaviors continue to erupt:
- Large-scale financial failures and scandals that prompt the popular outcry, “Where were the auditors?” –- revealing problems that have been repeated for decades.
- Corruption of regulatory inspection and oversight through falsification and back-dating of documents and misuse of confidential agency information.
- Answer-sharing and other cheating on professional examinations.
- Individual violations of trading rules, use of inside information, and breaches of independence rules.
- Doubts over the profession’s ability to recruit and retain a growing number of properly qualified and skilled personnel, in the context of concerns for academic preparation and workplace environments.
This despite intense attention and massive investments of time, money, and energy in education, training, messaging from senior leaders, and the development of tools and methodology.
The challenges may seem intractable. But if issues that extend over judgment, skepticism, ethical behavior, and operating priorities are really so hard, are there ways to improve?
Although the search for simple solutions to complex questions can lead down the road of faulty analysis and wrong answers, a more optimistic perspective is proposed here. Fortunately, we have a half-century of research in the cognitive sciences to open a window into the complexities.
The premise is that accountants share common characteristics with all other fellow humans –- for better or worse. They are not unique or special, differently equipped, or uniquely skilled. They are no less subject to all the influences and pressures that affect and expose the human condition with its frailties and weaknesses.
Which means the news is not all bad. The very opportunities identified in that research have adaptations to the accounting profession’s particular needs.
Biases, Pressures, and Human Frailty
A rich body of relevant literature sets the stage. A sampling for those interested:
- The ground-breaking 1974 essay of Daniel Kahneman and Amos Tversky, “Judgment Under Uncertainty: Heuristics and Biases” (Science V. 185)
- Kahneman’s own encyclopedic book, “Thinking, Fast and Slow”(2011)
- Contributors such as Philip Tetlock, “Super-Forecasting: The Art and Science of Prediction” (2015) and Richard Thaler – with Cass Sunstein, “Nudge” (2009 and later editions), and his own “Misbehaving: The Making of Behavioral Economics” (2015)
- A generation of sharp economists, including Steven Levitt and Stephen Dubiner of the University of Chicago, e.g.,”Freakonomics” (2005,) and the UK’s Tim Harford, e.g., “The Undercover Economist” (2005) and “How to Make the World Add Up” (2021).
Greatly simplified, all modern humans carry the legacy of our prehistoric ancestors’ DNA. An early hunter, armed with a spear and hearing the rustling underbrush, faced a critical moment that required an instant reaction: if it were an animal he could kill, the tribe would eat that day, but if a tiger or a wild boar, he was at deadly peril. A stranger emerging from the forest with outstretched hand –- was it in friendship or with a weapon?
For our ancient predecessors, the fast instincts of “fight or flight” and “friend or foe” were essential to the tribe’s safety and survival, whether against hostile adversaries and vicious predators. Their judgments carried life-or-death consequences for the hunters, although the tribe survived provided they were mostly right, most of the time.
Those same triggers serve today in many simple cases. When a traffic light changes from green to yellow, no analytic energy goes into a driver’s non-rational “go or no” reaction. Likewise, quick and low-salience consumer choices at a Starbucks counter or a grocery shelf are mostly simple, fast, and non-demanding.
But these examples show that our DNA has not evolved. The very simplicity of our ancestors’ instinctive reactions means they are maladapted to complex modern challenges. Decisions such as how to allocate an investment portfolio, calculate a reserve provision, where to open a new office, or which candidate to hire are better made with research, deliberation, and consultation.
Like the ancient tribes, today’s society still operates, for better or worse, as programmed for overall survival under normal conditions. This means that decision-making often defaults to oversimplification or resists recognition of the consequential exceptions:
- The very essence of the long-standard auditor’s report, expressed in non-technical terms, conveys a message filled with multiple qualifications — that the client’s information “is generally right, for the most part, over this time period, so far as we can tell within the limits of our work.”
- Likewise, practices and programs for personnel hiring, evaluation, promotion, and management are designed on the assumption that those affected will behave as expected. They act honestly, in good faith, and respond appropriately to guidance and leadership.
For both of these, the general rules work just fine. Until, under conditions that may seem unusual or unpredictable, they don’t.
The dissonance emerges in both performance and management, because it is the exceptional cases, not the normal and expected ones, that cause the problems. (E.g., The white-collar fraudster for whom an audit is an inconvenience to be evaded, the abusive supervisor, the rogue employee.)
And that’s when the consequences kick in. Reactions to this bad behavior–-from the profession, its clients, the general public–-can be put in the terms used to discipline unruly children, “when you do right, nobody remembers; when you do wrong, nobody forgets.”
The Ubiquity of Human Bias
The literature has cataloged an entire vocabulary of human bias and error that pervades the accountants’ environment as much as any other area of business or professional services. The irresistible human impulse to simplify complex problems leads to narrowed questions that, by default, are often answered with little time and energy –- and too frequently answered incorrectly.
Examples from a long list will illustrate.
In General | In the Profession | |
---|---|---|
The Induction Fallacy (Mistakenly believing that past experience is a complete guide to the future) | It’s only the flu | The CFO has never misled me before |
The Planning Fallacy (Overconfidently disregarding the likelihood that unforeseen changes will affect a project’s execution) | The new opera house will be finished on time and on budget | Our audit report is on schedule –- we don’t expect any disruptions |
The Anchoring Effect (Overweighting quantitative information that is visible although not necessarily complete or relevant) | “Buy three tires, get the fourth free” –- sounds like a bargain | The provision for receivables feels light –- a 5% bump should do it |
The Optimism Bias (Ignoring evidence inconsistent with a desired outcome or narrative) | Our troops will occupy Kyiv within two weeks | Client growth was last year 15% -- 20% this year is a lock |
The Inertia Effect (Continuing past practice, whether or not in an environment of changed conditions) | NASA shuttles have launched for years –- the Challenger is good to go | Our sample scope was fine last year -- SALY |
The Bias of Representativeness (Extending an assumption based on limited information) | Dressed like that, he surely doesn’t belong here | We don’t send recruiters to schools with that ranking |
These examples — illustrative and by no means comprehensive -– are common throughout the profession, as they are in society as a whole. And they share a common theme. They all arise out of the natural default to quick, simple, and intuitive reactions inherited from our ancestors that are ill-suited for modern life’s complexities.
The guidance of the research is best summarized by Richard Thaler (Misbehaving, pp. 355-56):
It is time for everyone –- from economists to bureaucrats to teachers to corporate leaders –- to recognize that they live in a world of humans and to adopt the same data-driven approach to their jobs and lives that good scientists use.
Good leaders must create environments in which employees feel that making evidence-based decisions will always be rewarded, no matter what outcome occurs. The ideal organization encourages everyone to observe, collect data, and speak up.
A word of caution: perfection is not within the grasp of fallible humans with our in-built limitations. “Zero defects” is not achievable, nor does the awareness of our sources of bias assure consistent use of the tools available. But the deliberate aspiration to improve, being the best we can do, has promise.