The Dunning-Kruger effect can lead to technical debt

Last updated on July 7th, 2021 at 07:54 pm

Cropped detail from Charles Robert Darwin, a painting by John Collier
Cropped detail from Charles Robert Darwin, a painting by John Collier (1850-1934). The painting was given to the National Portrait Gallery, London, in 1896. Darwin writes, in The Descent of Man (1871): “… ignorance more frequently begets confidence than does knowledge …” which is the essence of the Dunning-Kruger effect. Image courtesy WikiQuote.

The Dunning-Kruger effect [Kruger 1999] can lead to formation or persistence of technical debt in two ways. First, it can cause technologists or their managers to overestimate their ability to maintain the resource focus needed for retiring technical debt in a timely fashion. Second, it can cause senior managers to be reluctant to accede to resource requests of technologists and their managers in support of technical debt management programs.

Kruger and Dunning conducted experiments that yielded results consistent with the following four principles (paraphrasing):

  1. Incompetent individuals, compared to their more competent peers, tend to dramatically overestimate their own ability and performance
  2. Incompetent individuals, compared to their more competent peers, tend to be less able to gain insight into their own true levels of performance
  3. Incompetent individuals can gain insight about their shortcomings, but, paradoxically, this comes about by gaining competence
  4. Incompetent individuals, compared to their more competent peers, are less able to recognize competence when they see it

The first three principles lead to distorted assessments of one’s own capabilities. The fourth principle leads to distorted assessments of the capabilities of others.

How the Dunning-Kruger Effect affects teams

As an example of distorted self-assessment, consider a team or its managers who must undertake retirement of some types of technical debt. And suppose that they must do this in the course of enhancing or repairing an asset. Such a task plan seems at first to offer efficiencies. The engineers can readily make both kinds of changes at one go. Metaphorically, if we must go to the store for milk, we can also pick up bread while we’re there, rather than making two trips.

However, modifying an existing complex technological asset is unlike shopping for bread and milk. The two kinds of modifications—debt retirement and asset enhancement or repair—might seem at first to be separable. Often they are. But if they aren’t separable, and we undertake the two tasks together, testing and debugging can become extremely complicated. The complications arise because of interactions between defects in the two kinds of modifications. Under some circumstances, an experienced team and its managers might be more likely to anticipate these difficulties. An inexperienced team and its managers might be more likely to underestimate the difficulties, as a consequence of the Dunning-Kruger effect. Budget and schedule overruns are possible consequences of underestimating the complexity of the problem.

How the Dunning-Kruger Effect affects decision makers

As an example of the fourth principle above, the Dunning-Kruger effect can cause some decision makers to discount the warnings and resource requests of engineers and their managers. Decision makers who are unsophisticated in matters related to technical debt must nevertheless assess the validity of the resource requests. In making these assessments, these decision makers may be at a disadvantage for a number of reasons. Examples:

  • Decision makers might hold mistaken beliefs about technical debt. For example, many believe that the main causes of technical debt are poor decisions by engineering managers. And others believe that technical debt is the result of slovenly work habits of engineers. Those who hold such beliefs might be reluctant to allocate yet more resources to engineers to address technical debt.
  • If the advocates for resources for technical debt management aren’t fully informed about the strategic direction of the enterprise, their requests might be inconsistent with enterprise strategy. As a result of a cognitive bias [Kahneman 2011] known as the halo effect [Thorndike 1920], decision makers might notice that portions of those proposals don’t take enterprise strategy into account properly. This can cause them to tend to discount valid portions of the technologists’ proposals.
  • Decision makers might be affected by unrealistic optimism [Weinstein 1996], also known as optimism bias. It’s a cognitive bias that can cause them to discount the sometimes-vivid warnings of technologists about the unfavorable consequences of failing to provide technical debt management resources.

Last words

Investigations of organizational behavior relative to technical debt and the Dunning-Kruger effect could be fruitful. For example, what is the degree of correlation between burdens of technical debt and the incidence of rejected or severely curtailed proposals for resources? Also rewarding would be a survey of the nearly 200 known cognitive biases, to determine which of them might be most likely to affect decision-making relative to technical debt.

References

[Kahneman 2011] Daniel Kahneman. Thinking, Fast and Slow. New York: Macmillan, 2011.

Order from Amazon

Cited in:

[Kruger 1999] Justin Kruger and David Dunning. “Unskilled and Unaware of It: How Difficulties in Recognizing One's Own Incompetence Lead to Inflated Self-Assessments,” Journal of Personality and Social Psychology, 77:6, 1121-1134, 1999.

Cited in:

[Thorndike 1920] Edward L. Thorndike. “A constant error in psychological ratings,” Journal of Applied Psychology, 4:1, 25-29, 1920. doi:10.1037/h0071663

The first report of the halo effect. Thorndike found unexpected correlations between the ratings of various attributes of soldiers given by their commanding officers. Although the halo effect was thus defined only for rating personal attributes, it has since been observed in assessing the attributes of other entities, such as brands. Available: here; Retrieved: December 29, 2017

Cited in:

[Weinstein 1996] Neil D. Weinstein and William M. Klein. “Unrealistic Optimism: Present and Future,” Journal of Social and Clinical Psychology 15:1, 1-8, 1996. doi:10.1521/jscp.1996.15.1.1

Cited in:

Other posts in this thread

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Show Buttons
Hide Buttons