Three cognitive biases

Last updated on September 16th, 2018 at 03:58 pm

Technical debt arises in enterprise assets through the effects of two classes of drivers: obsolescence and decision-making. When technologies advance, or new technologies arise, or laws or regulations evolve or are introduced, existing assets or assets under development can sometimes be left behind. That’s how obsolescence produces technical debt. Debt driven mainly by decision-making is more difficult to describe, but anything that biases decisions away from strictly rational results presents risk. Three cognitive biases likely have strong effects on technical debt formation and persistence.

Photo of Daniel Ellsberg, speaking at a press conference in New York City in 1972
Photo of Daniel Ellsberg, speaking at a press conference in New York City in 1972. Best known for his role in releasing the Pentagon Papers, Dr. Ellsberg made important contributions to decision theory while at the RAND Corporation. Photo by Bernard Gotfryd, courtesy U.S. Library of Congress.
Decision-making produces technical debt as the people of the enterprise make choices in design, development, and resource acquisition or allocation. Typically, both obsolescence and decision-making contribute to producing any particular instance of technical debt, though either obsolescence or decision-making might be more important than the other in any given instance.

Managing debt driven principally by obsolescence isn’t difficult, but I’ll leave that topic for another time. For now, let’s focus on decision-making. Already widely accepted is the contribution of engineering decisions to technical debt formation. Indeed, many believe that all — or most — technical debt arises as a result of faulty decisions by engineers. While some engineering decisions are indeed faulty, the current scale of technical debt is so large as to create doubt about the idea that the only decisions contributing to technical debt formation are engineering decisions. Investigating how resource allocation decisions might contribute to technical debt formation is certainly worthwhile.

In this post, I propose three examples illustrating how resource allocation decisions might contribute to technical debt formation and persistence. Each example illustrates how people make faulty decisions while believing they’re proceeding objectively and rationally. In each case, what causes the problem is a phenomenon called cognitive bias, though each example in this post illustrates the action of a different cognitive bias.

Loss aversion

The cognitive bias known as loss aversion, first identified by Amos Tversky and Daniel Kahneman [Kahneman 1984], is the tendency to prefer options that avoid losses to options that lead to gains that are equivalent or even greater. A decision-maker affected by loss aversion bias might conclude that it’s better to not lose $5 than to find $5 or even $10. In this way, loss aversion skews decisions so as to favor options that enable the enterprise to protect or enhance existing revenue streams, even if that decision causes increases in operating expenses. And this bias has effect even if the increases in operating expenses exceed the value of whatever revenue that decision protected.

Retiring technical debt usually entails deferring revenue in the short term, for two classes of reasons. First, we must turn the attention of some part of the engineering organization to debt retirement, instead of whatever they were doing. Assuming that they would have been working on maintaining or enhancing existing products or services, this redirection of their attention can lead to reducing or deferring revenue. Second, during the debt retirement operation, some work might require short-term interruptions of revenue streams, for the purpose of installing or testing the assets that are being revised for debt retirement purposes.

Thus, debt retirement efforts often do reduce revenue — or reduce revenue increases — in the short term, and some decision-makers can perceive that effect as a loss.

However, the long-term effects of debt retirement can be gains, and those gains can be considerable. Typically, by retiring an asset’s technical debt, we reduce the difficulty (read: time required, effort, cost, and risk) of future maintenance and enhancement efforts involving the asset. We also reduce the probability of debt contagion.

Since these long-term effects of debt retirement are ongoing, their impact on the enterprise can be significant. But unless one is experienced with dealing with the consequences of technical debt, recognizing the value of retiring technical debt can be difficult. When loss aversion is in play, intuitive comparisons of the effects of (a) a short-term revenue loss or delay to (b) a long-term benefit of debt retirement are biased in favor of not retiring technical debt.

Insulating decisions about debt retirement from the effects of loss aversion bias requires objective mathematical modeling of revenue losses and operating cost benefits for all options under consideration. Those models must also account for uncertainty, which makes them inherently ambiguous. And that leads us to consider our next cognitive bias, the ambiguity effect.

The ambiguity effect

The cognitive bias known as the ambiguity effect causes us to prefer options for which the probability of a desirable outcome is relatively better known, over options for which the probability of a desirable outcome is less well known, even if the expected value of that more ambiguous outcome exceeds the expected value of the less ambiguous outcome. The effect was first described by Daniel Ellsberg [Ellsberg 1961].

Consider a choice between allocating resources to a new development project and allocating resources to a technical debt retirement project. In most enterprises, decision-makers are familiar with new development projects. Likewise, project champions, project sponsors, and project managers are also familiar with new development projects. All parties are less familiar with debt retirement. It’s reasonable to suppose that when confronted with such a choice, decision-makers are likely to see debt retirement as carrying with it a probability of positive outcome that is less well known than the probability of a positive outcome for the new development project.

Because of the ambiguity effect, resource allocation decisions are likely to be biased against technical debt retirement, and in favor of maintenance or new development.

But there’s more. Most projects, of any kind, encounter trouble from time to time. When that happens, the urge to reallocate organizational resources can be powerful. Troubled projects might receive more resources if they’re viewed as important to the organization. If so, those resources often come from other projects. The ambiguity effect biases these resource reallocation decisions in a way analogous to initial resource allocation decisions, as described above. In other words, because of the ambiguity effect, when projects encounter trouble, debt retirement projects are less likely to be able to retain previously allocated resources than are maintenance or new development projects.

The availability heuristic

The availability heuristic is a method humans use to evaluate the validity or effectiveness of decisions, concepts, methods, or propositions [Tversky 1973]. According to the heuristic, if we recognize the item being evaluated as familiar, or related to something with which we are familiar, we’re more likely to regard it as valid or workable. And when making comparisons between two alternative decisions, concepts, methods, or propositions, we’re likely to assess more favorably the decision, concept, method, or proposition with which we’re more familiar, all other things being equal.

In organizations where decision-makers have more experience evaluating maintenance or development project proposals than they have with technical debt retirement proposals, the availability heuristic acts to reduce the relative assessed favorability of technical debt retirement proposals. It does this in three ways.

First, in most organizations, technical debt retirement projects are less familiar to decision-makers than are maintenance or development projects. On that ground alone the technical debt retirement project proposals are at a disadvantage.

But the second effect of the availability heuristic is more important, because the effect of the availability heuristic extends to the consequences of the decision, concept, method, or proposition under consideration. To grasp the value of a maintenance or development project, one must understand how it will affect the users of the assets being developed or maintained. Likewise, to grasp the value of a technical debt retirement project, one must understand how the presence of the technical debt hampers the enterprise in its attempts to achieve its objectives. On must also understand how retiring the technical debt might confer advantages in terms of future engineering efforts. Usually, understanding the consequences of maintenance or development projects is more “available” to decision-makers than is understanding the consequences of technical debt retirement projects. Even more dramatic is the difference between understanding the consequences of not funding a maintenance or development project and the consequences of not funding a technical debt retirement project.

Finally, much of the benefit of a technical debt retirement project is indirect. That is, although there is some direct benefit in terms of the assets from which the debt has been retired, the most dramatic benefits are manifested in projects that follow the debt retirement project, and which depend on the assets that have been relieved of debt. Sometimes, those follow-on projects are known at the time decision-makers are considering funding the debt retirement project. Sometimes those follow-on projects have yet to be specified or even recognized. In either case, they are less “available” to decision-makers because those follow-on projects are indirect beneficiaries.

These three effects of the availability heuristic cause decisions about resource allocations to tend to favor maintenance or development projects over debt retirement projects.

Mitigating the risks of these three cognitive biases

Over time, as everyone becomes more familiar with technical debt retirement projects, these effects may wane somewhat. But waiting for that to happen isn’t exactly what one might call risk mitigation. For one thing, familiarity grows only if one is motivated and pays attention. As busy as are decision-makers in modern organizations, depending on them to actively enhance their own familiarity with technical debt retirement projects is probably not the safest course.

An effective program of actively mitigating the risks of these three cognitive biases probably should focus on four areas.

Familiarity

Do what you can to increase decision-maker familiarity with the concept of technical debt, and with the consequences of carrying existing technical debt. Conventional presentation-based training will help, but interactive, experiential training is far more effective. Participants must actually experience the consequences of technical debt in a well-designed and professionally facilitated simulation of a problem-solving task. A faithful simulation should include estimation, changing and ambiguous requirements, and team composition volatility.

Retrospectives

Retrospectives (also known as after-action reviews, post mortems, debriefings, or lessons-learned sessions) are meetings convened to review processes that just completed pieces of work [Kerth 2001]. Typically, attendance is restricted to the project team members. To maintain psychological safety and to encourage truth telling, attendance by enterprise decision-makers is not recommended, unless the organizational culture includes appropriate safeguards. In any case, a section of the retrospective dedicated to investigating the causes and consequences of technical debt in the context of the current project can ensure capture of relevant knowledge and experience.

Mathematical modeling practice

Mathematical modeling is one path to creating a more objective foundation for decisions. It’s essential for improving estimation quality. Also helpful are high quality effort data and metrics data related to the formation and lifetime of technical debt. Reviews of estimates and projections during retrospectives can help improve their quality over time.

Metrics development

Determining the effects of risk mitigation failure provides important guidance for corrective action in risk mitigation. Developing metrics that reveal these failures is therefore essential to managing cognitive bias risk. I’ll be suggesting some valuable metrics in a future post.

Last words

These three cognitive biases are by no means the only cognitive biases that can affect the formation or persistence of technical debt. Of the more than 200 identified cognitive biases, those most likely to be relevant are those that affect decision-making. Watch this space for links to posts about additional cognitive biases and their affects on technical debt formation or persistence.

Other posts relating to cognitive biases

References

[Ellsberg 1961] Daniel Ellsberg. "Risk, ambiguity, and the Savage axioms." The quarterly journal of economics, 643-669, 1961.

Available: here; Retrieved: August 17, 2018.

Cited in:

[Kahneman 1984] Daniel Kahneman, Amos Tversky, and Michael S. Pallak. “Choices, values, and frames,” American Psychologist 39:4, 341-350, 1984.

Available: here; Retrieved: August 8, 2017

Cited in:

[Kerth 2001] Norman L. Kerth. Project Retrospectives: A Handbook for Team Reviews. New York: Dorset House, 2001.

Order from Amazon

Cited in:

[Tversky 1973] Amos Tversky and Daniel Kahneman. "Availability: A heuristic for judging frequency and probability." Cognitive Psychology 5:2, 207-232, 1973.

Available: here; Retrieved: August 9, 2018.

Cited in:

Other posts in this thread

Confirmation bias and technical debt

Confirmation bias is a cognitive bias [Kahneman 2011]. It’s the human tendency to favor and seek only information that confirms our preconceptions, or to avoid information that disconfirms them. For example, the homogeneity of cable news channel audiences, and the alignment between preconceptions of the audience and the slant of the newscast for that channel, are results of confirmation bias.

Third stage ignition, sending the Mars Climate Orbiter (MCO) to Mars in December, 1998
Computer-generated image of the third stage ignition, sending the Mars Climate Orbiter (MCO) to Mars in December, 1998. The spacecraft eventually broke up in the Martian atmosphere as a result of what is now often called the “metric mix-up.” The team at Lockheed Martin that constructed the spacecraft and wrote its software used Imperial units for computing thrust data. But the team at JPL that was responsible for flying the spacecraft was using metric units. The mix-up was discovered after the loss of the spacecraft by the investigation panel established by NASA.
One of the many operational changes deployed as a result of this loss was increased use of reviews and inspections. While we do not know why reviews and inspections weren’t as thorough before the loss of the MCO as they are now, one possibility is the effects of confirmation bias in assessing the need for reviews and inspections. Image courtesy Engineering Multimedia, Inc., and U.S. National Aeronautics and Space Administration
Confirmation bias causes technical debt by biasing the information on which decision makers base decisions involving technical debt. Most people in these roles have objectives they regard as having priority over eliminating technical debt. This causes them to bias their searches for information about technical debt in favor of information that would support directly the achievement of those primary objectives. They tend, for example, to discount warnings of technical debt issues, to underfund technical debt assessments, and to set aside advice regarding avoiding debt formation in current projects.

For example, anyone determined to find reasons to be skeptical of the need to manage technical debt need only execute a few Google searches. Searching for there is no such thing as technical debt yields about 300,000 results at this writing; technical debt is a fraud about 1.6 million; and technical debt is a bad metaphor about 3.7 million. Compare this to technical debt which yields only 1.6 million. A skeptic wouldn’t even have to read any of these pages to come away convinced that technical debt is at best a controversial concept. This is, of course, specious reasoning, if it’s reasoning at all. But it does serve to illustrate the potential for confirmation bias to contribute to preventing or limiting rational management of technical debt.

Detecting confirmation bias in oneself is extraordinarily difficult because confirmation bias causes us to (a) decide not to search for data that would reveal confirmation bias; and (b) if data somehow becomes available, to disregard or to seek alternative explanations for it if that data tends to confirm the existence of confirmation bias. Moreover, another cognitive bias known as the bias blind spot causes individuals to see the existence and effects of cognitive biases much more in others than in themselves [Pronin 2002].

Still, the enterprise would benefit from monitoring the possible existence and effects of confirmation bias in decisions with respect to allocating resources to managing technical debt. Whenever decisions are made, we must manage confirmation bias risk.

References

[Ellsberg 1961] Daniel Ellsberg. "Risk, ambiguity, and the Savage axioms." The quarterly journal of economics, 643-669, 1961.

Available: here; Retrieved: August 17, 2018.

Cited in:

[Kahneman 1984] Daniel Kahneman, Amos Tversky, and Michael S. Pallak. “Choices, values, and frames,” American Psychologist 39:4, 341-350, 1984.

Available: here; Retrieved: August 8, 2017

Cited in:

[Kahneman 2011] Daniel Kahneman. Thinking, Fast and Slow. New York: Macmillan, 2011.

Order from Amazon

Cited in:

[Kerth 2001] Norman L. Kerth. Project Retrospectives: A Handbook for Team Reviews. New York: Dorset House, 2001.

Order from Amazon

Cited in:

[Pronin 2002] Emily Pronin, Daniel Y. Lin, and Lee Ross. “The bias blind spot: Perceptions of bias in self versus others.” Personality and Social Psychology Bulletin 28:3, 369-381, 2002.

Available: here; Retrieved: July 10, 2017

Cited in:

[Tversky 1973] Amos Tversky and Daniel Kahneman. "Availability: A heuristic for judging frequency and probability." Cognitive Psychology 5:2, 207-232, 1973.

Available: here; Retrieved: August 9, 2018.

Cited in:

Other posts in this thread

Unrealistic optimism: the planning fallacy and the n-person prisoner’s dilemma

Last updated on September 20th, 2018 at 03:51 pm

In a 1977 report, Daniel Kahneman and Amos Tversky identify one particular cognitive bias [Kahneman 2011], the planning fallacy, which afflicts planners [Kahneman 1977] [Kahneman 1979]. They discuss two types of information planners use. Singular information is specific to the case at hand; distributional information is drawn from similar past efforts. The planning fallacy is the tendency of planners to pay too little attention to distributional evidence and too much to singular evidence, even when the singular evidence is scanty or questionable. Failing to harvest lessons from the distributional evidence, which is inherently more diverse than singular evidence, the planners tend to underestimate cost and schedule. So for any given project, there’s an inherent tendency in human behavior to promise lower costs, faster delivery, and greater benefits than anyone can reasonably expect.

Aerial view of Hoover Dam, September 2017
Aerial view of Hoover Dam, September 2017. Under construction from 1931 to 1936, the dam was built for $48.8M ($639M in 2016 dollars) under a fixed-price contract. It was completed two years ahead of schedule. Apparently the planning fallacy doesn’t act inevitably. 112 men died in incidents associated with its construction. 42 more died of a condition diagnosed as pneumonia, but which is now thought to have been carbon monoxide poisoning due to poor ventilation in the dam’s diversion tunnels during construction. There’s little doubt that unrealistic optimism affects not only projections of budget and schedule, but also projections of risks, including deaths. Photo (cc) Mariordo (Mario Roberto Durán Ortiz), courtesy Wikimedia Commons.
But the problem is exacerbated by a dynamic described by Boehm et al. [Boehm 2016], who observe that because organizational resources are finite, project sponsors compete with each other for resources. They’re compelled by this competition to be unrealistically optimistic about their objectives, costs, and schedules. Although Boehm et al. call this mechanism the “Conspiracy of Optimism,” possibly facetiously, it isn’t actually a conspiracy. Rather, it’s a variant of the N-Person Prisoner’s Dilemma [Hamburger 1973].

Unrealistic optimism creates budget shortfalls and schedule pressures, both of which contribute to conditions favorable for creating non-strategic technical debt. And the kinds of technical debt produced by this mechanism, or any mechanism associated with schedule or budget pressure, tend to be subtle — they’re the types least likely to become evident in the short term. For example, technical debt that might make a particular kind of enhancement more difficult in the next project is more likely to appear than technical debt in the form of a copy of some code that should have been replaced by a utility routine. Copies of code are more easily discovered and more likely to be retired in the short term, if not in the current project. Awkward architecture might be more difficult to identify, and is therefore more likely to survive in the intermediate or long term.

In other words, the forms of technical debt most likely to be generated are those that are the most benign in the short term, and which are therefore more likely to escape notice. If noticed, they’re more likely to be forgotten unless carefully documented, an action that’s unlikely to be taken under conditions of schedule and budget pressure. In this way, the non-strategic technical debt created as a result of unrealistic optimism is more likely than most technical debt to eventually become legacy technical debt.

Policymakers can assist in addressing the consequences of unrealistic optimism by advocating for education about it. They can also advocate for changes in incentive structures and performance management systems to include organizational standards with respect to realism in promised benefits, costs, and schedules.

References

[Boehm 2016] Barry Boehm, Celia Chen, Kamonphop Srisopha, Reem Alfayez, and Lin Shiy. “Avoiding Non-Technical Sources of Software Maintenance Technical Debt,” USC Course notes, Fall 2016.

Available: here; Retrieved: July 25, 2017

Cited in:

[Ellsberg 1961] Daniel Ellsberg. "Risk, ambiguity, and the Savage axioms." The quarterly journal of economics, 643-669, 1961.

Available: here; Retrieved: August 17, 2018.

Cited in:

[Hamburger 1973] H. Hamburger. “N-person Prisoner’s Dilemma,” Journal of Mathematical Sociology, 3, 27–48, 1973. doi:10.1080/0022250X.1973.9989822

Cited in:

[Kahneman 1977] Daniel Kahneman and Amos Tversky. “Intuitive Prediction: Biases and Corrective Procedures,” Technical Report PTR-1042-7746, Defense Advanced Research Projects Agency, June 1977.

Available: here; Retrieved: September 19, 2017

Cited in:

[Kahneman 1979] Daniel Kahneman and Amos Tversky, “Intuitive Prediction: Biases and Corrective Procedures,” Management Science 12, 313-327, 1979.

Cited in:

[Kahneman 1984] Daniel Kahneman, Amos Tversky, and Michael S. Pallak. “Choices, values, and frames,” American Psychologist 39:4, 341-350, 1984.

Available: here; Retrieved: August 8, 2017

Cited in:

[Kahneman 2011] Daniel Kahneman. Thinking, Fast and Slow. New York: Macmillan, 2011.

Order from Amazon

Cited in:

[Kerth 2001] Norman L. Kerth. Project Retrospectives: A Handbook for Team Reviews. New York: Dorset House, 2001.

Order from Amazon

Cited in:

[Pronin 2002] Emily Pronin, Daniel Y. Lin, and Lee Ross. “The bias blind spot: Perceptions of bias in self versus others.” Personality and Social Psychology Bulletin 28:3, 369-381, 2002.

Available: here; Retrieved: July 10, 2017

Cited in:

[Tversky 1973] Amos Tversky and Daniel Kahneman. "Availability: A heuristic for judging frequency and probability." Cognitive Psychology 5:2, 207-232, 1973.

Available: here; Retrieved: August 9, 2018.

Cited in:

Other posts in this thread

Feature bias: unbalanced concern for capability vs. sustainability

Enterprise decision-makers affected by feature bias tend to harbor distorted views of the importance of new capability development compared to technical debt management. This tendency is likely due to the customer’s relative sensitivity to features, and relative lack of awareness of sustainability. Whatever the cause, customers tend to be more attracted to features than they are to indicators of sound technical debt management and other product sustainability practices. This tendency puts decision-makers at risk of feature bias: unbalanced concern for capability vs. sustainability.

Alaska crude oil production 1990-2015
Alaska crude oil production 1990-2015. This chart [Yen 2015] displays Alaska crude oil produced and shipped through the Trans Alaska Pipeline System (TAPS) from 1990 to 2015. Production had dropped by 75% in that period, and the decline is projected to continue. In January 2018, in response to pressure from Alaskan government officials and the energy industry, the U.S. Congress passed legislation that opened the Arctic National Wildlife Refuge to oil exploration, despite the threat to ecological sustainability that exploration poses. If we regard TAPS as a feature of the U.S. energy production system, we can view its excess capacity as a source of feature bias bias, creating pressure on decision-makers to add features to the U.S. energy system instead of acting to enhance the sustainability of Alaskan and global environmental systems [Wight 2017].
Changes in cost accounting could mitigate some of this feature bias by projecting more accurately total MICs based on historical data and sound estimation. I’ll explore possible accounting changes later in this post, and in future posts; meanwhile, let’s explore the causes and consequences of the distorted perspective I call feature bias.

For products or services offered for sale outside the enterprise, the sales and marketing functions of the enterprise represent the voice of the customer [Gaskin 1991]. But customers are generally unaware of product or service attributes that determine maintainability, extensibility, or cybersecurity — all factors that affect the MICs for technical debt. On the other hand, customers are acutely aware of capabilities — or missing or defective capabilities — in products or services. Customer comments and requests, therefore, are unbalanced in favor of capabilities as compared to maintainability, extensibility, cybersecurity, and other attributes related to sustainability. The sales and marketing functions tend to accurately transmit this unbalanced perspective to decision-makers and technologists.

An analogous mechanism prevails with respect to infrastructure and the internal customers of that infrastructure. Internal customers tend to be more concerned with capabilities — and missing capabilities — than they are with sustainability of the processes and systems that deliver those capabilities. Thus, pressure from internal customers on the developers and maintainers of infrastructure elements tends to emphasize capability at the expense of sustainability. The result of this imbalance is pressure to allocate excessive resources to capability enhancement, compared to activities that improve maintainability, extensibility, or cybersecurity, and which therefore would aid in controlling or reducing technical debt and its MICs.

Nor is this the only consequence of feature bias. It provides unrelenting pressure for increasing numbers of features, despite the threats to architectural coherence and overall usability that such “featuritis” or “featurism” present. Featurism leads, ultimately, to feature bloat, and to difficulties for users, who can’t find what they need among the clutter of features that are often too numerous to document. For example, in Microsoft Word, many users are unaware that Shift+F5 moves the insertion point and cursor to the point in the active document that was last edited, even if the document has just been freshly loaded into Word. Useful, but obscure.

Feature bias, it must be noted, is subject to biases itself. The existing array of features appeals to a certain subset of all potential customers. Because it is that subset that’s most likely to request repair of existing features, or to suggest additional features, the pressure for features tends to be biased in favor of the needs of the most vociferous segments of the existing user base. That is, systems experience pressure to evolve to better meet the needs of existing users, in preference to meeting the needs of other stakeholders or potential stakeholders who might be even more important to the enterprise than are the existing users. This bias in feature bias presents another risk that can affect decision-makers.

Organizations can take steps to mitigate the risk of feature bias. An example of such a measure might be the use of focus groups to study how education in sustainability issues affects customers’ perspectives relative to feature bias. Educating decision makers about feature bias can also reduce this risk.

At the enterprise scale, awareness of feature bias would be helpful, but awareness alone is unlikely to counter its detrimental effects, which include underfunding of technical debt management efforts. Eliminating the source of feature bias is extraordinarily difficult, because customers and potential customers aren’t subject to enterprise policy. Feature bias and feature bias bias are therefore givens. To mitigate the effects of feature bias, we must adopt policies that compel decision-makers to consider the need to deal with technical debt. One possible corrective action might be improvement of accounting practices for MICs, based, in part, on historical data. For example, since there’s a high probability that any project might produce new technical debt, it might be prudent to fund the retirement of that debt, in the form of reserves, when we fund the project. And if we know that a project has encountered some newly recognized form of technical debt, it might be prudent to reserve resources to retire that debt as soon as possible. Ideas such as these can rationalize resource allocations with respect to technical debt.

These two examples illustrate what’s necessary if we want to mitigate the effects of feature bias. They also illustrate just how difficult such a task will be.

References

[Boehm 2016] Barry Boehm, Celia Chen, Kamonphop Srisopha, Reem Alfayez, and Lin Shiy. “Avoiding Non-Technical Sources of Software Maintenance Technical Debt,” USC Course notes, Fall 2016.

Available: here; Retrieved: July 25, 2017

Cited in:

[Ellsberg 1961] Daniel Ellsberg. "Risk, ambiguity, and the Savage axioms." The quarterly journal of economics, 643-669, 1961.

Available: here; Retrieved: August 17, 2018.

Cited in:

[Gaskin 1991] Steven P. Gaskin, Abbie Griffin, John R. Hauser, Gerald M. Katz, and Robert L. Klein. “Voice of the Customer,” Marketing Science 12:1, 1-27, 1991.

Cited in:

[Hamburger 1973] H. Hamburger. “N-person Prisoner’s Dilemma,” Journal of Mathematical Sociology, 3, 27–48, 1973. doi:10.1080/0022250X.1973.9989822

Cited in:

[Kahneman 1977] Daniel Kahneman and Amos Tversky. “Intuitive Prediction: Biases and Corrective Procedures,” Technical Report PTR-1042-7746, Defense Advanced Research Projects Agency, June 1977.

Available: here; Retrieved: September 19, 2017

Cited in:

[Kahneman 1979] Daniel Kahneman and Amos Tversky, “Intuitive Prediction: Biases and Corrective Procedures,” Management Science 12, 313-327, 1979.

Cited in:

[Kahneman 1984] Daniel Kahneman, Amos Tversky, and Michael S. Pallak. “Choices, values, and frames,” American Psychologist 39:4, 341-350, 1984.

Available: here; Retrieved: August 8, 2017

Cited in:

[Kahneman 2011] Daniel Kahneman. Thinking, Fast and Slow. New York: Macmillan, 2011.

Order from Amazon

Cited in:

[Kerth 2001] Norman L. Kerth. Project Retrospectives: A Handbook for Team Reviews. New York: Dorset House, 2001.

Order from Amazon

Cited in:

[Pronin 2002] Emily Pronin, Daniel Y. Lin, and Lee Ross. “The bias blind spot: Perceptions of bias in self versus others.” Personality and Social Psychology Bulletin 28:3, 369-381, 2002.

Available: here; Retrieved: July 10, 2017

Cited in:

[Tversky 1973] Amos Tversky and Daniel Kahneman. "Availability: A heuristic for judging frequency and probability." Cognitive Psychology 5:2, 207-232, 1973.

Available: here; Retrieved: August 9, 2018.

Cited in:

[Wight 2017] Philip Wight. “How the Alaska Pipeline Is Fueling the Push to Drill in the Arctic Refuge,” YaleE360, Yale School of Forestry & Environmental Studies, November 16, 2017.

Available: here; Retrieved: February 8, 2018

Cited in:

[Yen 2015] Terry Yen, Laura Singer. “Oil exploration in the U.S. Arctic continues despite current price environment,” Today in Energy blog, U.S. Energy Information Administration, June 12, 2015.

Available: here; Retrieved: February 8, 2018.

Cited in:

Other posts in this thread

The Dunning-Kruger effect can lead to technical debt

Last updated on May 31st, 2018 at 07:43 am

The Dunning-Kruger effect [Kruger 1999] can lead to formation or persistence of technical debt in two ways. First, it can cause technologists or their managers to overestimate their ability to maintain the resource focus needed for retiring technical debt in a timely fashion. Second, it can cause senior managers to be reluctant to accede to resource requests of technologists and their managers in support of technical debt management programs.

Cropped detail from Charles Robert Darwin, a painting by John Collier
Cropped detail from Charles Robert Darwin, a painting by John Collier (1850-1934), given to the National Portrait Gallery, London, in 1896. Darwin writes, in The Descent of Man (1871): “… ignorance more frequently begets confidence than does knowledge …” which is the essence of the Dunning-Kruger effect. Image courtesy WikiQuote.

Kruger and Dunning conducted experiments that yielded results consistent with the following four principles (paraphrasing):

  1. Incompetent individuals, compared to their more competent peers, tend to dramatically overestimate their own ability and performance
  2. Incompetent individuals, compared to their more competent peers, tend to be less able to gain insight into their own true levels of performance
  3. Incompetent individuals can gain insight about their shortcomings, but, paradoxically, this comes about by gaining competence
  4. Incompetent individuals, compared to their more competent peers, are less able to recognize competence when they see it

The first three principles lead to distorted assessments of one’s own capabilities. The fourth principle leads to distorted assessments of the capabilities of others.

As an example of distorted self-assessment, consider a team or its managers who must undertake retirement of some types of technical debt in the course of enhancing or repairing an asset. Such a task plan seems at first to offer efficiencies, because the engineers can readily make both kinds of changes at one go. Metaphorically, if we must go to the store for milk, we can also pick up bread while we are there, rather than making two trips.

However, modifying an existing complex technological asset is unlike shopping for bread and milk. The two kinds of modifications — debt retirement and asset enhancement or repair — might seem at first to be separable, and often they are. But if they are not separable, and the two tasks are undertaken together, testing and debugging can become extremely complicated, because of interactions between defects in the two kinds of modifications. Under some circumstances, an experienced team and its managers might be more likely to anticipate these difficulties. An inexperienced team and its managers might be more likely to underestimate the difficulties, as a consequence of the Dunning-Kruger effect. Budget and schedule overruns are possible consequences of underestimating the complexity of the problem.

As an example of the fourth principle above, the Dunning-Kruger effect can cause some decision-makers to discount the warnings and resource requests of engineers and their managers. Decision-makers who are unsophisticated in matters related to technical debt must nevertheless assess the validity of the requests for resources. In making these assessments, these decision-makers may be disadvantaged for a number of reasons, including the following:

  • Decision-makers might hold any of a number of mistaken beliefs about technical debt. For example, many believe that the main causes of technical debt are poor decisions by engineering managers. And others believe that technical debt is the result of slovenly work habits of engineers. Those who hold such beliefs might be reluctant to allocate yet more resources to engineers to address the problem of technical debt.
  • If the advocates of resources for technical debt management are not fully informed about the strategic direction of the enterprise, their requests might be inconsistent with enterprise strategy. As a result of a cognitive bias [Kahneman 2011] known as the halo effect [Thorndike 1920], decision-makers might tend to discount valid portions of the technologists’ proposals, because some portions of those proposals don’t take enterprise strategy into account properly.
  • Decision-makers might be affected by unrealistic optimism [Weinstein 1996], also known as optimism bias. It’s a cognitive bias that can cause them to discount the sometimes-vivid warnings of technologists about the unfavorable consequences of failing to provide technical debt management resources.

Investigations of the degree of correlation between burdens of technical debt and the incidence of rejected or severely curtailed proposals for resources to support technical debt management programs could determine the significance of the Dunning-Kruger effect relative to the problem of technical debt. Also rewarding would be a survey of the nearly 200 known cognitive biases, to determine which of them might be most likely to affect decision-making relative to technical debt, and how best to mitigate the risks they present.

References

[Boehm 2016] Barry Boehm, Celia Chen, Kamonphop Srisopha, Reem Alfayez, and Lin Shiy. “Avoiding Non-Technical Sources of Software Maintenance Technical Debt,” USC Course notes, Fall 2016.

Available: here; Retrieved: July 25, 2017

Cited in:

[Ellsberg 1961] Daniel Ellsberg. "Risk, ambiguity, and the Savage axioms." The quarterly journal of economics, 643-669, 1961.

Available: here; Retrieved: August 17, 2018.

Cited in:

[Gaskin 1991] Steven P. Gaskin, Abbie Griffin, John R. Hauser, Gerald M. Katz, and Robert L. Klein. “Voice of the Customer,” Marketing Science 12:1, 1-27, 1991.

Cited in:

[Hamburger 1973] H. Hamburger. “N-person Prisoner’s Dilemma,” Journal of Mathematical Sociology, 3, 27–48, 1973. doi:10.1080/0022250X.1973.9989822

Cited in:

[Kahneman 1977] Daniel Kahneman and Amos Tversky. “Intuitive Prediction: Biases and Corrective Procedures,” Technical Report PTR-1042-7746, Defense Advanced Research Projects Agency, June 1977.

Available: here; Retrieved: September 19, 2017

Cited in:

[Kahneman 1979] Daniel Kahneman and Amos Tversky, “Intuitive Prediction: Biases and Corrective Procedures,” Management Science 12, 313-327, 1979.

Cited in:

[Kahneman 1984] Daniel Kahneman, Amos Tversky, and Michael S. Pallak. “Choices, values, and frames,” American Psychologist 39:4, 341-350, 1984.

Available: here; Retrieved: August 8, 2017

Cited in:

[Kahneman 2011] Daniel Kahneman. Thinking, Fast and Slow. New York: Macmillan, 2011.

Order from Amazon

Cited in:

[Kerth 2001] Norman L. Kerth. Project Retrospectives: A Handbook for Team Reviews. New York: Dorset House, 2001.

Order from Amazon

Cited in:

[Kruger 1999] J. Kruger and D. Dunning. “Unskilled and Unaware of It: How Difficulties in Recognizing One's Own Incompetence Lead to Inflated Self-Assessments,” Journal of Personality and Social Psychology, 77:6, 1121-1134, 1999.

Cited in:

[Pronin 2002] Emily Pronin, Daniel Y. Lin, and Lee Ross. “The bias blind spot: Perceptions of bias in self versus others.” Personality and Social Psychology Bulletin 28:3, 369-381, 2002.

Available: here; Retrieved: July 10, 2017

Cited in:

[Thorndike 1920] E.L. Thorndike. “A constant error in psychological ratings,” Journal of Applied Psychology, 4:1, 25-29, 1920. doi:10.1037/h0071663

Cited in:

[Tversky 1973] Amos Tversky and Daniel Kahneman. "Availability: A heuristic for judging frequency and probability." Cognitive Psychology 5:2, 207-232, 1973.

Available: here; Retrieved: August 9, 2018.

Cited in:

[Weinstein 1996] Neil D. Weinstein and William M. Klein. “Unrealistic Optimism: Present and Future,” Journal of Social and Clinical Psychology 15:1, 1-8, 1996. doi:10.1521/jscp.1996.15.1.1

Cited in:

[Wight 2017] Philip Wight. “How the Alaska Pipeline Is Fueling the Push to Drill in the Arctic Refuge,” YaleE360, Yale School of Forestry & Environmental Studies, November 16, 2017.

Available: here; Retrieved: February 8, 2018

Cited in:

[Yen 2015] Terry Yen, Laura Singer. “Oil exploration in the U.S. Arctic continues despite current price environment,” Today in Energy blog, U.S. Energy Information Administration, June 12, 2015.

Available: here; Retrieved: February 8, 2018.

Cited in:

Other posts in this thread