Synergy between the reification error and confirmation bias

Last updated on December 11th, 2018 at 10:34 am

In deciding whether or not to undertake technical debt retirement projects, organizations are at risk of making inappropriate decisions because of a synergy between the reification error and confirmation bias. Together, these two errors of thought create conditions that make committing appropriate levels of resources difficult. And in those cases in which resources are committed, there is a tendency to underestimate costs, which can lead to an elevated incidence of failures of technical debt retirement projects.

The reification error and confirmation bias

As explained elsewhere in this blog, the reification error is an error of reasoning in which we treat an abstraction such as technical debt as if it were a real, concrete, physical thing, which, of course, it is not. (See “Metrics for technical debt management: the basics”)

And confirmation bias is a cognitive bias that causes us to favor and seek only information that confirms our preconceptions, or to avoid information that disconfirms them. (See “Confirmation bias and technical debt”)

How the reification error affects management

The reification error might be responsible, in part, for a widely used management practice that often appears in the exploratory stages of undertaking projects. Let’s start with an illustration from the physical world.

A feedback loop that now provides budgetary control in most organizations
A feedback loop that now provides budgetary control in most organizations.

In the physical world, when we want cherries, we go to the produce section of the market and check the price per pound or kilo. Then we decide how many pounds or kilos we want. If the price is high, we might decide that fewer cherries will suffice. If the price is low, we might purchase more cherries. We have in mind a total cost target, and we adjust the weight of the cherries to meet the target, given the price. In the physical world, we can often adjust what we purchase to match our ability to pay for it.

Retiring technical debt doesn’t work like that, in part, because technical debt is an abstraction. But we try anyway; here’s how it goes. Management decides to retire a particular class of technical debt, and asks an engineer to work up an estimate of the cost. Sometimes Management reveals the target they have in mind if they have one; sometimes not. The estimate comes back as Total ± Uncertainty. Management decides that’s too high—or the Uncertainty is too great—and asks the engineer to find a way to do it for less, with less Uncertainty, maybe by being clever or doing less.

Management—the “customer” in this scenario—makes this request, in part, based on the belief that it’s possible to adjust the work to meet a (possibly unstated) target, in analogy to buying cherries in the produce department. That thinking is an example of the reification error. In this dynamic, we rarely take into account the fact that retiring technical debt isn’t exactly like buying cherries.

How confirmation bias affects engineering estimates

Now back to the interaction between Management and estimator. The engineer now suspects that Management does have a target in mind. Some engineers might ask what the target is. Some don’t. In any case, the engineer comes back with a lower estimate, which might still be too high. This process repeats until either Management decides against retiring the debt, or accepts the lowest Total ± Uncertainty, hoping for a final cost that isn’t too high.

In adjusting their estimates, engineers have a conflict of interest that can compromise their objectivity through the action of confirmation bias. In the case of technical debt retirement efforts, engineers are usually highly motivated to gain Management approval of the project, because the technical debt in question depresses engineering productivity and induces frustration. And since engineers typically sense that Management approval of the project is contingent on finding an estimate that’s low enough, the engineers have a preconception. That is, they have an incentive to convince themselves that the budget and schedule adjustments they make are reasonable. Because of the confirmation bias, they tend to seek justifications for the belief that lowering costs and tightening schedules is reasonable, while they tend to avoid seeking justifications for believing that their adjustments might not be feasible. That’s the confirmation bias in action.

How synergy between the reification error and confirmation bias comes about

So, because of the reification error, Management tends to believe that the work needed to retire technical debt of a particular kind is more adjustable than it actually is. And because of confirmation bias, engineers tend to believe that they can do the work for a cost and within a schedule that Management is willing to permit. Too often, the synergy between the two errors of thinking provides a foundation for disaster.

Why this synergy creates conditions for disaster in technical debt retirement projects

Management usually interprets estimates as commitments. Engineers do not equate estimates with commitments. Management usually forgets or ignores the upside Uncertainty. So typically, when Management finally accepts an estimate, the engineering team finds that it has made a commitment to deliver the work for the cost Total, with zero upside Uncertainty. Few engineering teams are actually asked to make an explicit commitment to perform the work for a cost Total with zero upside Uncertainty. An analogous problem occurs with schedule.

By ignoring the Uncertainty, Management (the buyer) transfers the uncertainty risk to the project team. That strategy might work to some extent with conventional development or maintenance projects, where we can adjust scope and risk before the work begins. But for technical debt retirement projects, this practice creates problems for two reasons.

Adjusting the scope of debt retirement projects is difficult

First, with technical debt retirement we’re less able to adjust scope. To retire a class of technical debt, we must retire it in toto. If we retire only some portion of a class of technical debt, we would leave the asset in a mixed state that can actually increase MICs. So it’s usually best to retire the entirety of any class of technical debt, so as to leave the asset in a uniform state.

Debt retirement efforts are notoriously unpredictable

Second, the work involved in retiring a particular class of technical debt is more difficult to predict than is the work involved in more conventional projects. (See “Useful projections of MPrin might not be attainable”) Often, we must work with older assets, or older portions of younger assets. The people who built them aren’t always available, and documentation can be sparse or unreliable. Moreover, it’s notoriously difficult to predict with accuracy when affected assets must be temporarily withdrawn from production—and for how long—to support the technical debt retirement effort. Revenue stream interruptions, which can be significant portions of total costs, can be difficult to schedule or predict. Thus, technical debt retirement projects tend to be riskier than other kinds of projects. They have wider uncertainty bands. Ignoring the Uncertainty, or trying to transfer responsibility for it to the project team, is foolhardy.

A strategy for reducing the effects of this synergy

To intervene in the dynamic between the consequences of the reification error and the consequences of confirmation bias, we must find a way to limit how their consequences can interact. That will curtail the ability of one phenomenon to reinforce the other. This task is well suited for application of Donella Meadows’ concept of leverage points [Meadows 1999], which I discussed in an earlier post, “Leverage points for technical debt management.”

In that post, I summarized Meadows’ idea that to alter the behavior of a complex system, one can intervene at one or more of 12 categories of leverage points. These are elements in the system that govern the behavior of the people and institutions that comprise the system. In that post, I sketched the use of Leverage Point #9, Delays, to alter the levels of technical debt in an enterprise.

In this post, I’ll sketch the use of interventions at Leverage Point #8, which Meadows calls, “The strength of negative feedback loops, relative to the impacts they are trying to correct against.”

Our strategy is this:

A feedback loop that now provides budgetary control in most organizations

One feedback loop at issue in this case, illustrated above, provides budgetary control. It influences managers who might otherwise overrun their budgets by triggering some sort of organizational intervention when they do overrun their budgets. And it leads to increases in the portfolios of managers who handle their budgets responsibly.  Presumably, that’s why managers compel estimators to find approaches that cost less. The feedback loop to which managers are exposed causes them to establish another feedback loop involving the engineer/estimator, and later the engineering team, to hold down their estimates, and later their actual expenditures.

We can use a diagram of effects [Weinberg 1992] to illustrate the feedback mechanism commonly used to control the performance of managers who are responsible for portfolios of project budgets. In the diagram, the oval blobs represent quantities indicated by their respective captions. Each of these quantities is assumed to be measurable, though their precise values and the way we measure them are unimportant for our rather qualitative argument.

Notice that arrows connect the blobs. The arrows represent the effect of changes in the value represented by one blob on the value represented by another. The blob at the base of the arrow is the effector quantity. The blob at the point of the arrow is the affected quantity. Thus, the arrow running from the blob labeled “Actual Spend” to the blob labeled “Overspend” expresses the idea that a positive (or negative) change in the amount of actual spending on projects causes a positive (or negative) change in Overspend. When a change in the effector quantity causes a like-signed change in the affected quantity, we say that their relationship is covariant.

Because increases in Budget Authority tend to decrease Overspend, all other things being equal, the relationship between Budget Authority and Overspend is contravariant. We represent a contravariant relationship between the effector quantity and the affected quantity as an arrow with a filled circle on it.

Finally, notice that the arrow from Overspend (effector) to Promotion Probability (affected) has a filled Delta on it. This represents the idea that as Overspend increases, it negatively affects the probability that the manager will be promoted at some point in the future. The Delta indicates a delayed effect; that the Delta is filled indicates a contravariant relationship. (An unfilled Delta would indicate a delayed covariant effect.)

This diagram, which contains a loop connecting Budget Authority, Overspend, and Promotion Probability, has the potential to “run away.” That is, as we go around the loop, we find self-re-enforcement, because the loop has an even number of contravariant relationships. It works as follows:

As Overspend increases, after a delay, the Probability of Promotion decreases. This causes reductions in Budget Authority because, presumably, the organization has reduced faith in the manager’s performance. Reductions in Budget Authority make Overspend more likely, and round and round we go.

Similarly:

As Overspend decreases, after a delay, the Probability of Promotion increases. This causes increases in Budget Authority because, presumably, the organization has increased faith in the manager’s performance. Increases in Budget Authority make Overspend less likely, and round and round we go.

Fortunately, other effects usually intervene when these self-re-enforcing phenomena get too large, but that’s beyond the scope of this argument. For now, all we need observe is that managers who manage their budgets effectively tend to rise in the organization; those who don’t, don’t.

The result is that managers seek to limit spending so as to avoid overspending their budget authority. And that’s one reason why they push engineers to produce lower estimates for technical debt retirement projects.

How this feedback loop overlooks important drivers of technical debt formation

To break the connection between the managers’ reification error and the engineers’ confirmation bias, our intervention must cause the managers and the engineers to make calculations differently. We can accomplish this by requiring that they consider more than the mere cost of retiring the class of technical debt under consideration. They must estimate the consequences of not retiring that technical debt, and they must also estimate costs beyond the cost of retiring the debt. In what follows, I’ll use the shorthand TDBCR to mean the class of Technical Debt Being Considered for Retirement.

Specifically, the estimates that are now typically generated for such projects cover only the cost of performing the work required to retire the TDBCR. It’s then left to Management to decide whether, when, and to what extent to commit resources to execute the project. The primary consideration is the effect on the decision-maker’s budget, and the consequences for achieving the goals for which the decision-maker is responsible.

Since the retirement project can potentially provide benefits beyond the manager’s own portfolio, failing to undertake the project can have negative consequences for which the manager ought to be held accountable. That’s the heart of the problem. So let’s look at some examples of considerations that must be taken into account.

Adjustments that would be needed in these feedback loops to gain control of technical debt

In making a resource allocation decision for a technical debt retirement project, there are considerations beyond the cost of retiring the debt. A responsible decision regarding undertaking technical debt retirement projects is possible only if other kinds of estimates are also generated and available. Here are some examples:

  • The effects of retiring TDBCR on the cost of executing any other development or maintenance efforts contemplated or already underway
  • The effects of retiring TDBCR on revenue and market share for all existing assets that directly produce revenue and which could be affected by retiring TDBCR
  • The revenue that would be generated (and timing thereof) by any new products or services that would be enabled by retiring TDBCR
  • The effects of retiring TDBCR on the cost of executing other technical debt retirement efforts

And these items might not be related to anything for which the decision-maker is responsible. That’s the core of the problem we now face: the feedback loop we now use to influence the decision-maker excludes considerations that are affected by the decision-maker’s decisions. Until we install feedback loops that cause the decision-maker to consider these consequences, or until we make decisions at levels that include these other consequences, the effects of the decision-maker’s decisions are uncontrolled, and might not lead to decisions optimal for the enterprise.

References

[McConnell 2006] Steve McConnell. Software Estimation: Demystifying the Black Art. Microsoft Press, 2006.

Order from Amazon

Cited in:

[Meadows 1999] Donella H. Meadows. “Leverage Points: Places to Intervene in a System,” Hartland VT: The Sustainability Institute, 1999.

Available: here; Retrieved: June 2, 2018.

Cited in:

[Weinberg 1992] Gerald M. Weinberg. Quality Software Management Volume 1: Systems Thinking. New York: Dorset House, 1989.

This volume contains a description of the “diagram of effects” used to explain how obstacles can induce toxic conflict. Order from Amazon

Cited in:

Other posts in this thread

The resilience error and technical debt

Last updated on October 4th, 2018 at 02:09 pm

I’ve mentioned the reification error in a previous post (see “Metrics for technical debt management: the basics”), but I haven’t explored its dual, the resilience error. Let me correct that oversight now.

The future USS Zumwalt (DDG 1000) in sea trials, December 2015
The future USS Zumwalt (DDG 1000) is underway for the first time conducting at-sea tests and trials in the Atlantic Ocean Dec. 7, 2015. The first of the Zumwalt class of US Navy guided missile destroyers, it is designed to be stealthy, and to be supported by a minimal crew. After the program experienced explosive cost growth, the class has been downsized from 32 ships to three, and complement increased from 95 to over 140 to reduce capital costs. The three vessels on order now have significantly reduced missions. As one might expect, the causes of these troubles are much debated. But it’s possible that the resilience error plays a role. Before the first of a new class of ships goes to sea, it exists as an abstraction—a collection of concepts, plans, promises, and technologies, tried and untried. Many elements of this collection have never inter-operated with other elements. The first ship represents the first opportunity to see how all the elements work together. Although troubles often appear even before the ship is fully assembled, anticipating all troubles is extraordinarily difficult.

Reification risk is the risk that an error of reasoning known as the reification error might affect decisions—in this case, decisions regarding technical debt. The reification error [Levy 2009] [Gould 1996] (also called the reification fallacy, concretism, or the fallacy of misplaced concreteness [Whitehead 1948]) is an error of reasoning in which we treat an abstraction as if it were a real, concrete, physical thing. Reification is useful in some applications, such as object-oriented programming and design.

But when we reify in the domain of logical reasoning, troubles can arise. For example, we can encounter trouble when we think of “measuring” technical debt. Strictly speaking, we cannot measure technical debt. It isn’t a real, physical thing that can be measured. What we can do is estimate the cost of retiring technical debt, but estimates are only approximations. And in the case of technical debt, the approximations are usually fairly rough—they have wide uncertainty bands. That’s one way for trouble to enter the scene. When we regard the estimate as if it were a measurement, we tend to think of it as more certain than it actually is. Technical debt retirement projects then overrun their budgets and schedules, and chaos reigns.

For example, if we think we’ve measured the MPrin of a class of technical debt, rather than that we’ve estimated it, we’re more likely to believe that one measurement will suffice, and that it will be valid for a long time (or indefinitely). On the other hand, if we think we’ve estimated the MPrin of a class of technical debt, we’re more likely to believe that obtaining a second independent estimate would be wise, and that the estimate we do have might not be valid for long. These are just some of the consequences of the reification error.

The resilience error

If the reification error is risky because it entails regarding an abstraction as a real, physical thing, we might postulate the existence of a resilience error that’s risky because it entails regarding an abstraction as more resilient, pliable, adaptable, or extensible than it actually is.

When we commit the resilience error with respect to an abstraction, we adopt the belief—usually without justification, and possibly outside our awareness—that if we make changes in the abstraction without fully investigating the consequences of those changes, we can be certain that the familiar properties of the abstraction we modified will apply, suitably modified, to the new form of the abstraction.  Or we assume incorrectly that the abstraction will accommodate any changes we make to its environment.

Sometimes we benefit when we modify abstractions; usually we encounter unintended and unpleasant consequences. For example, unless we examine our modifications carefully, it’s possible that the implications of a modification might conflict with one or more of the fundamental assumptions of the abstraction.

Examples of the resilience error

Perhaps a (ahem) concrete example will illustrate. Consider the steel hull of an ocean liner. We can manufacture it more cheaply if we can devise a way to use less steel. So one approach to that goal is to remove a small portion of the bottom of the hull, say, a circular hole one meter in diameter. We send some people into the ship to do the work, and they return with panicky reports of water coming in. But the ship seems fine, so we reject the reports. Even a day later, all seems well. But by the end of the second day, the trouble is obvious. The ship is sinking.

The problem in our example is that the circular hole in the hull violated a fundamental assumption about how ship hulls work: they work by keeping all water out of the ship. We had extended the idea of hull to make it lighter, but in doing so, we encountered some unintended consequences because our extension violated a fundamental property of hulls.

Now for a less fanciful example.

Consider the fictitious company Alpha Properties LLC, which manages small condominium associations (from 25 to 100 units). Things have been going swimmingly at Alpha Properties, and they’ve decided to expand to handle large condominium associations. Their financial accounting software  has worked well, and their employees have become quite expert in its use. Alpha management has heard good reports from other management companies that deal with large client associations. So Alpha decides to use the same software for its larger accounts too. But things don’t work out so well.

The software is fine, but the processes used by the staff are cumbersome and slow. For example, setting up a new association requires too much manual data entry. For a 100-unit association, client setup wasn’t a burden, but for a 900-unit association the problem is just unmanageable.

This is a fine example of the resilience error. When we make this error, we fail to appreciate how an abstraction can encapsulate assumptions that make for difficulties when we try to extend it or apply it in a new or altered context. In this example, Alpha’s data flow processes are the abstraction. The context is signing up a new client association. When the context (signing up a large new client) is different, it violates an internal assumption of the abstraction (the data flow process for signing up a new client).

How the resilience error leads to technical debt

In many cases, the resilience error is at the heart of the causes of technical debt. It works like this. We have an asset that works perfectly well for one set of applications or in one set of contexts. We want to apply that asset in a new way, which might (or might not) require some minor extensions. When we try it, we find that the asset incorporates some assumptions about the application or the context, and one or more of those assumptions are violated by the new application or the new context. Scrambling, we find some quick fixes that can get things working again, but those fixes usually aren’t well designed or easily maintained. The result is a trail of technical debt.

Acquiring companies is like that. Before the acquisition, we think we’ll be able to merge the IT operations to save some expenses in operations. When we actually try it, though, merging them proves to be far more expensive than we imagined. Ah, the resilience error.

What makes this situation so difficult is that often we’re unable to anticipate what assumptions we might be about to violate. That’s why we make the resilience error.

Spotting difficulties with adapting to new applications and new contexts isn’t so difficult with physical entities. For example, we can see in advance that a square peg won’t fit into a round hole. But with abstractions, we can’t always see the problems in advance. Piloting, prototypes, games, and simulations can help us avoid some trouble, but not all.

References

[Gould 1996] Stephen Jay Gould. The mismeasure of man (Revised & Expanded edition). W. W. Norton & Company, 1996.

Order from Amazon

Cited in:

[Levy 2009] David A. Levy, Tools of Critical Thinking: Metathoughts for Psychology (second edition). Long Grove, Illinois: Waveland Press, Inc., 2009.

Order from Amazon

Cited in:

[McConnell 2006] Steve McConnell. Software Estimation: Demystifying the Black Art. Microsoft Press, 2006.

Order from Amazon

Cited in:

[Meadows 1999] Donella H. Meadows. “Leverage Points: Places to Intervene in a System,” Hartland VT: The Sustainability Institute, 1999.

Available: here; Retrieved: June 2, 2018.

Cited in:

[Weinberg 1992] Gerald M. Weinberg. Quality Software Management Volume 1: Systems Thinking. New York: Dorset House, 1989.

This volume contains a description of the “diagram of effects” used to explain how obstacles can induce toxic conflict. Order from Amazon

Cited in:

[Whitehead 1948] Alfred North Whitehead. Science and the Modern World. New York: Pelican Mentor (MacMillan), 1948 [1925].

Order from Amazon

Cited in:

Other posts in this thread

Three cognitive biases

Last updated on September 16th, 2018 at 03:58 pm

Technical debt arises in enterprise assets through the effects of two classes of drivers: obsolescence and decision-making. When technologies advance, or new technologies arise, or laws or regulations evolve or are introduced, existing assets or assets under development can sometimes be left behind. That’s how obsolescence produces technical debt. Debt driven mainly by decision-making is more difficult to describe, but anything that biases decisions away from strictly rational results presents risk. Three cognitive biases likely have strong effects on technical debt formation and persistence.

Photo of Daniel Ellsberg, speaking at a press conference in New York City in 1972
Photo of Daniel Ellsberg, speaking at a press conference in New York City in 1972. Best known for his role in releasing the Pentagon Papers, Dr. Ellsberg made important contributions to decision theory while at the RAND Corporation. Photo by Bernard Gotfryd, courtesy U.S. Library of Congress.
Decision-making produces technical debt as the people of the enterprise make choices in design, development, and resource acquisition or allocation. Typically, both obsolescence and decision-making contribute to producing any particular instance of technical debt, though either obsolescence or decision-making might be more important than the other in any given instance.

Managing debt driven principally by obsolescence isn’t difficult, but I’ll leave that topic for another time. For now, let’s focus on decision-making. Already widely accepted is the contribution of engineering decisions to technical debt formation. Indeed, many believe that all — or most — technical debt arises as a result of faulty decisions by engineers. While some engineering decisions are indeed faulty, the current scale of technical debt is so large as to create doubt about the idea that the only decisions contributing to technical debt formation are engineering decisions. Investigating how resource allocation decisions might contribute to technical debt formation is certainly worthwhile.

In this post, I propose three examples illustrating how resource allocation decisions might contribute to technical debt formation and persistence. Each example illustrates how people make faulty decisions while believing they’re proceeding objectively and rationally. In each case, what causes the problem is a phenomenon called cognitive bias, though each example in this post illustrates the action of a different cognitive bias.

Loss aversion

The cognitive bias known as loss aversion, first identified by Amos Tversky and Daniel Kahneman [Kahneman 1984], is the tendency to prefer options that avoid losses to options that lead to gains that are equivalent or even greater. A decision-maker affected by loss aversion bias might conclude that it’s better to not lose $5 than to find $5 or even $10. In this way, loss aversion skews decisions so as to favor options that enable the enterprise to protect or enhance existing revenue streams, even if that decision causes increases in operating expenses. And this bias has effect even if the increases in operating expenses exceed the value of whatever revenue that decision protected.

Retiring technical debt usually entails deferring revenue in the short term, for two classes of reasons. First, we must turn the attention of some part of the engineering organization to debt retirement, instead of whatever they were doing. Assuming that they would have been working on maintaining or enhancing existing products or services, this redirection of their attention can lead to reducing or deferring revenue. Second, during the debt retirement operation, some work might require short-term interruptions of revenue streams, for the purpose of installing or testing the assets that are being revised for debt retirement purposes.

Thus, debt retirement efforts often do reduce revenue — or reduce revenue increases — in the short term, and some decision-makers can perceive that effect as a loss.

However, the long-term effects of debt retirement can be gains, and those gains can be considerable. Typically, by retiring an asset’s technical debt, we reduce the difficulty (read: time required, effort, cost, and risk) of future maintenance and enhancement efforts involving the asset. We also reduce the probability of debt contagion.

Since these long-term effects of debt retirement are ongoing, their impact on the enterprise can be significant. But unless one is experienced with dealing with the consequences of technical debt, recognizing the value of retiring technical debt can be difficult. When loss aversion is in play, intuitive comparisons of the effects of (a) a short-term revenue loss or delay to (b) a long-term benefit of debt retirement are biased in favor of not retiring technical debt.

Insulating decisions about debt retirement from the effects of loss aversion bias requires objective mathematical modeling of revenue losses and operating cost benefits for all options under consideration. Those models must also account for uncertainty, which makes them inherently ambiguous. And that leads us to consider our next cognitive bias, the ambiguity effect.

The ambiguity effect

The cognitive bias known as the ambiguity effect causes us to prefer options for which the probability of a desirable outcome is relatively better known, over options for which the probability of a desirable outcome is less well known, even if the expected value of that more ambiguous outcome exceeds the expected value of the less ambiguous outcome. The effect was first described by Daniel Ellsberg [Ellsberg 1961].

Consider a choice between allocating resources to a new development project and allocating resources to a technical debt retirement project. In most enterprises, decision-makers are familiar with new development projects. Likewise, project champions, project sponsors, and project managers are also familiar with new development projects. All parties are less familiar with debt retirement. It’s reasonable to suppose that when confronted with such a choice, decision-makers are likely to see debt retirement as carrying with it a probability of positive outcome that is less well known than the probability of a positive outcome for the new development project.

Because of the ambiguity effect, resource allocation decisions are likely to be biased against technical debt retirement, and in favor of maintenance or new development.

But there’s more. Most projects, of any kind, encounter trouble from time to time. When that happens, the urge to reallocate organizational resources can be powerful. Troubled projects might receive more resources if they’re viewed as important to the organization. If so, those resources often come from other projects. The ambiguity effect biases these resource reallocation decisions in a way analogous to initial resource allocation decisions, as described above. In other words, because of the ambiguity effect, when projects encounter trouble, debt retirement projects are less likely to be able to retain previously allocated resources than are maintenance or new development projects.

The availability heuristic

The availability heuristic is a method humans use to evaluate the validity or effectiveness of decisions, concepts, methods, or propositions [Tversky 1973]. According to the heuristic, if we recognize the item being evaluated as familiar, or related to something with which we are familiar, we’re more likely to regard it as valid or workable. And when making comparisons between two alternative decisions, concepts, methods, or propositions, we’re likely to assess more favorably the decision, concept, method, or proposition with which we’re more familiar, all other things being equal.

In organizations where decision-makers have more experience evaluating maintenance or development project proposals than they have with technical debt retirement proposals, the availability heuristic acts to reduce the relative assessed favorability of technical debt retirement proposals. It does this in three ways.

First, in most organizations, technical debt retirement projects are less familiar to decision-makers than are maintenance or development projects. On that ground alone the technical debt retirement project proposals are at a disadvantage.

But the second effect of the availability heuristic is more important, because the effect of the availability heuristic extends to the consequences of the decision, concept, method, or proposition under consideration. To grasp the value of a maintenance or development project, one must understand how it will affect the users of the assets being developed or maintained. Likewise, to grasp the value of a technical debt retirement project, one must understand how the presence of the technical debt hampers the enterprise in its attempts to achieve its objectives. On must also understand how retiring the technical debt might confer advantages in terms of future engineering efforts. Usually, understanding the consequences of maintenance or development projects is more “available” to decision-makers than is understanding the consequences of technical debt retirement projects. Even more dramatic is the difference between understanding the consequences of not funding a maintenance or development project and the consequences of not funding a technical debt retirement project.

Finally, much of the benefit of a technical debt retirement project is indirect. That is, although there is some direct benefit in terms of the assets from which the debt has been retired, the most dramatic benefits are manifested in projects that follow the debt retirement project, and which depend on the assets that have been relieved of debt. Sometimes, those follow-on projects are known at the time decision-makers are considering funding the debt retirement project. Sometimes those follow-on projects have yet to be specified or even recognized. In either case, they are less “available” to decision-makers because those follow-on projects are indirect beneficiaries.

These three effects of the availability heuristic cause decisions about resource allocations to tend to favor maintenance or development projects over debt retirement projects.

Mitigating the risks of these three cognitive biases

Over time, as everyone becomes more familiar with technical debt retirement projects, these effects may wane somewhat. But waiting for that to happen isn’t exactly what one might call risk mitigation. For one thing, familiarity grows only if one is motivated and pays attention. As busy as are decision-makers in modern organizations, depending on them to actively enhance their own familiarity with technical debt retirement projects is probably not the safest course.

An effective program of actively mitigating the risks of these three cognitive biases probably should focus on four areas.

Familiarity

Do what you can to increase decision-maker familiarity with the concept of technical debt, and with the consequences of carrying existing technical debt. Conventional presentation-based training will help, but interactive, experiential training is far more effective. Participants must actually experience the consequences of technical debt in a well-designed and professionally facilitated simulation of a problem-solving task. A faithful simulation should include estimation, changing and ambiguous requirements, and team composition volatility.

Retrospectives

Retrospectives (also known as after-action reviews, post mortems, debriefings, or lessons-learned sessions) are meetings convened to review processes that just completed pieces of work [Kerth 2001]. Typically, attendance is restricted to the project team members. To maintain psychological safety and to encourage truth telling, attendance by enterprise decision-makers is not recommended, unless the organizational culture includes appropriate safeguards. In any case, a section of the retrospective dedicated to investigating the causes and consequences of technical debt in the context of the current project can ensure capture of relevant knowledge and experience.

Mathematical modeling practice

Mathematical modeling is one path to creating a more objective foundation for decisions. It’s essential for improving estimation quality. Also helpful are high quality effort data and metrics data related to the formation and lifetime of technical debt. Reviews of estimates and projections during retrospectives can help improve their quality over time.

Metrics development

Determining the effects of risk mitigation failure provides important guidance for corrective action in risk mitigation. Developing metrics that reveal these failures is therefore essential to managing cognitive bias risk. I’ll be suggesting some valuable metrics in a future post.

Last words

These three cognitive biases are by no means the only cognitive biases that can affect the formation or persistence of technical debt. Of the more than 200 identified cognitive biases, those most likely to be relevant are those that affect decision-making. Watch this space for links to posts about additional cognitive biases and their affects on technical debt formation or persistence.

Other posts relating to cognitive biases

References

[Ellsberg 1961] Daniel Ellsberg. "Risk, ambiguity, and the Savage axioms." The quarterly journal of economics, 643-669, 1961.

Available: here; Retrieved: August 17, 2018.

Cited in:

[Gould 1996] Stephen Jay Gould. The mismeasure of man (Revised & Expanded edition). W. W. Norton & Company, 1996.

Order from Amazon

Cited in:

[Kahneman 1984] Daniel Kahneman, Amos Tversky, and Michael S. Pallak. “Choices, values, and frames,” American Psychologist 39:4, 341-350, 1984.

Available: here; Retrieved: August 8, 2017

Cited in:

[Kerth 2001] Norman L. Kerth. Project Retrospectives: A Handbook for Team Reviews. New York: Dorset House, 2001.

Order from Amazon

Cited in:

[Levy 2009] David A. Levy, Tools of Critical Thinking: Metathoughts for Psychology (second edition). Long Grove, Illinois: Waveland Press, Inc., 2009.

Order from Amazon

Cited in:

[McConnell 2006] Steve McConnell. Software Estimation: Demystifying the Black Art. Microsoft Press, 2006.

Order from Amazon

Cited in:

[Meadows 1999] Donella H. Meadows. “Leverage Points: Places to Intervene in a System,” Hartland VT: The Sustainability Institute, 1999.

Available: here; Retrieved: June 2, 2018.

Cited in:

[Tversky 1973] Amos Tversky and Daniel Kahneman. "Availability: A heuristic for judging frequency and probability." Cognitive Psychology 5:2, 207-232, 1973.

Available: here; Retrieved: August 9, 2018.

Cited in:

[Weinberg 1992] Gerald M. Weinberg. Quality Software Management Volume 1: Systems Thinking. New York: Dorset House, 1989.

This volume contains a description of the “diagram of effects” used to explain how obstacles can induce toxic conflict. Order from Amazon

Cited in:

[Whitehead 1948] Alfred North Whitehead. Science and the Modern World. New York: Pelican Mentor (MacMillan), 1948 [1925].

Order from Amazon

Cited in:

Other posts in this thread

Managing technical debt

Last updated on August 24th, 2019 at 05:57 pm

Managing technical debt is something few organizations now do, and fewer do well. Several issues make managing technical debt difficult and they’re discussed elsewhere in this blog. This thread explores tactics for dealing with those issues from a variety of initial conditions. For example, tactics that work well for an organization that already has control of its technical debt, and which wants to keep it under control, might not work at all for an organization that’s just beginning to address a vast portfolio of runaway technical debt. The needs of these two organizations differ. The approaches they must take might then also differ.

A jumble of jigsaw puzzle pieces. Managing technical debt can be like solving a puzzle.
A jumble of jigsaw puzzle pieces. Where do we begin? With these puzzles, we usually begin with two assumptions: (a) we have all the pieces, and (b) they fit together to make coherent whole. These assumptions might not be valid for the puzzle of technical debt in any given organization.

The first three posts in this thread illustrate the differences among organization in different stages of developing technical debt management practices. In “Leverage points for technical debt management,” I begin to address the needs of strategists working in an organization just beginning to manage its technical debt, and asking the question, “Where do we begin?” In “Undercounting nonexistent debt items,” I offer an observation about a risk that accompanies most attempts to assess the volume of outstanding technical debt. Such assessments are frequently undertaken in organizations at early stages of the technical debt management effort. In “Crowdsourcing debt identification,” I discuss a method for maintaining the contents of a database of technical debt items. Data maintenance is something that might be undertaken in the context of a more advance technical debt management program.

Whatever approach is adopted, it must address factors that include technology, business objectives, politics, culture, psychology, and organizational behavior. So what you’ll find in this thread are insights, observations, and recommendations that address one or more of the issues related to these fields. “Demodularization can help control technical debt” considers mostly technical strategies. “Undercounting nonexistent debt items” is an exploration of a psychological phenomenon.  “Leverage points for technical debt management” considers the organization as a system and discusses tactics for altering it. And “Legacy debt incurred intentionally” explores how existing technical debt can grow as long as it remains outstanding.

Accounting issues also play a role. “Metrics for technical debt management: the basics” is a basic discussion of measurement issues. “Accounting for technical debt” looks into the matter of accounting for technical debt financially. And “Three cognitive biases” is a study of how technical debt is affected by the way we think about it.

Posts in this thread:

References

[Ellsberg 1961] Daniel Ellsberg. "Risk, ambiguity, and the Savage axioms." The quarterly journal of economics, 643-669, 1961.

Available: here; Retrieved: August 17, 2018.

Cited in:

[Gould 1996] Stephen Jay Gould. The mismeasure of man (Revised & Expanded edition). W. W. Norton & Company, 1996.

Order from Amazon

Cited in:

[Kahneman 1984] Daniel Kahneman, Amos Tversky, and Michael S. Pallak. “Choices, values, and frames,” American Psychologist 39:4, 341-350, 1984.

Available: here; Retrieved: August 8, 2017

Cited in:

[Kerth 2001] Norman L. Kerth. Project Retrospectives: A Handbook for Team Reviews. New York: Dorset House, 2001.

Order from Amazon

Cited in:

[Levy 2009] David A. Levy, Tools of Critical Thinking: Metathoughts for Psychology (second edition). Long Grove, Illinois: Waveland Press, Inc., 2009.

Order from Amazon

Cited in:

[McConnell 2006] Steve McConnell. Software Estimation: Demystifying the Black Art. Microsoft Press, 2006.

Order from Amazon

Cited in:

[Meadows 1999] Donella H. Meadows. “Leverage Points: Places to Intervene in a System,” Hartland VT: The Sustainability Institute, 1999.

Available: here; Retrieved: June 2, 2018.

Cited in:

[Tversky 1973] Amos Tversky and Daniel Kahneman. "Availability: A heuristic for judging frequency and probability." Cognitive Psychology 5:2, 207-232, 1973.

Available: here; Retrieved: August 9, 2018.

Cited in:

[Weinberg 1992] Gerald M. Weinberg. Quality Software Management Volume 1: Systems Thinking. New York: Dorset House, 1989.

This volume contains a description of the “diagram of effects” used to explain how obstacles can induce toxic conflict. Order from Amazon

Cited in:

[Whitehead 1948] Alfred North Whitehead. Science and the Modern World. New York: Pelican Mentor (MacMillan), 1948 [1925].

Order from Amazon

Cited in:

Confirmation bias and technical debt

Confirmation bias is a cognitive bias [Kahneman 2011]. It’s the human tendency to favor and seek only information that confirms our preconceptions, or to avoid information that disconfirms them. For example, the homogeneity of cable news channel audiences, and the alignment between preconceptions of the audience and the slant of the newscast for that channel, are results of confirmation bias.

Third stage ignition, sending the Mars Climate Orbiter (MCO) to Mars in December, 1998
Computer-generated image of the third stage ignition, sending the Mars Climate Orbiter (MCO) to Mars in December, 1998. The spacecraft eventually broke up in the Martian atmosphere as a result of what is now often called the “metric mix-up.” The team at Lockheed Martin that constructed the spacecraft and wrote its software used Imperial units for computing thrust data. But the team at JPL that was responsible for flying the spacecraft was using metric units. The mix-up was discovered after the loss of the spacecraft by the investigation panel established by NASA.
One of the many operational changes deployed as a result of this loss was increased use of reviews and inspections. While we do not know why reviews and inspections weren’t as thorough before the loss of the MCO as they are now, one possibility is the effects of confirmation bias in assessing the need for reviews and inspections. Image courtesy Engineering Multimedia, Inc., and U.S. National Aeronautics and Space Administration
Confirmation bias causes technical debt by biasing the information on which decision makers base decisions involving technical debt. Most people in these roles have objectives they regard as having priority over eliminating technical debt. This causes them to bias their searches for information about technical debt in favor of information that would support directly the achievement of those primary objectives. They tend, for example, to discount warnings of technical debt issues, to underfund technical debt assessments, and to set aside advice regarding avoiding debt formation in current projects.

For example, anyone determined to find reasons to be skeptical of the need to manage technical debt need only execute a few Google searches. Searching for there is no such thing as technical debt yields about 300,000 results at this writing; technical debt is a fraud about 1.6 million; and technical debt is a bad metaphor about 3.7 million. Compare this to technical debt which yields only 1.6 million. A skeptic wouldn’t even have to read any of these pages to come away convinced that technical debt is at best a controversial concept. This is, of course, specious reasoning, if it’s reasoning at all. But it does serve to illustrate the potential for confirmation bias to contribute to preventing or limiting rational management of technical debt.

Detecting confirmation bias in oneself is extraordinarily difficult because confirmation bias causes us to (a) decide not to search for data that would reveal confirmation bias; and (b) if data somehow becomes available, to disregard or to seek alternative explanations for it if that data tends to confirm the existence of confirmation bias. Moreover, another cognitive bias known as the bias blind spot causes individuals to see the existence and effects of cognitive biases much more in others than in themselves [Pronin 2002].

Still, the enterprise would benefit from monitoring the possible existence and effects of confirmation bias in decisions with respect to allocating resources to managing technical debt. Whenever decisions are made, we must manage confirmation bias risk.

References

[Ellsberg 1961] Daniel Ellsberg. "Risk, ambiguity, and the Savage axioms." The quarterly journal of economics, 643-669, 1961.

Available: here; Retrieved: August 17, 2018.

Cited in:

[Gould 1996] Stephen Jay Gould. The mismeasure of man (Revised & Expanded edition). W. W. Norton & Company, 1996.

Order from Amazon

Cited in:

[Kahneman 1984] Daniel Kahneman, Amos Tversky, and Michael S. Pallak. “Choices, values, and frames,” American Psychologist 39:4, 341-350, 1984.

Available: here; Retrieved: August 8, 2017

Cited in:

[Kahneman 2011] Daniel Kahneman. Thinking, Fast and Slow. New York: Macmillan, 2011.

Order from Amazon

Cited in:

[Kerth 2001] Norman L. Kerth. Project Retrospectives: A Handbook for Team Reviews. New York: Dorset House, 2001.

Order from Amazon

Cited in:

[Levy 2009] David A. Levy, Tools of Critical Thinking: Metathoughts for Psychology (second edition). Long Grove, Illinois: Waveland Press, Inc., 2009.

Order from Amazon

Cited in:

[McConnell 2006] Steve McConnell. Software Estimation: Demystifying the Black Art. Microsoft Press, 2006.

Order from Amazon

Cited in:

[Meadows 1999] Donella H. Meadows. “Leverage Points: Places to Intervene in a System,” Hartland VT: The Sustainability Institute, 1999.

Available: here; Retrieved: June 2, 2018.

Cited in:

[Pronin 2002] Emily Pronin, Daniel Y. Lin, and Lee Ross. “The bias blind spot: Perceptions of bias in self versus others.” Personality and Social Psychology Bulletin 28:3, 369-381, 2002.

Available: here; Retrieved: July 10, 2017

Cited in:

[Tversky 1973] Amos Tversky and Daniel Kahneman. "Availability: A heuristic for judging frequency and probability." Cognitive Psychology 5:2, 207-232, 1973.

Available: here; Retrieved: August 9, 2018.

Cited in:

[Weinberg 1992] Gerald M. Weinberg. Quality Software Management Volume 1: Systems Thinking. New York: Dorset House, 1989.

This volume contains a description of the “diagram of effects” used to explain how obstacles can induce toxic conflict. Order from Amazon

Cited in:

[Whitehead 1948] Alfred North Whitehead. Science and the Modern World. New York: Pelican Mentor (MacMillan), 1948 [1925].

Order from Amazon

Cited in:

Other posts in this thread

Unrealistic optimism: the planning fallacy and the n-person prisoner’s dilemma

Last updated on September 20th, 2018 at 03:51 pm

In a 1977 report, Daniel Kahneman and Amos Tversky identify one particular cognitive bias [Kahneman 2011], the planning fallacy, which afflicts planners [Kahneman 1977] [Kahneman 1979]. They discuss two types of information planners use. Singular information is specific to the case at hand; distributional information is drawn from similar past efforts. The planning fallacy is the tendency of planners to pay too little attention to distributional evidence and too much to singular evidence, even when the singular evidence is scanty or questionable. Failing to harvest lessons from the distributional evidence, which is inherently more diverse than singular evidence, the planners tend to underestimate cost and schedule. So for any given project, there’s an inherent tendency in human behavior to promise lower costs, faster delivery, and greater benefits than anyone can reasonably expect.

Aerial view of Hoover Dam, September 2017
Aerial view of Hoover Dam, September 2017. Under construction from 1931 to 1936, the dam was built for $48.8M ($639M in 2016 dollars) under a fixed-price contract. It was completed two years ahead of schedule. Apparently the planning fallacy doesn’t act inevitably. 112 men died in incidents associated with its construction. 42 more died of a condition diagnosed as pneumonia, but which is now thought to have been carbon monoxide poisoning due to poor ventilation in the dam’s diversion tunnels during construction. There’s little doubt that unrealistic optimism affects not only projections of budget and schedule, but also projections of risks, including deaths. Photo (cc) Mariordo (Mario Roberto Durán Ortiz), courtesy Wikimedia Commons.
But the problem is exacerbated by a dynamic described by Boehm et al. [Boehm 2016], who observe that because organizational resources are finite, project sponsors compete with each other for resources. They’re compelled by this competition to be unrealistically optimistic about their objectives, costs, and schedules. Although Boehm et al. call this mechanism the “Conspiracy of Optimism,” possibly facetiously, it isn’t actually a conspiracy. Rather, it’s a variant of the N-Person Prisoner’s Dilemma [Hamburger 1973].

Unrealistic optimism creates budget shortfalls and schedule pressures, both of which contribute to conditions favorable for creating non-strategic technical debt. And the kinds of technical debt produced by this mechanism, or any mechanism associated with schedule or budget pressure, tend to be subtle — they’re the types least likely to become evident in the short term. For example, technical debt that might make a particular kind of enhancement more difficult in the next project is more likely to appear than technical debt in the form of a copy of some code that should have been replaced by a utility routine. Copies of code are more easily discovered and more likely to be retired in the short term, if not in the current project. Awkward architecture might be more difficult to identify, and is therefore more likely to survive in the intermediate or long term.

In other words, the forms of technical debt most likely to be generated are those that are the most benign in the short term, and which are therefore more likely to escape notice. If noticed, they’re more likely to be forgotten unless carefully documented, an action that’s unlikely to be taken under conditions of schedule and budget pressure. In this way, the non-strategic technical debt created as a result of unrealistic optimism is more likely than most technical debt to eventually become legacy technical debt.

Policymakers can assist in addressing the consequences of unrealistic optimism by advocating for education about it. They can also advocate for changes in incentive structures and performance management systems to include organizational standards with respect to realism in promised benefits, costs, and schedules.

References

[Boehm 2016] Barry Boehm, Celia Chen, Kamonphop Srisopha, Reem Alfayez, and Lin Shiy. “Avoiding Non-Technical Sources of Software Maintenance Technical Debt,” USC Course notes, Fall 2016.

Available: here; Retrieved: July 25, 2017

Cited in:

[Ellsberg 1961] Daniel Ellsberg. "Risk, ambiguity, and the Savage axioms." The quarterly journal of economics, 643-669, 1961.

Available: here; Retrieved: August 17, 2018.

Cited in:

[Gould 1996] Stephen Jay Gould. The mismeasure of man (Revised & Expanded edition). W. W. Norton & Company, 1996.

Order from Amazon

Cited in:

[Hamburger 1973] Henry Hamburger. “N-person Prisoner’s Dilemma,” Journal of Mathematical Sociology, 3, 27–48, 1973. doi:10.1080/0022250X.1973.9989822

Cited in:

[Kahneman 1977] Daniel Kahneman and Amos Tversky. “Intuitive Prediction: Biases and Corrective Procedures,” Technical Report PTR-1042-7746, Defense Advanced Research Projects Agency, June 1977.

Available: here; Retrieved: September 19, 2017

Cited in:

[Kahneman 1979] Daniel Kahneman and Amos Tversky, “Intuitive Prediction: Biases and Corrective Procedures,” Management Science 12, 313-327, 1979.

Cited in:

[Kahneman 1984] Daniel Kahneman, Amos Tversky, and Michael S. Pallak. “Choices, values, and frames,” American Psychologist 39:4, 341-350, 1984.

Available: here; Retrieved: August 8, 2017

Cited in:

[Kahneman 2011] Daniel Kahneman. Thinking, Fast and Slow. New York: Macmillan, 2011.

Order from Amazon

Cited in:

[Kerth 2001] Norman L. Kerth. Project Retrospectives: A Handbook for Team Reviews. New York: Dorset House, 2001.

Order from Amazon

Cited in:

[Levy 2009] David A. Levy, Tools of Critical Thinking: Metathoughts for Psychology (second edition). Long Grove, Illinois: Waveland Press, Inc., 2009.

Order from Amazon

Cited in:

[McConnell 2006] Steve McConnell. Software Estimation: Demystifying the Black Art. Microsoft Press, 2006.

Order from Amazon

Cited in:

[Meadows 1999] Donella H. Meadows. “Leverage Points: Places to Intervene in a System,” Hartland VT: The Sustainability Institute, 1999.

Available: here; Retrieved: June 2, 2018.

Cited in:

[Pronin 2002] Emily Pronin, Daniel Y. Lin, and Lee Ross. “The bias blind spot: Perceptions of bias in self versus others.” Personality and Social Psychology Bulletin 28:3, 369-381, 2002.

Available: here; Retrieved: July 10, 2017

Cited in:

[Tversky 1973] Amos Tversky and Daniel Kahneman. "Availability: A heuristic for judging frequency and probability." Cognitive Psychology 5:2, 207-232, 1973.

Available: here; Retrieved: August 9, 2018.

Cited in:

[Weinberg 1992] Gerald M. Weinberg. Quality Software Management Volume 1: Systems Thinking. New York: Dorset House, 1989.

This volume contains a description of the “diagram of effects” used to explain how obstacles can induce toxic conflict. Order from Amazon

Cited in:

[Whitehead 1948] Alfred North Whitehead. Science and the Modern World. New York: Pelican Mentor (MacMillan), 1948 [1925].

Order from Amazon

Cited in:

Other posts in this thread

Feature bias: unbalanced concern for capability vs. sustainability

Enterprise decision-makers affected by feature bias tend to harbor distorted views of the importance of new capability development compared to technical debt management. This tendency is likely due to the customer’s relative sensitivity to features, and relative lack of awareness of sustainability. Whatever the cause, customers tend to be more attracted to features than they are to indicators of sound technical debt management and other product sustainability practices. This tendency puts decision-makers at risk of feature bias: unbalanced concern for capability vs. sustainability.

Alaska crude oil production 1990-2015
Alaska crude oil production 1990-2015. This chart [Yen 2015] displays Alaska crude oil produced and shipped through the Trans Alaska Pipeline System (TAPS) from 1990 to 2015. Production had dropped by 75% in that period, and the decline is projected to continue. In January 2018, in response to pressure from Alaskan government officials and the energy industry, the U.S. Congress passed legislation that opened the Arctic National Wildlife Refuge to oil exploration, despite the threat to ecological sustainability that exploration poses. If we regard TAPS as a feature of the U.S. energy production system, we can view its excess capacity as a source of feature bias bias, creating pressure on decision-makers to add features to the U.S. energy system instead of acting to enhance the sustainability of Alaskan and global environmental systems [Wight 2017].
Changes in cost accounting could mitigate some of this feature bias by projecting more accurately total MICs based on historical data and sound estimation. I’ll explore possible accounting changes later in this post, and in future posts; meanwhile, let’s explore the causes and consequences of the distorted perspective I call feature bias.

For products or services offered for sale outside the enterprise, the sales and marketing functions of the enterprise represent the voice of the customer [Gaskin 1991]. But customers are generally unaware of product or service attributes that determine maintainability, extensibility, or cybersecurity — all factors that affect the MICs for technical debt. On the other hand, customers are acutely aware of capabilities — or missing or defective capabilities — in products or services. Customer comments and requests, therefore, are unbalanced in favor of capabilities as compared to maintainability, extensibility, cybersecurity, and other attributes related to sustainability. The sales and marketing functions tend to accurately transmit this unbalanced perspective to decision-makers and technologists.

An analogous mechanism prevails with respect to infrastructure and the internal customers of that infrastructure. Internal customers tend to be more concerned with capabilities — and missing capabilities — than they are with sustainability of the processes and systems that deliver those capabilities. Thus, pressure from internal customers on the developers and maintainers of infrastructure elements tends to emphasize capability at the expense of sustainability. The result of this imbalance is pressure to allocate excessive resources to capability enhancement, compared to activities that improve maintainability, extensibility, or cybersecurity, and which therefore would aid in controlling or reducing technical debt and its MICs.

Nor is this the only consequence of feature bias. It provides unrelenting pressure for increasing numbers of features, despite the threats to architectural coherence and overall usability that such “featuritis” or “featurism” present. Featurism leads, ultimately, to feature bloat, and to difficulties for users, who can’t find what they need among the clutter of features that are often too numerous to document. For example, in Microsoft Word, many users are unaware that Shift+F5 moves the insertion point and cursor to the point in the active document that was last edited, even if the document has just been freshly loaded into Word. Useful, but obscure.

Feature bias, it must be noted, is subject to biases itself. The existing array of features appeals to a certain subset of all potential customers. Because it is that subset that’s most likely to request repair of existing features, or to suggest additional features, the pressure for features tends to be biased in favor of the needs of the most vociferous segments of the existing user base. That is, systems experience pressure to evolve to better meet the needs of existing users, in preference to meeting the needs of other stakeholders or potential stakeholders who might be even more important to the enterprise than are the existing users. This bias in feature bias presents another risk that can affect decision-makers.

Organizations can take steps to mitigate the risk of feature bias. An example of such a measure might be the use of focus groups to study how education in sustainability issues affects customers’ perspectives relative to feature bias. Educating decision makers about feature bias can also reduce this risk.

At the enterprise scale, awareness of feature bias would be helpful, but awareness alone is unlikely to counter its detrimental effects, which include underfunding of technical debt management efforts. Eliminating the source of feature bias is extraordinarily difficult, because customers and potential customers aren’t subject to enterprise policy. Feature bias and feature bias bias are therefore givens. To mitigate the effects of feature bias, we must adopt policies that compel decision-makers to consider the need to deal with technical debt. One possible corrective action might be improvement of accounting practices for MICs, based, in part, on historical data. For example, since there’s a high probability that any project might produce new technical debt, it might be prudent to fund the retirement of that debt, in the form of reserves, when we fund the project. And if we know that a project has encountered some newly recognized form of technical debt, it might be prudent to reserve resources to retire that debt as soon as possible. Ideas such as these can rationalize resource allocations with respect to technical debt.

These two examples illustrate what’s necessary if we want to mitigate the effects of feature bias. They also illustrate just how difficult such a task will be.

References

[Boehm 2016] Barry Boehm, Celia Chen, Kamonphop Srisopha, Reem Alfayez, and Lin Shiy. “Avoiding Non-Technical Sources of Software Maintenance Technical Debt,” USC Course notes, Fall 2016.

Available: here; Retrieved: July 25, 2017

Cited in:

[Ellsberg 1961] Daniel Ellsberg. "Risk, ambiguity, and the Savage axioms." The quarterly journal of economics, 643-669, 1961.

Available: here; Retrieved: August 17, 2018.

Cited in:

[Gaskin 1991] Steven P. Gaskin, Abbie Griffin, John R. Hauser, Gerald M. Katz, and Robert L. Klein. “Voice of the Customer,” Marketing Science 12:1, 1-27, 1991.

Cited in:

[Gould 1996] Stephen Jay Gould. The mismeasure of man (Revised & Expanded edition). W. W. Norton & Company, 1996.

Order from Amazon

Cited in:

[Hamburger 1973] Henry Hamburger. “N-person Prisoner’s Dilemma,” Journal of Mathematical Sociology, 3, 27–48, 1973. doi:10.1080/0022250X.1973.9989822

Cited in:

[Kahneman 1977] Daniel Kahneman and Amos Tversky. “Intuitive Prediction: Biases and Corrective Procedures,” Technical Report PTR-1042-7746, Defense Advanced Research Projects Agency, June 1977.

Available: here; Retrieved: September 19, 2017

Cited in:

[Kahneman 1979] Daniel Kahneman and Amos Tversky, “Intuitive Prediction: Biases and Corrective Procedures,” Management Science 12, 313-327, 1979.

Cited in:

[Kahneman 1984] Daniel Kahneman, Amos Tversky, and Michael S. Pallak. “Choices, values, and frames,” American Psychologist 39:4, 341-350, 1984.

Available: here; Retrieved: August 8, 2017

Cited in:

[Kahneman 2011] Daniel Kahneman. Thinking, Fast and Slow. New York: Macmillan, 2011.

Order from Amazon

Cited in:

[Kerth 2001] Norman L. Kerth. Project Retrospectives: A Handbook for Team Reviews. New York: Dorset House, 2001.

Order from Amazon

Cited in:

[Levy 2009] David A. Levy, Tools of Critical Thinking: Metathoughts for Psychology (second edition). Long Grove, Illinois: Waveland Press, Inc., 2009.

Order from Amazon

Cited in:

[McConnell 2006] Steve McConnell. Software Estimation: Demystifying the Black Art. Microsoft Press, 2006.

Order from Amazon

Cited in:

[Meadows 1999] Donella H. Meadows. “Leverage Points: Places to Intervene in a System,” Hartland VT: The Sustainability Institute, 1999.

Available: here; Retrieved: June 2, 2018.

Cited in:

[Pronin 2002] Emily Pronin, Daniel Y. Lin, and Lee Ross. “The bias blind spot: Perceptions of bias in self versus others.” Personality and Social Psychology Bulletin 28:3, 369-381, 2002.

Available: here; Retrieved: July 10, 2017

Cited in:

[Tversky 1973] Amos Tversky and Daniel Kahneman. "Availability: A heuristic for judging frequency and probability." Cognitive Psychology 5:2, 207-232, 1973.

Available: here; Retrieved: August 9, 2018.

Cited in:

[Weinberg 1992] Gerald M. Weinberg. Quality Software Management Volume 1: Systems Thinking. New York: Dorset House, 1989.

This volume contains a description of the “diagram of effects” used to explain how obstacles can induce toxic conflict. Order from Amazon

Cited in:

[Whitehead 1948] Alfred North Whitehead. Science and the Modern World. New York: Pelican Mentor (MacMillan), 1948 [1925].

Order from Amazon

Cited in:

[Wight 2017] Philip Wight. “How the Alaska Pipeline Is Fueling the Push to Drill in the Arctic Refuge,” YaleE360, Yale School of Forestry & Environmental Studies, November 16, 2017.

Available: here; Retrieved: February 8, 2018

Cited in:

[Yen 2015] Terry Yen, Laura Singer. “Oil exploration in the U.S. Arctic continues despite current price environment,” Today in Energy blog, U.S. Energy Information Administration, June 12, 2015.

Available: here; Retrieved: February 8, 2018.

Cited in:

Other posts in this thread

The Dunning-Kruger effect can lead to technical debt

Last updated on May 31st, 2018 at 07:43 am

The Dunning-Kruger effect [Kruger 1999] can lead to formation or persistence of technical debt in two ways. First, it can cause technologists or their managers to overestimate their ability to maintain the resource focus needed for retiring technical debt in a timely fashion. Second, it can cause senior managers to be reluctant to accede to resource requests of technologists and their managers in support of technical debt management programs.

Cropped detail from Charles Robert Darwin, a painting by John Collier
Cropped detail from Charles Robert Darwin, a painting by John Collier (1850-1934), given to the National Portrait Gallery, London, in 1896. Darwin writes, in The Descent of Man (1871): “… ignorance more frequently begets confidence than does knowledge …” which is the essence of the Dunning-Kruger effect. Image courtesy WikiQuote.

Kruger and Dunning conducted experiments that yielded results consistent with the following four principles (paraphrasing):

  1. Incompetent individuals, compared to their more competent peers, tend to dramatically overestimate their own ability and performance
  2. Incompetent individuals, compared to their more competent peers, tend to be less able to gain insight into their own true levels of performance
  3. Incompetent individuals can gain insight about their shortcomings, but, paradoxically, this comes about by gaining competence
  4. Incompetent individuals, compared to their more competent peers, are less able to recognize competence when they see it

The first three principles lead to distorted assessments of one’s own capabilities. The fourth principle leads to distorted assessments of the capabilities of others.

As an example of distorted self-assessment, consider a team or its managers who must undertake retirement of some types of technical debt in the course of enhancing or repairing an asset. Such a task plan seems at first to offer efficiencies, because the engineers can readily make both kinds of changes at one go. Metaphorically, if we must go to the store for milk, we can also pick up bread while we are there, rather than making two trips.

However, modifying an existing complex technological asset is unlike shopping for bread and milk. The two kinds of modifications — debt retirement and asset enhancement or repair — might seem at first to be separable, and often they are. But if they are not separable, and the two tasks are undertaken together, testing and debugging can become extremely complicated, because of interactions between defects in the two kinds of modifications. Under some circumstances, an experienced team and its managers might be more likely to anticipate these difficulties. An inexperienced team and its managers might be more likely to underestimate the difficulties, as a consequence of the Dunning-Kruger effect. Budget and schedule overruns are possible consequences of underestimating the complexity of the problem.

As an example of the fourth principle above, the Dunning-Kruger effect can cause some decision-makers to discount the warnings and resource requests of engineers and their managers. Decision-makers who are unsophisticated in matters related to technical debt must nevertheless assess the validity of the requests for resources. In making these assessments, these decision-makers may be disadvantaged for a number of reasons, including the following:

  • Decision-makers might hold any of a number of mistaken beliefs about technical debt. For example, many believe that the main causes of technical debt are poor decisions by engineering managers. And others believe that technical debt is the result of slovenly work habits of engineers. Those who hold such beliefs might be reluctant to allocate yet more resources to engineers to address the problem of technical debt.
  • If the advocates of resources for technical debt management are not fully informed about the strategic direction of the enterprise, their requests might be inconsistent with enterprise strategy. As a result of a cognitive bias [Kahneman 2011] known as the halo effect [Thorndike 1920], decision-makers might tend to discount valid portions of the technologists’ proposals, because some portions of those proposals don’t take enterprise strategy into account properly.
  • Decision-makers might be affected by unrealistic optimism [Weinstein 1996], also known as optimism bias. It’s a cognitive bias that can cause them to discount the sometimes-vivid warnings of technologists about the unfavorable consequences of failing to provide technical debt management resources.

Investigations of the degree of correlation between burdens of technical debt and the incidence of rejected or severely curtailed proposals for resources to support technical debt management programs could determine the significance of the Dunning-Kruger effect relative to the problem of technical debt. Also rewarding would be a survey of the nearly 200 known cognitive biases, to determine which of them might be most likely to affect decision-making relative to technical debt, and how best to mitigate the risks they present.

References

[Boehm 2016] Barry Boehm, Celia Chen, Kamonphop Srisopha, Reem Alfayez, and Lin Shiy. “Avoiding Non-Technical Sources of Software Maintenance Technical Debt,” USC Course notes, Fall 2016.

Available: here; Retrieved: July 25, 2017

Cited in:

[Ellsberg 1961] Daniel Ellsberg. "Risk, ambiguity, and the Savage axioms." The quarterly journal of economics, 643-669, 1961.

Available: here; Retrieved: August 17, 2018.

Cited in:

[Gaskin 1991] Steven P. Gaskin, Abbie Griffin, John R. Hauser, Gerald M. Katz, and Robert L. Klein. “Voice of the Customer,” Marketing Science 12:1, 1-27, 1991.

Cited in:

[Gould 1996] Stephen Jay Gould. The mismeasure of man (Revised & Expanded edition). W. W. Norton & Company, 1996.

Order from Amazon

Cited in:

[Hamburger 1973] Henry Hamburger. “N-person Prisoner’s Dilemma,” Journal of Mathematical Sociology, 3, 27–48, 1973. doi:10.1080/0022250X.1973.9989822

Cited in:

[Kahneman 1977] Daniel Kahneman and Amos Tversky. “Intuitive Prediction: Biases and Corrective Procedures,” Technical Report PTR-1042-7746, Defense Advanced Research Projects Agency, June 1977.

Available: here; Retrieved: September 19, 2017

Cited in:

[Kahneman 1979] Daniel Kahneman and Amos Tversky, “Intuitive Prediction: Biases and Corrective Procedures,” Management Science 12, 313-327, 1979.

Cited in:

[Kahneman 1984] Daniel Kahneman, Amos Tversky, and Michael S. Pallak. “Choices, values, and frames,” American Psychologist 39:4, 341-350, 1984.

Available: here; Retrieved: August 8, 2017

Cited in:

[Kahneman 2011] Daniel Kahneman. Thinking, Fast and Slow. New York: Macmillan, 2011.

Order from Amazon

Cited in:

[Kerth 2001] Norman L. Kerth. Project Retrospectives: A Handbook for Team Reviews. New York: Dorset House, 2001.

Order from Amazon

Cited in:

[Kruger 1999] Justin Kruger and David Dunning. “Unskilled and Unaware of It: How Difficulties in Recognizing One's Own Incompetence Lead to Inflated Self-Assessments,” Journal of Personality and Social Psychology, 77:6, 1121-1134, 1999.

Cited in:

[Levy 2009] David A. Levy, Tools of Critical Thinking: Metathoughts for Psychology (second edition). Long Grove, Illinois: Waveland Press, Inc., 2009.

Order from Amazon

Cited in:

[McConnell 2006] Steve McConnell. Software Estimation: Demystifying the Black Art. Microsoft Press, 2006.

Order from Amazon

Cited in:

[Meadows 1999] Donella H. Meadows. “Leverage Points: Places to Intervene in a System,” Hartland VT: The Sustainability Institute, 1999.

Available: here; Retrieved: June 2, 2018.

Cited in:

[Pronin 2002] Emily Pronin, Daniel Y. Lin, and Lee Ross. “The bias blind spot: Perceptions of bias in self versus others.” Personality and Social Psychology Bulletin 28:3, 369-381, 2002.

Available: here; Retrieved: July 10, 2017

Cited in:

[Thorndike 1920] Edward L. Thorndike. “A constant error in psychological ratings,” Journal of Applied Psychology, 4:1, 25-29, 1920. doi:10.1037/h0071663

The first report of the halo effect. Thorndike found unexpected correlations between the ratings of various attributes of soldiers given by their commanding officers. Although the halo effect was thus defined only for rating personal attributes, it has since been observed in assessing the attributes of other entities, such as brands. Available: here; Retrieved: December 29, 2017

Cited in:

[Tversky 1973] Amos Tversky and Daniel Kahneman. "Availability: A heuristic for judging frequency and probability." Cognitive Psychology 5:2, 207-232, 1973.

Available: here; Retrieved: August 9, 2018.

Cited in:

[Weinberg 1992] Gerald M. Weinberg. Quality Software Management Volume 1: Systems Thinking. New York: Dorset House, 1989.

This volume contains a description of the “diagram of effects” used to explain how obstacles can induce toxic conflict. Order from Amazon

Cited in:

[Weinstein 1996] Neil D. Weinstein and William M. Klein. “Unrealistic Optimism: Present and Future,” Journal of Social and Clinical Psychology 15:1, 1-8, 1996. doi:10.1521/jscp.1996.15.1.1

Cited in:

[Whitehead 1948] Alfred North Whitehead. Science and the Modern World. New York: Pelican Mentor (MacMillan), 1948 [1925].

Order from Amazon

Cited in:

[Wight 2017] Philip Wight. “How the Alaska Pipeline Is Fueling the Push to Drill in the Arctic Refuge,” YaleE360, Yale School of Forestry & Environmental Studies, November 16, 2017.

Available: here; Retrieved: February 8, 2018

Cited in:

[Yen 2015] Terry Yen, Laura Singer. “Oil exploration in the U.S. Arctic continues despite current price environment,” Today in Energy blog, U.S. Energy Information Administration, June 12, 2015.

Available: here; Retrieved: February 8, 2018.

Cited in:

Other posts in this thread

Show Buttons
Hide Buttons