paint-brush
I Already Know Where the Problems Are in My Code by@konstantin-sokolov
349 reads
349 reads

I Already Know Where the Problems Are in My Code

by Konstantin SokolovJune 2nd, 2021
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

Incompetence, hubris and opportunism are commonplace in software development. We often produce opinions that we defend “by definition” against everything and everyone. The insidious thing about opinions is that, contrary to popular belief, they are almost immune to facts. In software development, however, one very often gets away with it at first and can far too easily create a false sense of security. Developers should still consider themselves infallible experts who know everything about themselves and others about it.

Company Mentioned

Mention Thumbnail
featured image - I Already Know Where the Problems Are in My Code
Konstantin Sokolov HackerNoon profile picture
If you’re in the software development industry, you’ve probably heard this phrase more than once. From colleagues, suppliers, customers or business partners. But most often, of course, from yourself. You have said it out aloud, consciously thought it or implicitly assumed it.Why? Because it is what you truly believe. Anything else would be a personal disaster, wouldn’t it? And while I am 100% convinced that I am unfamiliar with the majority of the problems in my code, I too often catch myself thinking the opposite.

We Are Never Right

Recently I read the highly recommendable article . There, Reto U. Schneider summarizes scientific findings from psychology and neurobiology that outline our difficult relationship with our own opinions. In short: our brains always strive to hold on to existing opinions at all costs — regardless of the relationship of those opinions to reality.
“Opinions are often born not from facts, analysis and intelligence, but from incompetence, hubris and opportunism.”
— Reto U. Schneider

While reading the article, I couldn’t stop thinking about software development. In the first place probably because I’ve been programming since I was 13 and I think about that stuff way too often anyway. But on the other hand also because writing code has a lot to do with opinions.

Why? Well, a piece of code is ultimately just our opinion about what the best solution to implementing a requirement is. For almost any problem, there are simply too many possible solutions to objectively settle on just one correct one. And so we as software developers constantly produce opinions that we defend “by definition” against everything and everyone (that’s just the way our brains want it).

So do opinions only arise from the variety of solutions?

Not at all! Incompetence, hubris and opportunism are also commonplace in software development.

Or don’t we copy-paste half-baked stuff from Stackoverflow, which has the most likes? Decide by gut feeling where test coverage is worthwhile and where not? And sometimes use the latest technology (after 2–3 YouTube videos) that we don’t quite understand, just because it’s “in” at the moment? Or is it always just the others who do that?

The insidious thing about opinions is that, contrary to popular belief, they are almost immune to facts. We identify so strongly with our opinions that we perceive a change of opinion as a betrayal of our present and former selves.

“We don’t have opinions, we are our opinions”
— Reto U. Schneider

The brain automatically tries to avoid this unpleasant emotional state (also called cognitive dissonance) by rearranging the world as it suits itself. These automatisms lead to systematic errors (so-called cognitive distortions) in the perception of reality, to which we are all subject. One of the best known is confirmation bias — the tendency to selectively choose information so that it confirms our opinions (Wikipedia lists more than ).

An Example From Software Development Please?

Hand on heart: how much time has to pass before you can admit to yourself completely unreservedly and without shame that a piece of code you wrote was total crap? For me it is 1–2 years. Before that, I will be convinced that there were valid reasons to do it that way and not otherwise. Don’t get me wrong, it’s not about deliberate lying. We just can’t think outside of the box of what we believe to be true — it’s the only reality accessible to us.

The same may apply to other professions as well. In software development, however, one very often gets away with it at first and can far too easily create a false sense of security. This only became clear to me towards the end of my studies, when a transistor blew up while I was soldering a circuit. It banged loudly and the worst of all: I had to go to the store and buy a new one.When developing software, on the other hand, we can click “compile” & “run” so many times and bend things until they somehow work — it’s just SOFT-ware. Without real fear in the neck, the motivation for mental effort decreases and the brain makes use of this whenever it can: thinking is a calorie eater and brains love to save resources.

Actually, Developers Should Have Despaired Long Ago

In view of the amount of mistakes and wrong decisions they make. Anyone still having illusions about this should take a look at their bug tracker. What other industry keeps lists of its own failures so carefully? And despite all this, we all consider ourselves infallible experts who know everything about themselves and others.

“Subjective confidence in a judgment is not a reasoned evaluation of the probability that this judgment is correct.”
— Daniel Kahneman.

This plausible truth is actually an insurmountable hurdle for us: everything we do, we do out of confidence. Because one thing is clear: we do not write sub-optimal code intentionally. We are always convinced of its correctness. Most of the time, we are doomed to trust our intuitions. We are our intuitions. That’s how our life coping works. Otherwise, we would go crazy with constant doubts.

As Soon as It Comes to Ourselves, We Talk Bullshit

The number of memes about what all goes wrong in software development is mind-boggling. And so one might be inclined to believe that the industry should be more sensitive to its own error-proneness. But fiddlesticks — apparently developers are only human. We accept and acknowledge with willingness and rational boldness the problems of our industry, our company, our colleagues and our closest friends.However, as soon as it comes to ourselves, we start talking bullshit.The belief set in the title is the fundamental problem of many misguided developments in software engineering. Another one that we also encounter in one form or another in customer projects and conversations is the following:
“Analysis tools only confirm what I know anyway, the rest are false positives”

Well, it is perfectly understandable to believe such a thing about your own code. It is the result of another cognitive bias that Daniel Kahneman calls the WYSIATI (What You See is All There Is) principle:

“You cannot help dealing with the limited information you have as if it were all there is to know. You build the best possible story from the information available to you, and if it is a good story, you believe it. Paradoxically, it is easier to construct a coherent story when you know little […]. Our comforting conviction that the world makes sense rests on a secure foundation: our almost unlimited ability to ignore our ignorance.”
— Daniel Kahneman.

For the most part, we only know the problematic code locations that we have either deliberately caused ourselves (as Technical Debt) or those where errors that fall under our purview occur more frequently. That is exactly the limited information that is available to us.

If analysis tools show red flags right there, we are happy about confirmation (Confirmation Bias). The rest we meet with skepticism and rejection ( — our tendency to meet contradictions particularly critically and to find reasons against them).

In Fact, Such a Belief Is Always Wrong!

First, because errors are only one of many indicators of inadequacy. For example, those code locations are also highly problematic that have not yet caused any errors, but require a lot of development time because they are difficult to understand (but which more often escape our attention because they do not have the conciseness of an error that has already occurred). Second, because we know nothing about what we don’t know. In other words, the absence of our knowledge about problems is not an indicator of the absence of the problems themselves. Such knowledge, however, hardly helps us to change our beliefs.

Such Self-Overestimations Sometimes Take On Adventurous Proportions

Some time ago, we got into a conversation with a technical manager at the TNW conference. He listened to what we had to say about data-based software analytics, nodded, and described the approaches as interesting. At least for the future. Because at the moment he would still have quite a good overview of his software. You have to know: the good man was responsible for projects in more than 60 repositories.

Does the Relationship Between Developers and Managers Make Realistic Reflection Difficult?

Recognizing one’s own fallibility is in itself a very difficult undertaking. But could it be that the way of working in software projects additionally exacerbates this inherent problem?

That managers on the one hand presuppose the First-Time-Right-Principle to an extent that is unjustified for software, and developers are thus forced to hide their wrong decisions for fear of being disadvantaged? And that developers, on the other hand (also unjustifiably) tacitly deny managers any realistic understanding of the volatility of architectural decisions, and thus see pretending infallibility as the only legitimate survival strategy?

Our project experience, as well as conversations with managers and developers, suggest that, unfortunately, both are true: the relationship is indeed suboptimal in both directions, and more often than necessary characterized by mutual recriminations.However, this topic is too extensive to be covered in just one section and deserves a separate post.

What Helps?

If you believe in research, unfortunately not too much. We hardly have any opportunities to escape our biases. Most of the time, we use our minds to explain away the contradictions to our beliefs instead of learning from them.Nonetheless, I don’t think we are completely without a chance. There are some adjusting screws we can tinker with.

1. We Should Keep Becoming Aware Of How Our Brains Work

That’s our only chance to trick them every now and then into doing what we want them to do, and not the other way around. Here are two book recommendations that have nothing to do with software:

2. Say Goodbye to the Idea That Developers Are Artists

Unfortunately, this myth is still rehashed far too often. We are no more or less artists than tile setters and insurance agents. Any activity can be performed with a level of excellence that borders on art. Software development too — but it is no exception among other professions in this regard.

No, we are not artists, we are committed to “software engineering”. And as engineers, we have to learn to be less arbitrary and more disciplined.

This also means: less trust in gut feeling and more openness to data-based findings that could also disprove our opinions.That’s the contribution we’re trying to make at . And indeed, Daniel Kahneman shows in his book that the simplest algorithms often provide better judgments than experts.

3. And Most Importantly, We Should Not Succumb to the Illusion That a Finding Is Only Worth Something if It Is 100% Correct

Small improvements make a big difference. Being a little more right is better than groping completely in the dark.Robert Sapolsky writes about this in the last chapter of his 800 pages:

“On any big, important issue it seems like 51 percent of the scientific studies conclude one thing, and 49 percent conclude the opposite. And so on. Eventually it can seem hopeless that you can actually fix something, can make things better. But we have no choice but to try.”

And there is actually nothing to add to this.

This article first appeared on the Cape of Good Code blog at:



바카라사이트 바카라사이트 온라인바카라