Learn how to be a wildly successful small business programmer

Is Uncle Bob serious?

Robert C. Martin (Uncle Bob) has been banging on the “software professionalism” drum for years and I’ve been nodding my head every with every beat. As a profession, I believe software developers need to up their game big time. However, I’ve become concerned about Uncle Bob’s approach. I reached my breaking point the other day when I read his blog post titled Tools are not the Answer.

He took issue with a recent The Atlantic article: The Coming Software Apocalypse. Let me see if I can summarize the theses of these two articles.

The Atlantic:

We are writing more and more software for safety-critical applications and the software has become so complex that programmers are unable to exhaustively test or comprehend all the possible inputs, states, and interactions that the software can experience. We are attempting to build systems that are beyond our ability to intellectually manage.

We need new ways of helping software developers write software that functions correctly (and is safe) in the face of all this complexity. The current methods of producing safety-critical software are especially dangerous to society because when software contains defects we can’t observe them in the same way we can observe that a tire is flat–they’re invisible.

Uncle Bob:

The cause:

  1. Too many programmer (sic) take sloppy short-cuts under schedule pressure.
  2. Too many other programmers think it’s fine, and provide cover.

And the obvious solution:

  1. Raise the level of software discipline and professionalism.
  2. Never make excuses for sloppy work.

Does Uncle Bob’s argument even pass the sniff test?

Safety-critical software systems, which are the topic of the Atlantic article, are held to shockingly high quality standards. The kind of requirements analysis, planning, design, coding, testing, documentation, verification, and regulatory compliance that goes into these systems is miles beyond what any normal organization would consider for an e-commerce website or mobile app, for example.

Read They Write the Right Stuff and tell me if you think Uncle Bob’s on the right track (note this article was written 21 years ago and the state-of-the-art has advanced significantly). Does it sound like the NASA programmers just need more discipline and professionalism coupled with never making excuses for sloppy work?

What does an expert in safety-critical systems from MIT have to say?

Dr. Nancy Leveson was quoted several times in the Atlantic article but Uncle Bob completely ignored those parts.

So let’s review an excerpt from one of her talks:

I’ve been doing this for thirty-six years. I’ve read hundreds of accident reports and many of them have software in them. And every someone (sic) that software was related, it was a requirements problem. It was not a coding problem. So that’s the first really important thing. Everybody’s working on coding and testing and they’re not working on the requirements, which is the problem. (emphasis added)

She can’t say it much clearer than that. Did I mention that she’s an expert? Did I mention that she works on all kinds of important projects, including classified government weapons programs?

How about Dr. John C. Knight?

In his paper Safety Critical Systems: Challenges and Directions, Dr. Knight describes many challenges of building safety-critical systems but developer discipline and professionalism are not among them. This is as close as he gets:

Development time and effort for safety-critical systems are so extreme with present technology that building the systems that will be demanded in the future will not be possible in many cases. Any new software technology in this field must address both the cost and time issues. The challenge here is daunting because a reduction of a few percent is not going to make much of an impact. Something like an order of magnitude is required.

Developing safety-critical systems is extremely slow, which adds to cost. But QA practices virtually ensure delivered software functions as specified in the requirements. Uncle Bob could possibly argue that some projects are slow because the developers on those projects are undisciplined and unprofessional. But a claim like that requires evidence and Uncle Bob offers none.

Yes, tools are part of the answer (but not the whole answer)

My goodness, we need more and better tools. When I first started programming, I started with a text editor with basic syntax highlighting, that’s it. I used to FTP into the production server to upload my code and run it; I didn’t have a development environment.

Better tools have helped me become a better programmer

Later I moved to Eclipse and thought I was stupid for not doing this sooner. Eclipse caught all kinds of errors I missed with the basic text editor. It just highlighted them like a misspelled word in a word processor–brilliant.

A couple of years later I adopted Subversion as my VCS and I thought I was stupid for not doing this sooner. I could see all the history for my project, I could make changes and revert them. It was awesome.

Ditto for:

  • code reviews/pull requests/Jira
  • advanced IDEs with integrated static analysis, automated refactoring tools, automatic code formatting, and unit tests that run at the push of a button
  • GIT/bitbucket/GitHub
  • TDD
  • property-based testing (QuickCheck family)
  • virtual machines
  • frameworks
  • open source libraries

It’s been nearly twenty years since I started programming and my tools have changed significantly in that time. I can only imagine how the tools that become available in the next twenty years will change how we write and deliver code.

Let’s look at some possibilities.

Better static analyzers

My static analyzers still don’t understand my code and can only pick up simple mistakes. They flag tons of false positives. They can be slow on large code bases. And I’d love it if I just have one static analyzer that did everything I wanted instead of 4-5. It’s also time consuming to write custom rules. There’s plenty of room for improvement there.

Correct by construction techniques

Then there are “correct by construction” techniques. I watched this video. He had me at “a provable absence of runtime errors”. So I got a book on Spark (a subset of Ada) and started learning. Wow, you might be able to write highly reliable and correct software in Spark but it’s going to be a slow process (aka expensive).

Is this the future? I don’t know but maybe if it was easier to program in Spark it might have a better chance in safety-critical software circles. It would also be interesting if someone developed formal method capabilities for my favorite programming language that were accurate and easy to use. “No need to write tests for this module, the prover says it’s mathematically sound,” yes please.

October 23, 2020 update:
I recently programmed a sumobot in Ada/SPARK to help me get a feel for the languages. And I think you’ll either love Ada/SPARK or hate it. If you love the speed and flexibility of python, you’ll probably hate Ada/SPARK. But if you care about low defect rates and high quality, you’ll love the features of Ada/SPARK that enable you to achieve those goals.

Software to track each requirement to the code that implements it and the tests that prove that it was implemented correctly

I watched a video where the presenter was talking about the difficulty her team has with tracking thousands of requirements to specific code and test cases and back for regulatory compliance purposes in safety-critical systems. The task became much more difficult as they tried to keep everything in sync while the requirements, tests, and code changed as the project progressed. That team and every team like them needs better tools. And, eventually, I’d love to see that kind of thing built into the IDE for my favorite programming language, if it was easy to use.

Formal specification languages/model checkers

Then there are formal specification languages to consider. The Atlantic article mentions TLA+ but there are others. Now imagine that these languages were easy to use. Imagine that you had a tool that could help you construct a formal specification in an iterative way where it coached you to along to make sure you covered every case. And when you were done, you could get it to generate some or all of the code for you. Plus, if you got stuck you could just find the answer on StackOverflow. Cool? Hell, yes!

And more…

I’m sure we can brainstorm dozens of new or improved tools in the comments that would help us write better, more correct code at a lower cost.

Why increased discipline and professionalism are not the answer

The fundamental problem is that even the brightest among us don’t have the intellectual capacity to understand and reason about all the things that could happen in the complex interacting systems we are trying to build. It’s not an issue of discipline or professionalism. These system can express emergent behavior or behave correctly but in ways unforeseeable by their designers.

That’s why Dr. Leveson’s book is so important. Instead of trying to figure out all those states and behaviors we “just” have to specify the states and behaviors that are not safe and prevent the software from getting into those states. Well, it’s more complicated than that but that’s a part of it.

Conclusion

I’m all for increasing software professionalism and discipline but Uncle Bob’s wrong about how to prevent “The Coming Software Apocalypse” in safety-critical software systems. Experts in the field don’t rank programmer professionalism and discipline anywhere near the top of their priorities for preventing loses.

More programmer discipline and professionalism can’t hurt but we also need ways of taming complexity, better tools, ways to increase our productivity, ways to reason about emergent behavior, research on what actually works for developing safety-critical software systems, new and better techniques for all aspects of the software development process, especially better ways of getting the requirements right, and so much more.

I know there are tons of programmers churning out low-quality code. But organizations building safety-critical systems have processes in place to prevent the vast majority of that code from making it into their systems. So if the software apocalypse comes to pass you can be pretty sure it won’t be because some programmer thought he could take a short-cut and get away with it.

What do you think? Agree or disagree? I’d love to hear your thoughts.

Additional resources

Blog post: Safety-Critical Software: 15 things every developer should know

Here’s a video of Uncle Bob’s software professionalism talk: https://youtu.be/BSaAMQVq01E

Nancy Leveson’s book Engineering a Safer World is so important that she released it in its entirety for freehttps://www.dropbox.com/s/dwl3782mc6fcjih/8179.pdf?dl=0

Excellent video on safety-critical systems: https://youtu.be/E0igfLcilSk

Excellent video on “correct by construction” techniques: https://youtu.be/03mUs5NlT6U

9 Comments

  1. Btara

    Somehow got here when was searching for Uncle Bob. I’ve read his response to the Atlantic article and I admit that I didn’t read the Atlantic article he was responding to. It’s nice to have a counter opinion, so thank you for making this article.

    After reading your response I would say I’m still in favor of Uncle Bob’s viewpoint in the end, though I see how better tools leads to better development and ability to produce more complex work. But the reason why I incline more to Uncle Bob’s viewpoint is because in the end the producers of code is us, humans. In the end the ability to maximize the tools for better coding depends on our ability as humans to code, abstract, design, and think about problems. The betterment of our ability to develop, whether through professionalism, discipline, better education etc. IS part of the solution. It has to be, so long as we humans are the ones creating these complex systems

    One thing I want to take note of is that you said “Experts in the field don’t rank programmer professionalism and discipline anywhere near the top of their priorities for preventing loses” and I think that is what Uncle Bob is claiming: not enough people care about professionalism and discipline, with the premise that those elements are central to the betterment of the industry. At least where I am working I clearly see this trend, from higher ups all the way down to the developers themselves. There is so much mention of tools, techniques, technological advancements, AI, blockhain etc. but so little talk about the qualities and discipline of developers themselves

    Again, thank you for the post. Was surely thought provoking

    • Blaine Osepchuk

      Thanks for reading and taking the time to reply. I’m so glad you found it thought provoking.

      Don’t get me wrong, I’m strongly in favor of increasing the level of professionalism in our industry. I wrote about twice recently:

      https://smallbusinessprogramming.com/great-power-comes-great-responsibility/
      https://smallbusinessprogramming.com/software-security-is-hopelessly-broken/

      But you really need to watch the Levenson video (https://youtu.be/WBktiCyPLo4?t=21m20s). She’s an expert (maybe THE expert in safety critical systems) and she’s saying professionalism isn’t the problem that’s causing loses. There are enough checks and balances in place in safety critical software development to compensate for unprofessional programming practices.

      So, while I’m sure safety critical software development could be a lot cheaper if programmers behaved more professionally, the danger is coming from the complexity of these systems. Theses systems have so many possible states when they interact with other systems and/or the environment that we are failing to recognize dangerous potential states and make sure they are handled correctly. Each one of these unanticipated states is basically a missing requirement.

      Cheers.

      • Btara

        No problem

        We can both agree, I think, that professionalism and good tooling are important factors towards a safer product/service. Where we differ I think is how we weight their importance relative to each other.

        I agree that a lot of products and services these days are very complex. Not even just in safety critical systems but in everyday products we use like social media, or some website that integrates with a bunch of other websites. Inherently I think the complexity arises from the desire to create better products, with more features and more conveniences and higher goals than before.

        I am sure great tools will help in ensuring a safer product (in various sense of safe including social, economical, political, environmental, physical, mental etc.), but I would hold to the believe that only with careful the discipline and desire to thread carefully around complexities (obvious or hidden), will we be ready to serve the complex needs of society.

        Thank you for your reply and the links you shared. I’ll bookmark and check them out.

        Cheers

      • Daganev

        If the main problem is “Requirements”, then you need more discipline and professionalism to notice and find the errors in those requirements / dangerous states.

        You need the “brainless rituals” and disciplines to become second nature so that your have mental time and energy to even notice the potentially bad states, and then prevent them from happening.

        • Blaine Osepchuk

          Yes, I agree. Changing focus from the small to the large repeatedly is one of the things that makes programming difficult.

          On the other hand, even devices with seemingly simple requirements in physical computing systems can get absurdly complex when you try to cover all possible edge cases.

          Take a look at this talk on medical device security (https://youtu.be/bA3xCpYLA34). Kevin Fu goes into detail on some of the challenges of making pacemakers secure.

          Then start to think of edge cases he didn’t mention. Gunshot wound? Physical damage to the device from a car accident? EMP pulse? What happens if two programmer devices try to program the pacemaker at the same time? Does the device have error correcting RAM to deal with bit flips from stray cosmic rays? Does the device protect against doctor error by preventing unsafe settings? What happens if the programming device malfunctions during an upgrade to the pacemaker? Will it be bricked? What happens if you have one of these devices in your chest and someone tried to use an AED on you? Could a sensor malfunction trick the device into giving you a fatal shock?

          Thanks for your comment.

  2. Joe

    You might be misunderstanding Uncle Bob a bit. Yes he thinks that professionalism and discipline are important but the reason is not only to decrease the chances of something very bad. It doesn’t really matter what we do, sooner or later something really bad will happen. But when it does, it is not the fault of the profession as a whole but rather mistakes of an individual/ team/ company, because the profession had guidelines and procedures in place.

    • Blaine Osepchuk

      Interesting point of view. Can you direct me to the parts of the article where Uncle Bob says that? I’m not seeing it.

      • David Peer

        You can see this viepoint in his lectures in the clean coders site

        • Blaine Osepchuk

          I haven’t watched his clean coder videos but I wouldn’t be surprised by that. He’s quite passionate.

          Thanks for your comment, David

Leave a Reply