software

 

How All This Applies to Software Development

When will they ever learn?
When will they ever learn?
                                                                                            from "Where Have All the Flowers Gone?" by Pete Seeger

Somehow, in the early days, software development/programming was considered an art form.  There were no trained programmers, because of the recent origin of the entire field.  So programmers came from all kinds of backgrounds, including english, history and other liberal art specialties. Ada Lovelace, often cited as the first programmer, was a mathematician.  Additionally, programming for early computers mostly involved fairly small programs that one person could understand thoroughly.  So it no wonder that as programs grew to gigantic size and got far beyond the capability of one person (or even a few persons) to create and understand, that projects for developing large programs got into serious problems.  Cleverness in programming was admired, but it was then found that when the creating programmer left, the code that was left behind was inscrutable to those who had to maintain it.  I even remember a case where the rapid conversion of numbers from binary to binary-coded-decimal (BCD) was accomplished making use of the floating point hardware in the machine!  Clever, but impossible to fathom for anyone other than the original programmer -- unless extensive and detailed documentation was also provided, rarely the case.

So it was recognized that maintainable, large programs required using straightforward, common procedures and needed to be thoroughly documented.  A disciplined approach to the entire project was needed.  A number of philosophical approaches to accomplishing this were proposed and explored.  But no universal approach was successful and, somehow, all this discipline was incompatible with the "art form" approach of many programmers, who just couldn't bring themselves to stick to the standards and requirements of a common approach.  This was recognized decades ago.

Then why are more recent software projects still getting into classic trouble?  It still seems to be happening, with large programs from fighter jets to financial systems being held up by their incomplete or error-prone software.  This problem should have been eliminated, or at least controlled, long ago.  Some of the it comes from the inherent culture clash between the artful, young programmers and the disciplined approach large projects require.  And some of it comes from finding that the disciplined approach is much slower and more expensive to carry out.  So there are natural pressures to take short-cuts.

So what can be done about all this.  Clearly, there are organizations that know how to do this, because lots of good software products have been completed.  But I have the distinct impression that many organizations still flounder about in classic, historical fashion.

I don't have a magical solution.  But after my years of experience and the many scars on my back from software project difficulties, I have a few thoughts to pass on.

First, consider the analogous process for creating a mechanical and/or electronic product (see the section on Development and Termination).  Aside from some exploratory models, the products are first designed and then are constructed, including the prototype, from the documented design, that is, from the design prints or the computer equivalent.  That is, they are "built to print".  Documentation of the design is never done after the product has been built (other than some corrections).  With software, it is tempting, and often the actuality, to do the documentation after the product has been "built".  As a result, the documentation is often incomplete or incorrect (programmers hate to document).

So it is tempting to prescribe that software first be documented -- that is, the user's manual and the documentation of the implementation should be written first, and the actual code then created from the documentation, perhaps even by a different group than the one that did the design!  A number of such proposed procedures have been made over the years, but I don't know if they have been successfully adopted widely.  The philosophy appeals to me, but I can also see real difficulties in carrying it out completely.  Still, moving in this direction seems right to me.

Two other crimes need to be mentioned that plague software projects, in common with non-software projects.  One is the common problem of having specifications/requirements of the product changing during the design.  This plagues even the Department of Defense, though perhaps that environment cannot freeze product specs for that long in a changing world.  But it still seems that a more rigid approach is needed.  The software world, however, makes this crime particularly serious because software is the only field of technology that I am aware of where elegance really matters.  Yes, an elegant software architecture should allow easier adjustment to changes in requirements, but a truly elegant architecture can only be accomplished when the purpose of the project is clear and constant.  Elegance matters in software just because of the inscrutability of the product itself, so a logical, clean design permits understanding and makes changes and improvements modular and controlled.

The other crime is not having the basic tools and technology soundly completed and proven before starting to use them.  Attempting to overlap these phases in the name of economy and schedule is a death warrant for the project.  It really should be considered as "research/exploration" until the tools are finalized, proven and frozen.   You cannot build a (necessarily) incomplete product with incomplete tools.  But people still try...

I am a great believer in first creating a quick run-through of a model of the software product, using the actual planned tools.  The result should have one example of each aspect of the project to show that it all can be done.  Designing hundreds of other screens, forms and reports can be done later, once you are sure you can do one of each kind in every aspect of the project.  Then you know you are on firm ground and can plan the effort remaining in the project.

This is a fine example of the difference between egg-laying and ditch-digging.  In egg-laying, assigning more hens won't help speed things up.  But when you get to ditch-digging, adding more resources is a predictable and productive option.  So it is with software.  Do everything you can to get beyond the egg-laying phase of a project so you can feel better about predicting the remaining work and can have alternatives to use to control the outcome.

All of this requires a kind of discipline that has been long accepted in, say, mechanical engineering, but which goes against the grain in software.  It does take some of the fun out of it, but it pays in the long run with more predictable projects and more controlled costs.