Interesting terms about software delivery

It is enlightening to find out that several things that I struggled at different times in professional life are actually experienced by many predecessors. Some smart people actually coined terms for these phenomena. Although they are somewhat negative, they won’t go away simply because not being mentioned. Both technical staff and project management professions should be wary of them. Some came out of software project management and some grew out of behavioural economics context. Here we go the terms, mostly excerpts from Wikipedia:

Death march project – a project that the participants feel is destined to fail, or that requires a stretch of unsustainable overwork. The general feel of the project reflects that of an actual death march because project members are forced by their superiors to continue the project against the member’s better judgement. Death marches of the destined-to-fail type usually are the result of unrealistic or overly optimistic expectations in scheduling, feature scope, or both, and often include lack of appropriate documentation or relevant training and outside expertise that would be needed to accomplish the task successfully. Often the death march will involve desperate attempts to right the course of the project by asking team members to work especially grueling hours, or by attempting to “throw (enough) bodies at the problem”, often causing burnout.

Software Peter Principle – in software engineering, software Peter principle describes a dying project which has become too complex to be understood even by its own developers. It is well known in the industry as a silent killer of projects, but by the time the symptoms arise it is often too late to do anything about it. Good managers can avoid this disaster by establishing clear coding practices where unnecessarily complicated code and design is avoided. This term is derived from the Peter Principle – a theory about incompetence in hierarchical organizations. There are mainly three causes of this:

  • Loss of conceptual integrity: The conceptual integrity of software is a measure of how well it conforms to a single, simple set of design principles. When done properly, it provides the most functionality using the simplest idioms. It makes software easier to use by making it simple to create and learn. Conceptual integrity is achieved when the software design proceeds from a small number of agreeing individuals. For software to maintain conceptual integrity, the design must be controlled by a single, small group of people who understand the code in depth. In projects without a strong architecture team, the task of design is often combined with the task of implementation and is implicitly delegated among the individual software developers. Under these circumstances, developers are less likely to sacrifice personal interest in favour of the interests of the product. The complexity of the product grows as a result of developers adding new designs and altering earlier ones to reflect changes in fashion and individual taste.
  • Programmer incompetence: the best developer should understand what details need to be communicated with people.
  • Programmer inexperience: programmers sometimes make implementation choices that work but have unintended negative consequences. Over time, many such implementation choices degrade the software’s design, making it increasingly difficult to understand.

Brooks’ law – an observation about software project management according to which “adding human resources to a late software project makes it later”. It was coined by Fred Brooks, according to who, there is an incremental person who, when added to a project, makes it take more, not less time. This is similar to the general law of diminishing returns in economics. Brooks admit the law is an outrageous oversimplification, but it captures the general rule. Brooks points the the main factors:

  • It takes some time for the people added to a project to become productive. Brooks calls this the “ramp up” time. Software projects are complex engineering endeavours, and new workers on the project must first become educated about the work that has preceded them; this education requires diverting resources already working on the project, temporarily diminishing their productivity while the new workers are not yet contributing meaningfully. New worker may even make negative contributions, for example, if they introduce bugs that move the project further from completion.
  • Communication overhead increases as the number of people increase. Due to combinatorial explosion, the number of different communication channels increases rapidly with the number of people. Everyone working on the same task needs to keep in sync, so as more people are added they spend more time trying to find out what everyone else is doing.
  • Adding more people to a highly divisible task, such as cleaning rooms in a hotel, decreases the overall task duration (up to the point where additional workers get in each other’s way). However, other tasks including many specialties in software projects are less divisible; Brooks points out this limited divisibility; Brooks points out this limited divisibility with another example: while it takes one woman nine months to make one baby, nine women can’t make a baby in one month.

Escalation of commitment – a human behaviour pattern in which an individual or group facing increasingly negative outcomes from a decision, action, or investment, nevertheless continues the behaviour instead of altering course. The actor maintains behaviours that are irrational, but align with previous decisions and actions.

Cowboy coding – the development process where autonomous developers in control of schedule, algorithms, frameworks and coding style work with minimal process or discipline. Usually it occurs when there is little participation by business users or fanned by management that controls only non-development aspects of the project, such as the broad targets, timelines, scope and visuals (the “what”, but not the “how”)

Conway’s law – organizations design systems which mirror their own communication structure. The law is based on the reasoning that in order for a software module to function, multiple authors must communicate frequently with each other. Therefore, the software interface structure of a system will reflect the social boundaries of the organization(s) that produced it, across which communication is more difficult.

Murphy’s law – Anything that can go wrong will go wrong.

KISS principle – “Keep it simple, stupid”, as a design principle, it states that most systems work best if they are kept simple rather than made complicated; therefore, simplicity should be a key goal in design, and unnecessary complicity should be avoided.

Boondoggle – a project that is considered a waste of both time and money, yet is often continued due to extraneous policy or political motivations.

Optimism bias – a cognitive bias that causes someone to believe that they themselves are less likely to experience a negative event.

Planning fallacy – a phenomenon in which predictions about how much time will be needed to complete a future task display an optimism bias and underestimate the time needed. This phenomenon sometimes occurs regardless of the individual’s knowledge that past tasks of a similar nature have taken longer to complete than generally planned. The bias only affects predictions about one’s own tasks. When outside observers predict task completion times, they show a pessimistic bias, overestimating the time needed.

Separation of concerns – a design principle for separating a computer program into distinct sections such that each section addresses a separate concern. A concern is a set of information that affects the code of a computer program. A concern can be as general as “the details of the hardware for an application”, or as specific as “the name of which class to instantiate”. A program that embodies SoC well is called a modular program. Modularity, and hence separation of concerns, is achieved by encapsulating information inside a section of code that has a well-defined interface. Encapsulation is a means of information hiding. Layered designs in information systems are another embodiment of separation of concerns (e.g., presentation layer, business logic layer, data access layer, persistence layer).

DRY principle – Don’t Repeat Yourself. Every piece of knowledge must have a single, unambiguous, authoritative representation within a system. The principle has been formulated by Andy Hunt and Dave Thomas in their book The Pragmatic Programmer. They apply it quite broadly to include “database schemas, test plans, the build system, even documentation”.[3] When the DRY principle is applied successfully, a modification of any single element of a system does not require a change in other logically unrelated elements. Additionally, elements that are logically related all change predictably and uniformly, and are thus kept in sync. This principle is aimed at reducing repetition of software patterns, replacing it with abstractions or using data normalization to avoid redundancy.

YAGNI principle – You Aren’t Gonna Need It. Always implement things when you actually need them, never when you just foresee that you need them.

Golden Hammer – aka Law of Instrument, a cognitive bias that involves an over-reliance on a familiar tool. It is a form of narrow-minded instrumentalism.

That’s it for now. There are some more software development philosophies on Wikipedia.

Leave a Comment