There is no actual speed limit in software development, at least not a theoretical one. In fact, we are forever pushing the envelope with new thinking and techniques with progressive teams typically driving right at the edge between control and chaos, frequently flirting with the latter. Agile methods, lean thinking, open source and most recently DevOps practices have ushered in significant gains in productivity, raising the practical limits of development speed enormously in the last 15 years.
There has always been an insatiable desire to go faster and faster in software development. This can be a significant competitive differentiator and can be the difference between becoming the market leader or becoming an “also ran”. In order to feed this appetite for speed, we’ve had to develop new approaches and abstractions, often burying vast amounts of complexity deeper and deeper, with many hidden risks.
Sometimes Speed Kills
However, sometimes speed kills. And, when there is a “fatality”, there is a natural tendency to introduce speed limits in the form of process, which in the case of some open source related snafu might be something like: Forming a governance body that will manually review and approve all open source software before it can be used.
If you are not a developer, that’s a somewhat intuitive response. Obviously, the logic goes, we are going too fast. Let’s slow things down a bit. Let’s review things very carefully. Then we won’t be a likely to drive off a cliff.
However, imposing anything that limits the pace at which development operates runs directly counter to the competitive differentiation speed creates. This kind of response immediately puts you at a competitive disadvantage and can also add significant incremental costs stemming from development inefficiencies (e.g. waiting for approval) and manual, human intervention. Not only do things slow down, they cost more too.
If that’s not bad enough, they also do not scale well and are generally ineffective. Developers, trying to achieve the speed needed to deliver on time and the efficiency to work within budget, readily thwart such perceived obstacles. And, they are really good at routing around “process” too.
We Have No Choice… or Do We?
But, they cry: It’s for the children, for public safety, to CYA. We have no choice… Except that often there is, and the value in these alternative solutions in contrast with the debilitating impacts of an intuitive but nonetheless incorrect approach is extraordinary.
Development teams must be able to move fast and do so safely. That requires tooling and approaches that do not impose arbitrary speed limits, that instead maximize automation and provide clear signal while eliminating spurious noise.
Correct solutions act like guardrails that go largely unnoticed and unappreciated until, suddenly, they are very highly appreciated. Instead of a fatality, you simply scrub off a little speed before getting back on the throttle. With the right choice, there are no speed limits, no traffic lights, no toll booths; just open road.
Development at Top Speed
Every team developing applications today has a software supply chain whether they use that moniker or not. And, those supply chains are extraordinarily complex. Nearly all software is a vast assembly of open source parts glued together with a much smaller body of custom code, which itself might also be open source. When it comes to optimizing and controlling that supply chain end to end, seek solutions that were built with development speed top of mind. Speed has been and continues to be a primary design criteria behind both Nexus and CLM. They are a team’s license to speed.
Latest posts by Mike Hansen (see all)
- Nexus Firewall – Quality at Velocity - February 1, 2017
- The Reports of Agile’s Death have been Greatly Exaggerated - June 22, 2016
- The Low Cost of High Caliber Developers - March 7, 2016
- The Nexus Firewall – Perimeter Defense for Software Development - November 18, 2015
- Software Supply Chain Automation - May 13, 2015