Do not replace programming complexity with infrastructure/logistics complexity
I find that it is very useful to view computing infrastructure as being mere commodity. It wasn’t that long ago when computing…
I find that it is very useful to view computing infrastructure as being mere commodity. It wasn’t that long ago when computing infrastructure was considered anything but commodity. It was only 40–50 years ago when only big wealthy corporations, government, and military were able to afford sophisticated computing machinery.
Fast forward to today, and we see that computing infrastructure has undergone the process of intense commoditization. Cloud computing is one example, smart phones we all carry around is another example, IoT products are yet another example, and so on.
Old habits die hard
Since the ubiquitous commoditization of the computing infrastructure is a relatively new phenomenon, it appears that in many people’s minds the idea that having access to sophisticated computing infrastructure spells privilege, still persists. That sentiment is most likely due to inertia. Societal change almost always seems slow on the uptake.
It is helpful, I think, to try to wean ourselves off that antiquated view that computing infrastructure is giving us competitive advantage. At this point, having sophisticated computing infrastructure at our disposal is in no way more competitive to our business that it is having electricity at our disposal. Electrical wiring, once considered the privilege of advanced, developed nations, has now penetrated all corners of the earth. Same thing is happening with the computing infrastructure. It is by now, in the year 2022, ubiquitous and pervasive.
How do we offload processing to the underlying infrastructure?
If we hold computing infrastructure in high regard and claim that it is giving us a leg up regarding our competition, we are clouding our judgment. We then tend to gravitate toward leveraging that infrastructure, and doing that often results in pursuing false economy.
What at first may seem like an easy-peasy, ‘no brainer’ solution if only we push as much of our processing logic out to the computing infrastructure, is more likely than not going to come back to bite us.
Let’s look at one example: we are working on a repository that contains the source code of our product. As we continue making changes to our source code, we use the underlying computing infrastructure that specializes in the processes related to the version control.
Each time we commit some changes to our source code, the version control tool (most likely git) will record that commit as a fact. The tool has the ability to allow us to easily navigate various versions, to hop into a time machine, travel to the past, and examine prior events to our heart’s content.
Nothing wrong with that, of course. As a matter of fact, it is a very powerful, almost indispensable capability. Brought to us by the commodity we call “version control tool”.
Where does the trouble begin? Well, we may become enamored by that commodity, and we may then start looking into leveraging it further. For instance, we could figure out how to make changes to our source code without having to worry about clobbering the changes someone else may be making to the same source code repo.
A sophisticated piece of computing infrastructure, called version control, enables us to do that. How does it work? Simple — before making any changes to the source code, we create a branch. Creating a branch basically means the tool is creating a local copy of the source code repo for us. That local copy of the source code is fully isolated from the source code repo.
Now we can safely proceed messing around and making any changes we deem interesting. We can do it knowing full well it will be impossible to step on anyone’s toes while we’re experimenting with the source code. Why is it not possible to clobber other people’s work? Because we are making changes in full isolation.
All fine and dandy. Where is the problem?
Well, the safety net that branching seems to offer to us is deceptive. We may think that now we’re safe from stepping on anyone else’s toes, but, as Tomas Hobbes famously noted, “hell is truth seen too late.”
Why is it advantageous to demote computing infrastructure to the mere commodity?
From the above we’ve seen how seductive it may be to fall back on the seemingly comforting features of the computing infrastructure. But it is often a myopic decision, as when doing that we don’t seem to stop and think things through.
So, what is the “truth seen too late” in the above case?
We often overlook to complexities that result from working in isolation. Yes, at first is seems safe and sound to lock ourselves into a cave and make changes to our heart’s content without being concerned about our team mates/coworkers. But sooner or later, those changes made in the safety of our local, isolated branch, will have to be brought back into the fold.
One thing we also seem to overlook when relying on the comfort of branching is that each time we create a branch, we are creating a new version of our product. How many versions are there? I sometimes see teams who maintain dozens, even hundreds of versions of their product! That level of complexity is certainly mind boggling. How do we know which version is the true version? What is the meaning of maintaining more than one version? Do we even know what we, as a team, are doing?
Those are all tough questions. They are tough to answer. Ask teams who are rolling with many branches what is the reason for that, and you will see how quickly they assume a defensive stance.
In addition to that complication (i.e., maintaining more than one version of the product), a potentially much bigger complication arises when we attempt to merge one of those isolated branches into our source code repository. What is often bound to happen when attempting to merge isolated branch with the source code repository is what is known as merge conflicts.
Those merge conflicts could be small, but often they are sizeable. And they can stop the entire team dead in their tracks. Sometimes it takes many hours, even days, to resolve those conflicts.
So, we see that we start from a comfortable position of working freely in isolation, only to end up creating conflicts. How is that desirable? Shouldn’t such way of working be outlawed? Why do we think that our team members deserve to be pushed into a conflict, simply because we feel more comfortable working in full isolation?
It is time, therefore, to abandon our fascination with the computing infrastructure, demote it to mere commodity that must recede into the walls (the way electrical wiring and indoor plumbing did), and focus on the proper way of working.
Why is that advantageous? Well, as we’ve seen from just one simple example, focusing on the computing infrastructure, such as git, tempts us to lean on it full bore and use it as a safety net. In the process, we end up creating conflicts, which is not something professional engineers, or any other professional staff, should ever do to their coworkers.
Instead of relying on the commodities offered to us by the computing infrastructure, we should strive to organize our workflow in such fashion that it enables the flow and does not create speed bumps and conflicts. We can do it, because we are engineers, we know how to design and implement systems. Let our programming logic dictate how the changes are handled, instead of letting some stupid tool (git) make those decisions for us and in the process painting us into a corner.