Global Column | Time Tracking Is Ruining Developers and Software Development

As you learned in high school history class, the Industrial Revolution gave birth to machine-centric manufacturing processes that greatly increased production and lowered manufacturing costs, raising the standard of living for everyone. It has also brought about changes in social structure, greatly expanding the middle class and giving rise to the idea that technology can improve our lives.

One of the changes brought about by the Industrial Revolution was the way we work. The concept of economies of scale led to increasingly larger factories. As industrialization progressed, we realized that productivity could be easily measured. We could easily observe the worker tightening the cap of the toothpaste tube, and calculate the number of times the cap was successfully attached. Efficiency experts like Frank Gilbreth, famous for ‘more for less,’ have found ways to make factories run more efficiently.

ⓒ Getty Images Bank

Some argue that the day the ENIAC computer was delivered to the U.S. military marked the end of the Industrial Revolution and the beginning of the digital age. Since that day, the software development industry has struggled to effectively manage software development.

The source of the concern came from “fighting the last war.” A well-known problem in the military is that when preparing for the next war, they spend too much time looking at the last war and end up relying on outdated technologies and strategies that may have been effective in the past but ultimately become ineffective. Compare the development of highly accurate rifles and assault tactics. The same goes for the software development industry.

This trend is proven in several ways. We’ve all seen factory workers blowing whistles, people walking past the factory entrance “on the clock,” and managers walking around the floor to make sure the caps on shaving cream cans are placed efficiently. This made some sense. Workers were evaluated by their production, and production was easily measurable. The number of workers on the assembly line for a standard shift was a good indicator of success. More time on the assembly line meant more production.

software made with butt

Unfortunately, we interpreted this as time spent sitting with our butts in the chair in the software development business. This phenomenon is most recently seen in companies forcing employees to return to the office. Like workers on a factory floor, software developers sit at their desks, tapping away at their keyboards, wondering exactly what to produce? cord? function?

The success of a driver factory is measured by the number of drivers it can produce given a certain amount of input. In this way, it seemed reasonable to measure what software developers produce. We all know how disastrous measuring the number of lines of code you’ve written can be. But what exactly do software developers produce? Most of the time we have trouble measuring individual developer productivity because we can’t really answer this question.

The managers seemed to think that something calculable was happening at the partitioned desks, and that the more time developers spent there, the more likely they would be to produce whatever was being produced.

The terrible pandemic also had one good thing: it was a wake-up call for these old-fashioned managers. Sometimes the best a software developer can do is stare at code for three hours and type for 15 minutes. Or maybe a developer wants to get rid of 1,300 lines of spaghetti code and replace it with 200 lines of elegant, good solution. Or maybe a developer spending a week building something the ‘right way’ is time well spent because it saves them maintenance time down the road.

Perhaps the worst legacy of industrialization is the concept of time tracking. Managers feel a strong urge to measure something, and ‘time spent on task’ becomes a powerful sparkle that is difficult to resist. Therefore, development teams often need to track how long it takes to fix bugs or complete individual tasks. These times are then compared across developers, providing a means to measure productivity and judge success. Is a shorter average bug fix time a good thing? Or is it a bad thing?

worst indicator

Time tracking for software developers is, to put it mildly, terrible. No two bugs are the same, and while inexperienced developers may be able to fix easy bugs quickly, experienced developers who are typically tasked with more difficult problems may take longer. Or, a junior developer may have been assigned a bug that he or she cannot handle. What’s worse is that time tracking encourages developers to gamify the system. Developers who worry about how long a task will take will try to avoid tasks that may take longer than the ‘expected’ time and all sorts of ‘unproductive’ activities.

Shouldn’t we acknowledge that there is no way to determine how long or how long a particular unit of software work will take? Having to consider every moment of the day only creates a bad incentive to save time. It can also make smart, capable developers feel insecure or worry that it’s “taking too long” to solve a difficult problem that “just needs to change one line.”

We know how long it takes to label a jar of mayonnaise. However, no one can accurately predict how long it will take to fix a minor bug or implement a new feature. Forcing people to track time for tasks whose time is uncertain has many disadvantages and no advantages.

Digital products are not physical products. The software is a completely different product from previous products. It is fundamentally wrong to think that we can create, manage, and run software using the same skills we learned during the Industrial Revolution. Fortunately, more and more organizations are no longer preparing for the ‘last war’ and are realizing that what works on the assembly line of a car factory does not work on the software development floor.
editor@itworld.co.kr

Source: www.itworld.co.kr