How to measure developer productivity — and how not to
I cringe when development teams tell me that their organization makes them capture hours in Jira Software, Azure DevOps, or whatever agile management tool they are using. Capturing hours is understandable and often required for professional services organizations, which often charge clients on a time and materials contract. But for most organizations, selecting accounting metrics to quantify development productivity is an antipattern carried over from command and control management practices.
There is a long list of other production metrics that are also problematic. Calculating productivity by lines of code completed? That’s a good way to encourage development teams to bury the organization with technical debt from bloated codebases. Measuring the numbers of user stories or story points delivered? That’s just asking developers to break down features into more stories or inflate their story point estimates.
Determining the number of releases completed is a step in a better direction, but that metric also requires clarification. Production releases that introduce defects, cause major incidents, or deliver poor end-user experiences must be tagged differently than successful deployments.
Clearly, measuring software development productivity is fraught with challenges. One of the experts I spoke with about them, Sagar Bhujbal, who is VP of technology at Macmillan Learning, warned that choosing the wrong metrics can hurt team morale:
Developer productivity should not be measured by the number of errors, delayed delivery, or incidents. It causes unneeded angst with development teams that are always under pressure to deliver more capabilities faster and better. Instead, provide psychological safety and continuously align the operating framework and improve engineering practices.
In this article, we’ll examine how productivity measures can be applied to software developers and development teams not only to improve (and not harm) their productivity and morale, but also to improve business outcomes.
Don’t use development productivity metrics to evaluate performance
I’m a strong proponent of capturing development and devops KPIs and metrics to drive improvements in process, quality, and yes, productivity. The main concern is when organizations equate productivity measures with team or individual performance objectives. The problem is that equating productivity with performance without proper context often leads to undesirable behaviors, such as:
- Developing more capabilities that drive little business impact.
- Releasing more software without factoring risk, security, quality, and operational impacts.
- Innovating without a realized path to production for successful experiments.
Therefore, whenever I am speaking to human resources or discussing the overall performance of a software development group, I strive to cover a broader set of measures beyond productivity.
Use development productivity measures to help the business
So where can productivity measures be applied to software development teams to improve business outcomes? Specifically, productivity improvements should help businesses grow revenue, improve end-user experience, increase quality, lower costs, enable innovation, deliver strategic capabilities, improve collaboration, drive efficiencies, simplify access to information, or reduce risks.
Software development (and IT in general) is often a contributor to these business outcomes, and the outcome-based KPIs should set the context for any software development (and IT) metrics. The operational side of these KPIs should be metrics on how effective the team has been delivering against stated values and standards.
There is a growing body of knowledge on these metrics. Here are some examples:
- Software development metrics that matter today include agile metrics like cycle time and team velocity and production metrics like mean time to repair.
- Some clever productivity metrics focus on behaviors such as code reviews completed, documentation written, and conversations had with others that capture productivity, quality, and collaboration.
- Comparing value-add activities such as software design, testing, and integrations versus non-value-add activities like configuring environments, squabbling over implementation issues, or writing new code to figure out how existing code works.
When considering some of these productivity measures, it’s worth evaluating the deltas versus the current metrics. When teams improve velocity and cycle times, or reduce the defect escape rate and mean time to repair, then the team’s productivity is likely improving.
Use development productivity measures to improve productivity
One of the goals of measuring productivity should be optimizing investments that deliver productivity improvements. Investments that help software development teams focus more time and energy on solving prioritized business problems fall into this category. Some examples:
- Automating deployments with CI/CD instead of executing them manually.
- Standardizing environments with infrastructure-as-code (IaC) and containers to minimize manual configurations and reduce defects tied to system differences.
- Automating regression tests so that developers fix bugs during the development process and spend less time investigating and fixing defects found in production.
Macmillan Learning’s Bhujbal notes the importance of team behaviors and tools that impact productivity. “Intangible measures like less friction and seamless collaboration with business departments, and optimizing tools will ultimately help achieve the desired and timely outcomes.”
Productivity metrics should focus on business outcomes
Martin Davis, a seasoned CIO with Southern Company Services, also stresses focusing on business value and outcomes:
The more IT can use business metrics to measure its productivity, the more easily senior management can understand the value. I would always try to measure our development productivity in terms of added functions or business problems solved and ideally try to get dollar figures for the added value. By doing this, you can turn the conversation from IT as a cost to IT as a value provider.”
Ramki Ramaswamy, VP of IT, technology, and integrations at JetBlue, agrees with Davis that development productivity should be a factor of value delivered, and he adds risk removal as a second factor. “Development productivity is usually reported in hours or defects or dollars,” he said, “but it should ideally be measured in value (return, opportunity, operational savings) per dollar spent versus risk (security, opportunity) of doing nothing.”
The advice of Ramaswamy and Davis is consistent with what I feel is the most important agile KPI. Ultimately, teams should first measure business outcomes and then qualify them with metrics that illustrate targeted behaviors. Presenting KPIs that combine outcomes and delivery qualifications help answer the who, what, why, and how to deliver software of high quality. These KPIs are far more meaningful than focusing on a bunch of low-level metrics.
Once these KPIs are defined, technology and process improvements gain a lot more context and meaning for both software developers and the business. For example:
- A development team asked to improve customer satisfaction metrics as a business outcome may elect to focus on improving the mean time to repair issues and the defect escape rates.
- As development teams demonstrate improvements to these metrics, they should express where they invest in productivity improvements such as automation, developing documentation, or reducing technical debt.
- As development teams deliver improvements, they should measure customer satisfaction and report on their selected metrics.
KPIs that combine business outcomes and developer productivity metrics help answer the question, “Is the team delivering on prioritized business outcomes while improving their productivity?” Given that teams can’t improve everything at once, organizations should look to champion teams that make prudent decisions that deliver outcomes while improving productivity.
Copyright © 2021 IDG Communications, Inc.