This is the latest issue of Engineering Enablement, a weekly newsletter covering the data behind world-class engineering organizations. To get articles like this in your inbox every week, subscribe:
This week,
, DX’s CTO, is diving into the newly released 2024 State of DevOps report from DORA. For those unfamiliar, DORA is a long-running research program focused on helping engineering teams improve software delivery. Each year, they release a report analyzing the capabilities that drive software delivery and organizational performance.This year’s report covers the impact of AI tools, interesting trends in throughput and quality, and how platform engineering and transformational leadership can influence performance.
Here’s Laura.
I look forward to the DORA report every year. When I first get it, I usually scroll through to the charts and graphs to see if there’s anything really surprising. Then I read it cover to cover. It's long and thorough, so if you're just interested in the highlights, here's my list:
AI helps individual productivity but hurts software delivery performance
Measures of software delivery throughput and quality are continuing to move more independently
Overall software delivery performance seems a bit weaker when compared to last year
Invest in systems and processes that help developers execute independently (documentation, self-serve platforms, etc)
Developer platforms might slow down delivery overall, but they do boost individual and team performance
AI introduces risk, but not because of garbage code
In 2024’s report, for the second year in a row, DORA research shows that using AI tooling actually worsens software delivery performance. This is the area of research that I was most curious about, because 2023’s report also shared some findings that went against the grain of what was being reported elsewhere in the industry.
But the reason isn’t necessarily what you might expect. “AI code is garbage, of course it breaks.” While it is true that many respondents do not trust code generated by AI (39.2%), the reason behind the AI tooling and worsened software delivery performance correlation is not that; it’s because batch size seems to increase when AI is used in the coding process. And bigger changesets are riskier, something that DORA’s research has long supported. It’s just easier to write more code with AI.
Tradeoffs and predictions
What I found most interesting about the AI buzz is that it’s contributing to operational process stability. Adopting AI is a clear priority, even on teams who are used to operating in a world where everything is urgent and priorities shift constantly. So the "drop everything to work on AI" has at least given some companies more operational stability.
But AI is a story of tradeoffs. One perhaps counterintuitive finding is that adoption of AI tooling actually results in less time spent on valuable, meaningful work. However, the amount of toilsome work (meetings, admin overhead, busywork) remains mostly unaffected. This makes sense though: the most common use case for AI is assisting with coding tasks. I’ve never met a developer who wanted to spend less time coding. We get time back because our meaningful work gets completed faster—not because we can get the robots to do the unsavory parts of our jobs.
Documentation is likely the biggest growth opportunity for AI when it comes to potential impact. DORA’s research has long shown the correlation between documentation and performance, and adding AI to this problem space can accelerate positive results. It’s not totally clear if using AI helps us generate better documentation, or if AI just makes it easier to work with bad documentation. But DORA estimates that if AI adoption increases by just 25%, we should expect a 7.5% increase in documentation quality—how reliable, up-to-date, findable, and accurate it is—the highest of all the factors in their prediction.
Updated performance clusters for software delivery performance
On the software delivery side of things, something interesting is happening with Change Failure Rate, one of the four key metrics to measure software delivery performance (the others being deployment frequency, lead time to change, and time to recover from a failed deployment).
For a few years, there has been some evidence to show that quality and throughput are moving more independently. And this year, the medium performance cluster actually has a lower change failure rate than the high performance cluster, which is unusual. In previous years, all four key metrics tended to move together.
This year, the DORA team had to make an important choice when assigning rank to the clusters, choosing between a group that deploys more frequently with more frequent failures, or a group that has fewer failures while deploying, but deploys less frequently. This year, both of those groups report recovering from failures in less than one day, another interesting feature of 2024’s clusters. In the end, deploying more frequently, albeit with more failures, was designated high performance, while the slow and steady approach fell into the medium performance category.
The change in distribution across performance clusters from 2023 is also something I’m curious about. The elite cluster remained mostly the same, while the high performance cluster shrank significantly, from 31% of respondents in 2023 to just 22% in 2024. Meanwhile, the low cluster represents 25% of respondents this year, up from 17% last year. And it’s not a case of raising standards: the low cluster is actually performing worse in both deployment frequency and change lead time compared to last year, but has improved in change failure rate and failed deployment recovery time.
Overall, delivery seems a bit worse for 2024. The last 18 months have been fairly tumultuous in many companies, so I'm not surprised to see the impact of of our macro-economic situation show up this way.
Platform engineering: productivity booster, centralized controls, or both?
Aside from AI and software delivery, this year’s report went deeper into topics around platform engineering, developer experience, and transformational leadership.
Platform engineering, and specifically the adoption of internal platforms, are correlated with a boost in both individual productivity and team performance. But, they can slow down throughput and cause additional instability. Still, organisations who use platforms are shown to deliver software to users faster overall, and have higher operational performance.
There are probably a few reasons here, one which was mentioned on the DORA community thread by James Brookbank: "We very rarely see companies doing platform engineering primarily for developer productivity reasons." So can we increase security and governance while also improving developer productivity? Generally, yes. Even with tradeoffs, orgs that adopt a platform engineering model are still better off.
To perform well, know your users
Critical to the success of an internal platform is user-centricity, or seeking to understand what your users are going to do with the software you build. In this case, internal developers are the users. It’s important to think not just about what tasks they are trying to complete, but what their goals are. Sometimes this is called a “platform as product” mindset. Developer independence is another key factor. Can developers get their work done without having to wait for an enabling team?
User-centricity was again a main feature of the discussion on developer experience. This year’s report also featured interviews as a qualitative research method, used to enrich and triangulate the quantitative data collected in the DORA survey (yes, all this data is self-reported, even the software delivery metrics). In the interviews, some respondents share more about why user-centricity impacts their work, and how they derive value from what they do all day.
Finally, transformational leadership is called out as a key factor for high performance. In short, leaders should have a clear vision and support their team members. Depending on your own personal experience, it may or may not be surprising just how much these basic traits impact performance: decrease in burnout, increase in job satisfaction, increase in team, product, and organizational performance.
How should you use this data?
The whole point of DORA's research is to help you get better at getting better. The data presented here is not meant to be a maturity model, or something to measure against once and then forever chase an unchanging target.
DORA performance clusters are not static, and the definitions of elite, high, mid, and lower performers change each year based on respondent data. DORA looks at the data to define these clusters, not that DORA defines the cluster performance thresholds and then sees how many respondents fit into one category. This is why the definitions change each year – and another reminder that using these DORA benchmarks should be a continuous process, not just an assessment that is done once. These clusters are not a maturity model.
One thing I always keep in mind when reading a benchmarking report like DORA is that high performance is an horizon to chase, a never-ending story, not something that is ever finished. While some folks may find the lack of a finish line demotivating, I enjoy the journey of getting better. And at its core, DORA is an organization focused on giving you more data and information so you can get better at getting better.
The 4 key metrics from DORA can be useful metrics to help you keep track of progress. I wrote a guide on how to think about DORA metrics and other developer productivity frameworks here.
Who’s hiring right now
Here’s a roundup of Developer Productivity job openings. Find more open roles here.
Capital One is hiring multiple Product Manager roles - DevEx | US
Picnic is hiring a Technical Product Owner - Platform, and DevEx Coach | Amsterdam
Shopware is hiring a Technical Delivery Manager - Cloud Infra | Berlin
SiriusXM is hiring a Staff Software Engineer - Platform Observability | US
Adobe is hiring a Sr Engineering Manager - DevEx | San Jose
That’s it for this week. Thanks for reading.
-Abi