Developing Trustworthy Software Tools
The key components of trust formation when developers are adopting tools.
This is the latest issue of my newsletter. Each week I cover the latest research and perspectives on developer productivity.
This week I read Make Your Tools Sparkle with Trust: The PICSE Framework for Trust in Software Tools, a paper by Brittany Johnson and Microsoft Researchers Christian Bird, Denae Ford, Nicole Forsgren, and Thomas Zimmerman. For teams looking to increase adoption of their internal tools, this paper describes the factors that influence whether developers trust tools and ultimately adopt them.
My summary of the paper
Engineers are often enthusiastic about new tools that are designed to help them with their daily tasks. However, leaders still struggle to connecting these tools with the engineers they are meant to support and as a result, many internal tools go unnoticed or unused.
One reason for this centers around trust: prior research has found that developers only adopt tools they trust. Therefore, it is possible that tools which are going unused may not be perceived as trustworthy.
The focus of this study was to identify the factors that influence engineers’ trust in software tools. To achieve this, the researchers conducted interviews with engineers both within and outside of Microsoft to gain insights into the day-to-day tasks of engineers, their perspectives on trust while collaborating with fellow engineers to develop and maintain software, and how they perceive trust in the context of utilizing tools.
For the purpose of this study, the researchers defined a software tool as any technology that supports software engineering. This includes both AI-assisted tools and traditional tools, which discussed later in this paper.
PICSE: a framework for trust in software tools
The researchers organized the factors that engineers consider when building trust into five categories. These categories make up the PICSE framework for trust in software tools:
Personal: internal, external, and social factors that impact trust
Interaction: various aspects of engineer’s engagement with the tool that impact trust
Control: factors that impact trust as it pertains to engineers’ power and control over the tool and its usage
System: properties of the tool that impact trust
Expectations: meeting engineers’ expectations that they have built impacts trust
The rest of the paper describes each category in more detail.
(P) Personal factors represent the intrinsic, extrinsic, and social aspects of tool adoption and use that impact trust. This includes community, source reputation, and clear advantages.
The community of users (or lack thereof) behind a tool can impact trust. Some developers prefer to use tools that their peers use. In general, having a community of users publicly available provides current and potential users with a way to easily ascertain use cases, success stories, failures, and other relevant information regarding the tool.
The reputation of the individual, organization, or platform associated with their introduction to the tool can impact trust. The most prominent aspect of this factor among engineers in the study was the individual that they learned about the tool from.
The ability for engineers to see the clear advantages (benefits) that come with using a tool has an impact on trust. Engineers often get this information by reviewing testimonials or stories from other users.
(I) Interaction factors pertain to considerations engineers make regarding the kind of support and outcomes they expect from their interactions with the tool. This category includes factors such as contribution validation support, feedback loops, and educational value.
The contribution validation support factor refers to mechanisms within a tool that confirm aspects of a contribution such as its correctness, fit, or quality. Engineers have increased trust in tools that support quick and easy validation of the tool’s contributions or recommendations.
The feedback loops factor refers to how well the tool takes users’ preferences and needs into consideration. Engineers have increased trust for tools that show a level of care towards the user.
Engineers find value in tools that add education value. For example, if the tool makes or recommends contributions that either the engineer themselves would not have thought of or improves on their solution, this builds trust.
(C) Control factors are considerations tool users make regarding their ability to make the tool experience what they want and need it to be. The factors in this category include ownership, autonomy, and workflow integration.
Engineers may have increased trust when they have some ownership over the tool being used.
The extent to which engineers have autonomy over the integration of contributions impacts trust. For example, engineers may feel comfortable with using a tool that automatically contributes code to a code base if they can review the code and see what’s happening.
Workflow integration refers to how well a tool fits into the user’s existing workflow.
(S) System factors are considerations tool users may make regarding properties that a tool does, or does not, possess when determining trust. This includes ease of installation and use, polished presentation, safe and secure practices, correctness, consistency, and performance.
It helps if a tool is easy to install and use. This includes having easily accessible and useful documentation for getting started with a tool.
Engineers value a polished presentation. When tools look like the developers paid attention to detail, this leaves a good first impression. If they see visual inconsistencies or things that are broken, they may think the tool is poorly made and therefore less trustworthy.
Most engineers feel strongly about whether a tool follows safe and secure practices. Engineers appreciate clear privacy policies.
Correctness refers to the accuracy of the contributions made by the tool, and consistency refers to the tool’s functionality and outcomes.
Trust is higher for tools that are performant.
(E) Expectation factors represent tool users’ considerations regarding expectations they have built from their own experiences and would like tools to consider. The factors in this category include:
The transparent data practices factor refers to when there is visibility into where data is coming from within a tool and how the data users contribute will be used.
Goal matching conveys the importance of making contributions that map to the goals of the engineer using the tool at the time they are using it.
The meeting expectations factor is straightforward. Tools should be explicit and upfront about what the tool can and cannot do in order to build trust.
Applying the PICSE framework in practice
This framework provides insight into the considerations that engineers make when determining if and to what extent they trust and adopt a tool. Some of these factors are things that tool owners can influence to drive adoption. For example, a tooling owner may focus on building or highlighting the community around a tool. They may decide to focus on improving the ease of installation by reducing the number of steps involved in setting up the tool, or they may focus on the tool’s polish by improving aesthetics, flow, or usability.
This framework can apply to any kind of tool. However, the findings suggest that factors such as safe and secure practices and contribution validation support become more important with AI-assisted tools than they are with traditional tools.
While this paper is especially relevant for AI-based tools, it provides a new lens for understanding adoption challenges with any kind of tool. Tool owners may consider using the PICSE framework to evaluate and diagnose areas where they could make improvements in order to build trust and drive adoption.
That’s it for this week! If you’re finding this newsletter valuable, share it with a friend:
Thanks for reading Engineering Enablement! Subscribe for free to receive new posts and support my work.