Friction in software design and interaction
Jan 4 2025
I’ve been reflecting on the topic of friction in software design and interfaces for a while now. I believe it started when I read this passage from the paper “Envisioning Information Access Systems: What Makes for Good Tools and a Healthy Web?” by Chirag Shah and Emily Bender:
Most systems are designed with the idea that they are supporting users who prefer to put in as little effort as possible when accessing information. This is certainly a desired characteristic of an IA system. However, we argue that it is not always advisable to minimize this effort, especially if it takes away the ability for a user to learn, for instance due to lack of transparency. We believe, and as others have supported, that certain amount of ‘friction’ in an IA process is a good thing to have.
The paper’s focus is on search and information access, but one could extrapolate it to any user-facing technology, and I’ve been especially concerned by its implications towards software and user-interface design. The paper they cite by Jeremy Pickens is a good read as well. However, I don’t think either paper truly captures the scale of the problem, which precedes AI.
Friction is impressed upon developers and software designers as something to avoid and design away first. At least, that’s how I understood it (even though I hadn’t mapped the lexical item friction to this concept) when I learned programming and app development by myself. Formally trained UI/UX designers, or those who’ve worked in the area for years clearly realize its utility and value. However, I’d contend that eliminating friction, not making users read/think/wait, and designing easy-to-use interfaces that delight users is the overwhelming default mode of dealing with friction in software design.
The Verge’s interview of Google CEO Sundar Pichai back in May 2024 exemplifies this thinking. Pichai acknowledges the trade-offs inherent in providing users with instant LLM-generated answers at several points — but that hasn’t stopped them from stuffing AI answers and summaries everywhere. Choosing not to include AI summaries, or even introducing the least friction into the process, would promote a healthy web ecosystem and have other social and individual benefits in the long term (as Shah & Bender argue in their paper). But someone else would just do it and threaten Google’s dominant position in tech. The Big in Big Tech isn’t just a modifier — it’s fundamental to what these companies are and how they operate.
Perhaps the default way of conceptualizing friction in software design served users and society well until the early 2010s. But the smartphone, ubiquitous internet, social media, and now AI have changed how I think about it. Sometimes you have to frustrate individual users for the well being of the community. Unfortunately, the default path is heavily incentivized, since there’s a lot of potential money (and power) at the end.