Why UIs Get Worse Over Time (and how LLMs give us a way out)
GUIs get worse over time. Companies may hire better designers or PMs to avoid this, but their GUIs will inevitably become harder to use anyway. Competition pushes companies down a path that all but guarantees this outcome.
In this post, we'll see exactly why this happens. We'll also briefly look at how LLMs will help software companies and their users escape this unfortunate dynamic. The way out isn't to replace GUIs with chat interfaces. That won't happen. Instead, we'll see how LLMs give us a compliment to point and click navigation that will help us create better GUIs long-term.
Why GUIs (inevitably) go Bad
Remember Twitter's old UI? It looked like this:
Compare that with their current UI:
Which UI is better? It depends! But, all other things being equal, a GUI gets worse with every additional affordance we're presented with that we won't use. If we just want to see Jack's tweets, the current UI is worse because it's cluttered with things we don't care about.
This is the design insight that drives folks to implement "distraction free" GUIs in their products. I'm using one now as I write this in Hubspot. Twitter Blue's "Reader Mode" is another example. Zen Mode in VSCode is one more.
It's also the insight underlying progressive disclosure, the idea that advanced functionality should be initially hidden from users. The "More" button on the left-justified Twitter navigation menu above is a good example of this. We often won't use whatever is buried in that menu, so better to hide the options away than to distract us with them.
If companies understand this design principle, why can't they just give us GUIs that contain only the affordances we need? The answer, in a word, is "capitalism," but let's unpack this, starting with a recent complaint about Postman:
Lol did they take VC money?— Vance Lucas (@vlucas) July 18, 2023
VCs are blamed here, but they merely accelerate the competitive pressures that push companies to make their GUIs worse. Even without VCs, over time, all software becomes a commodity. At some point, we'll be as excited about ChatGPT-like software as we are about toasters.
Companies respond to this in two ways. They:
- Look for moats outside of the software they've built. E.g., economies of scale, network effects, brand, stickiness/high-switching costs, etc.1
- Keep innovating / building more features
These two responses together are poison for good GUIs. Theoretically, a company could build new things without trying to cram them all into a single GUI, but if they segment their innovations into separate GUIs, they lose out on the crucial advantages of economies of scale, network effects, and stickiness. By contrast, stuffing everything into a single GUI enables them to sell their users on new features so they can amortize their customer acquisition costs across multiple revenue supporting feature sets / products. Of course, this feature stuffing ruins any hope of having a GUI with only the features we will use.
Progressive disclosure can help mitigate the problem of crowded GUIs, but it's very hard to get right. In the best case, it lets users access a GUI that contains only the affordances they need for a particular task, but this access is only possible if users have internalized the company's particular way of categorizing features. They have to think about their work using the same terms the company uses on the navigation items that lead them to the desired screen. Even with the best designers, there will often be a mismatch between the terms users have for describing their work and the terms used in the product.
I was just burned by this while using Hubspot the other day. I needed to edit my calendar availability for our customer interviews. (Grab a time here if you want to chat 😉). These interviews aren't sales-like at all. They're just conversations, which is why I selected "Conversations" in the top-level nav. Turns out the functionality lived under "Sales." Again, this is hard to get right. No wonder we're all so desperate for ChatGPT to replace GUIs.
But let's suppose the design team is super-human and they nail the terms 80%2 of folks use to describe their work and build navigational elements using terms that allow those users accomplish their tasks with the smallest amount of confusion and clicks. Obviously, the 20% of users who think differently are worse off than they would be if the product was simpler and had a simpler GUI, but as the product grows, even the users in the 80% will still need to do a fair amount of reading and clicking to get to their destination and obviously, this isn't as nice as a GUI with only the affordances they need to accomplish their tasks.
LLMs: A New Hope
Although LLM-powered chat interfaces won't replace GUIs for the bulk of the work we do, LLMs do offer us a new way of interacting with GUIs that side-step the above issues with over-stuffed GUIs and the inevitably imperfect progressive disclosures we build. With LLMs, users can describe their tasks in their own language and we can route them to correct place in the application.
Here's what this could look like running on top of Amplitude:
The example is focused on a data querying use case, but we can do this for any task in any piece of software whether we're editing calendar availability in Hubspot or trying to find the hidden "deactivate X/twitter account" button.
These experiences are the future of how we make GUIs usable in spite of the competitive pressures software companies face, and they will replace how-to docs, explanatory videos, and the canned walkthrough experiences we bolt on to our apps to bolster usability. Build these experiences yourself or sign up to be an ATLAS design partner, but don't be naive: the law of bad GUIs says that as you seek to stave off competition by adding more to your product and holding on to the essential advantages of economies of scale, network effects, and stickiness, your GUI will eventually degrade.
1. 7 Powers is an excellent look at all of the ways in which companies try to stay competitive over time.
2. How often does the 80/20 rule actually apply to this aspect of product design? I suspect it's less often than we'd care to admit.