MARK LINGONQVIST
Senior Human Factors Specialist | Autonomous Systems & Semiotics
Alum University of Oxford, University of Aberdeen; ex-AKQA, NHS
Based in Sweden & the UK
Responses are limited to referrals.
Why I Stopped Designing for People and Started Designing for Machines
At some point, design - as a profession - started lying to me.
It didn’t happen all at once. The deceit came quietly - dressed in clean typography, polite language, and rounded corners. Every dialog box promised to “help,” every button said it was here to “guide.” But the more I looked at these interfaces, the more I saw the same pattern: design had stopped being honest. It had learned to smile while it manipulated.
Consumer design doesn’t trust people anymore. It assumes we’re idiots. It hides the truth behind gestures, microcopy, and endless apologies. It persuades instead of explaining. The button that says Continue isn’t asking; it’s commanding. The system already knows where it wants you to end up - it just needs to make you feel okay about it.
There’s a kind of corporate civility in that, a white-collar way of avoiding confrontation. Nothing’s ever your fault, nothing’s ever direct, and everything’s wrapped in politeness. You don’t design actions anymore, you design “flows.” And if the flow breaks, you fix it by adding more politeness - a tooltip, a loading animation, a smiley in the empty state. It’s not communication; it’s customer service cosplay.
I grew up around blue-collar people — people who worked with things that could break, burn, or cut. When a problem showed up, they dealt with it head-on. You couldn’t bullshit a hydraulic line or sweet-talk a seized bearing. Confrontation was part of the work. There was no UX language for it, no "guidelines for empathy." You said what was wrong, you fixed it, and you got back to work.
That kind of honesty doesn’t exist in consumer design anymore. Everything’s designed to be “frictionless,” which is a polite way of saying “you’ll never notice what we’re doing to you.” We talk about trust, but what we really mean is compliance. And over time, the software starts to feel like management - always friendly, always “checking in,” always reminding you that it’s for your own good.
So I left.
I stopped designing for people in the consumer sense — the scrolling, swiping, micro-dosing dopamine sense. I started designing for machines, for operators, for environments where ambiguity gets people hurt. It’s not glamorous. You don’t get to A/B test a safety-critical support system. You can’t run a dark pattern in a cockpit. But in those worlds - HMI, control systems, high-performance GUIs - honesty still matters.
You can’t hide intent behind animation when someone’s controlling 40 tonnes of autonomous machinery. You can’t “delight” someone out of a system fault. You can’t fake precision. In these systems, the design either works or it fails. There’s no gray area, no “we’re rolling out improvements gradually.”
And the thing is - it feels better. Cleaner. Designing for machines makes you responsible again. It reminds you that clarity isn’t decoration; it’s ethics. You stop designing to please and start designing to prevent catastrophe.
I used to think empathy was the highest goal of design. Now I think it’s truth. Empathy can be manufactured. Truth can’t.
Working with machines brought me back to that - the part of design that still feels like craft, that still demands you know what the hell you’re doing. It’s not about aesthetics or user journeys; it’s about signals, latency, feedback, consequence. There’s something deeply human about that kind of honesty.
An order is still an order. But at least in HMI, it means what it says.
This page uses no cookies or tracking.
©2023