I used to have a wonderful toaster. It was just the best thing since sliced bread.
It was simple. It always worked. It was efficient and fast, and got the job done.
Every day, for many years, it made my life better.
Then one day, the toaster updated itself. I went to make my breakfast, and found that buttons had moved, It looked a bit different. Before I could use it, I needed to accept some new privacy agreements, and figure out how to adjust the settings.
But after a few weeks I grew to love the new interface.
It was my same toaster, just looked a bit fresher, and had a few extra bells and whistles I really didn't need.
At the core, it was still making my life better, every day.
Fast forward a year.
One morning, another update landed, and this one was a major one. I had to stand there, raw bread in my hand, clicking things and feeling like an idiot for 5 minutes while my toaster updated, before I could use it.
When the update finished, my toaster now looked more like a microwave with lots of new buttons and settings and options. Everything was different, even the toast slots had moved!
Another 5 minutes to work it out. The bread in my hand is already getting stale.
Finally, I find the "toast bread" option, and proceed with my morning routine. I'm not happy, but... it still worked.
I could still toast my bread.
I found it a bit baffling why all these changes were being made, and how I was supposed to be benefitting from them, but hey... I was still getting my toast.
So I adapted, and continued my daily morning routine, with just a tiny bit more friction around getting my toast made. All these new features seemed to be slowing my toaster down and complicating the basic toast my bread process, but it was still usable.
And on the whole, it still made my life better, every day.
Until this morning, October 4th, 2021, when something new happened.
My toaster had apparently updated again, and now it was talking to me. I walked into my kitchen, and suddenly pop-ups started appearing. I didn't ask for that. I wasn't even near my toaster. I wasn't even interested in making breakfast yet.
But for some reason, my toaster suddenly demanded my attention.
I felt like a chihuahua had moved into my kitchen overnight.
It was at that moment, that I realized that the friction my toaster was generating in my life had exceeded the value it was providing.
It was no longer making my life better.
I reflected on this as I calmly sipped my coffee, walked into my garden and headed for the shed...
Where I keep my sledgehammer.
Eh. I'd been meaning to try keto anyway.
Our World is Full of Toasters
Obviously, this is not a tale about toasters.
I use a lot of tools in my work. Dozens or even hundreds of software tools, libraries, applications. Not to mention the computers, monitors, keyboards, tablets and mice that are all part of making that work.
I've been all-too-aware lately a disturbing trend towards increasing noisiness of the tech tools I'm using.
Tools are meant to make life easier, to make you more productive, and to reduce friction in your life. All of that breaks down when those tools introduce interference and distraction into your work environment.
Here are the most common types of interference I'm seeing...
Software updates
Oh. My. God.
I'm afraid to reboot my PC simply for the number of updates I'll have to wade through.
These are often much too frequent, and they require interaction with each application separately.
The notifications that "there is an update available" are often launched as an in-your-face modal dialog on startup for each application, And these aren't for critical security issues or bug fixes, they're for new features that I don't have time to explore right now.
For non-critical updates, it somehow seems more logical to put these at application shutdown, as a "would you like to update silently after closing this app?"
Less friction please...
Overly complex security
Yes, I want my data safe & secure... but not from me.
I work from my home computer 99% of the time, which is locked safely inside my house.
Does Google really need me to re-authenticate myself every 48 hours, on every account that I use? Do I really need to confirm my phone number & email monthly, when it hasn't changed in the last 10 years?
Yes, it's still me Google. And if someone was to break into my house and try to mess with my Google account, you'd assume it was me and let them right in.
So what are we accomplishing here?
When I am researching and browsing the web, do I really go through a cookie consent and configuration process on every website I visit?
There must be a better way.
Unhelpful features & complexity
Just like humans, software companies don't always deal with success well. They usually take that success and hire more developers, who are bored and looking for things to do.
They add widgets and features and add-ons to the product that... don't really make it any better. Often, those features just fall under the category of "bloat." Applications become larger, slower, and more complex.
Ultimately, basic functions are impacted, and companies destroy the very thing that made their product popular to begin with.
Why? If you have the resources and want to increase your market share- great, go build some wonderful awesome apps.
But don't touch my @!%#$ toaster.
Practical Examples
This problem is broad, it's like a megatrend... I'm seeing this problem almost everywhere. But here are some of the worst offenders, which will serve to illustrate my point.
Evernote
The tale was inspired by Evernote when I was greeted this morning with a large desktop pop-up. For absolutely no reason.
For the past year I've watched Evernote change and bloat continually, warping it away from the clean, fast, simple tool that I once loved.
That love has faded.
It is no longer the nimble athlete that it once was. It has gained 100kg and just can't compete anymore on the sporting field. Especially in the pole-vaulting category, which it used to dominate.
For the past year, I've struggled with the periodic UI changes, including new bugs that would occasionally make it inaccessible on my Android phone. I call it the white screen of death.
Now I find the phone version useless. Even if it does let me open the app, the slowness ( on my fast, new, expensive Android phone ) makes it too painful. It's about 30 seconds from when I have the thought "I need to record this" to the point when I can actually write my note.
I could have tattooed it on my arm faster.
On my desktop, I sort of tolerate Evernote now, using it as little as possible since I have other apps that do the job more quickly or better.
It's important to share that I'm on the paid plan for Evernote. These aren't advertising pop-ups, or encouragements to upgrade. I see those as fair play. If you give someone a product for free, you have the right to nagware them a bit.
But I'm paying.
And I am tired of paying to be annoyed.
Apple Phones
I ditched Apple iPhones a few years ago because of the typical battery issues, and the increasing sluggishness of the device over time.
It was far too difficult to get files on to and off of my phone. I found the entire iTunes setup horrific, and difficult to use. Far worse, it consumed gigabytes of space on my system drive, and it cost me hours trying to free space so my PC could function.
In the end, there was just too much friction, and it increased until I threw in the towel and switched to Android.
Android Phones
I immediately wished I'd made that switch sooner.
Suddenly things were working again, like they were supposed to. My phone was fast, and configurable the way I wanted. I could copy files to and from it with ease.
Brilliant, and frictionless.
But lately, the friction is increasing. My Android phone now wants an OS update every couple of months and when that happens, it greets me with a full-screen "update now" message every time I use it.
The problem is, I get that message every time I pick up my phone, and the reason I picked it up is that I need to use it now.
I can't wait for an update.
This, this moment right now, would be the worst possible time for an update.
So I dismiss the screen, and do what was needed. Once I'm done, I could actually afford the downtime of an update. But that screen is gone. I glance at notifications area, and the settings menu... nothing obvious. Where to find the option to do the OS update right now? It's buried somewhere... I'd need to do some Googling to find it out.
More friction.
A better way would be a less obtrusive message, and one which allows me to quickly approve and schedule a time for the update, e.g. tonight at midnight. That could work.
Microsoft Windows
I'm a big fan of Microsoft.
For my money, Microsoft makes some of the best development tools and office-productivity tools in the world, and that includes the Windows operating system.
For me, it offers the right balance of ease-of-use, and customizability, for all of the different types of work I do.
But... Windows Update. Need I say more?
For me, Windows update is like a big nose wart on the Mona Lisa. It's just too intrusive on every level, and I've invested a fair bit of time learning how to disable the Windows Update pop-up alerts, and most importantly, its "automatic reboot" feature, which has destroyed unsaved work.
Sure, a power out could have done the same thing... but that's why I have a UPS.
You don't expect your OS to be a threat to your work, and when it violates that trust, it's hard to rebuild.
I hope Windows 11 does better.
There's a Better Way
Any software developers or development companies reading this, I hope you'll take this to heart, and shift your perspective.
Success can't be measured by the number of features you've packed into a product, or by how many updates you release in a month.
It's measured by how much you tangibly improve the lives of your users, and how much friction you take away.
5 Rules to Code By
Here's how I see it...
#1 - Every feature is bloat to someone who doesn't need it
That's such an important perspective, that it bears repeating.
Every feature is bloat to someone who doesn't need it.
Whenever you're adding a feature, ask yourself "does everyone really need this, or should it be an optional add-on module?"
Quite often, it should be a separate product entirely.
The Rule: Never shove new features on users who didn't request it. Give users the option to reject changes they don't need. Consider this at the individual level. And by all means, track who is using which features in your product. You'll learn a lot about your users.
#2 - Updates are good, sometimes
Sometimes a software update is warranted. Major bug fixes and security risks should absolutely go to the front of the queue, and in those hopefully super-rare cases ( is your QA team doing it's job? ) it's justified to alert the user the moment the application is launched.
But think of it as blocking the on-ramp to the motorway, during rush hour. You'd better have a very good reason for making everyone late to work. Like, you're saving lives.
The Rule: Make smooth traffic flow your top priority. The update process itself is a major impediment to that. Be very sparing with updates, and strategically position them for lowest impact. Suggesting updates at application launch is the highest impact, so reserve that for critical situations only.
#3 - UI changes are expensive
And they are more so to your users, than to you.
Even brilliantly-crafted, well-designed UI changes will force your users into a new learning curve, while they adapt and change their habits.
How would you feel if the turn signal on your car kept moving to a new location on the dashboard? You'd probably not be that appreciative, and neither would your rear bumper.
Most companies are wizening up to the reality of these impacts and to change resistance, and they offer a "preview" of a new UI long before it's forced on them.
This is a good compromise. If you think you can make things more efficient, great- but give your users time to adapt, and the ability to choose when they have the time to tackle that learning curve.
Right now, at the moment they launch your app, they're looking to complete a specific task so suddenly changing the game on them will not be appreciated.
The Rule: Minimize UI changes. Give users the ability to preview them, when they're ready and have time. Consider offering a "keep the old UI" as a long-term option.
#4 - Security is the enemy of usability
We're getting better at this, gradually, so I have hope.
Technology is improving, and I imagine that it won't be long before my smartwatch will know it's being worn by me, and when I'm wearing it near my keyboard, it will make for a pretty reliable form of authentication.
Someday, logins might even go away, with retinal scanning or facial recognition, or fingerprint detection on my mouse and keyboard.
It will all get better. I hope.
But right now, it feels like the friction is increasing.
More frequent logins, with more complex password requirements. Two-factor authentication which means I can't do anything without my phone.
Let's collectively step back and rethink what we're trying to accomplish here.
The Rule: Be kind to your users. Don't demand overly complex security unless what you're protecting warrants it. Bank account, medical histories, private diaries, sure. Social media accounts, ok yes, makes sense. But not everyone is storing the nuclear launch codes to a user's personal life. Does a recipe database really warrant 2 factor authentication?
#5 - Frictionlessness, safety & usability are everyone's responsibility
The designers of a road and its traffic signals share just as much responsibility for the safety and comfort of its users as the car manufacturers do. But these usually get less attention, because it's harder to sue a government.
On the web, we're seeing the same dynamic in the so-called "cookie consent" screens are popping up everywhere. Someone decided that cookies were dangerous and that users must be presented with the option to reject them.
This has led to a trend of overly aggressive compliance notices, that require extra clicks and add all kinds of UX friction to websites worldwide.
I get that there are no standards yet... but wouldn't this make more sense to handle in the browser itself?
Should I, as a user, be able to configure my privacy settings in my web browser...
- YES / NO - I'm OK with being tracked by advertising agencies
- YES / NO - I'm OK with being tracked by analytics agencies
- YES / NO - I'm OK with a website remembering my preferences
- YES / NO - I'm OK with a website remembering my past visits & history
And choose to adjust those settings for any website I wish?
Shouldn't the browsers then deliver this to each website as part of the request header, and make it queryable to scripts through the Navigator object?
Or possibly even distinguish and "classify" cookies as security cookies, advertising cookies, analytics cookies, user-preferences cookies, and then the browser itself could determine which ones get saved, and which don't.
Any of these approaches would allow for a far smoother user experience, cut development costs, and improve legal compliance, all in one move.
Seems like the way to go.
The Rule: If you're anywhere in the chain of delivery of a product or service, understand how you contribute to the overall UX, particularly when a new problem arises such as cookie-compliance laws, or GDPR, or COPPA. Yes, a hamburger company is responsible for the health & safety of its customers, but so is the grocery store, and the refrigerated truck that transported them. Understand your part, the problems your end-users face, and how you can best contribute to the overall solutions.
In general, if a problem is being experienced market-wide, the best solution is usually found higher in the chain. Website owners probably shouldn't be trying to fix global cookie-compliance problems, when browser manufacturers can do it much more efficiently.
I hope that gave you some good things to ponder.
Good luck, and code well.