Why Your Bits Might Be Flipped (and What to Do about It)
While working on my latest book, “Are Your Bits Flipped?,” I spent lots of time thinking about the widespread phenomenon of technology misconceptions. Hardware and software have seen incredible advances in recent years, and we’ve all become more accustomed to products that would have seemed magical just decades ago. At the same time, basic misunderstandings about technology are rampant, with results that range from comical to tragic. I’d like to offer a few reflections on why that may be and what to do about it.
At a dinner party with some casual acquaintances, a guy asked me what I do for a living. I replied that I write books about technology. I mentioned a few of the topics and suggested that he might find them useful.
“I hope your books aren’t the kind that try to explain how things work,” he said, wrinkling his nose. “I don’t have time for that, and I’m not interested. Just tell me what to click — that’s all I care about.”
I took a deep breath and paused to think about how to reply diplomatically. I said that my books usually do have lots of explicit, step-by-step instructions but that they also provide background explanations so that people who want to know what’s going on behind the scenes can learn more. That clearly wasn’t good enough for him, and he became defensive when I gently suggested that a little bit of behind-the-scenes information could never hurt.
I quickly changed the subject. But I felt sad, because this fellow’s attitude made him prone to mistakes and ensured that he’d forever be relying on friends, technicians, and authors like me to figure things out for him and solve his problems.
Nobody is going to force you to understand the inner workings of your Mac or iPhone, an app or cloud service, or any other technology you might encounter. But however much you may try to distance yourself from the mysterious contents of the black box, your brain will automatically and involuntarily create a mental model that you’ll rely on to predict future behavior and make decisions. The further that model is from reality, the greater the chance you’ll experience confusion and frustration.
For example, suppose you use Gmail, and you notice that the Web sites you visit often have ads for exactly the products you’ve been discussing with a family member by email. Maybe that strikes you as mysterious or even suspicious, and you unconsciously form the opinion that someone at Google must be reading your email and actively passing along your interests to advertisers. That is, shall we say, a highly inaccurate description of what actually occurs. But if you’ve never looked into the details behind the technology, you might become excessively paranoid (or, at the other extreme, excessively trusting), and that can affect the way you use email, browse the Web, respond to advertising, and choose which products to buy.
More often than not, the initial flawed assumption is pretty tiny. Metaphorically speaking, it’s nothing more than a “flipped bit” — a one that should be a zero or vice-versa — but it can lead the brain down the garden path to an elaborate misconception. For as long as I’ve been working in technology (over two decades), my mission has been to help people notice and eliminate these errors and thereby become smarter about how they use their tools.
And this is what I have to say to anyone whose brain has led them astray: It’s not your fault (Fair warning: that video clip has some profanity).
You know the old story about the guy whose computer had a broken “cup holder,” but it was actually the CD-ROM tray. Or people who see the message “Press any key” but get stuck and call tech support because they can’t find the Any key. There are countless tales like these, and when we hear them, we all laugh and roll our eyes, baffled that anyone could be that stupid.
But stupidity isn’t the right diagnosis. Rather, the people who create products have a set of assumptions, but the people who buy and use those products don’t necessarily share the same assumptions; they have different mental models. If stupidity is indeed at work, one could just as easily say it was stupid of designers and engineers not to put themselves in the position of their customers and imagine what things would look like from their perspective.
Some of us may have more experience than others, or mental models that are more likely to match those of product designers, but anyone can experience a misconception about technology. If you have a bit flipped, it’s not your fault. But if you have the opportunity to correct a flipped bit, you should take it!
In my opinion, the people who develop technology and the people who use it should be willing to meet each other halfway. You shouldn’t have to be a technology expert to use a computer or smartphone, but on the other hand, following a policy of willful ignorance does no one any good. By all means, complain about foolish design choices, but also, read the manual. Urge developers to make their products “just work,” but also, take responsibility for understanding how they’re intended to be used.
There’s no shame in not knowing something, or in being honestly mistaken. But being willing to learn more — to eliminate incorrect assumptions and improve your mental models — is a virtue.
Whenever I write a TidBITS article or a Take Control book, I start with the assumption that I understand the subject matter pretty well and end with the assumption that I’ve explained it clearly. Both assumptions are often wrong! Like everyone, I sometimes get my bits flipped. Tech reviewers and commenters point out factual errors that slipped past me, while editors and readers tell me when my explanations baffled them, despite my best efforts. Although I may grumble or gripe, I do my best to accept these corrections with equanimity. They’re learning experiences. They mean I’ve just gotten a tiny bit smarter, and when I next revise the article or book, it’ll be that much better.
Whether as technologists or as consumers, we can all benefit from examining our assumptions from time to time. The more willing we are to acknowledge our flipped bits and improve our understanding, the better our relationship with technology (and with other people) will be.
In that spirit, I invite you to take a look at “Are Your Bits Flipped?.” Let me be candid: if you’re a dyed-in-the-wool “just tell me what to click” person, this book is not for you. It’s not a how-to book, like most Take Control titles. Instead, it’s a series of essays about common tech misconceptions, inspired by my FlippedBITS series of TidBITS articles. For each myth or misunderstanding, it explains what’s really going on in a friendly, down-to-earth way. Not only will it help to set your brain on the straight and narrow, it will also help you understand what may have led some of those bits to flip in the first place and
learn what you can do to avoid future misunderstandings. I hope you find it both entertaining and educational — and I’ll eagerly await your corrections.
Thanks Joe. I agree that this is not stupidity on most people's part but rather lack of interest or worse, intellectual laziness. Sorry to digress but think of the 40% of Americans who still believe in Adam and Eve, or any other scientific misconceptions. And let's not even talk about religion in general, which in many ways is the everyday equivalent of "Just tell me what to click!".
Misconceptions are extremely common in all realms of thought, and in fact my book includes an appendix that lists books about misconceptions in areas such as economics, psychology, and history.
As someone who has experienced many misconceptions myself, I try to be generous with people who (for whatever reason) haven't seen past theirs yet. We all have approximations of the truth, and I think the important thing is being willing to ask questions and revise one's beliefs as new information presents itself.
Religion in general in many ways is the everyday equivalent of "Just tell me what to click!"
Nice one. Stored in my quotes collection, thanks!
As far as I can tell there are two kinds of people: those who understand concepts and those who rely on mnemonics ("ezelsbruggetjes" or "donkey bridges" in Dutch).
In my experience it's a hard-wired thing. Explain something to a concept person and you rarely see them back on the issue. Explain something to a mnemonics person and you will see them back with every iteration of the same problem.
Mnemonics people will always be writing down full sequences of manipulations that will lead to a desired result in a specific situation. They will have notebooks full of them without ever being able to figure out the unifying concepts behind them.
Long ago I used to believe I could explain concepts to anybody in order to help resolve problems. No more. I now try to determine what group the person I try to assist belongs to, and proceed accordingly. Of course there may be some overlap between the two groups, but the divide still presents itself quite prominently.
As computer (and latterly mobile phone) technology has become more reliable and more like a service, this has led to two things: first a broader uptake of use of these devices, and second, delivery which is more like a service.
When I turn on a power point at home, I want the electricity, and I really don't care how it works. About as close as I get to thinking about how it works is purchasing green power.
When I withdraw cash from an ATM, I'm not concerned with how it works.
When I run up a VM on AWS, I'm not worried about how it works.
Although I'm a tech head of 33 years, and have been working in IT since 1991, I have complete empathy for the guy at the dinner party. Technology these days is marketed, sold and delivered as a service.
As I say, it's not necessary to be a tech expert to use a computer or mobile phone, and I'm not talking about knowing that level of detail. At the same time, it's unreasonable to expect all technology to be completely self-explanatory. Indeed, the guy at the dinner party did not expect that; he just wanted someone else to do all the learning for him :-).
I agree that you don't expect *all* tech to be self-explanatory. But for technology that underpins behaviour that has been sold as a service, then it needs to be of sufficient simplicity (from a Ux perspective) that you don't need to care (or have to worry about) how it works under the hood. As certain things move more toward a utility or service model, we as technical folks need to be prepared for this, IMHO.
As for people who are too lazy to learn, that's a different discussion :)
I think the jargon-y use of 'service' obfuscates the issue. MS Word 4.0 shipped with one thick printed manual and two thinner ones, and it was *merely* a word processing program. Today's laptops and smartphones come with tiny illustrated brochures and most apps come with no documentation whatsoever. This catering to someone's notion of end-user experience makes designers jobs even more constrained and reduces everyone's experience to the lowest common denominator. Imagine having to cook on a one-button stovetop with one shape of pot and one shape of pan. That's what we're ending up with in the supposed 'service' model of tech consumables. We need to get back to expecting more of ourselves.