Mirror, Mirror in Your Brain, Can You Help the Computer Explain?
For many years, I have tried to understand how otherwise smart people, including professionals who synthesize massive amounts of information in their day-to-day work, cannot seem to master a desktop user interface. In contrast, many of these folks seem to have no trouble with an iPad. The iPad is the first device I’ve seen that you can hand to nearly anyone and have that person master basic functions right away.
An older neighbor has an ancient dial-up mail appliance that has started to sputter. Her stepson attempted to move her to a Windows laptop, but she would have none of it. That’s not a critique of Windows: I wouldn’t have tried to get her on a Mac, either. But she became interested in the iPad, so her stepson took her to an Apple Store for a demo. I then sold her an original iPad I no longer needed — her stepson had already set up broadband and Wi-Fi in her house — and walked her through using it.
Though she made several written notes, and claimed she would never remember what did what, I was confident that the iOS experience would work for her. Sure enough, later in the day, she sent me an email from the iPad. But it wasn’t simplicity that allowed her to use an unfamiliar device so quickly. What’s the key? I think I finally have it, and would love your opinion, dear readers.
Daniel Goleman’s book, “Emotional Intelligence,” introduced me to the concept of mirror neurons — a specific kind of neuron that fires when you perform an action or when observing others performing the same action. It’s speculated, and some evidence has accrued, that these neurons enable us to build a model in our head of how other people act — a kind of internal simulation. This would explain how we have conversations (and arguments) with people we know in our head, and can anticipate others’ responses to our points. (There is some skepticism about mirror neurons, but I’ll run with
the most convenient current explanation.)
This led me to examine my own use of a computer. When I work with a program with a graphical interface, I anticipate and model the expected future behavior of the program while I’m interacting with it. I can imagine what the program’s response will be to nearly any action. It’s essentially a running simulation of the program’s user interface operating inside my head, at the very same time as I’m interacting with it. As I learn the interface better, much like getting to know a person better, my internal model adjusts itself to match the real interface more closely.
You can also liken this to how programmers think. I consider myself on the low end of professional programmer, despite thousands of hours — but not many thousands of hours — engaged in the task over the last 20 years. But the mark of a programmer is being able to run a program in her head. Such people have a built-in C or Java compiler or perl or PHP interpreter in there. They may not be able to perform loops with a million lines of unique input, of course, but they know how the loop will function and what manipulations will be performed on the inputs. After all, you can’t write code effectively by punching it into an editor and wondering what will happen.
The folks I know who can’t master the traditional graphical interface seem to lack this internal simulation of what the computer will do in response to input — they’re not using the skills on which they rely when interacting with people to anticipate future behavior. For them, it’s like playing a game of Whac-A-Mole where they don’t know where something will pop up next, but they’d better hit it. Or, it feels like a sort of perverse Skinner Box experiment. Without an internal model, every response from the interface is a surprise; the user can never anticipate and thus interact fluidly with the interface.
If you’ll accept this view so far, this would also explain why some interfaces drive us bonkers. I’m not naming any names. But it’s not just a different or more difficult way of doing things. Most modern operating systems, desktop or mobile, let you carry out the same kind of tasks with the same number of steps or ease — more or less. But if you move among interfaces, and don’t use an unfamiliar one much, it’s like the normal person you spend the day with, interacting via keyboard, has been replaced by some weirdo.
Think of it as if you came to work one day, and you had this conversation with a person who was sitting in the chair of a co-worker with whom you’d shared an office for years.
“Hi, Bill!”
“Who are you?”
“I’m John, your new officemate.”
“What happened to Justin?”
“He’s fine, but he’s out for a bit. I’ve been trained in all the same tasks, and I have all his manila folders right here, and I’ve been brought up to speed.”
“Well, okay. I’ll miss Justin a lot. Let’s get to work on the Wilson file.”
“OK, if you’ll just tell me which of these folders it’s in.”
“That one over there.”
“Which one? This one?”
“No, no, that one!”
“No need to get huffy. Now, if you’ll hand me a red pen, I can start to mark up the paperwork.”
“Justin always used a blue pen. And, anyway, I don’t understand those notations you’re using.”
“I can teach them to you. Almost everyone else in the office uses this kind of mark-up. Won’t take you more than a few days to get used to them, and maybe a few weeks to memorize them. Anyway, it’s time for lunch.”
“Sushi?”
“Never touch the stuff. But I know you’ll love a hoagie. In fact, I insist.”
Over time, one of two outcomes is likely. Either you’ll get used to John and his foibles (as you think of them), and you’ll figure out how to work as efficiently as you did before with him, and grow to love hoagies. Or you’ll find John so maddening and inscrutable that you threaten to quit unless he’s transferred to a new position.
Where does the iPad fit into this theory? The iPad is literally more of a blank slate than any desktop interface. It requires that you build less of a model, because it already conforms to many physical and real-world conventions, requiring less internal modeling to interact with.
Consider Apple’s use of gestures. They aren’t exactly intuitive, because although you don’t need to be taught to move your fingers, the specific actions aren’t the sort of thing we do every day. But they do mimic our expectations of a physical experience, relying on existing experiences as the base on which comfort with the interface is built.
The iPad’s insistence on full-screen apps shouldn’t be dismissed, either. There’s no management of items, but instead just a canvas on which activity occurs. We at TidBITS have talked before about how the iPad becomes the app you’re using. But that’s almost literally true in the mind. If you can play a game by using gestures, and don’t have to manage a keyboard, a file system, or a desktop on which you might accidentally click, that’s not just less to learn, it’s less to simulate.
This theory has just started to percolate through my fevered brain, and I wonder how you work. When you interact with a graphical interface is it your friend or foe? Can you anticipate your Mac’s or iOS device’s every move? I’m betting that’s true of those who read TidBITS, but if you get a chance to ask someone who has more trouble with traditional interfaces, see if you can determine if this inability to anticipate future behavior lies at the heart of the problem.
"The user already has a mental model that describes the task your software is enabling. This model arises from a combination of real-world experiences, experience with other software, and with computers in general. For example, users have real-world experience writing and mailing letters and most users have used email applications to write and send email. Based on this, a user has a conceptual model of this task that includes certain expectations, such as the ability to create a new letter, select a recipient, and send the letter. An email application that ignores the user’s mental model and does not meet at least some of the user’s expectations would be difficult and even unpleasant to use. This is because such an application imposes an unfamiliar conceptual model on its users instead of building on the knowledge and experiences those users already have."
http://developer.apple.com/library/mac/#documentation/UserExperience/Conceptual/AppleHIGuidelines/HIPrinciples/HIPrinciples.html
I suppose one way of summing up the iPad's intuitive power is to say that it's especially good at communicating which model it happens to be going for in any given app. The iPhone's tight design has lent iOS the directness you and every one else has noticed, which the iPad carries well with next to no clutter even on its much larger screen. It becomes the app. And the app developer's job is to do the obvious, well.
What I see in your descriptions — Glenn & Jaspar, is that Apple really has done their homework on the iOS operating system GUI. Many of the hackers who are incredibly good with computing are Asperger's style personalities whose mirror neurons are out of whack in understanding social interactions: BUT right at home understanding the machine. They can often begin to understand people by learning behaviour through Applied Behaviour Analysis = ABA.
Perhaps people who don't get along well with computers could learn better with the same principles of ABA ? Just GREAT if iOS is a set of stepping stones. ABA lives at CARD: www.centerforautism.com/technology.php —> Skills; —> www.skillsforautism.com/
Much of my own model is kinesthetic: moving stuff around to get things done; combined with visuals. I'm shockingly bad with gigantic texts because they don't make visuals for me, though I can at least spell, most of the time.
'Think Diff'rent RULES...'
I'm definitely wordy rather than kinesthetic, yet commandlines and therefore programming almost completely baffle me, no matter how many times I've tried.
If it's not graphical it's just not a computer. So goes my mind's prejudice, and yet I'll merrily read all day.
Holy cats. I never saw that. That's great.
My wife and I were discussing the impact of Steve Jobs passing and the emotional connection they have with Apple's devices. She is an expert in the "mirror neuron" issue and has been researching in that direction for years now. But by way of demonstrating to me how Steve is 'in the machine' and the experience of Apple's particular approach to making things easier has created a large and loyal following - she picked up my iPod Touch and (having never used it and having never tried to 'start it' before) she was able to get right to the main screen without any trouble. And then she handed it back to me saying that this is what Steve put into the machine. People are responding to their computers in ways they never expected to.
I agree you are on to something here and I endorse the idea of an acquired familiarity which is tailored to be a close representation of a real-world activity.
I don't know if mirror neurons apply to this domain, but I do think you're on to something. I've long felt that when someone needs to have a handwritten recipe for every task they want to accomplish on their computer, the real problem is that it never occurred to them that there could be a systematic model underlying the computer's behavior—that everything it does is arbitrary, and that mastery of one task will not ripple through to other tasks. A naive user can still get flummoxed by the iOS interface, but there's definitely less there to flummox them.
If someone needs a handwritten recipe: they probably need that recipe broken down into sub-steps that make sense to them, not to the instructor. Often they can do it themselves or at least indicate where the breaks need to be. That's how I learn from 'Take Control' eBooks or from Eg. the URL Manager Pro Manual, last night.
Once the sub-steps have become fluent: creativity with productivity may leap out !
This would explain why I try to swipe my PC's screen all of the time when I move between my iPad and PC!
I get annoyed when I have to pick up a mouse when I'm not using an iPad. Funnily enough, this has led me to remember more keyboard commands. Go figure?
crptc du2 1K lmt
Haven't read the mirror stuff, probly won't, definitely agree that mental models are critical. Jakob Nielsen's excellent column on the subject:
http://www.useit.com/alertbox/mental-models.html
Spent over 40 years programming, disagree with "mark of a programmer is being able to run a program in her head". No, even programmers have mental models about how the computer *acts*, and only occasionally need to consider how it *operates*. I've seldom done the latter except when programming in assember, even then only as part of the task. Writing a SQL query (now a very common part of programming) is two or three levels abstracted.
All users need a model of how the machine will react to their actions. That's critical to the model, not "running a simulation" but understanding mapping from action to reaction. Many people have no model of even one level below, yet use computers very effectively with a strong action-reaction model.
Simplifying those models helps.
I'm a huge fan of Jacob's work.
I also realized how powerful the iOS interface was to involve the individual when I handed my 88 year old mother her new iPad.
Her experience using a PC (given to her so she might write some stories about her past) and only barely able to remember what she had been taught about how to use it, she was dubious about having to learn this tablet thing.
After only a week she insisted I set up some e-mail for her (as my father had relied on e-mail exclusively before he died) and then shocked me by asking to be signed up for Facebook.
After a couple of sessions (mostly to navigate the Facebook interface) she was flying on her own.
It's been a year now and she relies on it for all kinds of information and follows the daily postings of all her grandchildren religiously.
Although she has no idea why I occasionally connect it to my laptop and update things, she is very familiar with how it 'ought' to work.
Not sure I follow the mirror neuron connection here (I'm currently reading Ramachandran's Tell-Tale Brain, which expounds on the theory).
But it inspired my first thought, though, on reading your analogy: perhaps that you showed your neighbor, using your fingers directly on the device itself, rather than at the remove of a mouse elsewhere (wireless or not), gave her a model to imitate more easily. So, if you point to, and manipulate (swipe, etc.) the mail app., you are showing her precisely where and how to interact with the visual object on screen. The human pointing finger is a very powerful tool indeed.
The physical action is direct and easy to imitate (to mirror) and remember.
Very interesting article, Glenn.
This is most interesting. I think you've identified a very powerful social force operating in context with the iOS interface.
Almost everyone I know who uses Windows learned it from someone else. But the same cannot be said for my many Mac using friends. But I know of no one using an iPhone or iPad who felt they had to take a course for it.
But I see people in the Apple Store getting a demo and it seems just so easy once the person hands it to the customer.
The mirror-neuron linkage here is that people's neurons are responding to social stimuli as well as visual-tactile stimuli. These responses may have more meaning on a larger social scale than just between you and the touchscreen.
I do think you are on to something here. But I would add one crucial element that I have noticed time and again when helping folks with computers. Many people approach the computer with the assumption that it is basically quirky and unpredictable. When frustrated they might say "why the hell would it do that" but this is a rhetorical question that really means "this thing just doesn't make sense." The folks that end up with the internal model you are talking about START with the assumption that the answer is available and that it will make sense. Everything else is just added layers of experience.
In my experience, the basic assumption that, at it's core, the computer is not really "knowable" is crucial. When I see this in someone, I know they will never be completely independent and comfortable using any computer system.
And, alas, Windows has been the biggest single reinforcement of that sense of helplessness for many millions of people.
I've found (in my parents generation) that those who never touched a PC have a quicker mastery of iOS than those who have complex mazes of rote routines stored away from their computing past.
This disconnect with the computer, and a belief in black magic or the technology's own foul personality instead of any kind of sense, seems to be much more common If you didn't grow up in GUIs in the first place. The same applies for many advances throughout history.
i've used macs since 1984, and so have an expectation that i will not need a manual. if there is a manual, i can become irritated and never use all of that application's functions. i hate Word, but stay with it because i am familiar with its foibles. I'm trying to switch to Pages, but learning new foibles is frustrating, even though i am motivated to learn them. the app or application has to work immediately and intuitively. some new ones are not so intuitive. back when the ibm xt came out, i bragged about my baby mac, and he said: "dos is no problem. i just list the commands on a card and spend 10 minutes reviewing them every morning." this was as unacceptable to me then as it is now.
My daughter was able to operate my iPhone when she was 3 years old, and with virtually no actual training, she just watched me use it. But the metaphor of moving her hand on the mouse and manipulating a cursor that's a foot away from her hand is very hard for her to master, even with assistance and training. I think touch, all by itself, eliminates the feeling of hardware and intimidating machinery, so all you need to figure out is the software itself. A much smaller and less daunting task.
I have a friend who has adult ADD (attention deficit disorder), so her brain's wiring is atypical. She uses XP a little (email, minimal browsing) because she memorized the actions; she doesn't want to get a 13" screen because "it's too big"; she refuses to try OS X; she doesn't want to be taught---"just explain it and let me try it myself".
She has an iPhone 4 (it was a gift) but gets confused if she goes beyond email and a little browsing. I bought her a copy of Take Control of the iPhone, but she doesn't want to find an answer in it.
She is very bright and a successful teacher (of adults) and administrator. She used to prepare project proposals by following templates. I love my new 27" iMac and wish I could afford a bigger and better iMac. Sometimes we don't understand each other........