Have We Entered a Post-Literate Technological Age?
Not long ago, Google produced a video that’s making the rounds on the Internet. In it, a Google employee asks people in Times Square in New York City a series of questions, such as “What is a browser?”, “What browser do you use?”, and “Have you heard of Google Chrome?” (Chrome is Google’s new Web browser; it’s available for Windows and in pre-release test versions for the Mac.)
Among the geek set, the video has gotten a lot of play because most of the people in the video – who appear to be functional adults and who use the Internet regularly – come off as highly clueless. According to the video, only 8 percent of people queried that day knew what a browser is.
The video is clearly not a scientific study, and suffers from horrible methodology. It’s likely, for instance, that simply asking “What is a Web browser?” would have produced better results, and the middle of Times Square is undoubtedly not where most people are thinking about the names of programs on their computers. But let’s leave aside such criticisms for the moment.
What’s Your Browser? Instead, let’s take the results on face value and consider their implications. What does it say about the technological world in which we live that 92 percent of the people asked could not identify the name of the program they use to access the Web? If other statistics are to be believed, browsing the Web is the primary use of computers today, so that’s saying these people couldn’t name the program they use more than any other.
Worse, some of the answers on the video reveal that they don’t even know what a program is. A number of them identified their browser as “a search engine” and “Google.” When asked which browser he used, one guy said “the big E,” undoubtedly meaning Microsoft Internet Explorer, which has a stylized lowercase letter E as its icon.
When the best someone can come up with is a vague recollection of a program’s icon, it says to me that we’ve entered a “post-literate” technological society, one in which people have lost not just the ability to read and write about a topic, but also the ability to speak about it, all while retaining the ability to use it.
As someone who earns a living crafting text to help people learn how to use technology, I found myself profoundly troubled by Google’s video. After all, if someone doesn’t know what browser they use, or even that a browser is a program on their computer, how could I possibly expect them to be interested in buying my company’s “Take Control of Safari 4” book (written, with infinite care, by the estimable Sharon Zardetto)? How could they even learn of its existence, if they had no idea that Safari is a Web browser or that they were using Safari?
(One concern that I don’t explore further in this article are the implications of a post-literate technological society for marketing technology itself – will even technology marketing be forced to rely solely on pretty pictures and emotional appeals? In fact, are we already there? Apple’s “I’m a Mac” ads help customers identify with the actor playing the Mac but give little solid information, and Apple conceals many technical specifications about the iPhone.)
But perhaps I’m barking up the wrong tree, and Google’s video in fact shows that we’ve taken great technological strides. TidBITS editor Glenn Fleishman, when we were discussing the video, suggested that it’s a good thing that the Web browser has become so ubiquitous that people need not know what it’s called to use it effectively.
(Linguistically, this same devolution has happened with the Web itself. Although it’s TidBITS house style to capitalize “Web” – a proper noun that’s a shortening of “World Wide Web” – it’s commonplace to see even professionally edited publications lowercase the word, thus de-emphasizing the fact that it’s a unique thing. I think they’re wrong: “Web” should always be capitalized, as should “Internet.”)
From a usability stance, I think I agree with Glenn – it’s a good thing that using the Web has become so easy that a myriad of people can do so without even knowing the name of the tool they use to access it. Most people just use the browser that comes bundled with their computer, and despite the issues with Microsoft Internet Explorer over the years, Firefox has garnered only a bit over 20 percent of the browser market since 2004 – largely from the small subset of people who know what a browser is.
On a platform like the iPhone, it’s even easier to see this trend toward obscuring the identity of the browser. Although Safari is the iPhone’s Web browser, and its icon is clearly named, applications like Twitterrific can display Web content internally, and others, like Mail, can open a Web link in Safari without ever informing you that Safari is displaying your page. It would be difficult to quibble with someone who didn’t realize that their iPhone browser was Safari, when in fact, much of the time they would be viewing the Web via some other app that piggybacks on top of OS X’s WebKit core.
Tied up in all of this is the fact that if what’s bundled with your computer or phone just works, you don’t need to learn much more. Dissatisfaction is the mother of exploration – only if Safari or Internet Explorer isn’t meeting your needs do you have much impetus to learn about and switch to Firefox. So the better technology works, the less we’ll learn about how it works. I can’t say that’s entirely a bad thing.
When the Thing Breaks — But I remain troubled by this post-literate inability to talk about everyday activities and the tools used to perform them, using the proper nouns that are not only generally agreed-upon by those in the know, but with which the graphical representations of those tools are clearly labeled. What happens when something goes wrong, and such a person can’t connect to the Internet at all? Can you imagine the tech support call?
"Hi, this is tech support. How may I help you?"
"I can't get on the Google."
"OK, what browser are you using?"
"I told you - Google."
"Let's step back for a second. What program are you running on your computer to access the Web?"
"I don't know - I just Google when I want to find something."
"Perhaps we should go a bit further back. What icon do you click on when you want to use Google?"
"The picture? It's blue and kind of round, I think."
"OK, that's probably Internet Explorer. Can you load any Web sites other than Google?"
"If I can't get on Google, how can I load any other Web sites?!"
I could draw this out further, but it’s not far-fetched (TidBITS staffer Doug McLean confirmed that my contrived dialog was painfully reminiscent of tech support calls he took in a previous job). In essence, the caller and the support rep don’t share a common language. They may both be speaking English, but that’s as far as it goes, and as soon as domain-specific words like “browser” come into play, communication breaks down. A good support rep would undoubtedly adjust his questions upon realizing that there’s a terminology barrier, and like Captain Kirk meeting an alien, would attempt to build up some shared terminology based on visual appearance before attempting to solve the problem.
Generational Problem Solving — If I asked you to tell me something about the caller in my fabricated script above, you might fall back on stereotypes and describe the caller as being elderly, or at least as someone who didn’t grow up with technology and therefore has come to it, perhaps grudgingly, later in life. But what if I told you it could be a college student?
My neighbor Peter Rothbart teaches music at Ithaca College, and he’s been noticing a disturbing trend among his students. Although they’re capable of using the digital music software necessary for his courses, he says that many of them have trouble with the most basic of computer tasks, like saving files in a particular location on the hard disk. Worse, if something does go wrong, he finds, they have absolutely no idea how to solve the problem.
These aren’t the sort of kids who are befuddled by high school – they’re students at a well-respected institution of higher education. (It’s the alma mater of Disney CEO Robert Iger, for instance.) No, they’re not computer science majors, but they’re not being asked to program, just to use off-the-shelf music software and perform commonplace tasks. And now those commonplace tasks are not only something that they apparently have never had to do, but lack the skills to figure out on their own.
Could this inability to solve a problem with a device with which they are otherwise familiar be a result of losing some ability to talk about it? I wouldn’t go so far as to say it’s impossible to troubleshoot without terminology, but it’s less radical to suggest that troubleshooting will become more difficult without being able to communicate effectively with people who are experts in the field.
Not all that long ago, when adults had trouble getting something working on a computer, they would sarcastically say that they needed a teenager to explain it to them. That was largely true of those of us who were teenagers in the 1980s and 1990s, but if Peter Rothbart’s experience is at all representative, today you’d be better off finding a 30- or 40-year-old geek to help.
Don’t get me wrong – I’m not saying that all young people are incapable of solving technical problems or going beyond the basics. My friend Dave Burbank, whose full-time job is as a fireman in the City of Ithaca, is also a serious geek known for taking hundreds of photos on his kids’ class trips, posting constant updates via Twitter, and updating a photo Web site for the trip before turning in each night. His 15-year-old son Istvan is currently a 3D animator at Moving Box Studios in Ithaca and is perfectly capable of maintaining a technical discussion on the evolution of backup media and other such geeky topics.
In other words, there will always be geeks, and in my mind, that’s a darn good thing. The technological sophistication of those people of my generation (I’m 41 now) who were interested in technology created the meme that young people were fluid with technology. But what we all missed was that being fluid with technology doesn’t mean you understand how it works or can fix it when it breaks. Being able to dash off text messages on a mobile phone demonstrates fluidity; being able to troubleshoot a dead Internet connection down to a corrupted preference file or flaky cable demonstrates understanding.
So what will most members of society do when something on their computers or smartphones fails to work? Let’s not pretend that problems won’t happen – technology may have become more reliable over time, but the rate at which things go wrong even for undemanding users is still shamefully high.
Just recently, my father called because his iPod wouldn’t show up in iTunes. After some back and forth, I suggested that he reset the iPod, and when he went to use it, he realized it was indeed entirely frozen. A hard reset brought it back to life and resolved his problem, but had he been on his own, it’s possible that he – or at least someone less experienced than he is – would have concluded it was broken and bought another one.
This isn’t a new concern. In 1909, E.M. Forster wrote a piece of early science fiction, “The Machine Stops,” in which he imagined a future in which face-to-face contact was considered bizarre, humanity lived underground, and the “Machine” fed all our needs. Of course, one day…the machine stopped. More recently and amusingly, consider the Pixar movie “Wall-E.”
Cars and Computers — The obvious analogy in today’s world, and one that several people have suggested in response to our discussions, is the car. At one time, knowledge of keeping a car running was a kind of patriarchal rite of passage. Failure to monitor oil levels, radiator fluids, and other factors could lead to a dead horseless carriage.
Few people know how cars work these days, and even those of us who do have a basic understanding of them can’t really work on a modern car. If the car stutters when accelerating, or sometimes won’t start, most of us simply take it in to the repair shop and get it fixed. Problem solved with the application of money, and of course, since cars work relatively well these days, much less monitoring is needed. When was the last time you checked your car’s fluids?
Like so many automotive analogies, this one sounds good, but suffers under scrutiny. In part, repairing cars has become a specialty not so much because intelligent people couldn’t understand what’s wrong or figure out how to troubleshoot it, but because the training and equipment necessary to diagnose problems and effect repairs have themselves become highly specialized. Gone are the days when you could fix a car with a few screwdrivers and a set of wrenches. The shops all download data from the car computer for diagnosis.
But the more serious problem with the analogy is that cars are single-purpose machines – they do one thing, and they do it moderately well. Thus, the type of problems they can suffer, while troubling, frustrating, and sometimes seemingly inexplicable, are still relatively limited in scope, more like a household appliance. How often do you have to check the inner workings of your washing machine or refrigerator?
In contrast, computers are general purpose machines that can perform a vast number of wildly different tasks, such as browsing the Web, reading email, writing a book, developing a company budget, tracking a database of customers, composing music, editing video, and so on.
We have up-and-coming geeks like Istvan Burbank, but even bright young men like Istvan have their limits. While I’d happily ask him to fix a Mac that’s not booting, I’m not sure he’d have any idea how to help if I showed him a PDF where the text on some pages appeared darker and bitmapped when viewed in certain PDF readers (even Adobe hasn’t been able to fix that problem reliably for me). There’s a limit to how much any one of us can learn, but there’s no limit to what a computer can do.
In a way, this is an odd situation for those of us who grew up with the personal computer. Before Apple, before the IBM PC, we had mainframes and minicomputers that we interacted with via dumb terminals. You couldn’t do all that much, and you were sharing resources with many other people, but you also didn’t have to worry about things going wrong as much, because when they did, the computer operators would fix them.
They were the gatekeepers, the wizards who controlled access and could say who was allowed to do what. Personal computers were supposed to democratize computing so anyone and everyone could do their own work. While that’s come to pass in some ways, it seems to me that we’ve returned to the days when you need a wizard to solve problems or do anything beyond the norm. It’s a somewhat uncomfortable situation, since those of us who grew up with personal computers are finding that we’re the new wizards.
Technological Illiteracy — So how did we get here? I’d argue that Apple – and we Macintosh users – are perhaps more to blame for this state of affairs than any other group. After all, no one has championed usability like Apple, with the Mac’s vaunted ease-of-use. For years, many Mac users scoffed at manuals. “Why would anyone need a manual when the program is so easy to use?” they’d ask. It was a fair point, for the users of the time, who were highly interested in the technology, well versed in how it actually worked under the hood, and amenable to poking and prodding when things didn’t go right.
But then we got our wish, and ever more companies started writing software that was easy enough for most people to use without reading a manual, at least at some level. That was the death of documentation, a phrase I first coined more than 10 years ago (see “The Death of Documentation,” 1998-05-04). Of course, it was really the death of the manual, and technical books have remained popular, in part because of the lack of the manual (how else could David Pogue have made a mint on his Missing Manual series?).
Even still, back when I started writing technical books in the early-to-mid 1990s, the average computer book would sell about 12,000 copies. Today, despite a vastly larger audience (though with much more competition), 5,000 copies is considered acceptable.
I’d argue there was a more insidious effect from the loss of manuals – it caused an entire class of users to become technologically functional while remaining technologically illiterate. When I asked my mother-in-law, Linda Byard, what browser she used, she became somewhat flustered and guessed at Outlook. This is a woman who uses the Web fluidly and for all sorts of tasks far more sophisticated than simply browsing static Web pages. And yet, the fact that she used Internet Explorer to do so escaped her.
As the conversation proceeded (and keep in mind that my father-in-law, Cory Byard, helped design personal computers for NCR back in the 1980s and now consults on massive database projects for Teradata – Tonya didn’t grow up in a technologically backward household), it came out that Linda had stopped reading about how to use technology when manuals gave way to inferior online help.
She didn’t stop learning how to use various programs, but without any sort of formalized instruction or written reference, she lost the terminology necessary to talk about the technology she was using. Of course, she had Cory around to fix anything that went wrong, and she said that the same was true of all her peers too – there was always someone technologically adept in the family to deal with troubles.
Although it’s harder to pin this loss of technological literacy on the lack of manuals when looking at schoolkids, the problem isn’t necessarily being addressed there either. When my son Tristan was in second and third grade in the public schools in Ithaca, NY, the closest he was taught to computer skills were typing (not a terrible idea, but tricky for kids whose hands aren’t large enough to touch-type properly) and PowerPoint.
Although some level of presentation skills are certainly worthwhile, why would you have second graders focus on something that’s guaranteed to be different (if not entirely obsolete) by the time they’re in college?
I’d argue that some of the basics of technology – the concept of a program as a set of instructions and the essentials of networking – would be both more compelling for kids and more useful for understanding the way the world works later in life.
When TidBITS contributing editor Matt Neuburg tried to teach a group of his friends’ kids REALbasic one summer, he found himself frustrated at almost every turn – they lacked the conceptual underpinning that they could make the computer do something. And more important, they didn’t care, since they were accustomed to technology just working. It wasn’t until he got them to draw a stick figure and, by changing the location of its parts repeatedly, make it walk across the screen, that one of them said, “Hey, this must be how my video games are made.”
And networking? No, you don’t need to know it works to use the Internet, but isn’t it wondrous that an email message sent to a friend on the other side of the globe in Australia is broken up into many small pieces, shuttled from computer to computer at nearly the speed of light, and reassembled at its destination, no more than seconds later? Wouldn’t it be fun to act out a packet-switched network with an entire class of second graders and the pieces of a floor puzzle? Or at least more fun than PowerPoint?
Luckily, this lack in the public education system isn’t uniform. Glenn Fleishman’s son Ben is about to enter a public elementary school in Seattle, where the beginning curriculum teaches kids about opening, saving, and printing files; later, it moves to task-based – not program-oriented – computer projects. That’s much better.
But I digress.
Illiteracy Stifling Innovation? My more serious concern with our society’s odd fluency with a technology that we cannot easily communicate about is that it might slowly stifle innovation. Already we’re in a situation where browser innovation is almost the sole province of Apple and Microsoft, with contributions from Mozilla, Google, and maybe Opera.
Iterative changes from the incumbents can be worked in, since everyone will be forced to accept them, but does it become harder to convince most people to try a ground-breaking new technology because it’s different, because it’s talked about using strange new terminology, and perhaps because no paradigm-shifting new technology can by definition be so easy to use that it doesn’t require some level of training? I fear that might be the case.
In the dawn of the computer age, the stakes weren’t as high and the market wasn’t as large, so I’d suggest that companies were more likely to take risks on innovative technologies that might appeal to only a small subset of the population. Today, with everyone using technology, I suspect that business plans and funding proposals all assume a large potential audience, which in turn causes the ideas to be vetted more on their business chances than their technological innovation.
Put another way, there have always been technological haves and have nots, but since there was no chance of selling technology to the have nots, technology of the past was less limited by the literacy of the audience. Since the technologically illiterate are not just buying technology now, but are the primary market for it, that has to be affecting the kind of ideas that get funding and are being developed in a real way.
Plus, think back to the point about dissatisfaction being the mother of exploration. We geeks may be willing to belly up to the new technology feeding trough since we’re never satisfied. But once technology reaches a certain plateau of working well enough, if this lack of technological literacy is indeed a more general concern, spreading technological successes into the population as a whole may become all the more difficult.
I’m fully aware that my musings here are largely hypothetical and based on anecdotal evidence. But I think there’s a new technology on the horizon that could serve as a test of my theory that anything sufficiently innovative will face an uphill battle due to the technological illiteracy of the user base: Google Wave.
For those who didn’t see Google’s announcement of Google Wave (we didn’t cover it in TidBITS at the time because it was a technology announcement, not a service that people could use), it’s a personal communication and collaboration tool that’s designed to merge the strengths of email, instant messaging, wikis, and social networking services. (You can read more about it at Wikipedia.)
On the plus side, Google Wave has the power of Google behind it, and Google could potentially merge it into Gmail, thus introducing it to 146 million users nearly instantaneously. But Google Wave will undoubtedly be quite different from Gmail, and will require a learning curve. Will that hamper its adoption, since email and instant messaging and other services work well enough that people aren’t sufficiently dissatisfied to learn about and try Google Wave? Only time will tell.
Being completely jaded, I've got a more generalized theory on why this is the case:
People are lazy, and make no effort.
Growing up (say, late grade school/high school), my friends and I would actually spend time going out of our way to learn things about technology. These aren't geeks either - straight up average folks. They'd ready the manual for their digital camera or phone. They'd get a book at Barnes & Noble about a piece of software or set of technologies.
Nowadays, I look at the 14-24 set, and I'm sorry, but they're just lazy as heck. Where and how it started I couldn't even begin to tell you. Doing some IT support I've seen plenty of times where people would just *zone out* while explaining something to them - in plain, second grade-level English, mind you.
*sigh* I fear for the future.
One bit that I didn't slide in was a conversation with Glenn where he pointed out that people gravitate to the easiest possible technology when they're learning - that may explain why text-messaging is so popular among the younger set, since there's so little to learn and almost no interface. Contrast that with email, where there's more to learn, more to write, more interface, and more actions to perform. The people who rely on email are those for whom it's second nature, having been easier than snail mail.
The shortest distance between two points is a straight line (a "law" which landscape designers know all too well...a meandering path is a wonderful existential experience; but it's a pain in the ... for anyone who has to "get there fast.") Consequently one of the basic rules of just about anything these days is "faster, easier, little effort." Too bad, bec. a challenge, albeit a minor one, can be a true "aha" experience.
Fortunately, every generation (every decade) since history began, people have been complaining about the younger generation and worrying about the future.
Strangely, that makes me more sanguine rather than less.
Although, as I note in the article, I think there was a point at which this wasn't true - when teenagers were seen as the ultimate technology mavens.
It's the kind of lazy that is insidious. I hear it all the time from family members who want to computer to do what they want it to do except for when they don't.
They don't care to learn ANYTHING since learning requires THINKING and thinking "hurts".
If the teacher learned to speak proper English rather than Geekese, it might not hurt so much.
It's tricky, though, since "geekese" _is_ proper English in a domain-specific topic. Any new field has to create its own terminology because it's new, and Shakespeare's English lacks the vocabulary for new technologies.
My customer may not know what browser she's using - but my mechanic's customer (that would be me) I doesn't know what kind of serpentine belt or fuel injection my car uses.
I do bemoan general scientific illiteracy, and even moreso, critical thinking skills. But with the one exception of people not knowing how to find their files, I don't see ignorance of the ins and outs of computers as much of a problem. I think that computers should be even more transparent and those of us who help others use them more practiced in bridging the tech-lay language barrier.
I think there's a difference though - when you're talking about things that are wrong with a car, you're talking about things in the real world that we all share - grinding noises, brakes that don't stop the car as quickly as expected (and we know they're called "brakes"), the wheel pulling to the right...
With computers, the things that go wrong are far less concrete - how do you describe the lack of connectivity caused by getting a self-assigned IP address instead of a proper one from your DHCP server? All we have to go on are error messages from programs, and those require shared terminology and accurate communication that's often not present.
Funny thing is that one of your "concerns" (e.g. a person that identifies the Internet and his browser with Google) could be -quite often- the solution to the other one (people who don't understand -not even at a high level- the technology they use so they're not able to troubleshoot it).
Example: if your father googled the terms "iPod doesn't show up in iTunes", after a little searching he could have found out what his problem was without calling you.
And now the circle is closing again: you need to know how to search and how to conceptualize a Web search to make it effective. So, again, it's a problem of cultural code: people who identify the Internet and their browser with Google are not able to fully understand the "search engine" mechanism and therefore they're not able to use the technology properly.
Indeed - terminology is extremely important for searching, and I wouldn't be surprised if there were firms working on technologies that would interview the user to figure out how to perform the search.
Certainly, even I have trouble with some searches when the necessary terms are all generic or heavily used words. (I was looking for an HTML generation tool called Gutenberg the other day; it's impossible to learn about because all hits on Gutenberg point at Project Gutenberg. The program turned out to be an internal Apple tool, but I found that only by asking someone who knew.)
Hmmmm, interesting article! The search problem is an old one - just ask librarians :)
I've observed that people are generally really poor at analysing what it is they are looking for, far less then actually putting that into terminology and all the variations and nuances that may entail.
I remember talking with my dad about looking for information about how to make or build a wind powered turbine. He just popped "wind turbine" in the browser and was frustrated by the number of hits that were irrelevant to him.
It took ages to tease more information out of him and identify the context - teaching people (and our kids) to identify "who, what, when, where, why" as part of their search needs would be useful. I think we're still some way off programs intuiting this for us!
They are not able to use the technology properly because they don't care to take the time to learn.
"It should just be easy" is what I hear from some of my family. My frustrated response is usually, "You should use your brain. It needs the exercise."
I am more concerned about the general lack of literacy in our country. A problem of much greater concern than simply technological illiteracy.
People cannot read intelligently nor can they formulate intelligible writings. They lack logic and problem solving skills - what would be generally called critical thinking skills.
The blame for this falls squarely on the shoulders of a failed educational system that is more concerned with social integration than with the 3 Rs. And the reason for this can be found with all of the government's educational support programs and fundings.
Don't point fingers at any particular person, president, or political party. The problem lies with a general public of mostly ignorant voters.
I think you hit the nail on the head. There is a tribe of people for whom "logic and problem solving skills" are not necessary since someone else can do the thinking for them.
It's a brave new world.
It's a new world, but bravery has nothing to do with it. Most of the past three decades has been dominated by an anti-intellectual culture.
Possessing intelligence or exhibiting critical thinking skills is viewed by many as elitism. The good life is now viewed as not having to struggle in any way, including what goes on between the ears.
_Every_ generation of people have been largely ignorant of how the world around them works. There are newspaper articles from 1917 bemoaning how the younger generation is dumb and lazy.
In fact, like Glenn, I would argue that the ignorance of how computers work is a sign of how thoroughly they've been democratized. When they were the provence of a small, essentially hobbyist, elite, those people understood their functions. Now that they are general, the general population doesn't really understand it. Shockingly, they use it for what they need and little else.
My observation was not that one generation is more ignorant than another. Nor was it limited to technological knowledge. My point is that people in general are ignorant regardless of generation, but that this ignorance has proliferated because of the failure of our educational system to instill critical thinking skills.
However, I personally do believe today's youth are "dumber" than the previous generation which is "dumber" than the generation preceding it, etc., and ending finally with my generation, which of course is omniscient and brilliant beyond compare!
Strange how the same educational system managed to produce one such brilliant generation but fail miserably both before and after.
If ignorance is a sign of democratization of technology, and ignorance is bliss, then the democratization of technology must be bliss! ;-)
I seriously doubt that ignorance is the result of a democratization of technology. It is rather a failure of public education.
At least in the US our public educational system dates back to the 1800s when public education did not even extend to high school and was intended to enhance social integration. That goal has not changed with respect to the government's support programs and funding of public education.
One can also make a good argument that ignorance is helped along by the failure of television to meet the great goals once attributed to it. Personally I see the proliferation of reality TV and incredibly dumb comedies and cartoon shows as helping to move along this proliferation of ignorance.
"If ignorance is a sign of democratization of technology, and ignorance is bliss, then the democratization of technology must be bliss! ;-"
Good snark, and a useful way not to deal with the point.
Well, only ONE of the so-called 'three Rs' starts with an actual r, so there's one problem right there. But the idea that teaching reading and writing and maths is 'critical thinking' is just silly. teaching critical thinking is a task in and of itself. Besides, the way schools are now moving is very 19th century 'three Rs' since all funding now is based on the student's performance on standardized tests.
+1 on that. Logic and Critical Thinking is a subject that should be taught at every level. The basic understanding of how to break down statements and assertions logically is more important than rote learning. And teaching for standardized tests goes entirely the other way, leaving students with little benefit little from their years in school.
Great thoughts and article.
I'm dismayed by the taunts I get when I admit to reading the manuals for devices / technologies I use. Most people think it strange to read about and therefore I'd argue understand technology properly. More troubling, I see this in my workplace, busy people only getting a cursory grasp of a subject and then making decisions about it.
Maybe this is the key, people are suffering from information overload. Too busy keeping up to spend time pondering / understanding.
I'm not saying the end is nigh but cause for concern.
Be afwaid, be very afwaid.
I don't think it's possible to have "information overload." Intelligent, critical thinking people should be skilled enough to filter what they need for any given situation, to be selective, and to be discriminating.
People who scurry around trying to know everything usually end up knowing very little.
I so agree with this - I don't 'believe' in information overload. Go & sit in forest & actually pay attention. There is more information to be grasped, gleaned & gathered there than in any area of human artifice.
Information overload is a catch all excuse for sloppiness & ignorance - ignorance as in IGNORE-ance.
I do, however, think future shock & neophobia deserve some attention.
And,by all means, lets interrogate the education system & determine its limitations.
"Intelligent, critical thinking people", hhmm. I wonder what the ratio of "Intelligent, critical thinking people" to "People who scurry around" is? Back to the Google video...
I have bought several "Missing Manuals" for areas of computing I know little about but would like to use in a small way. I'm just a 79-year-old home user, and I find the trouble with these manuals is they are so big that I can't read the whole thing, and if I try I get loaded down with way too much information beyond what I will ever use. The cause of this, of course, is the proliferation of "features" in the programs. Now if I need to try to use a new "feature,' I use the manual as a reference book and learn just what I need to do what I want to do with the program. I don't need to fill my already bursting brain with extranea.
It's not my primary job, but I've done plenty of tech support over the past 30 years. I don't see how these Times Square responses can be considered in any way surprising or anything new. In fact, I would say that the situation has improved quite a bit over the past 15 years. For example, I virtually never speak to anyone any more that doesn't know how to use a pull down a menu, but that used to be frequent. Knowing what program you are using, however, is still often a problem. If only I had a nickel for every time I've said "What does it say next to the Apple in the top left?" Users often don't realize they've switched applications when they click on different windows, or have any idea what an application is.
I agree with Adam that common terminology is critical in communication, but users often make up their own terms, like the gentleman who uses the "big E" browser. Again, nothing new, and I even encountered this back in command line days. I don't think this will ever change.
It's interesting that some terms - like menu - have become commonplace, whereas other aspects of using technology, such as knowing you're running a program, haven't. I'm not distressed by newcomers to computers not knowing the domain-specific terms initially, but there is some expectation that they'd learn them over time. In the past, I found that was more common. People wanted to be able to speak the lingua franca. Now it seems as though people don't care if they're using the right terms or not, or at least they don't care until they need help and have trouble getting it.
I'd say that most people didn't know what a differential was even back when they could theoretically fix their cars on their own. Most people just don't care; to them, cars are a tool. The same applies to people's use of technology. As long as they can use their computer to do research and communicate and write letters, why *would* they care how it works?
These people are not lazy or stupid. Some of the smartest, most determined people I know are incapable of setting up a wifi router, updating their operating system, or installing an application. They think they're using "Microsoft" to write letters, and they think their browser is called "Internet". As far as I can tell, this isn't a new phenomenon. This has been the case ever since people wanted to use computers in order to achieve actual real-world tasks.
The issue here isn't that computers are too easy to use and thus don't require a manual. The issue is that computers are still too *hard* to use.
Oh, yay, thanks for this. This whole thread is in terrible danger of simply being a "people [these days] are stupid and/or lazy because they don't know what I know" rant. There is no field of human endeavor in which the schooled are not contemptuous of the unschooled and those who -- and oh, boy, is this ever galling -- don't even care to acknowledge the value of the schooling.
As you so rightly point out, computers are tools. They are, as has been noted, Job Number 2 for most people, who instead of learning the arcana of the computer have actual goals that the computer (usually) makes easier. If it's not easy for grandma to surf the Web, is it grandma's fault, or are the tools still too complex?
PS Any statement involving "now" or "these days" is virtually guaranteed to be a result of the Recency Illusion (q.v.)
I've recently become aware of a related phenomenon when I noticed that it is very rare to find anything on the Web that is challenging on a conceptual level. On the whole, if you can see it, you 'get' it. If you come across content that forces you to think it is usually poor design rather than the content itself that you have to think about. Indeed, so rare is it for anything other than poor design to force a user to think that if users come across something that is genuinely challenging they will often just ASSUME it is poorly designed. The irony is that conceptually challenging content is usually packaged up in PDFs - the book lives on as the 'natural' home for difficult ideas.
User expectations of simplicity have been important to me recently in the ongoing process of designing a prototype carbon visualisation tool: http://carbonquilt.org. We want to include powerful but conceptually challenging techniques, but worry that these will alienate users.
With the Take Control ebooks, we sometimes get this too. Some of the tech topics we cover, like how to backup several computers on a network, or how to set up a multiple-base-station Wi-Fi network are just plain complicated. Sometimes, people will tell me that if they have to read 150 pages to complete these tasks, then they aren't going to do it.
I understand where they are coming from - the amount of time I had for remotely optional computer tasks plummeted when my son Tristan was born. Now that he's 10, I can spend time on some more complicated tasks (and of course the fact that I *edit* tech books for my job helps enormously). But I think that different people have different areas of expertise. Someone keeping up with the latest on brain surgery or planning a large wedding won't have the brain space to learn about DNS and NATs, and that's entirely understandable.
This may seem at odds with my earlier post about info overload, however I do agree that people can only allocate what time they have available according to their own priorities. eg Brain surgeons
As a primary (grade) school teacher who put kids in front of a TRS-80 in 1979 I was paying attention to the world & drawing the attention of the kids to it. Since then hardware & software have come, gone proliferated & versionised. If I'd had to learn the intricacies of even the number of word processors that I've used & taught with that is all I would have done. I wouldn't have been teaching I would have been endlessly skilling up, reading manuals & doing workshops.
So what to do? I acquire sufficient familiarity with the artifact/process & look for technical support. often that will come from kids in the class who grab a piece of software & run with it & become the reference centre. If it requires more skilled technical input I can, because I have played with the tool, know what to ask.
Nothing has changed.
The first portable timepiece was built in the 1600's, the pocket watch followed
in the 1700's and by the 1800's watches had become sufficiently affordable to be
common. Do you think the average watch owner of any of those periods could have
explained how the mechanisms inside their watches kept time?
The earliest instance of something commonly used that was not understood
technologically or scientifically by its average user was probably fire.
IT is still too complicated.
I would argue for layers of literacy and indeed that is what we find.
The technologically literate are often functional illiterate and what right do they have in deriding the functionally literate for their technological illiteracy.
The reality (like the car analogy) is that you have different levels and types of knowledge about tools.
I'm not sure who would be technologically literate, but functionally (with respect to technology) illiterate. My point is that it's possible to be functional while still being illiterate, not the other way around.
Good article. I'm tempted to want to stratify the levels of illiteracy Adam's talking about here more specifically. There are greater and lesser grasps of computer awareness.
I'd also very much like to supplement this article with a large sampling from Apple's support call database, aggregated over time and tracked by illiteracy level. They must keep track of what sorts of help they've been asked to provide over the years, and by now will have likely drawn some conclusions about how those help requests have changed, and why.
In regard to the comment "How did we get here?" We've always been here. Back in the late 80's I was dealing with friends of friends calling w/problems. Him: "I can't open this thing I did in illustrator" Me: "Where did you save the file?" Him: "In Illustrator." Because that's where he opened it from. Also "There's a white area in the middle of the screen..." (this in the b&w checkerboard Desktop days) and while I was thinking "What's wrong with his monitor?" he continued talking and I realized he meant a *window.*
The majority of people plow ahead and do whatever seems to work, damn the consequences and time wasted. That's why we still see 10 tabs used in a Word doc instead of a tab setting. And since this has existed since printed manuals were common, I'm not sure we can blame the miserable online help approach, either.
I think it just seems worse now because there are so many more people using computers. The same proportion are clueless, but the sheer numbers are way larger.
I'd argue that because there was a steeper learning curve in the past, there were fewer people who used computers but weren't willing to climb the learning curve. Thus, the proportion of clueless people is greater along with the raw number (though it's the raw number that's really the limiting force).
The learning curve can be steep for people & They also have to scale the unlearning curve which is even steeper. Only the pure technonovice comes to ICT without an unlearning curves associated with earlier or different forms of the whatever it is presenting the current difficulty.
This is all likely to be associated with comfort zones, priorities & ego of course.
Of course, if Word opened a new doc without any tabs, more people would either use the space bar, or learn to make tabs.
What does the described lack of technological literacy mean for the future? Absolutely nothing. Expecting the masses to grasp and fully utilize complex computer technology is a futile conceit.
This article right here meshes quite well with the article "Shop Class as Soulcraft," which has since been expanded to a book. The idea that lacking the ability to understand and fix things yourself makes you dependent upon others, and weakens you is a fascinating one. Especially in this day and age where people spend years getting degrees in things that aren't even serious academic disciplines, much less of any practical use.
Ironically, I just finished opening up my (now out-of-warrant) MBP to see what was wrong with the latch. Turns out it was just stuck. I had no problem following the very simple instructions on iFixit.
Many people wouldn't even have thought to do such a thing. They would have taken it in for repairs. I do that only when it's under warranty. I've done basic auto repair many times on older vehicles. I make it a rule to never depend, on a day-to-day basis, on a machine you don't understand. To do so is begging for trouble.
I think that an entire class of users has become illiterate pure and simple. Anyone who provides help daily in a place like Apple's forums cannot help but be struck by how minimal the reading skills of many users are, often requiring repetition of the same instructions twice or more before they even see all the words addressed to them. Manuals are indeed pointless for them, only video gets through.
But they had better be very short videos. Not much attention span these days!
There's a common refrain amongst media-savvy types that newcomers to the Internet are dumb or clueless or have no attention span. Well, I don't understand the ins and outs of the internal combustion engine, but this doesn't stop me using a car and getting enjoyment from travel. Why should we expect people to grasp the fundamentals of technology or computing so they can use the Web? Saving a file to a particular folder on your computer is a great example of that. Why do we insist that people need to understand their computer's file structure -- or even the concept of 'files', 'folders' and 'saving'? If ordinary people are struggling, perhaps we need to re-examine the geek-designed metaphors we've created rather than question people's competence.
i guess you didn't have time to
make this shorter. (but i see that
we're limited to 1000 characters!)
> only if Safari or
> Internet Explorer
> isn't meeting your needs
> do you have much impetus
> to learn about and
> switch to Firefox. So the
> better technology works,
> the less we'll learn
> about how it works.
> I can't say that's
> entirely a bad thing.
that'd be nice. but of course it is
people using the worst browser
(internet explorer 6) who know
_the_least_, so no go on that idea.
> troubleshooting will become
> more difficult without being
> able to communicate effectively
> with people who are experts
no, that's the problem, adam...
technocrats make it too complex.
(because they _enjoy_ complexity.)
it should be the job of the experts
to learn how to communicate with
the people, to make it simple to
talk about our tools, so we can.
There have been a lot of good ideas about how to teach technology on this comment thread. TidBITS obviously has an audience with well-informed views. Adam, why not put together a "Take Control of Teaching Computing Skills," or a Wiki site where this expertise can be turned into something useful for schools?
I think the problem is not a new one. It's been with us for millennia.
The stupid, and the ignorant (or to be more charitable, the less informed).
But it's not a problem that's getting worse, it's getting better. It's just that more of the stupid and ignorant are getting behind a computer than before.
When such people weren't using computers, the population seemed more tech literate than it actually was.
So I would say that this is actually a sign of society gradually getting smarter, but not as fast as one would like it to.
I suspect I am like most visitors here in one respect. I serve as tech support for family and friends. Some are professionals, intelligent and capable of solving complex legal and business problems. Yet I am gobsmacked at how little they know about the computers they use every day.
They meander and stumble for years, teaching themselves bizarre habits and workarounds for problems that they have themselves introduced. They fail to learn the simplest of commands and controls for everyday programs. Their usage habits are a Gordian knot that I cannot unwind. So I remind myself that we are sitting in front of machines that are more powerful than we could have dreamed and get on with it.
To me, computers are a fascinating hobby. For others, they just big, fancy blenders. The first thing I do is get them to just stop clicking already. Work through this problem step by step. I hope they learn, but they are usually content with a quick fix. I do see this as a cultural metaphor, a sad one.
This article is making me rethink how my school teaches computers and technology. I've been helping setup everything the school uses for computers and classwork. The level of computer literacy among the teaching staff is across the board and I don't want my children ending up ignorant of how to find a file on a computer or how to do basic troubleshooting. Right now, it's easier for me just to fix it for them, but that may be a huge disservice that I need to correct. I imagine it's something all geeks deal with on a daily basis; Struggle through helping a neophyte and taking 2 or 3 or more times as long as it would take to fix yourself. Of course, it also ensures our place in the future, if few are willing to learn, it cements our place in the running of everything.
Why should people know all that stuff? The original idea behind the Macintosh was to make a computer that you could plug in, turn on, and use. You shouldn't have to be an engineer to use a computer (at least not a Macintosh).
Well, just how much should one know about one's tools to accomplish a task using them? The extremes are clearly bad: "Why didn't someone tell me that this hammer could smash my fingers? I don't want to know how it works - I just want to put in a nail." :-)
Ah, but what kind of hammer? A claw hammer? A framing hammer? An upholstery hammer? A cross-peen hammer? A ball-peen hammer? A twist hammer? A dog-head hammer?
Should it have a finish head or a waffle-head?
I can also attest from personal experience that this post-literacy is not new. Back in the day, at a Seattle-based ISP where Adam Engst was our hero, the complaint I heard was "I can't get on the Netscape." That's not far off from the contrived dialog in this article.
Also, there are important reasons for some grammatical uses. For example, an internet is any network using TCP/IP. Whereas the Internet is the network of networks using TCP/IP.
I've watched people unable to remember how to use domains even for things like Amazon. For example, typing in "amazon" into a search engine instead of simply typing in "amazon.com" into the address bar of their browser.
It's not post-literate if we've never been literate in the first place. We continue to be a technologically pre-literate culture, for the most part. If it's not the people who drives cars but don't know how to change their oil, it's the people that use the Internet and don't know what a browser is.
I got started with computers in 1978, (I was 31) and have earned my living with them ever since. I was fascinated by this article because I went thru the stage with my older friends who "just couldn't get it" on toward the "computer generation" and now I see their actual computer illiteracy. What's scary to me about all this is the thinness of the gossamer veil. The Internet is-extremely- fragile. One good swift serious attack and we go from millions of merely illiterate in this country, to insufficient knowledge to even -survive.- No 'net? Can't communicate to even organize; find food; build community. When we see how non-thinking people are so easily lead by corporate interests these days, I don't find it at all far-fetched to have such large and serious fears about our future.... seemingly hanging by a few photons and fibre optic cables.
I'm involved with a purebred dog breed - both the bred club and an email list. I cannot tell you how many people have no idea from whom they bought their dog. Now these dogs cost about $2500 or so - one would think they might remember. Sort of like not knowing if ones computer is a Mac or a PC! Most people just aren't interested in the "details" of just about anything anymore.
Bravo! I've been complaining for years that Apple's 'dumbing down' and gradual conversion of personal computers from a tool to a closed appliance was socially (or intellectually?) irresponsible, and didn't bode well for our society's future, as the first round of programming-oriented personal computers did. This article takes a while to get to its point but essentially feels like the first time I've seen the question addressed in a broad forum.
I don't think Glenn Fleishman's below comment about people always complaining about the next generation applies here. I like today's music, I think newer hip-hop is far superior to '80s & early '90s hip-hop, and even listen to some Emo. I like today's fashions better than previous decades. I love all kinds of new movies and see real advances happening every day. But Apple's increasing pandering to 'simplicity' is dumbing down the users. People will rise or fall to meet expectations. And Apple designs with the expectation that users are stupid.
Oh, I don't know; I'm not talking about taste, and I'm not making a general statement.
If you read articles and books from any point in the past (including Periclean Athens), you find thinkers of the day saying that the young people are less educated, less sophisticated, and less interested in deep thought.
That has nothing to do with taste.
Okay, Adam, here comes a broadside: you mention "After all, if someone doesn't know what browser they use, or even that a browser is a program on their computer, how could I possibly expect them to be interested in buying my company's "Take Control of Safari 4" book... by... Sharon Zardetto?" Similarly, Adam, if someone knows little about proper typography, how would he ever grasp that he should lunge to read/heed Robin Williams' "The Mac is Not a Typewriter." You see, Adam, there are "dashes" other than hyphens; for example, every instance (in your article) that you need to deploy an 'M' dash, you randomly vacillate between using a double hyphen <--> and a space-hyphen-space < - >. Typographically, there is no such thing as a double-hyphen, and there is never a space entered on either side of any one of the three types of dashes (hyphen, N-dash and M-dash). I'm not trying at all to be persnickety or perfectionist, but Adam, you're exhibiting the very phenomenon you're writing about.
You're not being persnickety, but you're certainly confusing consistency and convention with dogma.
A long dash is not a part of speech. It's a convention for indicating a certain kind of pause. We have long used a space followed by a hyphen followed by a space to indicate a long-dash style pause. This originated when TidBITS was a text-only publication, and em dashes weren't an option.
The double-dashes are used after bold subheads as a way to separate the subhead from the following text. Again, it's a convention.
I was trained as a typesetter back in the early 1980s, and as a graphic designer thereafter. I've designed piles of books, and once worked with some of the countries leading typographers (Howard Grala, Greer Allen, Charles Altschul, and Roland Hoover).
This doesn't make me correct, but it does mean I have some basis on which I make this opinion. At some point, we may move to using HTML long dashes.
Your points are well taken, Glenn, altho' I don't recall inferring that an M-dash is a part of speech. But I didn't, as you say, confuse consistency and convention with dogma. Au contraire, I was merely pointing out that, just as Adam queried "how could I possibly expect them to be interested in buying... Sharon Zardetto's book... (because they don't know what they don't know," Adam perhaps hasn't read Robin Williams' book for that very same reason. In the event, my comment was intended as tongue-in-cheek, as in //:=)
If an em dash isn't a part of speech, then it's merely convention that we use a long dash (the width of an M) to represent a certain kind of pause or quasi-parenthetical remark. If it's just a convention, then other conventions, equally self-consistent, should suffice, too, no?
We all know quite a bit about typography at TidBITS, but we've also tried to make choices to ensure that what we produce is readable to the largest audience, even in plain text via email. We might need to tweak our CMS to take advantage of...better typography!
I don't get it: all those special characters initially showed up just fine, but when I refreshed the page they all transmogrified into hex code. Gremlins. Verdammte gremlins.
Drat, you encountered my secret: I use AJAX to transfer comments, and the local display sometimes looks better than the "refreshed" one because of a problem in conversion. I'll add this to the bug list!
Nah. Leave it this way. Makes the annoyances more interesting. Failing that, you might try Bon Ami.
Glenn has pretty much answered your comment, but more to the point, we still deliver some 17,000 copies of TidBITS in plaint text to people who have not chosen to switch to our HTML version. Because there's no way to display an em-dash (or many other characters) to them reliably, we avoid them entirely.
In the long term, it would be nice to come up with some automatic changes between special characters and plain text replacements, but it's not necessarily a simple task, and so is lower on on our list than many other things.
But I assure you, we're fully aware of typographical conventions and use them whenever it's feasible.
One recommendation to all who are concerned.
Read the short book "Technopoly" by Neil Postman.
How come there's no Share link on this article?
Talk about technological literacy!
We used to have a widget that would let people share articles through various means, but we found it was used essentially not at all, despite millions of pages views.
We've thought about adding an "email this article" link, but there's a fundamental broken issue with that: most of these options (at the Wall Street Journal and everywhere) spoof the email address of the person sending the link. We can do that, but we're not sure we want to.
We created a short URL as our main interface--none of that ugly long query string stuff--and figured that went a long way toward letting people copy and paste a link that they wanted to pass on.
Oddly, you're the first person to ask for such an option in a long, long time. I'm not sure if that means because other people got frustrated, or because no one wants it! We're open to suggestions.
Great article, great thread! And I'd like to point out that EVERY contributor here is a substantially competent writer. This must be the "elite" I keep hearing about!
Two thoughts... First, all humans have particular talents or capacities. In my experience very few are truly tech-minded. They simply don't think that way and it's hard for them to grasp concepts (like operating system, applications, documents) that I find intuitive. At the same time, would it even be possible to design a computer that completely shielded the user from these concepts? How would that work? (Maybe search engines are one way.)
Secondly, due to the point above, there has always been a "class" of techies (mechanics, architects, programmers, electricians...) and everyone else who benefits from their abilities. Businesses have made this possible and profitable. And fortunately some people still like being geeks! Will they eventually become, as in a sci-fi story, the most powerful class in society?
It is not uncommon for me to find that a client has his home page set to Google (a ridiculous thing to do now that all major browsers have built-in Google fields), and furthermore, the toolbar where you can type in a new URL has been suppressed! Such a fellow might fairly ask, "How can I get anywhere else if I can't get to Google?" Some well-meaning relative has shot him in the foot in an attempt to help, forcing him to use the internet the same inefficient way Homer Simpson uses the phone system: no matter who he wants to call, he dials 411.
MaxR, your observation is absolutely on-the-money; I encounter this phenomenon regularly. On the phone I'll give a client a URL to go to; when the interlude becomes unbearable, I'll ask "What's the delay?" and get met with a reply "Google's giving me several hundred (or thousand) results, and I'm trying to choose the right one." Sigh. Furthermore, as you pointed out, his browser's address field has been eradicated, so that his only way to roam Cyberspace is by typing everything into his google search field. Your Homer Simpson/dial 411 analogy sums up the scenario precisely.
Remember the old days when there were Guilds. People were experts in one trade and this was often handed down to the son. Someone suggested that a feeling of happiness and content is created by totally submerging yourself in something you enjoy doing, giving it all you got and being it a creative expression of your self. Nowadays we have so many different opportunities that ppl often do not submerge themselves in one trade. Often one needs to learn a million skills. Combined with the amount of input we receive everyday from several sources we have become a generation who switches attention from one thing to the next rapidly without going deeper into the subject. The same counts for computer usage: it just needs to work so we can do what we want and move on to the next thing - results orientated, while in the old days one would spend many hours on building a chair or a house with several ornaments. Now we marvel at such craftmanship.
We could do with a governing body which monitors computer lessons in schools such that we are sure that people will learn the fundamentals (like what a browser is). I also think it is important when we learn computer skills that we start with these basics. Otherwise there is a good chance that ppl will not be interested in learning the basics as they do not see the necessity (I am using it, it is working already).
The major problem is that most individuals now have no understanding of cause and effect in nature, and in our failed societal system. Which is leading us to non-viability as a specie. Having never been able to solve even one of our basic societal problems. Where they become ever-more endemic and intractable. Leading one to wonder if this is the best a thinking specie can expect? To be born into a myriad of societal problems, live a long, dreary life with them unsolved, and die leaving them to our children.
Interestingly and depressingly, the percentage of school leavers who are "functional illiterates" as defined those who control the school system, is approximately equal to those who do not understand or believe in the problem of climate change. This, in addition to our failed societal system, is also leading Homo sapiens to non-viability as a specie. Which is the prelude to extinction.
Even while the simple, single cause and cure for our societal problems lies all before us.
This phenomena is yet another manifestation of the engineered destruction of the educational system. The ensuing cultural shift has equated "smart" with "liberal" as a negative attribution and seeks instead to elevate ignorance and conservatism as the preferred norm for political, religious, and cultural discourse. "Dumb and Dumber" was supposed to be a comedy movie, not a cultural how-to manual. Well, I suppose those with a clue should have an opportunity to exploit this phenomena (in prototypical Darwinian survival-of-the-fittest terms,) unless the dead weight of the ignorant and apathetic and the rise of right-wing hate-based activism doesn't destroy society in the meantime. Interestingly, only six percent of scientists currently self-identify as Republican... and some 60% of Republicans believe the world is under 10000 years old. Go figure.
I do think you underestimate the damage done to the educational system by left wing philosophies that proudly replace objective standards by relativism, individual responsibility by a culture of victimhood and entitlement, and actual learning and discipline by obsession with self-esteem.
Just a polite nudge to all that we have no interest in turning this discussion into a political free-for-all. There are other places for that.
If you are not conversant in any methodology you will never master it and you will never achieve excellence.
If you lack understanding in psychology and sociology you will never understand other people, or yourself.
Words mean something or they mean nothing -- use them mindfully.
I have a problem with the statement, "Today, with everyone using technology, I suspect that business plans and funding proposals all assume a large potential audience, which in turn causes the ideas to be vetted more on their business chances than their technological innovation."
That has always been the case! I've dealt with scores of business plans and, while they are supposed to also be used to guide the growing company, they are fundamentally an explanation of how the company will make money. This is usually done to induce the reader to invest in the enterprise, so that hopefully everyone involved will eventually make money.
Sorry if it offends you, but in business, "technological innovation" is a means to success, not the goal in and of itself.
One more small point. I don't understand your not using long dashes and your reference to HTML. You must know that M dashes were in use long before HTML came on the scene, and, e.g., have been standard in MS Word for a long time.
When looking back, before the Internet, and before computers were commonplace, business plans simply couldn't assume as large an audience as they do now. That's merely my point there.
As far as em dashes go, I can assure you that we know all about them - some of us have set type by hand in the past.
We don't use them because they don't reliably travel in plain text email, which the majority of our email subscribers rely on still.
Before the personal computer, business people relied on specialists. If they needed something typed, there was a secretary. If they needed graphics or typesetting work, they hired experts. If they needed data manipulated, they worked with IT analysts and programmers.
Today, many people do most of these tasks themselves, because hardware and software are widespread and inexpensive. But it's time-consuming to achieve the mastery that all the specialists used to have, as well as asking a lot: experts in graphic design or programming are seldom experts in both. So people have a little knowledge about a lot more than they used to (before WYSIWYG and desktop publishing, who was concerned about fonts?), but not a lot of in-depth knowledge.
Some people learn how to solve problems and enjoy figuring things out, because they've built confidence that they can. But I think most people don't learn problem-solving, and when something goes wrong they feel incapable, so they don't try to learn more.
I'm reminded of something John Dvorak wrote in Unix Review in the early '90s: "People don't read documentation. This is part of a national trend toward stupidity, because people don't read anything!"
As for the car analogy: there are millions of jokes about clueless dummies (technologically illiterate) taking their cars to mechanics, don't know a spark plug from a radiator cap.
Aside to grammar cop Curt: you meant "implied", not "inferred". ;-)
I'd like to comment on the car analogy, especially where you say that the analogy suffers when it comes to car repairs being highly specialized.
I think that's a part of the analogy which in fact is very much a case in point, not a difference. Both Mac and Windows try very hard to shut the user out of the system and to make repair work, if something goes wrong, practically impossible. I am a network administrator myself, and when Windows problems turn up, the usual way to troubleshoot them is, in sequence: restart program, restart Windows, reinstall. Administration as such is almost impossible, just like repairing modern cars with a standard set of tools is.
It's the same with Macs, though at least they don't fail nearly as frequently as Windows computers.
So in the end it is not just that computer users are becoming less literate, but the system vendors support that trend by making real computer literacy impossible.
Steven Wright said: "Anywhere is walking distance if you have the time."
I once worked for someone who, when offered (by yours truly) a how-and-why-things-work solution, would claim that he just didn't have the time to learn and instead needed a simple bulleted list of where-to-click-and-what-to-do steps. I always believed that if he would make the time, I'd be able to "teach him to fish" instead of having to "give him a fish" each and every time. Always, that is, until my wife and I had children of our own, and a house, and numerous competing responsibilities. Even now I long for those days when I could learn to use PageMaker by reading the manual cover-to-cover. Now, it's a blessing to get just five minutes to read a paragraph or two, and it's hard to slow down my thoughts to take in even that much.
Too much to do and no time to do it (or even feel like you can).
Back to work! :-)
Brilliantly done, Adam. (So what’s new about that?) You’ve identified the symptoms perfectly, though as others have observed, they may be part of a general dumbing-down.
To support your argument I remember a time at least 20 years ago when I tried deperately by phone to help a neighbor use her Mac, and terminology was the “Thou Shalt Not Pass” roadblock. This time though, the problem was not inability (I think) — it was her absolute, adamant, unyielding REFUSAL to “use that computer language. I don’t call them icons, I call them little pictures.” No problem; I could work with that. But it was the snowflake on the tip of the iceberg, because she genuinely refused to use the right name or word for anything, even if she knew what it was. She did not LIKE that lingo, and she was not about to use it, period. It was the only time I’ve ever had to admit defeat, and I was completely unable to help her. I fear I was moderately (but not undeservedly) ungallant in telling her so.
Wow — you’re 47 years my juniorI? Good lord, you must have been young when we first met! (One of my happy life events.)
Thanks for the kind words, Richard! It's interesting coming back to this article 7 years later and seeing that nothing has really changed. I remain troubled by the increasing lack of a vocabulary for interface - it's one thing when someone won't use the right name for some interface control; it's another when there is no agreed-upon name.
LOL! It zoomed right past me that the article was done in 2009, because it’s so spot on today! Every word seemed completely current. But at least my oversight means you’re not 47 years younger, but only 38. I feel better now. }:-)
And you’re still a jewel in the Mac community.