Skip to content
Thoughtful, detailed coverage of everything Apple for 33 years
and the TidBITS Content Network for Apple professionals
77 comments

Let’s Stop with the Siri Baiting

Immediately upon the release of the iPhone 4S with Siri, Apple’s speech-driven virtual assistant, people started asking Siri all sorts of questions and posting Siri’s often-hilarious responses (there are plenty more sites with names and URLs that will get our email issue marked as spam).

Now, however, toying with Siri has taken a darker turn, with people reading all sorts of things into Siri’s responses. Most recently, a kerfuffle erupted over Siri’s inability to find an abortion clinic in New York City, while a similar request for Washington, D.C. resulted in directions to anti-abortion centers. (Apple quickly responded to the New York Times, attributing the problem to “kinks in the product” and the fact that Siri is still in beta.)

In a move reminiscent of how Greenpeace harangued Apple for the PR value (see “Greenpeace Hitching Itself to Apple’s Star?,” 2 February 2010), MoveOn.org even sent out email encouraging people to sign a petition asking Apple to modify how Siri works, claiming that Siri “won’t tell you where you can get an abortion or even emergency contraception — instead she’ll promote anti-abortion pregnancy ‘crisis’ centers.” MoveOn went on to say, “When a user asked her why she is anti-abortion, she replied, ‘I just am.’” Oh, please.

Siri is neither a comedienne nor an Apple spokesdroid. Apple has cleverly programmed Siri with a wide variety of chatty responses to give the impression of personality and make people more comfortable speaking to what is essentially a chatterbot. The technique is of course not entirely successful; just like the original algorithmic psychoanalyst ELIZA (created by Joseph Weizenbaum in 1966 at MIT), Siri can’t hope to understand and respond to your every question or comment, and must therefore waffle to avoid disappointing you with flat, robotic answers.

Siri is most helpful when what you say contains keywords that enable Siri to pass off what you said to one of the supported apps or services. Apple provides a list of these in the Siri FAQ. But even then, Siri is limited by the capabilities and information encapsulated in those apps and services. (AI programmer and teacher Jeff Wofford has an interesting blog post speculating on how Siri works, though it’s worth remembering that he wrote it before the iPhone 4S came out.)

So it shouldn’t be a surprise that Siri will fail to provide the desired responses to certain questions. Presumably, whatever Yellow Pages-like database Siri uses currently lacks a category for abortion-related services, an omission that Apple can and should address. In contrast, the YellowPages.com Web site has an “Abortion Services” category. But very few of the organizations appearing in that category use “abortion” in their names. When I asked Siri for directions to the few that did, Siri had no trouble finding them in the Maps app.

In short, Siri is only as good as the underlying databases that Apple baked in. (Luckily, because Siri’s processing happens on Apple’s servers, not on the iPhone, Apple can continue to improve and extend Siri’s capabilities.) When I scanned the list of organizations that YellowPages.com returned for Manhattan under “Abortion Services,” I didn’t see any mention of Planned Parenthood. So I did another search in YellowPages.com for Planned Parenthood around Manhattan, and of the 98 hits, found that they were variously categorized under “STD Testing Centers,” “Family Planning Information Centers,” and “Birth Control Information & Services.” In other words, metadata matters, and if you don’t have good metadata,
you don’t get good results.

This is actually a serious issue in one respect, since it shows just how important technology has become in shaping our impressions of the world around us. And that in turn points to how essential it is that we continue to scrutinize how well search-related technologies work and remain aware of those technologies’ inescapable limitations. Just as you shouldn’t believe everything you read on the Internet, you shouldn’t believe everything Siri tells you.

Oh, and MoveOn’s snarky report that Siri self-identifies as being against abortion? That’s one of those chatty responses that Siri throws in to seem more human. I asked Siri, “Why are you anti-abortion?” and got back Popeye’s standard retort, “I am what I am.” Of course, I got the same answer when I asked, “Why are you against kittens?” and “Why are you a cannibal?” Similar questions generated a few equally fluffy responses, including:

  • “Why, indeed?”
  • “I can’t answer that.”
  • “I don’t know.”

So can we stop pretending that Siri is anything more than ELIZA’s chatterbot daughter? Siri can be useful, and is a whole lot of fun to demo, but it’s unreasonable to read anything more — certainly not Apple corporate policy — into Siri’s successes, failures, and little asides. Heck, we can’t even get Apple PR to say what Apple policy is most of the time. At least Siri always responds to our questions.

Subscribe today so you don’t miss any TidBITS articles!

Every week you’ll get tech tips, in-depth reviews, and insightful news analysis for discerning Apple users. For over 33 years, we’ve published professional, member-supported tech journalism that makes you smarter.

Registration confirmation will be emailed to you.

This site is protected by reCAPTCHA. The Google Privacy Policy and Terms of Service apply.

Comments About Let’s Stop with the Siri Baiting