Skip to content
Thoughtful, detailed coverage of everything Apple for 33 years
and the TidBITS Content Network for Apple professionals
29 comments

Why We Should Care about the Consumer Reports MacBook Pro Rating

Like many people, I scratched my head when Consumer Reports issued a “not recommended” evaluation of the 2016 MacBook Pro on 22 December 2016. It wasn’t that the new laptops are above reproach; rather, it was the huge inconsistency in battery life that CR saw across multiple tests of three different units. As a technology journalist, I knew what I’d do in that circumstance: assume it was my fault because of the erratic results in the tests — battery life ranging from 3.75 hours to 16 hours with the same model — and isolate my testing decisions until I found the problem, whether it was mine or Apple’s. I’ve done
this many times, including finding significant bugs in the first 802.11n AirPort routers that Apple later fixed.

The apparent reason for CR’s test result inconsistency was revealed on 11 January 2017. After Apple reviewed the test methodology provided by CR, the firm discovered a bug that affected a setting available only after enabling developer options in Safari. Apple has already released a developer beta of Sierra that fixes the bug, and CR says it will retest the laptops with the bug fix in place. Update: CR performed its tests with the fix in place and now recommends the MacBook Pro.

I found myself aggravated by this situation, though not through a desire to defend Apple nor to denigrate Consumer Reports, despite its checkered history in leading the charge on the non-existent Antennagate issue back with the iPhone 4 (see “Apple Responds to iPhone 4 Antenna Issue,” 16 July 2010).

Rather, it’s the way in which CR failed to serve its readers, rather than how it interacted with Apple, that worries me. (CR is a subscription publication that also makes some material available to the general public at no cost.) By neither revealing its test methods more fully in its original report nor admitting that it should have done more work to exclude its setup as the reason for the results, I worry that CR’s actions reduce the credibility of all technology reporting and reviewing.

Consumer Reports has more prominence on technology reviews now than at any previous time because so few remaining publications have the staff and time to perform rigorous testing. I wrote recently for Fast Company about the difficulty of finding safe and reliable USB-C products because of a lack of extensive independent testing. Because of this, more people are likely to rely on CR’s recommendations, which could lead to poor buying decisions for two reasons:

  • Users who might have benefited from purchasing newer gear might have unnecessarily put it off due to CR’s report.

  • Because CR found in retesting that the problems were due entirely to this bug, it sent a message to all consumers — not just those who read Consumer Reports — that negative results may just be due to testing errors.

To enable the setting in question, you must open Safari > Preferences > Advanced and select the Show Develop Menu in Menu Bar checkbox. Then, in the Develop menu, you have to choose Disable Caches. CR uses a script to pull a set of 10 Web pages over a local network repeatedly, and disables caching to simulate the effect of a user pulling down fresh pages from many sites. Without caching, the test provides a consistency that isn’t related to network or remote server performance; with caching enabled, Safari wouldn’t use the network or other system resources much at all. The bug apparently revolves around icon caching and thrashes battery life in some cases.

That said, almost no one would ever engage this setting — caches almost always improve performance. Thus, even though some new MacBook Pro owners are complaining about poor battery life relative to previous laptops and advertised performance, no regular user will encounter this particular bug. (Apple disabled the calculation of remaining battery time in macOS following the MacBook Pro release, reportedly due to its predictive accuracy, see “macOS 10.12.2 Sierra Focuses on New MacBook Pros,” 13 December 2016.)

Consumer Reports did most of the things that are upheld as standards at responsible technology publications: it presented Apple with the testing results and gave Apple an opportunity to respond, but didn’t change its conclusion when Apple couldn’t explain the discrepancy. After publishing the results, CR provided additional detailed information to Apple, which analyzed it and determined where the bug in Safari lay.

Since CR retested and found that battery life is comparable to previous models and competing laptops, it’s reasonable to ask what cost came with the first report.

It may have cost Apple sales. No reporter should worry about whether their honest and well-researched test results might affect a company’s sales, but they should always be concerned with rigor and fairness. (A reporter can have qualms, especially when writing about small firms, but those need to be balanced with the readers’ best interests.)

More concerning is the effect that CR’s report could have on the trust consumers place in Consumer Reports in particular and technology publications in general. Without a trusted technology press to verify corporate marketing claims, consumers will be left only with inherently biased sources of information. It’s extremely rare that any tech company actively aims to deceive consumers about the quality of its goods — see the recent debacle Samsung caused by not immediately owning up to Galaxy Note 7 problems — but everyone tries to paint their products in the best possible light. And it’s not hard for that paint job
to verge into whitewashing away unpleasant realities.

So yes, I question Consumer Reports publishing their recommendation without trying harder than they say they did to figure out why their tests provided such wildly inconsistent results. In fact, CR did test briefly with Google Chrome, in a process that wasn’t fully documented in the original article. CR obviously couldn’t recommend the MacBook Pro based on the Chrome test, but they could have used those results to help isolate the Safari problem in their test suite.

The moral of the story is that, when the results of one test are so unusual, the correct thing to do is dig more, not introduce confusion into the world. The reason isn’t to save a company’s feelings or sales, but to keep your bond of trust with your readers.

Note: This article appeared a few hours before CR issued its updated recommendations after retesting, finding consistent and high battery life.

Subscribe today so you don’t miss any TidBITS articles!

Every week you’ll get tech tips, in-depth reviews, and insightful news analysis for discerning Apple users. For over 33 years, we’ve published professional, member-supported tech journalism that makes you smarter.

Registration confirmation will be emailed to you.

This site is protected by reCAPTCHA. The Google Privacy Policy and Terms of Service apply.

Comments About Why We Should Care about the Consumer Reports MacBook Pro Rating