Last week we asked you which of the 20 top features you use from Apple’s 2021 operating systems: iOS 15, iPadOS 15, and macOS 12 Monterey. The results are in, and Adam Engst takes a spin through the features, evaluating the vote tallies and commenting on each feature. With an eye toward the future of macOS 13 Ventura and iOS 16, Julio Ojeda-Zapata previews the Continuity Camera webcam feature, which lets you use your iPhone as a Mac webcam. Finally, Josh Centers examines the OWC miniStack STX, an unusual expansion accessory aimed at giving Mac mini owners extra Thunderbolt ports and external storage. Notable Mac app releases this week include Cardhop 2.1, MarsEdit 4.6, Evernote 10.42, SpamSieve 2.9.50, Nitro PDF Pro and PDF Pro Essentials 13.3, and Tinderbox 9.3.
OWC has specialized in creating innovative hardware accessories tailor-made for the Apple community for decades. The company recently sent our colleague Glenn Fleishman one of their new products, the OWC miniStack STX. But Glenn didn’t have any use for it and OWC didn’t want it back, so I wound up with it. It’s as puzzling as it is useful.
The OWC miniStack STX is two solutions in a single box. It’s a three-port Thunderbolt 4 hub that also offers storage expansion for your Mac via both internal SATA and NVMe connections. It’s designed with the Mac mini in mind, being built to the exact same shape and dimensions. The idea is that you put your Mac mini on top of the miniStack STX and connect it with an included Thunderbolt 4 cable to expand the Mac mini’s storage and connectivity.
I don’t have a Mac mini, but I do have a 27-inch iMac that sits too low for my neck’s comfort. I previously propped the iMac up with a book, but the miniStack STX raises the iMac just as well, looks a little more professional, and solves another problem: my ever-growing Photos library. I’ve been struggling to keep enough free space on my iMac’s 1 TB of internal storage as my Photos library has ballooned to over 600 GB. I have several external hard drives kicking around, but they take up too much room on my already-crowded desk. I decided to put the miniStack STX to work.
The miniStack STX costs $279 with no storage, but you can pre-configure it with various SATA and NVMe storage configurations—up to 26 TB altogether. Or you can buy the miniStack STX on its own and add whatever drives you have lying around. The model Glenn sent me came with a 2 TB 7200 RPM hard drive, a configuration that lists for $349.
I was worried that the hard drive would be slow and loud, but I’ve been surprised by how quick and quiet it is. I moved my Photos library over in under an hour, and the only noise it makes is the quiet hum of its cooling fan.
Of course, you don’t have to use a hard drive. You could instead install a SATA SSD or an NVMe M.2 SSD module. When I catch a good sale, I plan on upgrading the miniStack STX’s storage to all flash. The SATA slot accommodates both 2.5-inch and 3.5-inch drives. OWC also includes its SoftRAID utility so you can combine the SATA and NVMe storage together into a RAID 1 configuration for redundancy.
I don’t have many Thunderbolt accessories, but I used one port to connect to my LG monitor via USB-C, which worked perfectly (for details on my monitor, see “LG 27UK850-W: An Acceptable 27-inch Display for the Mac” 18 December 2020). The miniStack STX supports up to two external displays on Intel-based Macs but only one when connected to M1 Macs.
One port must be connected to the Mac, which leaves three open ports on the miniStack STX. Those ports deliver a full 60 watts of power, so you could use one to fast charge an iPhone, or charge an iPad or MacBook Pro.
In theory, Thunderbolt 4’s bandwidth opens all sorts of possibilities for hubs and extra ports. In reality, it would be a mess of cables and dongles, which would defeat the purpose of the miniStack STX’s neat packaging. OWC missed an opportunity here. If the front of the miniStack STX sported a few USB-A ports and a headphone jack, it would be an easier sell. You can never have enough ports, and they’re hard to reach on Apple desktop Macs.
What problems does the miniStack STX solve?
- Fits neatly under a Mac mini or raises an iMac to a higher level
- Expands a desktop Mac’s total storage without increasing its overall desktop footprint
- Adds three additional Thunderbolt 4 ports to any Mac
While OWC explicitly designed the miniStack STX for the Mac mini, it fits well enough underneath my iMac, and it could also serve as a charging and Time Machine dock for a Mac laptop with an external display. (Note that the miniStack STX doesn’t support Thunderbolt 1 or 2, even with an adapter.)
Useful as it can be, the miniStack STX is a bit of a puzzle. It’s built specifically for a particular user: a Mac mini owner who needs extra storage and more Thunderbolt 4 ports. It’s a high-quality product, and I appreciate having it on my desk, but I wonder if OWC could have tweaked its industrial design and capabilities to better serve the needs of a broader subset of users. For instance, I could imagine optional plastic exoskeletons that would make it match the shape of iMac feet or support a Mac laptop, and a few additional ports on the front would be similarly welcome.
As someone with relatives on multiple continents and a day job I will do entirely from home for the foreseeable future, I engage in a lot of video conferencing. It’s often an unsatisfying experience.
This is partly due to the historically subpar quality of computer webcams like the FaceTime HD cameras built into Macs (see “The 2020 MacBook Air’s FaceTime HD Camera Is Still Lousy,” 8 April 2020, and “Why Are Webcams So Lousy?,” 25 January 2021). Apple has finally bumped the webcams in the latest Mac models from the laughably old 720p resolution to 1080p, but they still yield disappointing image quality. The same goes for Apple’s Studio Display external monitor, which has its own built-in camera.
It’s unclear if Apple plans to improve webcam specs in future models, but there’s an alternative that works now—repurposing the iPhone’s optically superb cameras as a Mac webcam. This is not a new idea. Elgato offers Epoccam software to pull this off. Similarly, Reincubate provides the ingenious Camo virtual-camera system (see “Turn Your iPhone into a Powerful Webcam with Camo,” 24 July 2020), but the product’s complexity and cost—beyond a basic free tier—will scare some Mac users away.
Now Apple has its own take, called Continuity Camera (see “Ten “It’s About Time” Features from WWDC 2022,” 6 June 2022). It’s one of the new features in the still-in-beta iOS 16 and macOS 13 Ventura and is a potentially compelling video conferencing enhancement for several reasons: it’s free, easy to set up and use, and provides advanced capabilities without being overwhelming. Familiar iPhone and iPad camera features such as Center Stage, Studio Light, and Portrait Mode are built into Continuity Camera, making it more than just a better camera.
I’ve been testing Continuity Camera for the past few weeks. Much like the new Stage Manager for the Mac (see “First Impressions: Stage Manager on the iPad and Mac,” 18 July 2022), it is a feature I suspect I’ll continue to use long after I have stopped experimenting with it in my tech writer role. Unsurprisingly, Continuity Camera has some rough edges given its pre-release status—one feature called Desk View is far from fully baked—but it looks promising.
Physically Mounting Your iPhone
In order to use Continuity Camera, you first need to figure out how to mount your iPhone so it is in a proper position for video chatting, ideally just above the upper edge of your laptop lid or your standalone display. A Reincubate post studies a myriad of options.
Apple will soon provide its own solution in the form of mounts it has commissioned from Belkin. The accessories clip to the top edge of a Mac notebook’s lid or a Studio Display and hold a recent-model iPhone using MagSafe, with the handset’s rear camera array aimed at the user.
The mounts are not yet available to the public, but some lucky tech writers have been sent pre-release versions. I’m told I’m also on this list, and while I wait for a delivery, I found something helpful in my tech gear stash: PopSocket’s PopGrip for MagSafe. It’s an oval slab that clamps magnetically to the back of an iPhone and incorporates that classic telescoping two-finger circular grip.
In the Continuity Camera context, the grip grabs—imperfectly but serviceably—the top edge of a MacBook lid or a Studio Display, placing the iPhone in a near-perfect position for video chats. It does not work at all with my beloved 23.7-inch LG UltraFine Display (see “Apple Debuts LG’s All-New 23.7-inch UltraFine Display,” 20 May 2019), though, so I’ll have to figure out something else there.
The good news is you can order your own PopGrip if you’re eager to experiment with Continuity Camera right away.
Setting Up Continuity Camera
You need to do a few things before you can use Continuity Camera:
- Check that you have an iPhone XR or later.
- Install the public betas of Ventura and iOS 16 (iPhone 8 or later, but you’ll need an iPhone 11 or later to take advantage of advanced features I will get to in a bit). As always, don’t install betas on any device you need to be reliable.
- Ensure that the Mac and iPhone are using the same Apple ID, and that two-factor authentication is turned on.
- Wi-Fi on your Mac needs to be on. Neither Bluetooth on your Mac nor Bluetooth and Wi-Fi on the iPhone seem to have any bearing on whether Continuity Camera works properly. However, Apple suggests you also have those on, and you probably already do.
- Activate Continuity Camera on your iPhone by going to General > AirPlay & Handoff and enabling Continuity Camera Webcam. It’s supposed to be on by default.
Using Continuity Camera: The Basics
The rest is super simple. Launch a videoconferencing app on your Mac, and search for the option within the app to choose a camera. The iPhone should be one of the choices there. Pick it, and the iPhone should shift seamlessly into webcam mode. This happens wirelessly—there’s no need to connect the iPhone to the Mac using a Lightning cable as with Camo.
In FaceTime, for instance, click the Video menu, and you’ll see a list of cameras. My iPhone sometimes failed to appear, and FaceTime sometimes locked up when I tried switching to Continuity Camera, but a Mac or iPhone restart tended to resolve the issues, so I’m chalking this up to beta flakiness.
If Automatic Camera Selection is selected, Mac-iPhone pairing should occur automatically.
In some cases, you’ll be able to designate the iPhone’s microphone as the audio input device separately—FaceTime is one app that offers this flexibility.
Apple has promised broad compatibility with third-party video conferencing services that function in either Mac or Web app form, and it appears to have delivered. Apps I briefly tested with good results include Zoom (app and Web), Google Meet (Web), Skype (app and Web), Microsoft Teams (app and Web), Facebook Messenger (app and Web), and WhatsApp (app and Web). Continuity Camera also worked with WebEx, but the video quality was mediocre in its app and abysmal through its Web interface.
Continuity Camera’s usefulness extends beyond video chats. At least some other video apps with recording functions, such as Photo Booth and QuickTime, also support its video. In contrast, iMovie would not recognize the iPhone as a video-input option—it seems likely that will change in the future.
Video Quality Is Very Good
Just a quick glance at the iPhone webcam’s video quality all but convinced me that it will end up being my default Mac video chat camera once Continuity Camera comes out of beta. It’s great.
I’m using an iPhone 13 mini, which in Continuity Camera defaults to its 12-megapixel Ultra Wide camera with an f/2.4 aperture and a 120° field of view. Your results will vary if you are using an older iPhone model with less-capable camera hardware, but you should find your video quality to be at least on par with—and likely better than—what your Mac or third-party webcam provides.
Here are image comparisons with the FaceTime HD cameras in the 2020 M1 MacBook Air, the recently released M2 MacBook Air, and the Studio Display, which was released earlier this year. Continuity Camera imagery is superior across the board. The M2 MacBook Air improves on the M1 MacBook Air, but not dramatically so. The Studio Display’s imagery is muddled and a bit dark. (Note: my loaner display had the latest software update, which was intended to fix earlier image quality issues, and has done so to a degree, but it clearly has not worked miracles.)
Using Continuity Camera: Advanced Features
Apple, unlike Reincubate, has aimed for simplicity over a robust feature set. That said, Continuity Camera builds in a few bells and whistles that give it flexibility and sophistication. They are accessed by going to Control Center > Video Effects when Continuity Camera is in use. Activate or deactivate these settings in any combination you like.
This iPad feature, a form of “auto framing,” keeps you in the picture during a video conference even when you move around (see “Center Stage Keeps You in the Video Chat Frame,” 23 September 2021). Center Stage also is present on the Studio Display, but it isn’t an iPhone capability yet—at least not when the handset is used on its own.
Center Stage magically appears when the iPhone is in Continuity Camera mode, however, provided it is an iPhone 11 model or later. It’s the best auto-framing implementation I’ve seen, but video quality degrades a bit when Center Stage is enabled.
On iPhones as far back as the iPhone 7 Plus from 2016, users have been able to take photos in Portrait mode, which keeps the foreground subject in focus while blurring the background. Portrait mode arrived for video with the iPhone 13 last year in the form of Cinematic mode, which allows for shooting using a shallow depth of field.
Now it’s present in Continuity Camera as an option to toggle background blurriness on and off. In this sense it is playing catch-up—video conferencing services such as Zoom and Google Meet have executed this feature well in recent years.
The new Studio Light feature uses computational photography to illuminate the user’s face while dimming the background. It works beautifully if subtly in Continuity Camera (assuming you have an iPhone 12 or later) and is somewhat like using a ring light, as Apple noted during its WWDC demo of Continuity Camera.
Unfortunately, Continuity Camera offers no other options for adjusting or enhancing image quality, unlike Reincubate’s Camo, which perhaps offers too many such options. There were moments in Continuity Camera when I looked discolored, and I could have used some help tweaking my appearance. Maybe such settings are on the way.
It’s not uncommon during videoconferences for participants to deploy two cameras, one that points at their faces, and another that aims downward to show off what’s on their physical work surfaces for demo or discussion purposes.
With Desk View (iPhone 11 or later), Apple is attempting to pull this off with just one camera—specifically, that Ultra Wide camera, which has the most expansive view and is therefore well suited for this task. While your face is in one window, a second window displays an eagle’s-eye view of your desk. It’s a cool trick… but tricky to get right.
It’s hard to modify the camera’s downward angle, so your notebook lid or external-display hinge can serve that role—tilt it towards you until the stuff on the desk is framed within the Desk View window. (Your face should stay framed within the other window if you have Portrait mode on.) Unfortunately, the quality of the desk imagery isn’t great.
Desk View works best for showing objects that are flat (think documents, photos, coins, and so on). Objects with any height—like the drinking glass and HomePod in the screenshot below—appear comically distorted.
Desk View, by the way, is not just a menu option in Video Effects, but a Mac app in its own right, though one that’s buried deep within the macOS System folder. Presumably, this architecture is necessary to provide compatibility with older apps that wouldn’t otherwise be able to access the Desk View video stream.
Contemplating Continuity Camera
When Apple first announced Continuity Camera at WWDC, I was a bit perplexed. Why jury-rig such a system when its Studio Display and recent-model Macs have improved webcams? But using Continuity Camera for a few weeks made me a believer.
It provides notably better image quality—it’s not even close—and offers significantly more features. Neither the Studio Display nor the MacBook Air’s FaceTime HD cameras offer Studio Light or Desk View, and the MacBook Air cameras don’t do Center Stage, either.
Apple might have developed Continuity Camera partly as a stopgap while it labors to bring its Mac cameras up to par with the iPhone and iPad. If so, that’s fine with me. I am looking to buy a Mac mini that I’d pair with my LG UltraFine—neither of which has its own camera—so repurposing my iPhone as a webcam via Continuity Camera would be a much better option than any of the janky third-party webcams I’ve tried in recent years (see “Why Are Webcams So Lousy?,” 25 January 2021).
Even if you are fine with the webcam built into your Mac, consider giving Continuity Camera a try once you’ve upgraded to iOS 16 and Ventura. It’s nice to have alternatives, and you might find yourself favoring iPhone webcam-ing on your Mac, as I have.
In the meantime, take a look at Reincubate’s Camo. It’s free in its basic form and loaded with features for those willing to pay up. I was especially excited about its many image-enhancement features—I wish Continuity Camera had even a fraction of those.
Thanks to the 1319 people who voted in last week’s survey attempting to determine which iOS 15, iPadOS 15, and macOS 12 Monterey features are most and least used (see “Survey: Which iOS 15, iPadOS 15, and macOS 12 Monterey Features Do You Actually Use?,” 28 July 2022). The results were fascinating!
Before we get started, let’s acknowledge some caveats. Most importantly, the survey only measures the usage of those who chose to complete it, largely TidBITS readers and those who may have heard about it in our second-degree circles. It would be a mistake to extrapolate the results to the Apple-using population as a whole or even all TidBITS readers.
However, as you’ll see, the respondents may be outliers only in degree, not direction. The most notable finding is that most of them don’t use most of the features listed. Only four features—Live Text, Shortcuts, Hide My Email, and Memories in Photos—received more votes saying they were Occasionally or Frequently used than Never used. (A fifth feature, App Privacy Report, was close, with only 57 more people on the Never side.) It seems probable that those responding to the survey were more likely than the average Apple user to use these features, suggesting that a broader survey would have shown even lower usage levels.
Also, the survey was intentionally simple, recording only the fact that people used a feature frequently, occasionally, or never. It didn’t attempt to discern why that might be, so any conclusions are merely speculative. For instance:
- One could conclude that Apple is putting too much focus on adding features and too little on fixing bugs, improving performance, and polishing existing features. Those in this camp often feel that one precludes the other, which may or may not be true.
- A possible follow-on conclusion from the previous one might be that Apple is creating features that aren’t worthwhile. I think that would be a mistake—my impression from the discussion so far is that while many people may use only one or two of these features, they do appreciate those they use.
- Another conclusion is that both Apple and the tech media need to do a better job of introducing new features to users. A common refrain among the comments was that many people—including many regular TidBITS Talk participants, who are probably even more involved than average TidBITS readers—didn’t even know about many of the features.
Finally, before we get to each feature, my son Tristan ran some statistical analysis on the survey results and found only three correlations—that is, if you use Feature A, you’re more likely to use Feature B. They were:
- Hide My Email and iCloud Private Relay were the most correlated (0.48), which makes sense because both are iCloud+ features and both protect your privacy.
- Shared with You and Memories in Photos were also correlated (0.44), and while the story behind that is more tenuous, I can imagine that people who find Shared with You’s collection of shared links helpful also like the way Photos creates collections of photos with Memories.
- Live Text and Virtual Lookup were also fairly correlated (0.36), a combination that again seems reasonable given that both expose metadata about images.
On to some commentary on the individual features, in ascending order of their popularity with the TidBITS readers who participated!
SharePlay: 1% Frequently / 6% Occasionally / 93% Never
I’m willing to chalk at least some of the bottom-dragging performance of SharePlay up to the likely age of many of the survey respondents. TidBITS readers trend older, and if you’re anything like me, the entire concept of watching a movie or listening to music simultaneously with others via FaceTime is largely incomprehensible. It might also have been an activity that was more common in the darkest days of the pandemic lockdowns. When I mentioned this to Tristan, however, he attributed it more to people not using FaceTime much. I can’t argue with that—we seldom use FaceTime.
FaceTime Links: 1% Frequently, 10% Occasionally, 89% Never
Tristan may be onto something here since FaceTime links, which let you schedule FaceTime calls instead of surprising people with an out-of-the-blue call, were only slightly more used than SharePlay. And given the utility of being able to schedule calls and share links to them in services like Zoom and Google Meet, there’s no question that this was a welcome addition to FaceTime. However, Apple might have been too late to the party.
Voice search in Safari: 2% Frequently / 13% Occasionally / 85% Never
People, people, people! Voice search in Safari on the iPhone and iPad is a great feature, and if you’re ever doing searches in Safari (and really, who doesn’t?), you should try it. Activate it by tapping the microphone button in the Safari search field and then speaking your search terms. It’s smart enough to detect when you stop and submit the search. I realize I’m becoming more focused on voice input thanks to controlling our home using Siri on our HomePods, but it’s a big win over the onscreen keyboard.
Tags in Notes: 5% Frequently / 14% Occasionally / 81% Never
I’m sure I’m revealing my biases here as well, but as someone who has never found Notes to work the way I think, I wasn’t surprised that Apple’s addition of tagging as another organizational option in Notes wasn’t commonly used.
Full-Screen Menu Bar: 9% Frequently / 16% Occasionally / 75% Never
I’m betting that most people who voted Never on this feature did so because they didn’t realize it existed. If you open System Preferences > Dock & Menu Bar > Dock & Menu Bar and uncheck “Automatically hide and show the menu bar in full screen,” the menu bar will always appear for apps in full-screen mode. For those who regularly work in full-screen apps, this could be a huge win—I hate having to mouse up to the top of the screen to see the menu bar. The only downside of keeping the menu bar visible is that it clutters full-screen video playback and videoconferencing, where you’re not interacting with the app and want all available screen real estate devoted to the video.
AirPlay to Mac: 5% Frequently / 21% Occasionally / 74% Never
The overall lack of popularity of being able to send audio and video to a Mac using AirPlay didn’t surprise me. Not that it’s not welcome—it was a long-standing omission—but because AirPlay is the sort of feature that people use occasionally, as needed, not every day. The Occasionally percentage supports that belief. Couple the relatively infrequent need for AirPlay with the newness of the feature, and I suspect that many of those who would use it either didn’t know it was possible or forgot about it. Make sure to enable AirPlay Receiver in System Preferences > Sharing before trying it. And note the AirPlay to Mac system requirements.
Legacy Contacts: 9% Frequently / 20% Occasionally / 72% Never
Here’s a feature I’d like to see more people using, and it’s something we should cover in more detail. In short, once you designate someone as a Legacy Contact, they’ll be able to access the data in your Apple account after your death. Believe me, that will make life a lot easier for those left behind.
For the Legacy Contacts question, we asked that you answer Frequently if you have designated a Legacy Contact and if you are a Legacy Contact for someone else, and Occasionally if only one of those is true. So the breakdown here isn’t bad, with just under a third of respondents using it to some extent.
Focus: 13% Frequently / 15% Occasionally / 72% Never
In this question, we explicitly tried to differentiate the new Focus feature in iOS 15, iPadOS 15, and macOS 12 Monterey from the long-standing Do Not Disturb, Do Not Disturb While Driving, and Sleep features that Apple rolled into Focus. We certainly hope those older features—particularly the basic Do Not Disturb—get a lot of use, but Focus feels, well, unfocused. It tries to do so much that even many highly technical friends find it confusing (see “Apple’s New Focus Feature May Be Overkill,” 20 January 2022). Apple seems to be doubling down on Focus in iOS 16, iPadOS 16, and macOS 13 Ventura, so we’ll see if the changes make it easier or harder to predict how the iPhone will or will not notify you at any given moment.
Safari Tab Groups: 13% Frequently / 16% Occasionally / 72% Never
When John Gruber asked this question on Twitter, his respondents weren’t enthusiastic either, with 15% responding that they use Safari Tab Groups “A lot,” 19% saying “Sometimes,” and 66% reporting “Never.” Our numbers skew slightly toward even less use, as you can see. Tab groups work for some people but not for many others, as became obvious in the comments when we were negative about the upcoming addition of Shared Tab Groups in “Seven Head-Scratching Features from WWDC 2022” (13 June 2022).
Universal Control: 10% Frequently / 19% Occasionally / 71% Never
Despite Univeral Control’s position in the bottom half of the responses, I think this is actually a pretty good showing. Given that the feature was delayed until macOS 12.3 and iPadOS 15.4, no one had a chance to use it until a few months ago (see “Using Universal Control in macOS 12.3 Monterey and iPadOS 15.4,” 24 March 2022). Even after it shipped, using it requires both having multiple devices and wanting to use them simultaneously—a relatively uncommon combination outside of the cognoscenti.
System-wide translation: 8% Frequently, 23% Occasionally, 69% Never
Again, needing to translate text on the Mac from one language to another can’t be something that most people want to do regularly, so the fact that just under a third of respondents say they use it shows solid adoption. Putting system-wide translation just a Control-click away is a big win, and the main trick for many people will be remembering that it’s available when it would be useful.
Visual Lookup: 4% Frequently / 27% Occasionally / 69% Never
Here’s another “when you need it” feature. Visual Lookup (below right) uses machine learning to identify objects in photos like art, flowers, landmarks, pets, and plants (see “Get More Info from Photos in iOS 15 and iPadOS 15,” 8 October 2021). For me, it has mostly been a novelty since I generally know what I’m looking at, but that’s in part because when I want to identify a plant, I use Seek from iNaturalist (below left) instead.
QuickNote: 7% Frequently / 25% Occasionally / 68% Never
It’s hard to complain about QuickNote, which lets you create notes quickly with an Apple Pencil gesture on the iPad or with a hot corner or keyboard shortcut on the Mac. But of course, it’s useful only if you use Notes heavily enough to want to create notes more quickly than ever before.
iCloud Private Relay: 22% Frequently / 13% Occasionally / 65% Never
iCloud Private Relay, which routes all your Safari traffic through two encrypted Web proxy servers to increase your privacy, may be available only to iCloud+ users, but it’s just a switch in System Preferences > Apple ID > iCloud. As such, it’s an easy decision to turn it on for those already paying for extra iCloud storage, and I suspect that accounts for its Frequently votes exceeding the Occasionally votes. iCloud Private Relay is still nominally in beta, so if you have any problems after enabling it, there’s no harm in turning it off (see “Solving Connectivity Problems Caused by Interlocking Apple Privacy Settings,” 20 June 2022).
Shared with You: 8% Frequently / 28% Occasionally / 64% Never
I’ll admit to some surprise that Shared with You ranked as highly as it did. It collects links others send you in Messages and displays them in the Apple apps Music, News, Photos, Podcasts, Safari, and TV. Apparently, others find this more helpful than I do since it was one of those features I knew about but have never had occasion to use.
App Privacy Report: 7% Frequently / 40% Occasionally / 52% Never
I’m heartened that nearly half of the respondents have checked their App Privacy Report, which you find at Settings > Privacy > App Privacy Report on the iPhone and iPad. It reports on which apps are using data and sensor access, app network activity, website network activity, and most contacted domains. Even if you don’t understand all the details, seeing which apps can do what and where your data is going can either be reassuring or help you stop using an app that’s abusing your privacy.
Hide My Email: 18% Frequently / 32% Occasionally / 50% Never
A full half of respondents use Hide My Email, which feels like another privacy win for Apple. Many people wince when asked to provide their email address for yet another account, knowing that they’re signing up for bulk email. Hide My Email won’t prevent you from receiving marketing email, but it does ensure that companies don’t have your real email address, and it enables you to block messages from that particular sender anytime you want (see “Cut Down on Junk Mail with iCloud+’s Hide My Email,” 21 October 2021). Manage your Hide My Email addresses in System Preferences > Apple ID > iCloud > Hide My Email.
Memories in Photos: 13% Frequently / 37% Occasionally / 50% Never
With this question, we chose to encourage respondents to consider the Memories feature of Photos in general because there was no coherent way to tease out just the refinements Apple made in iOS 15, iPadOS 15, and macOS 12 Monterey. As such, it’s not surprising that it ranked highly—it has been around for quite some time. I’ll admit that I’m not a fan of Memories because its notifications often appear at times when I don’t wish to be distracted from what I’m doing. Although, honestly, I’m not the sort who goes back to look at old photos often anyway—perhaps that’s another stage of life. My ideal way of being exposed to photographic memories from the distant past is through screensaver photos on the Apple TV, which has never worked as well as I’d like.
Shortcuts: 17% Frequently / 35% Occasionally / 48% Never
Although Shortcuts was new only on the Mac in this operating system cycle, we encouraged respondents to consider all platforms when voting. After all, Shortcuts are theoretically cross-platform, and if anything, it seems like the Mac would be an easier platform on which to create and edit shortcuts than iOS or iPadOS. That undoubtedly helped the strong performance of Shortcuts in this survey. Personally, I’ve been quite frustrated by Shortcuts due to the lack of support in the apps I’d like to automate. I tend to resort to Keyboard Maestro for most of my automation needs on the Mac. But if I didn’t have it, I’d probably be working harder to figure out the quirks and workarounds to bend Shortcuts to my will.
Live Text: 18% Frequently / 42% Occasionally / 40% Never
The standout feature in the survey was clearly Live Text, the only one whose Never percentage fell below its Occasionally percentage. People love Live Text, and a number of folks in TidBITS Talk commented that they found it particularly useful in conjunction with system-wide translation: take a photo, select the text, and ask for a translation. (Although, if you’re doing that, you might also try Google Translate, which can instantly translate text that appears in the viewfinder.) Regardless, there’s no question that Live Text is tremendously useful for letting you access text in an image as text instead of just pixels (see “Digitize Any Text with Live Text in iOS 15 and iPadOS 15,” 4 October 2021). If you haven’t used it yet, give it a try.
Phew! Such a lot of data! Nonetheless, the survey results were eye-opening, and we’ll have to think more carefully in the future about how to tease out which features people are most interested in learning. Apple won’t stop innovating, and while there’s no requirement that you use everything the company invents, you should be aware of what’s possible in case it can make your life better in the future.