HEVC and HEIF Will Make Video and Photos More Efficient
If you haven’t already experienced abbreviation overload, Apple has added two more to your plate: HEVC (High Efficiency Video Coding) and HEIF (High Efficiency Image File Format — yes, it’s short one F). These two new formats will be used by iOS 11 and macOS 10.13 High Sierra when Apple releases them later this year.
While you may not have heard of HEVC or HEIF before, both are attempts to solve a set of problems related to video and still images. As people take photos and shoot video at increasingly higher resolutions and better quality, storage and bandwidth start to become limitations. Even in this day of ever-cheaper and ever-faster everything, consuming less storage space and requiring less bandwidth when syncing or streaming still has many benefits.
The Current Landscape — Those of us who have been around the block a few times have seen plenty of image formats come and go. Once, I thought the Kodak-backed FlashPix format might make inroads because of how it created a hierarchy of multiple sizes of images in a single file for faster retrieval. But it was not to be.
Since the 1990s, only the PNG (Portable Network Graphics) format, designed to avoid certain patents then extant, has joined the pantheon of widely supported, well-established image formats, alongside JPEG and GIF. (The long-established TIFF isn’t used on the Web, but it remains important in publishing workflows.)
The main split between image format types is whether they’re lossless or lossy. Lossless formats retain pixel-for-pixel details and tones exactly, at the expense of a larger file size. Compression in lossless formats reduces the storage required for redundant information without discarding detail. Lossy formats rely on algorithms to approximate detail and tones across regions of an image, allowing for typically much smaller file sizes.
None of the popular formats fit all needs. JPEG is lossy, so it’s great for photos but less useful for screenshots, and it doesn’t support an alpha channel for transparency, like GIF and PNG. GIF is lossless, but relies on limited color palettes, making it great for certain kinds of artwork, but poor at reproducing photos. (Reduced color palettes were more important when color displays had shallow color depth and bandwidth was more constrained). GIF also supports animation, unlike JPEG and PNG’s most common implementation. PNG, either lossy or lossless, works particularly well for screenshots and was designed around the patent encumbrances that once restricted GIF usage. PNG and JPEG are both used on about 74 percent of Web sites, but GIFs still appear on 36 percent of sites. PNG didn’t succeed in killing GIF, but it made huge inroads.
HEIF tries to combine all the best aspects of PNG, JPEG, and GIF, while dramatically improving compression and adding new features like the capability to store bursts of photographs.
Video formats have suffered from a more fraught path, because many were caught up in newer patents that hampered widespread adoption. Image formats had to deal with patents, too, but either they were near the end of their lives when the Web was young, as with LZW compression, or the makers of various tools to create and display images — from Photoshop to Internet Explorer — had already licensed all the necessary bits and pieces.
These video patents complicated things for Web sites. Questions arose as to whether sites would have to pay royalties for every view, free software groups debated the use of encumbered standards, and Flash took off as a cross-platform delivery package for video, because Adobe took care of all the back-end licensing and display issues.
Ultimately, the MP4 family of standards and its H.264 codec (encoder/decoder) won the day, with the vast majority of video available on the Web now using that format. (The group that pooled patents for H.264 said it wouldn’t collect royalties for free Internet-delivered video. Hardware-accelerated encoding and decoding followed. That took the wind out of competitors’ sails.)
What will be the real-world benefits of these new formats? Let’s start with HEVC and move on to HEIF.
HEVC: Encompassing the Future While Shrinking the Past — Video streaming gobbles up over 70 percent of evening Internet traffic, and network-management firm Sandvine estimates it’s on track to hit 80 percent in 3 years. Thus, streaming video companies, ISPs, and viewers who have monthly caps or overage fees have a huge incentive to get more from less, and HEVC is the solution. Apart from Google’s 4K-capable VP9 codec, HEVC is the only reasonable path for most streaming services to affordably and practically feed out 4K Ultra-High Definition (UHD)
video for mobile devices. (VP9 is built into Android starting with version 5.0 Lollipop.)
HEVC is another name for the H.265 standard, and it’s being promoted as taking 50 percent less data to produce streams or downloads of the same quality as H.264 when the resolution is 1080p or less. Netflix, which accounts for about half of the aforementioned primetime data usage in the United States, found that it did indeed achieve 50 percent savings. (Google’s VP9 hit the same 50 percent mark.) Netflix suggests 5 Mbps for HD video now, which would consume about 340 GB for 150 hours of content.
When it comes to 4K UHD video, HEVC uses about 40 percent less data. But that’s still tremendously more efficient. Netflix streams only 4K to televisions released starting in 2014 that had an earlier version of the HEVC hardware decoder. The streaming company currently advises a constant 25 Mbps or higher rate to stream its 4K content, which is about 1.7 TB for 150 hours of viewing. It would be over 40 Mbps at H.264 compression rates.
How can HEVC achieve such a notable improvement in compression without sacrificing quality? As with the jump from MPEG2 to H.264, it involves hardware acceleration on the encoding side. With chips that can perform specialized calculations, algorithms that perform more intensive analysis of video to find places to compress become more viable. HEVC can require up to 10 times more computation than H.264 to encode at the same bit rate!
But this is asymmetrical. In plain English, HEVC works because it’s relatively cheap to buy super-powerful computers with specialized chips to encode the video in production, but even tiny mobile devices can decode those highly compressed streams or downloads quickly and easily. Producers crunch the files; viewers reap the bandwidth benefit.
Both H.264 and HEVC break down every frame in a video into a series of rectangles (mostly squares) based on the image’s tonal values, with the goal of grouping similar tones for more compression. A frame that has a large area of blue sky and small figures walking across a desert could obviously be compressed better if the blue sky and desert regions were broken out from the people walking across it. HEVC can encode larger areas at once, which results in higher compression for less-differentiated detail.
HEVC is also much better at “predicting” how elements in a frame will change from frame to frame and in which direction those elements will move. The full explanation is eye-glazing, but the summary is that increased compression efficiency both within a frame and between frames lets HEVC gain that extra 40 to 50 percent reduction.
Smaller files and fewer bits-per-second required for streaming are great when you’re Netflix, but why should you care as an individual user or even a company using video from iPhones as part of your workflow? Because every bit saved is a bit you don’t transmit and a bit you don’t store.
For starters, if a video occupies only half the space for the same quality, your iPhone’s precious storage goes twice as far before you have to sync or offload video.
At the average user level, if you’re an iCloud storage subscriber above the free 5 GB tier, when you cross 200 GB with your current media needs, you suddenly leap from $2.99 a month for 200 GB to $9.99 a month for 2 TB. Halve your storage and you save that difference in cost. For even relatively modest video production houses storing massive amounts of video, the same scenario applies to local SSD or RAID storage and remote cloud storage, and could result in savings of tens of thousands of dollars per year.
Similarly, if you send or receive video via cellular data, you might be able to drop to a cheaper data plan without being throttled or charged for overages. And for commercial users, being able to transfer less data to cloud storage or stream at lower bit rates could reduce costs significantly. Amazon S3 and Google Cloud may offer cheap storage and transfer, but it still adds up. Half of anything is half as much!
As viewers, we should get better and more consistent quality television and movie streaming on our iOS devices and Macs, as well as on the fourth-generation Apple TV, which is slated to receive HEVC decoding in tvOS 11 (see “What’s Coming in tvOS 11,” 15 June 2017). People with lower broadband throughput rates will potentially use the same amount of data and see a much crisper picture. Those with higher bandwidth connections will consume half as much data for the same results.
The central question about HEVC is how easy or hard it will be to capture, edit, and play back on various devices. Apple hasn’t named compatible devices, but in a developer presentation, it provided a clear rundown of the hardware and software support (go to 22:00 in the video to listen and see).
In short, all Macs and iOS devices that run the upcoming releases will be able to decode HEVC at least in software. But for hardware decoding, you’ll need an iOS device with an A9 or later and a Mac with an Intel Skylake or Kaby Lake processor (6th and 7th generation Intel Core). On the iOS side, that means an iPhone SE, iPhone 6s or later, any iPad Pro, and the fifth-generation iPad. The 2016 MacBook Pro models have Skylake processors, and the 2017 iMac and MacBook Pros sport Kaby Lake chips.
Depending on the size of files and other parameters, HEVC software decoding might be erratic or consume much more battery life than H.264. Smart Web sites may check a device’s vintage and iOS apps can use new developer queries about supported video formats, and then feed out H.264 if HEVC might suffer from software decoding hiccups. Of course, that means older hardware that’s technically capable of HEVC might not get the full bandwidth advantage, but owners will probably then appreciate H.264’s battery savings, reduced fan noise, and smoother video.
If you want to edit and encode with HEVC, you’ll have the same issues as decoding, and it probably won’t be practical without a newer Mac.
(A technical aside for those who care about deep color. Both H.264 and HEVC allow for 10-bit color, which provides richer differentiation of tones than 8-bit color: a billion different shades instead of just over 16 million. 4K and 5K iMacs and the Mac Pro support 10-bit color, as do the 2016 and later MacBook Pros, and external monitors on some other 2015 and later Macs. 2016 MacBook Pros with Skylake chips include only 8-bit HEVC hardware encoding; Kaby Lake models handle 10-bit.)
Because not every device will display HEVC video, exporting will produce compatible formats for social media and other sharing, as you can do with Photos, iMovie, and other apps today.
With video out of the way, HEIF will seem vastly simpler by comparison.
A Container for Images, Rather Than a Simple File Format — Even though it’s billed as an image format, HEIF is in fact a container that rethinks what an image format needs to do in today’s complex world. An HEIF file will be able to hold text, audio, video, still images, and sequences of frames for bursts and animations, and software will be able to extract and present the relevant information depending on what we’re trying to do.
HEIF is built on an ISO standard — hurray! — developed into a full spec by the Moving Picture Experts Group (MPEG). Apple has based its implementation on a second, more fully realized version that the company said at WWDC will soon be released. Also, Apple says HEIF is pronounced “heef,” rhyming with “beef.” (An Apple developer presentation offers a good amount of detail, if you want the not-too-gory bits.)
Once again, compression is perhaps HEIF’s most significant benefit. To make things simpler, HEIF can use HEVC compression. That’s both because HEVC compression is more efficient than JPEG, but also because Apple can use HEIF to store bursts of images and animations (think Live Photos), both of which benefit from HEVC’s inter-frame compression.
Less obvious benefits include better support for alpha channels, which are used for transparency and masking of images, and for deeper color, something Apple has been pushing into its hardware for a few years. HEIF can also break an image into rectangular regions so editing and display software quickly retrieve just the necessary adjacent pieces without loading the entire file. And it can store both an original image and images derived from the original, much like apps like Lightroom store a base image and then record a series of transformations.
Although HEIF can be used to store bracketed images — photos of the same scene taken in quick succession with different exposures — to let software produce high dynamic range (HDR) output, Apple instead generates HDR images directly in the image signal processor in iOS devices. Third-party software could opt to bypass Apple’s hardware and use HEIF for this purposes. ProCamera, for instance, has its own HDR mode.
But with two-camera iPhones, currently including just the iPhone 7 Plus, Apple will store the depth map that it derives for its Portrait-mode photographs in the HEIF file. The depth map identifies a series of planes at a range of distances from the foreground. This lets Apple separate out figures in the front and aesthetically blur the background to achieve the “bokeh” effect (see “Behind the iPhone 7 Plus’s Portrait Mode,” 24 September 2016). But it can also be used for a host of interesting effects by developers, who will be able to access the depth map in iOS 11 both as the camera is in operation and from stored HEIF images. Apple showed examples like a foreground figure being
in full color, while the background was in black and white. It will also make it easy to composite foreground elements against artificial backdrops.
I won’t reiterate the advantages for storage, since they apply just as much to photos as to video. As someone with dozens of gigabytes of video and hundreds of gigabytes of photos, I’ll likely find more savings from HEIF than HEVC video.
Just like HEVC, HEIF relies on newer hardware for hardware decoding: iOS devices need an A9 or later processor, and for Macs, the same Skylake and Kaby Lake models noted above. All other iOS devices and Macs that can run iOS 11 and High Sierra rely on software decoding.
Because HEIF is a container format, it gives individual implementors like Apple a lot of flexibility about what ends up inside. I hope that doesn’t create compatibility issues when moving HEIF files between other platforms that eventually support it as a native file type. At the moment, HEIF files can be read only by Apple beta software. Conceivably, we’ll see Adobe Photoshop and other software gain support.
Web browsers won’t support HEIF initially, and it’s not inherently suited for the Web because any given HEIF file could include all sorts of excess data. I expect that Apple and others will define kinds of HEIF that will be appropriate for Web usage, such as a substitute for animated GIFs and for better compression than JPEG provides. Web servers can already supply different kinds of image and video types based on browser versions, so HEIF would just extend that capability. But Apple hasn’t said anything along those lines yet.
Apple has created developer tools that let apps assess what format an image has to be in to share or display, and export and serve that up as needed. iOS, macOS, Apple’s apps, and independent apps will perform a lot of conversions or offer export options, while retaining and passing HEIF for intra-ecosystem use.
Are HEVC and HEIF Like USB-C for Media? — People still have issues with the USB-C connector used for USB 3.1 and Thunderbolt 3 because it requires adapters, raises compatibility issues among identical connectors, and generates anxiety about what will work with what. (I’m a big fan of USB-C — as evidenced by my owning a 12-inch MacBook and a 2017 iMac — but I understand the complaints.)
HEVC and HEIF shouldn’t suffer from the kind of confusion that plagues USB-C, however, because Apple has built its support around the notion that only devices within Apple’s ecosystem will support the formats natively. Moving outside Apple’s ecosystem will typically — at least initially — require transcoding and export, and those conversions will almost certainly happen without you even realizing. Apple is encouraging developers to keep this approach in mind, too.
The one sore spot you might hit is if you don’t upgrade all your devices to iOS 11 and High Sierra at the same time, or if you own older hardware that can’t be upgraded. For instance, if you use the same iCloud account with iCloud Photo Library across both new and old devices, I’m not clear on how pre-HEIF/HEVC platforms will deal with those images. Apple hasn’t provided guidance about that yet.
Another question that Apple has yet to answer is how your existing JPEG photos and H.264 video will be treated when you update to High Sierra. Will Photos automatically convert your entire library? What about files outside of Photos?
Nevertheless, the advantages for HEVC and HEIF are clear, and the transition shouldn’t be rocky if you move forward all at once. But hey, keep good backups, just in case.
I have iOS 11 on my iPhone and iPad, but my Mac is still on Sierra. Photos.app in Sierra can already display HEIF and HEVC, but it can't export or edit them.
Hmm. With public beta 1 in iOS 11, when I take a picture, it doesn’t seem to be stored in HEIF? And it's not syncing at the moment to iCloud Photo Library — it shows an Uploading status, but it’s stalled.
One mistake, I believe - Netflix already requires / uses HEVC for its 4K streams. That's in large part why it's 4K streams are very limited to certain devices.
You’re right — I didn’t see that the many TVs supported had early H.265/HEVC decoders (first gen standard). I’ll update the article!
"Netflix suggest 5 Mbps for HD video now, which would consume about 340 MB watching 150 hours of content."
150 hours is 540,000 seconds. The ⅛ megabit-to-megabyte conversion tells me that 5Mbs video would be closer to 340GB for 150 hours.
A typo, fixed!
Thanks! For what it's worth, it's the same deal with Amazon Prime Video. The advantage of the 4K streams on both platforms is highly underrated: you're not just getting higher resolution, you're getting a considerably more efficient codec *and* a higher bandwidth steam.
To that end, it kind of drives me nuts that devices like the iPad Pro, that have native resolutions higher than 1080p and seem to have hardware decoding for HEVC don't support 4K streaming from either service. Any idea why? Is it a DRM issue? A (false) sense that it's not worth it since the device resolutions don't reach full 4k?
My iPad Pro (new 12.9) also doesn’t seem to be using HEIF Format either with the first public beta installed.
I think this has a large potential impact outside the Apple world too. My “serious” photography is done with a Sony A7rII which creates huge 42 MB RAW files. I shoot RAW + JPEG and mostly like the JPEG right out of the camera. The potential to produce higher quality “lossy” images with no increase in size compared to JPEGs will further reduce the need for RAW files. If Apple can encode these images on the iPhone then Sony, Canon and Nikon should be able to do it on their fancy DSLRs. Supporting HEIF for editing might take more time. I wonder what happens to the Adobe DNR format that Apple supports now and that a lot of Camera apps use?
RAW still has the advantage of unadjusted sensor information, though.
Though wouldn't an advantage of in-camera HEIF support be that it could store a RAW and HEVC version in the same file? Like the current RAW+JPEG, but without the mess of two files for each image. Editing software would obviously have to be updated to allow for choosing which version to use.
I wonder how serious your "serious" photography is if you prefer JPEGs right out of the camera. RAW images include a great deal more image data than even the best JPEG can manage. That data can be important when editing images in a professional image processing program like Photoshop, Lightroom and Capture One. In particular RAW includes a much wider range of highlight detail, which enables one to eliminate apparent burn-in in high contrast images. There is also more shadow detail in RAW, though the value of that data can depend on the quality of the camera sensor.
As for the size or your RAW files, won't your camera allow you to select a smaller RAW file size? My old Cannon D40 with a mere 12 megapixel sensor can do that. You will lose image detail but not image quality. There's also the question of the computer power necessary to process a 42 megapixel image. Clearly you will want one of the new iMac Pros when they come out.
Then there's the question of why one would use a high-end digital camera only to throw away much of the image detail the camera captures. But that's your business I guess. Of course most people are perfectly happy to let the JPEG processor in their cameras do most of the image editing for them. All they'll ever do with their images is store them anyway, or share them in a low quality format. And there's nothing wrong with any of that. But serious photographers usually require more of their images, which is why RAW was developed in the first place.
I'm not sure that's true. 'Serious' photographers didn't used to process and develop their colour film. (Some did, but a very small number.) That's what adjusting every photo from RAW is equivalent to. The reality is that in-camera RAW processors are often excellent, and so it can be only a few photos that require starting from RAW.
Personally, I keep all my RAWs (just like I still have all my negatives), but I pretty much always use the JPEGs, as my camera produces beautiful JPEGs.
A good camera with good software can make a good default JPEG. I prefer editing RAW most of the time, but I also like the ease of a JPEG when I just need to grab it and send it somewhere. (My camera even has clunky Wi-FI software/app combos that let me upload directly to Flickr and Facebook as a JPEG.)
Thanks for all this Glenn!
Please keep track of the questions you posit at the end of the article ie how Apple is treating existing images outside Apple products and so on. This is extremely important for me and will be critical to whether or not I upgrade.
I don't want Apple messing with photos I've taken in the past either on devices they built or others.
Looks like I'll not upgrade either iDevices or Mac until these answers become available, and monitor photo apps I use, like Pixelmator, for compatibility with these new formats.
That makes 2 of us.i'm gushy with changing coding ,since 5 years ago iPhoto lost a years worthof scanned slidesduring an update
Thanks, Glenn, Good article. Does seem some limited pain with unsupported hardware for now.
I have been puzzling over the idea that HEVC can encode 1080p 50% more efficiently, but it can only manage 40% for 4K. Can't 4K be seen as 4 1080p screens in a grid? In which case, why the discrepancy? Seems like if anything 4K should be more efficient, as there is more information to analyze and potentially discard.
Some of the HEVC efficiencies are interframe and other intraframe. 4K frames already have okay compression, and while HEVC is better, it's not that much better. In other intraframe ways, it's much better, hence the difference.
So if you have a 1,000 by 1,000 pixel region that's all white, H.264 might require, let's pretend 1000 bytes to represent it and H.265 might require 600 bytes.
Glen, I think you underrate the confusion and complications these new formats will create. I suspect the primary reasons for their development are that as computers move from mechanical hard drives with abundant storage space to solid state drives with much more limited or expensive storage, and from on-sight backups to backups in the cloud, where bandwidth and space are also at a premium, image size has become an issue once again.
But, like USB-C, the power and flexibility of these new formats won't make things easier for most people. Maybe if you live only in the Apple ecosystem, it won't be too bad. But once again platform and application compatibility will be big issues. If you think uploading thousands of images to iCloud is problematic now, how much more difficult will it be if Apple decides to automatically convert your files? And, if the past is prologue, Apple may not even give you a choice in the matter. It's not as if iCloud Photo Library is easy to manage now. Even if conversion is optional, most people don't know jack about image formats. So they will blunder into this all unknowing.
You mentioned just a few of the issues. How many people do you know who upgrade all their Macs, say in a family situation, at the same time? How many Macs will even support High Sierra? So will one end up with images on their new iMac, say, that cannot even be opened on their old MacBook Pro running Yosemite or, God forfend, Mavericks?
That's not to say that progress can, or should, be held back, but just because you are using Apple's latest and greatest doesn't mean more than a select few of your readers are. Most of us make due with much less. Not that Apple cares over much about such things. For them it's usually full speed ahead and the devil take the hindmost.
Now I could be wrong. Apple may make an effort to ease the transition, as they did from PPC to Intel. Then again, they may make it hard, as they did from iPhoto to Photos. Or from Aperture to oblivion. We don't know and Apple's not telling. Of course it's early days yet, relatively speaking, so it's too soon to panic, but it's not to soon to be apprehensive.
I'm testing iOS 11 right now, and HEIF/HEVC are totally behind the scenes. Pictures I take sync to my Sierra Mac just fine, and I can drag those images out and work on them just like any other. The conversion seems to be automatic for the user.
Are you saying that photos you take with iOS 11 in HEIF format sync to Sierra as JPEGs?
I disagree with your premise! While multi-terabyte HDDs are now cheap and SSDs remain expensive, we're on the top of the inevitable downward curve in which HDDs will largely become used by specialized markets where cheapness and redundancy is favored above speed and stability.
The same amount of cloud storage becomes cheaper every year, and some services (including Apple) offer terabytes for relative pennies. Apple's new 2 TB tier for $10 a month is pretty incredible.
What’s driving this efficiency is more the growth of quality: better cameras and higher-resolution video. They’re outpacing storage and HEVC is a good solution for using extra computational power now available to have an asymmetric benefit.
Based on everything Apple has said across multiple WWDC videos and online documentation, they’ve created a good inside/outside approach. They’re assuming nothing can read HEIF/HEVC unless it says it can. The new OS releases are geared around lowest-common denominator.
In practice, I have concerns as you do because of Apple’s past performance.
Sorry, I think your premise is full of holes. What's fast about syncing your data to the cloud as compared to backing it up locally to a HDD? Or even better, to a local SSD drive over Thunderbolt 3? And what's more stable about the cloud? Cloud servers are largely HDD based, just like local storage.
And even if your computer uses an SSD, like your iPhone and iPad, and, ideally, the cloud servers are SSD based as well (however improbable that might be), there's still the bandwidth issue moving data from your computer and other devices to the cloud and back again. That cannot yet compete for speed or reliability with a local HDD or SSD storage solution.
I do not contest that the new file formats will be a more efficient way of storing photos and videos as regards file size. That is obvious. Where the discussion goes off the rails, it seems to me, is when it adds cloud storage to the equation. Sure, the new formats will save space in the cloud. And the syncing of the data will be marginally faster, as you will have to push smaller files (if you don't count conversion overhead). But compared to local storage? Not a chance. Fans of SSDs and the cloud have been dismissing local HDD storage for years. And yet, HDDs remain the most efficient, reliable and cost effective way to store data, whatever the file format. The move from relatively spacious internal hard drives to relatively small internal solid state drives in computers has only made local external storage more important, not less.
And there's one issue you did not mention. Image quality. Conventional JPEG files lose quality every time they are saved. How does HEIF affect image quality? Is it a lossless format, like PING, TIFF and RAW? Or lossy like JPEG? Since you didn't bring it up, I surmise this is another issue Apple has not yet addressed. Which is to say it's just one more caveat.
It's still the case that if you are concerned about the security of your data you will back up all your files, music, video, photo, etc., locally—and/or with a cloud backup service other than iCloud. Apple's cloud services are still too confusing for most people to use. At the same time they are too bare bones. There are still many operations you cannot perform in iCloud, most importantly relating to organizing your files. iCloud is far from best of breed in cloud services. These new file formats may be more efficient in some ways, but they won't make iCloud any easier to use.
As problematic as it may be, iTunes remains the best place to manage your Apple ecosystem data. And the only way you can use iTunes with your iPhone, iPod and iPad is through a wired connection to your computer. Compared to iTunes, iCloud is immature at best and incompetent at worst.
I'm curious, Glen, if you store your hundreds of gigabytes of photos in the cloud. Somehow I doubt it. Which is to say, you still rely on spacious hard drives to store most of it. No doubt HEIF will save you space eventually. But converting that much data to a new format is problematic. HEIF will be beneficial going forward given your new hardware. But for your archived data, the value of the time it would take to convert your files would far outweigh the value of any space you might save.
Apropos of the continuing value of HDDs, Apple has now made 1TB Fusion drives the default configuration for most iMacs. From that it's simple to conclude there will be room for both SSDs and HDDs in the future. Of course these new file formats will save space, even on a Fusion drive. Yet, though I have a 1TB fusion drive in my 2013 27" iMac, I back it up using a 3TB drive for Time Machine on one partition and a clone on the other. And my data is stored on yet more external HDDs. My safety deposit box backups are on an even larger external hard drive.
Perhaps the difference in our perspectives has to do with the people we work with. You routinely interact with computer professionals who (mostly) understand the higher mysteries. I work primarily with people for whom computers are a complete mystery, that is, with average users. I have to bring things down to their level, trying not to be too condescending, to the extent I cover them at all. Many of these folks do not even have an iCloud account, if they can even remember their Apple ID. So while I mostly understand your discussion of these new file formats, I won't be passing that information on to my clients. Even when they buy a new computer, they want it to work just like their old computer. Their eyes glaze over if I try to explain the bells and whistles of the new Mac OS on their new computer. When I upgrade the OS on an old computer, I get the same glazed look if I try to tell them what's new.
In sum, we see new technological developments differently because we mix with different kinds of people. That doesn't make either of us wrong; we just have different audiences.
Not sure I fully get your point. Even within Apple's closed ecosystem, saving 40% to 50% of the data reduces transfer time and storage time. For most purposes involving storing data, hard drives are fast enough for Internet bandwidth. And server farms intended for fast performance are all SSD already. Linode, which hosts this site, Take Control, and my personal sites, switched to 100% SSDs a while ago. SSDs are more reliable, work better in a wider range of heat and have other advantages, so data centers are converting as rapidly as the financial aspect makes sense. Still a lot of hard drives out there, but SSDs and other distributed approaches are often on the intake side when performance matters, and the data is then more slowly spooled to slower storage.
I have full local and multiple cloud backups—hundreds of gigabytes of images at iCloud Photo Library and with CrashPlan.
Two questions—OK, four questions: First, is your local storage on SSDs or HDDs? And I'd be interested to know how much you pay for hundreds of gigabytes of storage with iCloud and CrashPlan. You could probably put your entire photo collection on one 8TB hard drive with room to spare—for far less than the cost of one year of remote storage for the same data. Third, will your files be converted to HEIF by iCloud—because your hardware and software are compatible? And how will that affect your other backups?
We still don't know if HEIF is a lossy or lossless format. That distinction is far from trivial.
OK, so some server farms have converted to SSDs. How about Apple? Google? Amazon? Microsoft? IBM? You know, the big guys who set industry standards. And how about CrashPlan and Drop Box?
In any case you made my case for me when you pointed out that SSDs are often used on the intake side where performance matters most and then data is spooled to slower storage, ie. HDDs. It is and will likely remain a mixed media environment. The issue is not an either/or proposition, fan enthusiasm notwithstanding.
Case in point: I think Apple's Fusion Drives are an under-appreciated technology. Microsoft's Surface Studio Pro now has it's own version of the Fusion Drive. So this is one Apple development that is catching on.
In the meantime, thank you for explaining what is known about Apple's new image formats. I don't mean to sound ungrateful just because I quibble with some of your points. And clearly you are not responsible for the questions Apple has not yet answered. Like the new file system that has already come to iOS and will soon arrive in the macOS with High Sierra, there are important advantages to the new systems. But it's been my experience that the law of unintended consequences can big foot changes like these, and those consequences are not always salubrious. You know, like when Apple tried to change their networking protocols in Yosemite and delivered an epic fail.
I suspect the reason HDDs retain a major cost advantage over SSDs has nothing to do with supply and demand. I'm no expert, but my guess is that the answer lies in the cost of production. Given the prevalence of flash storage in hundreds of millions of mass market devices, it's clear that the savings that can be wrung from volume manufacturing have long since bottomed out. The volume of HDD sales is undoubtedly dwarfed by that of SSDs, yet the price per gigabyte on HDDs continues to drop. The chances that SSDs will ever catch up seem slim.
I've had enough issues with file formats over the years to be wary of new ones. Of course Apple is not the only one to trip over their own feet in this regard, but all that means is that the issues are systemic to data processing. And, in my opinion, not to be taken lightly.
It's not how promising a new idea is that matters. It's how well it's implemented that makes all the difference. In other words, the proof is in the pudding.
Since we're talking photos and video — people's most valued personal files — I think you're underestimating how much Apple cares about making this painless for users. The way they will do this is by temporarily doing without some of the file-size benefits of the new formats. Where absolutely necessary, you can bet both old and new formats will be stored or sent down. Most of the time though either the iDevice itself or the Apple servers are going to be able to detect the capability of a consuming app or device and down-convert the content for it. This means you won't necessarily save any space if you, say, email all your photos — Mail isn't going to allow mailing a HEIF, at least without a fight — but I place the chances at about zero that people are going to start seeing some unreadable file format popping up on their Windows machine, old macOS, or Android. It would just be suicidal for Apple to give the impression of hiving off into an "incompatible" silo, even if it is superior tech.
Thankfully, Photos does not convert your existing JPEG / h.264 files to HEIF / HEVC.
And on it goes, as we march along the road of inexorable upgrades and try to keep up with the technocrats and tools that lead the way and mold so much of modernity — leaving us little say in the matter of what’s in our best interests and who’s really running the show. Welcome to the OS world.
I'd like to point out that only Kaby Lake CPUs can decode HEVC at 4K resolutions (Skylake CPUs can decode HEVC up to 2K/1080p resolutions). See Intel's slides on the difference between Skylake and Kaby Lake CPUs. It's for this reason that I decided to get a KL MBP and not a Skylake MBP because in a few years' time, when 4K video is ubituitous, having that hardware decoding of 4K HEVC will something you'll probably appreciate (for battery life). Apple have stated that the MBP's AMD GPU will *not* be used for h/w decoding (for battery life), but Apple don't mention the Intel CPU limitations in their presentation and most articles on the web don't either. If you're considering a MBP and 4K HEVC is important to you, I strongly advise getting a KL CPU (2017 MBP) and not a Skylake CPU (2016 MBP).
I don't understand the technicalities - most of the comments are beyond me - but I do take photographs and my serious stuff is mostly in RAW and file size about 35MB. When I got a new Imac last hear I opted for 500GB solid state or is it fusion? expecting that to be plenty to accommodate my 10,000 or so picture library. Wrong. The HD is almost full which means that I have had to prune my collection - no bad thing - and henceforth be more discerning about how many images to keep from each shoot. For social use, I also have a compact camera and last week I took a video whose file size is 800Mb which will definitely be in my external HD (8TB) only or on a 1 GB USB stick.
I agree with an earlier comment. A new file format to reduce image size is a ploy to attract more users/photographs onto icloud by giving the impression that trusting your stuff to apple's ecosystem makes more sense than keeping it all to yourself.