The iPhone 1.1.3 software update, which offers location finding within its Maps application (see "iPhone and iPod touch Become Self-Aware," 2008-01-15), is a harbinger of what we'll see on more and more cell phones - but also a way to measure how "open" a cell platform, phone, and carrier has become. (For more on "open access" and open cell phones, see "Google's View of Our Cell Phone Future Is an Android, Not a GPhone," 2007-11-12.)
It's sometimes hard to come up with concrete examples of how an open network that allows any phone and any software accessing any remote service will differ from existing service offerings. So let me deconstruct the iPhone and its Maps location feature from the 1.1.3 software release.
How Maps Finds You -- This location feature uses the iPhone's hardware, software, network connection, and two remote services to calculate latitude and longitude with some knowledge of the margin of error. The iPhone doesn't include a GPS receiver, as do some cell phones, but all cell phones have to determine the nearest cell base stations from which they can receive signals, along with the signal strength and unique identifier. A cell phone has to know these details at all times to perform seamless handoffs without dropping voice calls or data connections.
The iPhone also has a Wi-Fi transceiver, which means that the phone can snapshot the set of signals, unique identifiers, and other device information for all Wi-Fi networks within its range. (If you use iStumbler on a Mac, you can see pretty much what an iPhone can pick up and measure without connecting to a network.)
In the current implementation, with the Maps program launched, you tap a small icon in the lower left of the screen, and the current cell tower and Wi-Fi information is transmitted. (As I understand it, if you're connected via the EDGE network, both cell and Wi-Fi scans are sent via EDGE; if connected via Wi-Fi, the same info is sent over Wi-Fi. You don't need to be connected to a Wi-Fi network to have Wi-Fi information packaged as part of your location finding.)
Apple's servers, in conjunction with Google for cell tower triangulation and Skyhook Wireless for Wi-Fi locating, produce coordinates that are returned to the Maps application. Maps then displays the appropriate location centered within a circle that indicates the expected degree of accuracy in distance - a larger circle means a less accurate result, on average. (You can play with Wi-Fi-based location finding under Mac OS X or Windows, too, with Skyhook's Loki browser plug-in for Firefox.)
Now think about what's involved in this dance: hardware (the cell and Wi-Fi radios), operating system and firmware (drivers that talk to the radios and frameworks that communicate with drivers), application software (the Maps application), and a network service (the GPS-like results produced from the Maps application's query).
How could this be more open? Let's look at the pieces.
An Open Location in an Open Cell Phone -- Starting with the hardware, an open cell phone could simply offer a slot or even something more primitive that would allow me to insert or solder on a GPS chip or module. This would let me add a true GPS to a phone that lacks it.
For a phone that includes a GPS radio or that has cell and/or Wi-Fi radios, an open platform would allow me to write software to communicate with it. In a truly open platform, I could even create custom driver software that would allow me to use any radios I chose, or that might allow lower-level access to functions in radios included with the phone, but that the supplied or stock drivers in the operating system didn't provide. (Imagine a GPS that could produce a timestamp to 10 digits of precision, but an included driver that supported only 7 digits.)
Moving up a layer, I should be able to write custom application software on an open platform that would talk to the drivers (custom or stock) and allow me to query any of the hardware on the device. If there were a compass but not a GPS on board, and an accelerometer - a device that records motion as a change - I could potentially track a path without other specialized hardware.
On the open services side, the software I write could talk to any Internet resource that had location information. Potentially, a market could develop for providing me with information based on my location. I wouldn't have to build the pieces from scratch, but could buy directions from Navteq and Wi-Fi coordinates from Skyhook Wireless. I'd have the option to go on the cheap, too, to a collaborative system or competitors to the previously mentioned firms.
Location Just Scratches the Surface -- Now you might not think that location finding is very exciting, and many people don't. But this is just a concrete metaphor - an oxymoron that makes sense here. The bottom-to-top approach works for any kind of hardware (or no hardware at all) coupled with drivers, programs that talk to drivers, and servers that programs talk to.
In a more general example, I could see Kodak introducing a private-label cell phone for which they created the optics and camera innards, leaving the rest to a partner. This hypothetical KodakPhone could have the highest quality portable camera electronics, and pair with Kodak's own photo services, so that as you take pictures, they're post-processed with your preferred settings for cleaning up images, and uploaded to their photo service.
A more creative company might then put together their own, less-corporate phone that coupled a great camera with some interesting internal processing capabilities and an option to upload to any of dozens of online services.
And that's the beauty of it. If we actually get open access, open platforms, and open services, then both of these possibilities - and many others - could come to pass. And the reign of "you can have it in black, silver, or red, but you can't do anything else you want with it" could come to an end.