Apple is building a new operating system to power a new augmented reality headset that may be ready for consumers by 2020, Bloomberg reported on Wednesday.
What does that mean? Here’s what I think.
Apple’s headset is supposed to employ augmented reality, which means it will take elements from the digital world, such as apps, and place them on top of the real world.
It might work like Google Glass did, whereby the user could see messages, directions and other commands while also looking through the glasses to see the street and their life carrying on as usual in front of them.
Much of Apple’s power comes from its developers and the tools it provides them. Apple might already have a head start (pun intended) on a whole array of apps for the headset if it launches as reported sometime in 2020.
That’s because Apple has already built a framework, ARKit, to let developers create augmented reality apps for the iPhone.
Ikea has an app that’s really fun to use, for example, which lets you place furniture around your home as a sort of dry-run before you buy it right from the app. Houzz offers a similar experience. Other applications include games that let you visualize all sorts of creatures and scenarios all in your living room.
You see this as you peer through your smartphone screen currently, but why not through glasses, too? It makes more sense.
Now cut out the iPhone entirely and put those experiences in a pair of glasses. Instead of peering through a smartphone screen, you’ll look around you naturally and see whatever it is Apple’s new operating system, reportedly called rOS, enables.
A patent filed by a company named Metaio, which Apple acquired in 2015, shows how a user might touch digital objects around them — menus, messages and more — to command the augmented reality experience.
So how would this work?
I imagine you might initiate an AR version of Apple Maps by bringing up an augmented reality menu of applications, or through a voice command using the Siri voice assistant. You could then tap buttons — presented in front of you, “on top of” the real world — to choose where you want to go and how you plan to get there (walking, driving or maybe public transit.) Siri could be used to control all of this, though Metaio developed technology called “Thermal Touch” that might enable a user to use hand gestures to select specific items that appear on the display, as if floating in a user’s vision.
Why should Apple enter the space? After all, Google Glass got a lot of backlash for its awkward design and from onlookers who were afraid of the attached camera, which could be turned on at will. Snap‘s similar camera glasses, Spectacles, introduced last year, have also fared poorly — the company just had to write off almost $40 million in unsold inventory.
But I think Apple has learned from the mistakes of its predecessors, and might not even have to include the ability to take photos or videos — just leave those tasks to a smartphone.
Instead, Apple’s smart glasses could be used for guides: providing directions as you walk around a city. Or maybe more information about the restaurant that you’re looking at (Yelp reviews, for example). Or perhaps Apple glasses could be used in the enterprise, as Google Glass has been used to allow engineers more accuracy during assembly.
What if you could pull up notes on Apple’s glasses during a business meeting? Or receive and send emails and messages, likely via voice, without breaking a stride. Or what about health? As Apple continues to expand its focus in the area, what if doctors could use rOS-powered glasses to better diagnose patients by comparing a one symptom to an entire database of ailments in just seconds? What if I could look down at a rash on my leg and know it was poison ivy and not a reaction to something I ate?
Apple could potentially generate revenue in two or more ways from its glasses. First, on the sale of the glasses themselves and, second, on the apps that it could sell that were developed specifically for the rOS software platform. I imagine wireless carriers would be happy to jump on board, too, since they’d need some sort of data connection to provide all of this real-time information.
Does this signal Apple’s intentions to move entirely away from the iPhone? I don’t think so. The iPhone will likely remain — at least for now — the place where we go to watch movies, play games, and interact with software that isn’t placed on top of the world we live in.
Instead, I foresee Apple’s glasses with rOS as a tool — a whole new kind of tech product that helps us understand more about the world around us and how to interact with specific items that we might not be trained to use. (How do I replace a battery in my car? How do I install a new power outlet in my house? What gate do I go to to catch my flight?)
There’s room for this sort of product, and the race is on to get it out there. Microsoft is planning something similar with HoloLens, and patents filed by Facebook and other major tech companies show that these sorts of products are just over the horizon.
If Apple launches these glasses by 2020, though, it may be the first to deliver on what has otherwise been failed attempts or the dreams of science fiction fans.
Source: Tech CNBC
Here's how I think Apple's smart glasses and rOS software will work, and why we should be excited