Tim Cook wants to make “spatial computing” a household term with the premiere of Apple’s new Vision Pro headphones

With Apple’s highly anticipated Vision Pro headset hitting store shelves Friday, you’ll likely start seeing more people wearing the futuristic glasses that are supposed to usher in the age of “space computing.”

It’s an esoteric mode of technology that Apple executives and their marketing gurus are trying to bring into the mainstream. This avoids other more widely used terms such as “augmented reality” and “virtual reality” to describe the transformative powers of a product that is advertised as potentially monumental like the iPhone released in 2007.

“We can’t wait for people to experience the magic,” Apple CEO Tim Cook said Thursday while discussing the Vision Pro with analysts.

The Vision Pro will also be among Apple’s most expensive products with a price tag of $3,500, a price that has most analysts predicting the company could only sell 1 million or fewer devices during its first year. But Apple only sold about 4 million iPhones in that device’s first year on the market and now sells more than 200 million a year, so there’s a history of what initially appears to be a niche product turning into something that gets entangled in how people live and work.

If that happens with the Vision Pro, references to spatial computing could become as ingrained in modern parlance as mobile and personal computing: two previous technological revolutions in whose creation Apple played a vital role.

So, what is spatial computing? It is a way to describe the intersection between the physical world around us and a virtual world fabricated by technology, while allowing humans and machines to harmoniously manipulate objects and spaces. Accomplishing these tasks often incorporates elements of augmented reality, or AR, and artificial intelligence, or AI — two subsets of technology that are helping make spatial computing happen, said Cathy Hackl, a longtime industry consultant who now runs a startup working on apps for Vision Pro.

“This is a crucial moment,” Hackl said. “Spatial computing will enable devices to understand the world in ways they have never been able to before. The interaction between humans and computers will change and eventually every interface, be it a car or a watch, will become a spatial processing device.”

As a testament to the excitement surrounding the Vision Pro, more than 600 newly developed apps will be immediately available for use on the headphones, according to Apple. The range of apps will include a wide selection of television networks, video streaming services (although Netflix and Google’s YouTube are absent from the list), video games and various educational options. On the business side, video conferencing service Zoom and other companies that provide online meeting tools have also created apps for Vision Pro.

But Vision Pro could reveal another disturbing side of the technology if its use of spatial computing were so compelling that people would start to see the world differently when they’re not wearing headphones and start to believe that life is much more interesting when viewed through glasses. . This scenario could worsen the screen addiction that has become endemic since the debut of the iPhone and deepen the isolation that digital addiction tends to cultivate.

Apple is far from the only major tech company working on spatial computing products. In recent years, Google has been working on a three-dimensional video conferencing service called “Project Starline” that uses “photorealistic” images and a “magic window” so that two people sitting in different cities feel as if they are in the same room together . But Starline hasn’t been widely released yet. Facebook’s parent company Meta Platforms has also been selling Quest headphones that could be seen as a platform for spatial computing for years, although the company has not positioned the device that way so far.

Vision Pro, by contrast, is backed by a company with the marketing prowess and customer loyalty that tend to spark trends.

While it could be heralded as a breakthrough if Apple realizes its vision with the Vision Pro, the concept of spatial computing has been around for at least 20 years. In a 132-page research paper on the topic published in 2003 by the Massachusetts Institute of Technology, Simon Greenwold argued that automatic toilet flushing was a primitive form of spatial computing. Greenwold supported his reasoning by pointing out that the toilet “senses the user’s movement to activate the flush” and that “the system’s engagement space is a truly human space.”

The Vision Pro, of course, is much more sophisticated than a toilet. One of Vision Pro’s coolest features is its high-resolution screens that can play three-dimensional video recordings of events and people to make encounters appear to be repeating. Apple has already laid the groundwork for selling Vision Pro by including the ability to record what it calls “spatial video” on premium iPhone 15 models released in September.

Apple’s headphones also react to the user’s hand gestures and eye movements, in an attempt to make the device feel like another piece of human physiology. While wearing the headset, users will also be able to use just their hands to lift and arrange a series of virtual computer screens, similar to a scene featuring Tom Cruise in the 2002 film “Minority Report.”

Spatial computing “is a technology that is starting to adapt to the user instead of requiring the user to adapt to the technology,” Hackl said. “It should all be very natural.”

It remains to be seen how natural it might seem if you sit at dinner with someone else who wears glasses instead of intermittently looking at their smartphone.

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *