I’m a child of the ’80s. The first computer I touched was an IBM PC. The first I owned was an Atari 800XL. The first I truly fell in love with was the Amiga 1000. I’ve programmed and worked with practically every major platform since. New technology rarely surprises me. Yet, every year around the first week of June is always a special time of anticipation as I wait to see what Apple will roll out at WWDC.
Even so, Apple really hasn’t come out with something truly new of any scale in nearly a decade. Apple exec Phil Schiller became an internet meme years ago with his keynote statement, “Can’t innovate anymore, my ass!” The innovation he was talking about was the almost universally mocked “trashcan” Mac Pro. Yes, Apple has had some of the most transformative products of the last 25 years, but it’s hard to remember anything terribly innovative recently. And I say this as a huge fan of Apple and their products.
But the days leading up to the WWDC keynote this year felt a bit different. Something was coming — and even with the product category spoiled months ago, it’s always hard to decipher how Apple will enter a new market.
I explored virtual reality (VR) years ago and built a gaming PC purely to explore an HTC Vive VR system and the potential to develop software for the platform. It was a wonderful ride — and I loved inviting non-tech folks over to strap on the bulky system, with its fish-eye lenses and dangling wires and little box sensors mounted all over the room, so they could walk to the edge of an immersive cliff or swim with whales. Yet, as groundbreaking as it was, it was an imperfect system — and due to its limitations and corresponding frustrations, found itself in a box after a couple months.
So in the run-up to WWDC, I must admit I found the building excitement a bit unwarranted. The rumored hybrid device that supplemented VR with AR — augmented reality — was hard to envision.
It was with a not-insignificant amount of indifference that I watched last week’s WWDC keynote, awaiting the well-predicted Apple AR/VR juggernaut. To be clear, I had no doubt that Apple would release a highly polished, high-end product. This is Apple’s schtick — take a relatively new but flawed product category and build the device that finally matches the expectations and interests of the market. Not always innovative by some people’s definition, but usually innovative in perfecting a category.
But that was the problem: I just couldn’t imagine how it could be that exciting of a product category — at least not to me. I put it in the same boat as the Apple Watch — a cool device, but I’m honestly just not that excited about watches and the functionality they provide. I recently gave one a try again, and while it’s a neat device and well done, I barely use any of its functionality beyond keeping time and the occasional run tracking.
The first hour and fifteen minutes of the keynote was predictably uneventful. Apple each year rolls out wonderful new development frameworks and advances for developers, plus a slew of OS features and refinements to keep users happy. This is old hat, and Apple has perfected the yearly release cycle.
Until … One More Thing.
It’s been 6 years since we have had a One More Thing, and that was iPhone X — a great phone, but certainly nothing groundbreaking (although maybe still better than the One More Thing in 2007: Safari for Windows).
Within minutes of the Vision Pro’s reveal, my traditional pragmatism (leaning precipitously toward cynicism) was gashed. This was something quite different. I couldn’t quite explain why yet, but it felt momentous.
My wife, Julie, is an avid iPhone and iPad user, but what I would lovingly call a “normal” — in contrast to a tech-obsessed geek like myself. As Apple CEO Tim Cook excitedly began to introduce One More Thing, she joined me in my office to hear what Apple had to offer. As the keynote panned to a woman sitting on a couch manipulating an alternate world with her fingertips in impressive demonstration after demonstration, Julie’s first comment was, “We won’t be able to get away with just one of these, will we?”
Well, maybe we can once you hear the price!
Julie is used to my tech obsession and me acquiring significantly more gear than I can ever possibly use. My kids are the general winners here, as my new hardware compulsions quickly become hand-me-downs to the rest of the house. It was clear that this device interested her — she wanted one for herself. As I worried about how it would interface to my Mac, if the keyboard integration would be sufficient, and other technical minutiae, she was reveling in the potential of this new computing experience. She’s had her own MacBook for years, but in the last decade she has transitioned almost completely to the iPhone as her primary computing device, and the occasional use of an iPad. So the possibility that this new “spatial computing” platform could be another step on the same trajectory intrigued her.
In contrast to Julie’s enthusiastic reaction, a number of responses I’ve seen on the internet to the unveiling of the Vision Pro have been that this device is dead on arrival — too expensive, too nerdy, too ugly. I have to wonder if some folks have missed something. Does this sound familiar? “There’s no chance that the iPhone is going to get any significant market share. No chance.” Steve Ballmer (then an executive at Microsoft) was convinced at the release of the iPhone that the best Apple could do would be 2% to 3% market share. It was way too expensive, he said. Then there was this Engadget writer: “Apparently none of you guys realize how bad of an idea a touch-screen is on a phone. I foresee some pretty obvious and pretty major problems here. I’ll be keeping my Samsung A707, thanks.” The UX and interaction model was too complex — too different. No one needed a paradigm shift.
No one should be writing the Vision Pro’s epitaph — no one knows how the market will respond to a new category. (For the same reason, no one should be proclaiming it as the next iPhone.) But many seem to be seeing the Apple Vision Pro and visionOS with blinders on (see what I did there?). Are we seeing this new product for what it is: an entirely new way to interact with computing? A complete paradigm shift in how we interact with technology?
The iPhone changed the world. A powerful computing device that fits in your pocket! There is a whole generation where many never or rarely touch a traditional computer. That’s a paradigm shift in the computing experience. The iPad didn’t quite make the same impact, but it got many thinking about different potential computing experiences. It feels like many Vision Pro detractors are near-sighted (ahem) and can’t see the potential beyond the use cases that have already been explored by existing technologies.
But it’s clear that the Vision Pro is like nothing that has come before it. I’ve used enough VR to know that the experience shown and described by the press who had first access is nothing like any experience I have ever had. For example, check out what John Gruber at Daring Fireball had to say based on his 30-minute experience with the Vision Pro.
This is not just a generational improvement; this is a new way of seeing and implementing the technology. Combining VR with AR, so that the user can interact with both their actual environment and a virtual one, in a sophisticated hardware package that is light years from the current state of the art is game-changing. It opens the door not only to mastering past use cases that were not always successful (from gaming to insane content consumption) but also to creating new paradigms of working with technology. Apple is calling this “spatial computing,” because it’s absolutely a new archetype for how we can do the things we have always done with laptops and desktops — and, for many use cases, with our iPads and iPhones as well.
Amazingly, you could potentially do all the things you’ve done before without owning those other devices.
Tim Cook said this on ABC News “Good Morning America”: “The real thing, of course, that it does is enable you to see, hear, and interact with digital content right in your physical spaces, as if it’s there. That’s spatial computing, and it is a big idea.” I don’t think Tim is wrong.
So I don’t question the success of the Vision Pro. My only question is the scope of its success.
Is this another device that finds some very specific use cases and contexts that are perfectly tailored for it, like an iPad? I could easily see the Vision Pro becoming established in education, the medical field, or many other very specific use cases.
Or is this another iPhone, creating a significant paradigm shift in computing by replacing the traditional computer for a decent segment of the population? Like my wife, who uses her iPhone for things that I would rather walk on coals than do on anything but my computer and its big display.
Or is it something even greater? Something that nearly replaces multiple computing devices for nearly all users? During the WWDC demo, I kept looking at the empty desk that one of Apple’s demonstrators stood in front of. I think the only thing on that desk was a plant: no computer, no wires, no monitors. What if this model of computing is so compelling that no one feels the need to have anything else? Does this become the sole computing device on my desk, all I need for a new world of productivity and immersion?
(Well, I hope folks still have their iPhones, because I don’t want to see people walking down the street in goggles — but I’ll get to that momentarily.)
The breadth of potential use cases is staggering. Kyle Richter, MartianCraft’s CEO, came up with a list shortly after the WWDC keynote that had my head spinning:
- Housing contractors could use an AR headset to see through walls in a building project to visualize pipes, electrical systems, studs, etc.
- Doctors could use AR to overlay X-rays of broken bones on patients to better visualize their injuries.
- Oil field workers could use AR to pull up information about various machines on an oil site and use VR to visualize steering a drill head deep underground.
- Interior designers could walk around with clients in AR to show them their design plans, including wall colors and furniture.
- Real estate agents could give a virtual tour of an apartment building to people moving to the area from far away.
- Kids playing paintball outside using AR could then run into the house and continue playing without damaging anything.
- A professional chef or home cook could look at steaks on a grill and see their exact temperature and doneness or glance over to a pot and see exactly how long until it begins to boil. For the restaurant chef, customers’ dietary restrictions could show up while they prepare plates.
- Airport ground staff could use AR to see real-time stats on aircrafts, schedule details, and runway status and to visualize baggage connections.
- Farmers could use AR to assess soil quality and crop health or for navigational help with drones monitoring large fields.
- Bartenders could use AR to see customers’ names and monitor their bills, last drink orders, favorites, and other preferences. They could also access recipes and measuring guides while mixing drinks.
- Firefighters could use AR to see through smoke or visualize the layout of a building before entering.
- Retailers could provide virtual fitting rooms where customers could see how clothes, glasses, or jewelry would look on them by looking at themselves in a mirror.
- Mechanics in training could work in a car engine and visualize all of its parts and information about each item.
- Boat captains could use AR in the wheelhouse to visualize ocean depth, channels, and other ships and access navigational aids to help see through fog.
All of this might seem a bit far-fetched at a $3,500 price point. Still, that’s what Ballmer said about the iPhone. And people said the same thing about the original IBM Personal Computer (base model priced at $5,200 in today’s dollars) and the original Mac, which came out in 1984 for $2,495 — or $7,300 in today’s dollars. Each changed the way computing would occur in the future. As with those earlier devices, the Vision Pro’s price will come down — new models will be released that are even more powerful with ever-decreasing component costs. But there are other potential obstacles.
As Julie and I watched the WWDC keynote, my teenage son walked by the office. He paused and stared at the screen to see what had us so agog, then tossed out, “Wow, we need one of those.” As he returned to wherever a teenager goes on summer break, I heard a trailing, “Can we rewatch Ready Player One tonight?”
The book by Ernest Cline (and subsequent Spielberg movie) of a future virtual world is a tribute to my favorite time period — the 1980s — and I’m old enough that it triggers all the nostalgic feels. But it also foreshadows an uneasy dystopian future of isolation and escapism through a virtual world that masks the horrors of a ravaged physical world and further divides humanity.
I think the biggest problem potentially facing the Apple Vision Pro as a mainstream device is not technology or even the lack of a yet-to-be-created “killer app.” It is the perception that the device will isolate people and further separate us from physical contact. Apple has built a computing device — a hunk of metal and glass and plastic — that you slap on your face, physically cutting off your eyes from the world. Yet, paradoxically, they are attempting to market it as bringing “connection.”
Tim Cook knows this is a concern — he knows how this could be perceived. And it’s clear that Apple is trying to guard against it from the start. As Cook notes, “It’s a major point that was a design point of ours from the start. This is not about isolation — this is about connection. This is about having people there that feel like they’re there with you.” It’s telling that we saw no pictures of anyone wearing the device except for the perfectly crafted variants in the demos. Anyone who follows tech and VR will remember the dystopian photo of hundreds of journalists and technologists at a Facebook keynote wearing headsets while a smiling Mark Zuckerberg walked by.
Some have compared the Apple Vision Pro to the Google Glass. Apple is clearly positioning this product (at least today) as one you wear in specific environs. Apple doesn’t see you walking down the street with these, or generally interacting with the world in public spaces with them, even if you could. They were careful to show places where the goggles would be more likely to be normalized. On a passenger on the train on the way to work? Maybe. On an airplane? Certainly. In my home or office, absolutely.
I would argue that the concern about isolation, which many have already begun to raise, seems somewhat misplaced. Many of the computing experiences we currently face daily are already isolating. When I sit at my desk in front of my monitor, I’m isolated. I’m focused on my screens — and honestly, when I’m working, I don’t really want to be interrupted anyway. For years, technology workers have railed against cube farms and companies that deprive their workers of private offices. When I watch a movie at a theater, that’s an isolating experience as well.
I hate to say it, but if we were really concerned about isolation, we would get rid of our phones. These insidious devices come along with us to dinner, on vacation, and everywhere else and can be easily taken out to possess our attention while people are in the middle of a conversation. I really can’t imagine you going to dinner with your friend, taking out an Apple Vision Pro, and throwing it on. I’m not suggesting that no one will do it — I mean, we’ve all seen the picture of the guy carrying his 27-inch iMac into a Starbucks. But it’s not going to be the norm.
What amazes me is that Apple has found ways to help facilitate connection even while wearing a device that covers half the user’s face. Remember, the Vision Pro combines VR with AR — augmented reality. The user can still see and interact with their actual surroundings. If you’re wearing a Vision Pro and a person walks up to you, you see a faint impression of them in your view. With the twist of a dial, you can change the level of immersion and the level of connectivity you have to the room and those around you to see them more clearly. And with the front-facing screen, the person with you can see your facial expressions and make a facsimile of eye contact.
Meanwhile, the potential for connecting with someone who isn’t in the same physical space is huge. I think using the Vision Pro’s SharePlay mode to watch “Ready Player One” with my brother who lives 2000 miles away might be really cool and quite social. So perhaps the marketing line that the device is all about connection isn’t so paradoxical after all.
In short, the problem of isolation isn’t necessarily a hardware issue. It’s a software problem that can be solved if we value it enough.
It’s an exciting time to be a technologist. I can’t wait to have my first experience with an Apple Vision Pro. Apple has a marketing phrase that is often repeated in their developer sessions by Apple employees: “I can’t wait to see what you will develop next.” I’m struck by how true that statement is today. I really can’t wait.
Just as MartianCraft has been at the forefront for every new Apple platform reveal, we will be a thought and implementation leader in this new paradigm of computing. If you, too, can’t wait, and you want your company or app idea to be represented on the groundbreaking Apple Vision Pro, get in touch.