Embracing the Flawed: A Journey from the Sony eReader to the Apple Vision Pro
I bought my first e-reader back in September of 2006. The PRS-500, Sony’s first e-reader available in America, cost about $300 and featured digital ink (one of the first to do so), but had no ability on its own to link to the internet to get books. (One had to tether to a computer.) The resolution was okay, not great. I don’t remember exactly how many books it could hold — probably not a lot by today’s standards.
At the time I was aware of discussions about e-readers but hadn’t seen any devices that I felt hit the mark. I knew even before purchase that the PRS-500 would have some significant drawbacks. I mean it was only compatible with PCs! As it turned out, I had a Mac at home and a PC at work, and I remember on more than one occasion forgetting my eReader at home and being frustrated that I’d have to wait at least another full day before I could download a book I wanted. The online store looked to be clunky (and it was), and buying books was cumbersome to say the least.
And yet, even knowing all this, I bought it anyway. BUT WHY? It was pretty simple really. Even though I thought this e-reader would have flaws, I believed it would be sufficiently close to a right solution that it made sense to start reading some books from a screen sooner rather than later. More importantly I hoped that by having some “skin in the game,” that I would recognize the so-called right device when it emerged. The Sony eReader turned out to be perfectly suited to help me achieve these goals — it was good enough that I kept using it over the course of the next year, and in the process I came to have a clear understanding of how these devices worked and what they were good for. When Amazon released the Kindle just over a year later, I knew in an instant that I would like it: platform agnostic, required no tethering, featured a more readable screen, etc.
So how does this relate to the Apple Vision Pro? Is it just an extraordinarily expensive Sony eReader for spatial computing that you should consider buying in order to recognize the “right” device later on down the road? I don’t think so. I think the eReader equivalent was actually the Oculus Rift from several year ago, a forerunner to the Meta Quest 2 (soon to be the Meta Quest 3). The Rift wasn’t (and the Quest isn’t) the ideal headset, despite doing some pretty amazing things. The Quest’s price point is, and will remain, dramatically lower than the Vision Pro will ever be.
If you have told yourself that you might get a Vision Pro next spring, but don’t currently have a Quest, I would encourage you to go get one. You’ll start to appreciate what does and doesn’t work well currently within the world of Spatial Computing. And you’ll hit the ground running with the Vision Pro. Start thinking now with the Quest how you could use it at work, at home, to connect with others. Have fun with it. Explore. That way, when the Vision Pro is ready for launch, you’ll be ready for the Vision Pro.
It’s the platform, stupid!
To state the obvious, it’s hard to forecast the future. (Not that I won’t be trying to do just that with a few predictions of my own later this week.) Predicting the future is hard in any forum, but especially with technology because so much changes at once, not to mention that there is such interdependence between various technologies. And when a so-called “breakthrough” happens, oh my, it sends ripple effects everywhere.
When a computer giant like Apple announces a new hardware platform, it’s about as close as we can get to seeing the future, or at least a portion of it. The reason is simple — the hardware is likely to be around for quite some time, because releasing requires a lot of money, effort, and insight. As such, when Apple announced the Vision Pro last week we got a glimpse into the next several years.
I read this morning that there have already been 10,000 reviews of the Vision Pro, and I’m sure that doesn’t include social media posts. Some forecast that the device will be a complete flop, while others predict that it will change the world. But most of the commentary is based on what we know about the software of today. I’ve done the same thing myself, suggesting last week that the current Apple ecosystem would be instrumental in achieving wide-scale adoption.
We have all heard the critique “a solution in search of a problem.” Usually that’s damning commentary. But I think it misses the mark when it comes to computer hardware. Maybe that’s exactly what a computing platform is supposed to be — a partial and potential solution in search of developers to solve known and not-yet-known problems. What problem exactly was the Mac solving back in 1984? Desktop publishing? Okay, maybe so. But it ended up solving a whole lot more than that because of developers who leveraged the platform later on to do some pretty amazing things. Purchasers of the original iPhone didn’t have an App Store even, but it didn’t take long until all kinds of solutions were developed to address problems which we previously hadn’t been able to solve, or that we didn’t think were even solvable!
At this point, Apple is keenly aware of how impacting the development community can be, and Apple is providing unprecedented support to software engineers for the Vision Pro. Sometimes the problems which will be solved can be envisioned, but often not.
Predicting the future is a challenging. Go back and watch the “One more thing” portion of the Apple Watch keynote from 2014. Look for the now familiar Heart icon — for the Apple Health app — in the featured picture of the watch face. (Hint: You’ll be looking a long time — it isn’t there.) Back then, we couldn’t see how impacting the Apple Watch was as a tool to promote health. The Vision Pro keynote last week had a lot of incredibly compelling content. But don’t we also have to wonder what was completely left out?
The Tough Questions.
It’s been just over ten days since Tim Cook uttered the magic words “One more thing…” at Apple’s WWDC23 event, just prior to announcing the Vision Pro.
Apple was no more specific than “early next year” for a release date for the Vision Pro. While I will make predictions about what may transpire once it’s out, I want to tee up questions to ask about this first version of the Vision Pro. Not the presumed second version of the Vision Pro, not the “cheaper” version that has been rumored the last few days to be coming out at the end of 2024, but the one which will be available in the spring.
Five potentially tough questions you’ll want to ask if you get a Vision Pro:
Question 1: What will it be like for people to see a digital representation of my eyes when I’m wearing my Apple Vision Pro?
EyeSight has been described as anything from “creepy” to “brilliant” — will it make one feel self-conscious, or as I indicated last week, might it make people less inhibited to put on the Vision Pro?
Question 2: Does the eye tracking work as advertised?
Others have tried (and mostly failed) to use our gaze as a proxy for the fine-tuned control of a mouse cursor for selections. If Apple has perfected eye tracking then that is a huge plus.
Question 3: Does using my fingers as a way to click links and icons work reliably?
Others, including Meta Quest, have offered hand tracking but not successfully with all users. It can take a bit to get the hang of it. Controllers historically have been more reliable.
Question 4: How good is the battery life?
Two hours is what has been advertised, but time will tell. In the Meta Quest, you typically don’t want to have experiences inside the headset which are much longer than an hour. Many get fatigued after an hour or an hour and a half. With the Vision Pro, two hours may not be a problem if one gets similarly fatigued at the 90 minute mark. That said, if the market for the Vision Pro really is a replacement for the mobility and convenience of a laptop, then two hours won’t be enough for many users.
Question 5: How well can I see my phone when I’m wearing the Vision Pro?
Many “pass through” visuals in other headsets have missed the mark. What happens of course is that people just take off the headset and look at their phone. And often, once the headset is off, it stays off. If the Vision Pro actually allows us to look at our commonly used digital devices (tablets, phones, watches), then there is one fewer reason to take off the headset. And presumably the longer the headset is on, the more productive one will be with it.
Predictions about the impact of the Apple Vision Pro.
Any (or all!) of these could be wrong, but let’s start with ones which are fairly likely in my view and end with a couple that, while perhaps improbable, still have a very real chance of coming true:
Prediction #1: Long lines! People will sleep out overnight all across the country and hang out in surprisingly long lines in order to buy the Vision Pro on the first day of its release.
Prediction #2: Apple will put forth a favorable plan to upgrade one’s Vision Pro to subsequent editions in order to make it easier to stomach the $3499 price tag.
Prediction #3 By 2027 those with corrective eye glasses will no longer need to get lenses for the Vision Pro, because there will be automatic refraction capability built in, allowing a single Vision Pro to adjust to any user’s eyes nearly instantly.
Prediction #4: After years of seeing declining sales across its line of headsets, Meta will sell to Apple the vast majority of its intellectual property and patents before the year 2030.
Prediction #5. Within three years, studies will have commenced to assess whether the Vision Pro‘s widespread use has fundamentally changed our brain chemistry because of the radically different ways in which we are visually interacting with information, prompting concern and protests that the Vision Pro should not be used by children under 12 years of age.
– Written by Rob Merrilees