I’ve just finished watching Apple’s visionOS sessions. Some of these concepts and the way Apple has implemented the design is mind blowing 🤯
Here are 5 examples that I loved:
1. Eye zoom. Just look at _where_ on an image you want to zoom in, and pinch your fingers apart
Vision Pro is incredibly inspiring. For anyone that wants to get into AR, at Hyper we're solving hard spatial UX problems, such as designing a new type of shopping experience.
If you're passionate, DM me even if there isn’t a role listed:
#WWDC23
Next year, Hyper will be VERY open. We’ll talk about our customers (BIG global retailers) and how our tech works.
For now:
- We can map a huge store with 50k products, in mins
- We can locate a user with 1m accuracy
- And navigate them with AR
Lots to come, I’m excited.
People have dreamed about this for years, but the tech challenges are really hard, and nobody’s been able to crack it.
So this is a pivotal moment - the first time this is actually a real app that anyone can now use. This is our “moon landing” moment. 🚀
@sama
Under the hood it’s just a prompt that says:
Do not, under any circumstances, return anything other than JSON. No additional text. No pre-text. No post-text. Just JSON. A cute puppy will die if you return anything other than valid JSON.
Dent Reality has been busy (not-so quietly) building AR for indoor spaces. Now, we’re preparing to put it into people’s hands, and raising our first round of investment.
If you’re an accredited angel investor, DM me for more details. RTs appreciated.
Introducing our completely redesigned AR navigation experience, built by the team
@HyperARCo
.
We’ve combined many UX elements into a single dynamic arrow that shows you where to go.
@aitchrobertson
Original iPhone had 2G. iPhone 3G was fast. Then 4G was really fast.
Nowadays, if your phone drops down to 4G, it's slooow. If it drops to 3G, forget it, a lost cause.
This is AR navigation, working outside in the Garden Center!
We use WiFi + AR for precise location, so it works at any store, inside or outside, and without beacons.
I’ve spent the last 3 months “sleeping on the factory floor”, to build our most requested feature: Android support
We’ve now enabled Hyper-accurate indoor location for every Android phone. Here’s a demo:
Black: Ground truth data
Green: Apple/Google
Red: Hyper-accurate location
This is Immersed VR, on Meta Quest Pro.
Apple’s headset will have a far more polished version of this.
You look at your Mac, virtual windows appear around it. Move your mouse, it floats off screen and between the windows. Works like magic, zero setup.
I’ve been working on a new open-source project for Apple’s RealityKit.
Introducing Mirador. Mirador makes it easy to build impressive “Point of Interest” AR experiences, from anywhere in the world. Coming soon.
Demo 1: Miradouro da Senhora do Monte, Lisbon.
So many people have asked if Apple has sent me a headset.
They have not. And if they had, I could never admit that they have. They haven’t.
Sent from Reality Pro.
A magic iPhone gadget I got for Christmas:
Belkin magnetic phone holder, turns your iPhone into a Mac webcam. BUT also... put a sheet of paper in front of your computer, and it shows a topdown view. 😱
Always innovating towards a killer use case for AR, we’ve revamped our Indoor AR navigation experience. These cute little gems will now guide your way. Works great in large buildings with tight corridors.
Mirador is now open-source!
Mirador makes it easy to build impressive point-of-interest experiences with Apple’s new AR framework, RealityKit.
Demo: Tunnel View, Yosemite National Park
First thing I'm going to do with visionOS is make my own 3D app icons 🤩
Apple have made this really easy - provide 3 transparent square images, don't add your own 3D effects etc..
Apple will generate specular highlights, shadows and depth in real-time based on focus.
When we were designing the AR navigation experience, we built a fun little tool to let us play with the size/shape/bounce of the guidance dots, to find what felt right
Humane started hyping themselves back in 2017 -- super secretive, not even investors had seen what they were building.
It's a LLM with a projector, powered by OpenAI. That tech only became possible in the last few years.
So what were they planning to ship before that?
New demo: Live in-store navigation with maps, AR and precise location
3 breakthroughs at Hyper which make this possible:
• Precise indoor location tech that uses wifi and AR
• AI for automated store mapping
• Spatial UX that merges maps and AR
Indoor AR navigation speed run!
It's taken innovations in mapping, indoor positioning and AR experience to arrive here. I’ve written a blog post to explain how it all works:
The price of Vision Pro is a fairly boring conversation.
Apple just revealed 6 years worth of R&D, innovation across hardware and software, and an entirely new product category, in 1 go.
That's interesting to talk about.
Apple is not putting this amount of work into incredibly impressive AR visuals.. just so that you can place furniture and play Pokemon. This is their next platform.
Indoor AR navigation, from Dent Reality, guiding me from a train station concourse to a store in a shopping mall.
(Yes this is real. I feel like I’m living in the future some days 😅)
5. Can Apple make typing on a virtual keyboard a _better_ experience than a physical one? Maybe?
visionOS does support using a physical keyboard too, but the virtual keyboard is very well designed - raised keys, dynamic hover-highlighting to guide your finger, audio feedback.
First public preview of Mirador, my new AR location library. I showed it to some tourists in Greenwich and got their reaction 🤩
Open-sourcing this weekend.
3. If your fingers control your mouse pointer, how do you easily move to the other side of the screen, without having to reach all the way across?
Well this is how - look and tap.
Thank you for all the love 🙏. We built the AR interface, maps, navigation and indoor location tech. Our location tech is powered by wifi and AR.
Now rolling out at scale.
2. "You had me at scrolling" - the famous quip someone made to Steve Jobs when he demoed the iPhone.
With visionOS, Apple uses your eyes and hands to control the entire interface in a new, intuitive way.
4. Dynamic scale content, aka “screen scale”. This technique is used in video games, to make text legible even if its positioned really far away.
The user can move windows closer or further away, and it still appears at exactly the size it was designed.
There's way more detail in the sessions - 3D app icons, voice control etc.
Here are 3 I'd recommend most:
- Principles of spatial design
- Design for spatial input
- Design for spatial user interfaces
Follow
@AndrewHartAR
for AR insights + innovation
We wanted to use an iOS Maps-style drawer in our AR navigation app, but the libraries available for this have always been complex, and caused UI lag issues.
So we’ve created Panels UI - it’s fast, elegant and simple. Supports multiple layers, swiping and different panel heights.
The wildest thing I've done this year: Fly to NYC, pay $3500 for a Vision Pro, then fly back to London.. I've been using the headset every day for more than a month now, so here's my Vision Pro review.
My background:
- I built the largest open-source projects for both of
First I built the AR navigation tech that you see in Apple/Google Maps. Now Hyper is building the same tech for stores.
Announcing a new retail partner soon 👀
The new wristbands for Coldplay shows, which they hand out when you walk in, are somehow able to light up in formation to draw a heart.
No idea how they've done this - any ideas what technology they're using or how it works?
Some really exciting AR news - we
@DentReality
have raised $3.4m to build the digital layer for the physical world!
Here's a demo of what we've built so far...
Lots of devs ask me how to draw AR navigation paths.. There's actually no built-in way to do it, you'd have to code your own low-level 3D geometry.
But no longer! My friend
@MaxxFrazer
has just open-sourced his SCNPath class. AR paths for all!