It’s been seven years for Mojo Vision to arrive where it is today: a feature-complete smart contact lens in an internal prototype that is now ready for real-world testing and is the company’s first real candidate for a releasable product.
Apparently, inventing the future isn’t easy.
The vision (groaner) is huge: a smart contact lens that could eventually replace smartphones, smartwatches, augmented reality glasses, even virtual reality headsets. That’s a big ask, and it’s taken some serious science, engineering, and iteration to get to even a first potential product.
Currently, the Mojo smart contact lens boasts:
- a 14,000 pixels per inch MicroLED display, the world’s smallest at just .5 millimeters in diameter, and densest with a pixel-pitch of just 1.8 microns, the company says)
- 5GH ultra-low latency radio to stream AR content
- ARM processor
- continuous eye tracking via custom-configured accelerometers, gyroscopes,
- and magnetometers
- medical-grade in-lens batteries
- eye-controlled user interface
The contact lens connects with an external controller that you might wear like a necklace, where all the heavy compute happens. It uses a proprietary 5GHz radio because Bluetooth is too slow and has too high latency. And the optic includes a “teeny, tiny, reverse Cassegrain telescope like the Hubble” built into the lens. The result is eye tracking that’s a 10X improvement over anything else available today, and the ability to build an eye-controlled user interface.
“Basically, it’s all about giving you super powers,” Mojo VP Steven Sinclair told me recently on the TechFirst podcast. “We’re building the world’s first, true augmented reality smart contact lens. So, something that you could put on your eye, see content when you want to see it, have it disappear when you’re not using it, so you look like yourself and you’re engaged in the real world.”
It has taken a while to get to this point.
Mojo Vision started in 2015 and has raised eight rounds of funding totaling $204 million. Investors include Amazon’s venture fund, Stanford’s StartX fund, Motorola, LG, Khosla Ventures, and a who’s who of venture capitalists.
But the finish line might be closer than you think. Sinclair won’t commit to a launch date, and there is FDA regulation and testing that will influence that as well as the company’s internal testing and iteration. But it’s not “way out in the future,” he says.
“We’re finally there with something we call a ‘feature complete’ lens,” Sinclair told me. “That means, basically, that all of the technical elements are pulled together into a single system that we can wear, and try, and test with.”
The seven years is simply due to the challenges involved. The microLED display needed to be invented basically from the ground up, as did the control system and the optics that focus the light from the display onto the back of your retina. Building a lens that doesn’t twist mattered too: if the lens moves around your eye or rotates as you blink, everything you see might end up 97 degrees off vertical, and turning your head to the side to read it wouldn’t help.
As things stand, however, the company says it has solved those issues.
“It literally is content that’s floating in space around you,” Sinclair says. “We can lock it. So, it could be locked in the world, so you can see some content to the right, to the left, to the bottom, up and down, wherever you want to look. And it’s up to you to decide when you bring that content up, and when you don’t want to see it anymore, it all just disappears and you’re just looking at the world.”
Athletes might like it for seeing routes and performance data. Technicians might like it for seeing schemata overlaying aircraft engines or building superstructures. Anyone who uses a smartwatch or smartphone today might like it for quicker, more accessible data delivery or augmented reality experiences than they can get from existing devices. And those with limited vision can use it to highlight dangers and routes, Mojo says.
It does require a relay accessory, as mentioned above.
The relay has a processor, memory, a GPU to run the display, and a radio to communicate with the smart contact lenses. (Plus Bluetooth, presumably, to communicate with a phone.) The relay is what’s pulling eye-tracking information off the smart lens and determining where to stream data in your visual field.
“Most of the applications that we’re using are run on an accessory that you also are wearing on your body somewhere near the head,” Sinclair says. “We call it a relay accessory. It could be built into a hat or a helmet, it could be built into a pair of safety goggles, it could be built into a neck band. And so, that’s probably the form factor we’re going to start with. It has to be relatively close to the eyes because the transmit power of the lenses is not particularly high.”
As for charging your smart contact lenses?
They charge at night, just like a smartwatch, in the cleaning/charging case.
The lenses themselves are hard lenses, and they are scleral lenses. That means they so not sit on the cornea, the clear front part of your eye that you see through, but on the sclera, the whites of your eyes. That makes them more comfortable, Mojo says, because the cornea has plenty of nerve endings — which is why pokes in the eye are so painful — but the whites of your eyes do not.
“We measure your eye when you come into your optometrist and we get the shape of that eyeball,” says Sinclair. “It’s not perfectly round. It’s got lots of ridges and bumps and such, and we cut the inside of the lens to match the shape of your eye so it rests very comfortably on. So, it is a little bit bigger than what people think of when they think of either corneal hard lenses or daily disposable soft lenses. But it’s quite comfortable on the eye because it’s been cut like a puzzle piece to match on your eye.”
Interestingly, vision is only part of the plan for this smart contact lens.
Other things Mojo is working on is checking the tear fluid on your eyes, or measuring intraocular pressure, or checking for early onset of glaucoma. The eyes are a great platform for health sensors, Sinclair says, and that is on the menu as well as enhanced vision and data delivery via AR.
Subscribe to TechFirst; get a transcript.