Press Releases

‘Autofocals’ Use Eye-Tracking to Do the Job of Progressive Lenses

A prototype being developed at Stanford University uses eye-tracking technology to do the job of progressive lenses.

“More than a billion people have presbyopia and we’ve created a pair of autofocal lenses that might one day correct their vision far more effectively than traditional glasses,” said Stanford electrical engineer Gordon Wetzstein. For now, the prototype looks like virtual reality goggles but the team hopes to streamline later versions.

Wetzstein’s prototype glasses – dubbed autofocals – are intended to solve the main problem with today’s progressive lenses: They require the wearer to align their head to focus properly.

This visual shift can also make it difficult to navigate the world. “People wearing progressive lenses have a higher risk of falling and injuring themselves,” said graduate student Robert Konrad, a co-author on a paper describing the autofocal glasses published June 28 in the journal Science Advances.

The Stanford prototype works much like the lens of the eye, with fluid-filled lenses that bulge and thin as the field of vision changes. It also includes eye-tracking sensors that triangulate where a person is looking and determine the precise distance to the object of interest. The team did not invent these lenses or eye-trackers, but they did develop the software system that harnesses this eye-tracking data to keep the fluid-filled lenses in constant and perfect focus.

Nitish Padmanaban, a graduate student and first author on the paper, said other teams had previously tried to apply autofocus lenses to presbyopia. But without guidance from the eye-tracking hardware and system software, those earlier efforts were no better than wearing traditional progressive lenses.

To validate its approach, the Stanford team tested the prototype on 56 people with presbyopia. Test subjects said the autofocus lenses performed better and faster at reading and other tasks. Wearers also tended to prefer the autofocal glasses to the experience of progressive lenses – bulk and weight aside.

If the approach sounds a bit like virtual reality, that isn’t far off. Wetzstein’s lab is at the forefront of vision systems for virtual and augmented reality. It was in the course of such work that the researchers became aware of the new autofocus lenses and eye-trackers and had the insight to combine these elements to create a potentially transformative product.

The next step will be to downsize the technology. Wetzstein thinks it may take a few years to develop autofocal glasses that are lightweight, energy efficient and stylish. But he is convinced that autofocals are the future of vision correction.

“This technology could affect billions of people’s lives in a meaningful way that most techno-gadgets never will,” he said.

Gordon Wetzstein is also director of the Stanford Computational Imaging Lab and a member of Stanford Bio-X and the Wu Tsai Neurosciences Institute.

The research was funded in part by Intel Corp., NVIDIA, an Okawa Research Grant, a Sloan Fellowship and the National Science Foundation.

INVISION Staff

Since launching in 2014, INVISION has won 23 international journalism awards for its publication and website. Contact INVISION's editors at editor@invisionmag.com.

Recent Posts

Economy Is Slowing but Remains Resilient

Prices for services still rising, while goods level off: NRF economist.

2 days ago

The Pros and Cons of Virtual Assistants and More of Your Questions Answered

Plus, what’s the secret to an employee review that’s actually effective?

2 days ago

Mastering Sales & Style: 6 Lessons Learned from TV

Art may imitate life but that doesn’t mean it still can’t teach us a few…

2 days ago

A 30-Year Optical Veteran Who Grew Up Within 30 Miles of the Community She Serves With 3 Generations of Women

And little gets this 30-year cancer survivor down but cleaning the 1,500 frames on their…

3 days ago

87% of You Don’t Use Employment Contracts

Often citing the employee handbook is enough. Guess our next question will be “Do You…

3 days ago

This website uses cookies.