Dr. Burge
By Rebecca Salowe
Scheie Vision Annual Report 2018

How do we see?  How do we focus our eyes?  How does our visual system estimate the distance between two objects?  We perform these tasks more than 100,000 times each day without much thought.  It is the goal of Johannes Burge, PhD, an Assistant Professor of Psychology at the University of Pennsylvania (UPenn), to understand the computational solutions to these tasks. One of over 60 vision scientists at UPenn, Dr. Burge specializes in sensation and perception.  His research specifically aims to understand how our visual system completes these processes in the natural world.  However, it is difficult to model the “natural world” in a laboratory setting, a challenge that Dr. Burge has spent years developing methods to overcome.

How to Model the World Around Us

Most laboratory research on eyesight examines and probes vision using artificial stimuli, such as bars or blobs in a particular shape.  As a result, the majority of knowledge about the visual system to date comes from research with artificial stimuli.  Dr. Burge hopes to change this trend.  While bars and blobs are well-defined mathematically, making subsequent analyses easier to control, they are not a realistic portrayal of our world.  Our visual world is complex and diverse, full of color and detail.  

So why is it rare to use real images – called “natural” stimuli – in vision experiments?  “Natural stimuli are more difficult for experiments and analyses, because they are complicated and resist precise mathematical description,” Dr. Burge explained.  “We cannot control all the variables that may impact the performance of the observer.”  

Despite these challenges, Dr. Burge’s lab continues to investigate methods to better define natural stimuli.  Unfortunately, the solution is not as simple as hiring statisticians to create mathematical models of natural images.  Due to the sheer complexity of images we see in the natural world, this method has not seen much success.

Thus, Dr. Burge pursued another possible solution, which he likens to “playing the detective.”  “Let’s take the task of estimating the distance of an object from you in a scene,” he said.  “There are certain features in the images formed on the back of your eye that provide information about how far away that object is. These features are clues about the distance.”  Examples of “clues” include the size of the object’s image or the position of the shadows it casts. 

“We then develop statistical tools that automatically identify the useful features that are relevant to estimating distance,” he continued.  “We determine how to combine those features in the best way possible to estimate the distance.  Finally, we conduct an experiment to test if humans in fact use those features.” 

Dr. Burge has repeatedly found that humans use the available image clues to perform visual tasks nearly as well as the theoretical upper limit. On one hand, these results are expected. There has been great evolutionary pressure to effectively perform basic visual tasks, such as estimating the distance to an object, because doing so will aid the capture of food or avoidance of predators. On the other hand, it is remarkable that a task-based analysis of natural images can tightly predict visual performance.

Auto-Focus…. And Smartphone Cameras

One real-life application of Dr. Burge’s research revolves around something unexpected: smartphone photos.  Or, more specifically, understanding how the auto-focus system works in the human eye – and how it can better inform this process in cameras.

“The auto-focus system in our eyes works a lot better than the auto-focus system in your camera,” said Dr. Burge.  “That’s because our visual systems use the available information—the clues in the images—in a much more principled way than the engineers at Apple.”

Imagine you are taking a photo with your iPhone.  You glance at the screen and initially see a blurry image of the scene.  The clarity of the image oscillates for a couple seconds before becoming clear and sharp, and then you take the photo.   

What is actually happening is that the camera is using a “guess and check” procedure to identify the optimal location of the lens.  In a matter of seconds, it moves the lens in one direction and checks if the contrast improves – then continues to make adjustments until the image becomes sharp.  Sometimes, the lens moves past the point of best focus, so it returns in the opposite direction (i.e. the image on your phone may be blurry, then clear, then blurry, before becoming stable).  

Though this process is fast, usually happening in less than a second, it is significantly slower than the speed of auto-focus in the human eye.  “The fact that you can see the auto-focusing happen in the camera, and be frustrated by it, means that it is happening much slower than the auto-focusing in your eye,” said Dr. Burge.  “The human visual system does it differently, and does it better.”

This is an incredible feat, when one considers the diversity of images that our eyes see.  In an instant, we can turn our heads and see objects at multiple distances with a staggering variety of colors, shapes, sizes, and textures.  How do our eyes auto-focus without us even noticing?  And can that same process be applied to smartphone cameras in the future?  Dr. Burge and colleagues believe so.

“There is one statistical property that is remarkably constant in images, in spite of all the variability,” said Dr. Burge.  “And that is the amount of contrast at each level of detail in an image. It is what is known as a 1/f (‘one-over-f’) amplitude spectrum.”  In this spectrum, the amount of contrast at each level of detail (i.e. frequency) decreases in proportion to the frequency.

However, an out-of-focus image changes the normal shape of this spectrum.  A small amount of focus error removes fine detail (i.e. high spatial frequencies) from the image, like the pinstripes on a shirt. A medium amount of focus error removes contrast at intermediate levels of detail. And the trend continues.

The significance of this pattern is that different focus errors are associated with differently shaped spectra.  “If you can recover the shape of the spectrum, you can use that shape to estimate focus error,” said Dr. Burge.  The estimate of that focus error can then be used to shift the lens by a specific amount – and it happens instantly, without any need for the guess-and-check process.  These findings were published in Proceedings of the Natural Academy of Science as well as Information Display.  The findings were also patented.  

Though more studies are needed, this auto-focus process has exciting implications for improving smartphone cameras.  Dr. Burge and colleagues are also in discussions with a medical device company that is building a digital magnifying glass for individuals with low vision.  Patients have given feedback that the auto-focus of this device does not work well, so Dr. Burge is working with the company to see if his method can be incorporated into the device.

Blur, Binocular Depth Perception, and Public Health Consequences

Dr. Burge’s research converges on another interesting area: monovision prescriptions.  In the past decade, more and more patients over age 40 have opted to receive monovision glasses or contacts instead of bifocals. Unlike bifocals, which have one lens with two distinct optical powers, monovision corrections prescribe a completely different lens for each eye.  Typically, the dominant eye receives a prescription for far vision and the non-dominant eye for near vision.

“Some people love it because they don’t need to look up or down to see far and near, respectively,” said Dr. Burge.  In addition, monofocal lenses reduce a person’s dependency on reading glasses for near-vision tasks, such as reading a book.  But these lenses are not without their drawbacks.  “With a monovision correction, at least one eye’s image will always be blurry, no matter what,” Dr. Burge added.  “But some people aren’t bothered by the superimposed blurry image.”  In fact, there is some evidence to suggest that the visual system begins to suppress the blurrier of the two images. 

Though monofocal lenses are being prescribed more widely, their impact on binocular vision remains unknown.

In preliminary work, Dr. Burge has shown that monovision corrections can have a dramatic impact on binocular depth perception.  He cites a perceptual illusion called the Pulfrich Effect as support.  In this illusion, a person views a pendulum swinging back and forth in the frontal plane.  Then, he or she takes a sunglass lens and places it over one eye.  What happens to the percept of the pendulum?

“Instead of the pendulum looking like it is swinging back and forth in the frontal plane, it now looks like it is swinging in an elliptical path, closer when it swings in one direction, farther when it swings in the other,” said Dr. Burge.  “It’s an incredibly dramatic effect.”

Why does this happen?  It’s a relatively simple explanation: the eye with the sunglass lens receives less light, slowing down the eye’s response (i.e. the transmission of that signal to the brain). If one eye is sending signals more slowly than the other, a neural disparity is created.  

Dr. Burge and colleagues thus reasoned that if one eye were blurred more than the other (as in monocular lenses), the same effect would occur.  After all, blur reduces contrast, and reduced contrast is also known to decrease the speed of signal transmission.  They were surprised to find the exact opposite.

“We found the reverse Pulfrich Effect,” he explained.  “The perceived elliptical path is in the opposite direction of the elliptical path that we were expecting.”  

They soon discovered an explanation.  Instead of slowing transmission to the eye, like the sunglass lens, the blurred lens actually quickened the signal transmission relative to the other eye.  The blurring knocked out fine detail, which takes more time to process than coarse detail.  A similar effect can occur if the pupil of one eye is dilated and the other is not, as often happens for patients visiting the optometric clinic.

This neural disparity is not just an obscure effect seen in a lab; it may have serious implications for optometric practice and public health.  

“Imagine you are pulling up to an intersection in your car,” explained Dr. Burge.  “And imagine you have a monovision correction that is blurring the right eye for far objects.  Now imagine that a bicyclist is coming from the left in the cross traffic. You’re going to perceive this bicyclist as being much farther away than he actually is.  So maybe you’re a bit lax on how you hit the brakes. Collision.”

More research is needed to understand if individuals with monovision lenses adapt out of this effect.  However, previous research has shown that over time, these Pulfrich effects become more pronounced rather than eliminated.  

Driving is just one example of a public health effect of monovision prescriptions.  One possible solution is ensuring that in the United States, where we drive on the right side of the road, patients receive the lens focused on far distances in the left eye (rather than whichever eye is dominant).  Similarly, drivers in England on the left side of the road should receive the blurred lens in the right eye.

As a whole, Dr. Burge’s research into our visual systems has the potential to influence several different realms of life, from day-to-day entertainment to sight-saving technologies.

Share This Page: