Categories

# Notes by Dr. Optoglass: Airy Disk and Pixel Density of the Human Eye

Topics Covered:

• The Airy Disk
• Diffraction
• Pixel size of the Eye
• Pixel density or resolution of the Eye

Everybody thinks they know what art should be. But very few of them have the sense that is necessary to experience painting, that is the sense of sight, that sees colors and forms as living reality in the picture – Otto Dix

What shape are pixels? On displays they are rectangular. On camera sensors they are probably amoebic! We know our photoreceptor cells are shaped like rods and cones. Printers recognize droplets as circular. Which is it?

No matter what, light that passes through a circular aperture (a lens usually) has the characteristics of a circle. Hence a lot of optic testing relies on the mathematics that govern this model.

The Airy disk is named after George Airy. This is what it looks like:

The 3D model tells us how the intensity of light is distributed. The width of the main circle is the smallest possible dot that can be produced by any given lens at a certain aperture. The rings are diffraction patterns.

Diffraction is light bending around corners:

They look similar, don’t they? So why is diffraction important?

• You can’t escape from diffraction.
• Every lens needs a slit (aperture), so no lens can escape diffraction.
• Even a perfect lens (which doesn’t exist) is affected by diffraction.
• Any camera system that relies on a lens (even a perfect one) is affected by diffraction.
• Every camera system has a sensor that is of a certain resolution.
• The lens used must either be equal to the sensor or greater than the sensor in resolution. The lens shouldn’t be the weak link in the chain.
• A system that has a lens that is better than the sensor and all other optical elements in the chain is said to be diffraction limited.
• It is diffraction limited because the lens (and all other optical elements, like filters, etc) is so much better than the sensor in resolution that at this point only diffraction can spoil the image.

Therefore, a camera system in which resolution is not limited by imperfections in the lens but only by diffraction is said to be diffraction limited.

If you have read what Professor Sampler has to say about this, you’ll remember that it’s only when things get smaller that quantum mechanics comes into play. The effect of diffraction is almost invisible if the slit (aperture) is large. That’s why we don’t observe its effects in the real world. But when the aperture gets really small (try it by almost squeezing your eyes shut – everything blurs) the effects of diffraction are not negligible.

Big question: Is our eye diffraction limited?

To answer this question we’ll need a formula to calculate the resolution of the human eye. Just like for everything else in science and engineering, we have not one, but two formulas to choose from:

 The Rayleigh Criterion where ? is the angular resolution ? is the wavelength of light in meters and D is the diameter of the lens’ aperture in meters
 The Dawes’ Limit R = 11.6/D where D is the diameter of the lens’ aperture in centimeters R is the angular resolution in arc seconds

The size of the human pupil can vary from 3mm to 9mm.
A typical human eye will respond to wavelengths from about 390 to 750 nm, with maximum sensitivity at around 555 nm.

What does that give us?

• According to Rayleigh about 0.2 arc min to 1 arc min
• According to Dawes about 0.2 arc min to 0.6 arc min

What both agree on fundamentally is that the eye cannot resolve beyond 0.2 arc minutes due to diffraction. As we have seen, in studies the eye does not resolve beyond 0.4 arc minutes anyway, and it would be a rare individual who can better 0.4 arc minute.

To answer the question: No, the eye is not a diffraction limited system. Our sensor (retina) is better than our lens.

Pixel Density

So how good is our ‘sensor’? We have already seen that most of the cones are found in the fovea, with the foveola having the highest cone density.

What is the size of one disk that 0.2 arc minutes would subtend on the foveola, assuming the diameter of the eye is 22.22 mm? It is about 1.3 ?m (microns). A 0.4 arc minute vision would subtend 2.6 ?m (microns).

What is the area of the fovea? It’s about 3.14mm2. The area of our airy disk is ? x (1.3 x 10-3)2 = 5.3 x 10-6 mm2. How many disks can fit our fovea? 600,000.

If we try the same calculation with 2.6 ?m (microns), we get the total cones in the fovea to be about 150,000. Which one is right?

The fovea subtends about 2 degrees of human vision, and if each disk subtends 0.2 arc minutes, how many disks can fit into 2o? About 1 million. That should tell us that about 2 ?m (microns) is a good average. Remember, in The Human Eye Part II, we learnt that a cone can vary in size, between 0.5 to 4.0 ?m. What’s the average? Yes, 2 ?m.

This is also confirmed by tests on the cone density in the fovea, which at maximum is about 350,000. This gives us a ‘pixel’ value of about 1.7 or 2 ?m.

If the diameter of the fovea is 1mm, then the ‘pixels’ per mm = 500. You could translate that into 500 lines per mm or 250 lp/mm or about 12,700 ppi.

Modern sensors have a pixel size (pitch) of 4 ?m. This gives us a theoretical maximum of about 250 lines per mm (125 lp/mm) (52.5 Megapixels) for a 35mm sensor (8750 x 6000).

Therefore, here are the results:

• Density at its absolute sharpest (0.5?m) = about 50,000 ppi or 2,000 lpm or 1,000 lp/mm
• Density at 1?m = 25,400 ppi or 1,000 lpm or 500 lp/mm
• Density at 2?m = 12,700 ppi or 500 lpm or 250 lp/mm
• Density at 4?m = 6,350 ppi or 250 lpm or 125 lp/mm

Modern sensors are getting there!

Takeaways:

• A point light through a lens is described in terms of an airy disk pattern.
• A camera system in which resolution is not limited by imperfections in the lens but only by diffraction is said to be diffraction limited.
• The pixel size of the eye can be said to be about 2?m (microns).
• The pixel density (or resolution in these terms) of the eye can be said to be about 12,700 ppi or 500 lpm or 250 lp/mm.

Links for further study: