Evolution Of Both Simple And Compound Eyes Biology Essay

Published: Last Edited:

This essay has been submitted by a student. This is not an example of the work written by our professional essay writers.


The function of the eye has been constant throughout its evolution. Starting with the simplest of photoreceptor types all the way to the human eye with its many adaptations these organs have been sensitive to and their evolution influenced by light. The eye is an example of co adaptation evolution whereby many small changes in many different elements that compose the eye have taken place over millions of years to produce these complex organs, and also many times in different species, somewhere in the region of 40 to 65 times in different groups of vertebrates (Gehring, 2004). Tracing back through the fossil record it appears that the beginnings of the evolution of vision occurred during the Cambrian period approximately 530 million years ago (Ridley, 2004). Only 20 million years before, in the Precambrian, most life was not large and mobile and therefore did not really require well-developed vision. The relatively sudden appearance of a large number of life forms, all with forms of light perception, is referred to as the Cambrian explosion and it is from this point that modern simple and compound eyes have developed. This explosion gave the world all modern phyla although examples of simple vertebrate eyes cannot be traced back this far as opposed to simple arthropod eyes. Though the early chordates that vertebrates evolved from did exist in the early Cambrian it is believed that they did not possess photoreceptors of any type. The first examples of chordates with eyes are not seen until almost 30 million years later (Ridley, 2004).


Light is the medium through which we see, and on the whole it is straight with us, i.e. it travels in straight lines therefore allowing eyes to use light to locate objects in the world and gain an accurate impression of their form. Light also interacts with the materials it is reflected from, changing its wavelength and therefore allowing eyes to perceive the colour of the world. Light was originally thought to be composed of particles travelling in straight rays. However, experiments in the early 1800s by Thomas Young using his double-slit method showed that light has the properties of a substance that is moving in waves. This discovery was refined over the next hundred years or so and in 1888 Heinrich Hertz confirmed that light was part of the electromagnetic spectrum. Still there were problems with wave theory: a major one being the photoelectric effect of light producing electrons when it struck metal since this would require some form of discrete particle or packet of energy as opposed to a continuous wave. In 1905 Albert Einstein proposed a unification of both the ray and wave theories of light called the quantum theory of light. However, on the whole the nature of light at the quantum level is not relevant to a discussion on the evolution of the various types of eyes (Land & Nilsson, 2002). The quality of vision is highly dependent upon the amount of light available to the eye, although one would think with the large number of photons that are available during daylight, somewhere in the range of 1020 photos per second per metre (Land & Nilsson, 2002) there should be no problem with acquiring enough photons to the photoreceptors in order to resolve an image. However, when you take into account the incredibly small size of a photon we can begin to see some of the difficulties faced by the evolution of eyes. The absolute threshold for human vision is approximately 1010 photos per second per metre and this gives us a range of usable human vision (Land & Nilsson, 2002). However, the simple eyes humans have developed do not in practice perceive this entire range of light intensity but only a smaller section of it, therefore our visual system must have the capability to shift the centre of its light detection intensity level to allow us to perceive the world around us at a number of differing light intensities from full daylight to very dim twilight.

There are three main properties of light that have influenced the evolution of eyes, which are contrast, wavelength and polarisation. Of the three only the first two are of relevance to human eyes but the detection of polarisation is found in many other species of the animal kingdom. Contrast allows us to determine the shape of objects by using the changes in light intensity from different parts of the same object. As eyes must be able to identify objects consistently in varying light conditions the absolute intensity level of light in the scene must be removed when the image is processed by the brain or nervous system. The wavelength of light that is perceivable by members of the animal kingdom does not seem to vary by very much, although there are some animals that can see a range that is below or above what humans can perceive in terms of wavelength (Figure 1). Everything in the world around us reflects the different wavelengths of light radiation to varying degrees and it is the combination of these colourless wavelengths of light in different intensities that forms the colour of an object after visual processing, although these colours are purely subjective to each individual though you may process the combination of wavelengths from a blue object differently to someone else as long as your processing is consistent across all the varying intensities of blue how would you know or even be able to demonstrate that your processing is different, the answer of course is that you could not.

Figure 1: The visible spectrum of light for the simple human eye lies between 400-700 nm although for some other animals the range extends down to 320 nm and the range of ultraviolet light, and certain butterflies can perceive light that is pushed slightly into the infrared up to 740nm. (Pajari, 2008)

Colour vision requires the use of at least two visual pigments capable of detecting differing wavelengths of light in order to allow us to remove the light intensity component from a scene and focus on the wavelengths of light that are reaching the eye. An alternative solution would be to have colour filters that alter the wavelengths of light that arrive at one visual pigment, however this alternative is only actually observed in some birds and lizards that also possess more than one visual pigment, therefore the colour filters must serve some other purpose (Land & Nilsson, 2002). The pigment molecule is composed of one of four chromophores, molecules very similar to vitamin A, held in an opsin protein. These chromophores undergo a conformational change when they detect and capture a photon, which causes a hyperpolarisation of the cell and a subsequent signal, and the wavelength of light detectable is determined by a combination of the type of chromophore and opsin (Smith, 2000).

Polarization as mentioned above is not detectable by humans but is an important property of light that is utilised by other species. In invertebrates and vertebrates the visual pigment molecule is present in the photoreceptor membrane of the visual cells and the electric field of a photon lies across a specific plane, therefore a photon will only excite a pigment molecule if the excitable element of the chromophore lies in the same plane. In vertebrates the axes of pigment molecules are randomly oriented but that is not the case in the eyes of invertebrates. Each of the subunits in a compound eye has its own lens and several long visual cells arranged in a star pattern. The light sensitive part of the visual cells are the microvilli, an array of tube like membranes where the pigment molecule is located. Invertebrate eyes have polarization sensitivity due to the fact that the pigment molecules are aligned parallel to the axes of the microvilli tubes. Therefore each visual cell would be maximally sensitive to light polarized parallel to its microvilli. While polarization gives an extra visual channelto the intensity and wavelength of light it can also be confusing. The polarization of the light reflected by a flower depends on the position of the sun and on the inclination of the flower. For these reasons the polarization sensitivity of invertebrates such as bees that use it for navigation is restricted to the dorsal upwards looking portion of their eyes (Smith, 2000).

Optical Problems and Solutions

The elements required for an eye to successfully resolve an image are the same requirements that a camera has, hence the popular use of the term camera-type eye to describe human and similar eyes such as the octopus eye. The lens or mirror in the eye fulfil the same purpose as in a camera and only differ in the substrates used for their construction. Figure 2 shows the various types of simple and compound eye and the arrangements of their lenses and mirrors in relation to their photoreceptors. The mirror as opposed to lens system of eye seen in the scallop in section G of Figure 2 is not a very common system. These biological mirrors are constructed from layers of biological materials with differences in their refractive index which creates multilayer interference and reflection. In the simple eyes of cats and some compound eyes of crustacea and insects a reflective tapeta is used to increase the amount of light the photoreceptors receive by doubling the path of light through the retina (Warrant, 2004). In apposition compound eyes each of the lenses forms a very small image (section E Figure 2) where as in superposition compound eyes the multiple lenses work together to form a single image (sections F and H Figure 2). The compound eye is composed of units called ommatidia and each ommatidium contains a cluster of photoreceptors called the rhabdom bordered by support cells and pigment cells.

Figure 2: Even though the diversity of eyes in the animal kingdom is very large the laws of optics have limited solutions for collecting and focusing light to only eight types of eyes. Simple single chambered eyes and compound eyes form images using shadows (A and B), refraction (C to F), or reflection (G and H). Light rays are shown in blue and photoreceptors are shaded. The simple pit eye (A) (chambered nautilus) led to the lensed eyes in fish and cephalopods (C) (octopus) and terrestrial animals (D) (red-tailed hawk). Scallop eyes (G) (bay scallop) are simple but use concave mirror optics to produce an image. The simplest compound eye (B) (sea fan) found in bivalve molluscs led to the apposition compound eye (E) (dragonfly) found in bees, crabs, and fruit flies; the refracting superposition compound eye (F) (Antarctic krill) of moths and krill; and the reflecting superposition eye (H) (lobster) found in decapod shrimps and lobsters. (Fernald, 2006)

All eyes in their many different forms are capable of measuring light intensity. An important advancement in the evolution of eyes came about with the development of spatial vision: the ability to discriminate between the intensity of light coming from different directions, therefore allowing the eye to visualise an image. Spatial vision can be simply put into practice with only two light receptors, each detecting light from a different direction. This system can then be scaled up to improve the resolution of the image being visualised; this of course assumes that the animal is able to discriminate between the information coming from each light receptive cell. This newly evolving eye must also fulfil some other criteria in order to successfully resolve its image: it must possess a method of providing the directionality of the light coming onto the photoreceptors. This can be achieved by placing a layer of dark pigment in front of the photoreceptors and it is how this is implemented in different species that gave rise to the evolution of simple and compound eyes. Simple or single chambered eyes have a single pigment shield for multiple photoreceptors whereas compound eyes have multiple pigment shields for multiple photoreceptors; they are essentially a collection of very simple single chambered eyes. The pigment shield method for determining directionality of light does have its limits as to how many photoreceptors it can provide directionality for (Land & Nilsson, 2002). Eventually more advanced methods will be needed for focusing light onto the photoreceptive elements and it is this development that has made simple eyes superior to compound eyes.

How effective an eye is at conveying visual information depends upon the two features of sensitivity and resolution, where sensitivity refers to the ability of an eye to receive enough photons on its photoreceptors and resolution refers to the granularity with which an image can be split which depends upon the number and size of the photoreceptors. Problems with the resolving of an image usually arise due to limitations in one or both of these features. For example, an eye may not be capable of bringing rays of light to a point of single focus, a process referred to as aberration, or the light itself may be scattered by some imperfection in the eye. The quality of how light is focused upon the photoreceptors will determine their size and number as there is no evolutionary advantage for having a much higher resolution than sensitivity and vice versa. The qualities of light discussed here act to determine the nature of the eye, the wave like properties of light create a fundamental limit to photoreceptor size and image quality. As a result of this one cannot have a photoreceptive element that is smaller than the wavelength of light as light would be unable to interact with it. The quantal nature of light, that is the fact that it travels in discrete and undividable packets called photons, affects the certainty with which light can be detected.

As the resolution in the eye is determined by the fineness of the photoreceptors it is possible to use their size to determine the finest grating of light and dark bars that can be resolved by the eye as the distance between the centres of two adjacent light and dark bars will be twice the width of a photoreceptor. This is useful as it allows us to measure the point at which an eye can no longer resolve an image due to constraints of resolution. In simple eyes the lens always has a nodal point where light rays will pass through without having their path altered by the lens. Knowing this we can determine the size of an object when it appears on the retina. Let us say that the height of the object is O and the distance from the object to the nodal point is U, then these two measurements will give us an angle at the nodal point of α, therefore α=O/U radians. The distance from the nodal point to the photoreceptors in the eye is f (the focal length) and I is the size of the image of the object on the photoreceptor layer (retina). Therefore, we can construct the equation O/U=α=I/f and using this we can determine the accuracy with which an eye can resolve an image by substituting I with the distance between the centres of two adjacent photoreceptors, s, to give us the minimum width of a bar on a grating of light and dark bars that can be resolved as δ, therefore s/f=δ (Land & Nilsson, 2002). We can now apply this to determine that the smallest resolvable grating would have a period of 2s which can be expressed as an angle 2s/f and the reciprocal of this (we must use the reciprocal of the period due to the fact that as the frequency increases as resolution improves but the period decreases) would be f(2/s) and this gives us the spatial frequency of the eye νs and νs= f(2/s)=1/(2 δ) (Land & Nilsson, 2002). The spatial frequency can therefore be improved by reducing the separation between photoreceptors s or by increasing the focal length f. Increasing the focal length requires increasing the size of the eye itself and you therefore reach a natural limit determined by the size of the organism, and reducing the separation between photoreceptors below 2µm is not possible due to diffraction, which is the next optical problem to be discussed.

The world can be viewed as being constructed from a vast number of gratings covering the total range of spatial frequencies of the eye. However, as the spatial frequency becomes finer and finer eventually a point is reached where there is no contrast in the image for the eye to detect and this is known as the optical cut-off frequency and is a major factor in determining the finest detail that can be resolved. This limit is determined by the diffraction of light when it hits a lens, as the wavefront of the light hits the lens the central portion of the wavefront takes longer to pass through the lens than the portions of the wavefront at the edges of the lens due to the fact that it is passing through more of the lens. This causes the wavefront to become a curve that is centred on the point of focus of the lens. When this altered wavefront reaches the point of focus parts of the wavefront that are in phase with one another will reinforce each other and those parts that are out of phase will cancel each other out. Therefore the image of a point source of light on the retina is not a point but a diffraction pattern caused by the interference of all the light that enters the aperture of the eye and this pattern is called the Airy disc. A good way of measuring the Airy disc is to take its width at half of its maximum intensity, i.e. the angular half-width. This is given in radians by λ/A, where λ is the wavelength of light and A is the diameter of the aperture. For a human eye in daylight A is approximately 2.5mm, which gives us an Airy disc half-width of 0.0002 radians (Land & Nilsson, 2002). The solution for diffraction would seem to be having larger eyes as this would increase the size of the aperture therefore reducing the size of the Airy disc and improving resolution. In apposition compound eyes the rhabdoms must each be in line with the lens that focuses their individual image and the image seen by the brain is composed from the average intensity of a small section of the visual field. Since these individual lenses are so very small diffraction becomes a major issue. For example, in the bee each lens in its compound eye only has a diameter A of 25 μm, giving a disc diameter of 0.02 radians which demonstrates the significantly lower resolving power of the compound eye versus the simple eye. These problems can be solved by having a local region in the apposition compound eye with larger lenses to improve resolution. The resolution in superposition compound eyes can reach the limit of what is possible due to diffraction due to the fact that the image seen is constructed from many small focused elements smaller than the half-width of the Airy diffraction disc.

The diffraction of light limits the upper level of resolution that an eye can resolve but there are other optical defects that can affect the resolving power of the eye. Spherical aberration is caused by the fact that a spherical surface will not focus all the light rays that strike its surface together at the same point of focus. Light that strikes the edges of the spherical surface will be focused in front of the focal point of light that strikes near the centre of the spherical surface causing blurring of the image that is being resolved. The solutions for these problems are either creating a hyperbolic or non spherical optical surface or by altering the refractive index of the optical surface from low at the edges to high in the centre. The lenses in fish eyes alter their refractive index in this way by making the lens non-optically homogenous (Shand, Døving, & Collin, 1999). The human eye utilises both of these correction methods by having a hyperbolic cornea and a lens that is non optically homogenous. Chromatic aberration occurs due to the fact that the different wavelengths of light are refracted to differing degrees by a lens, specifically red light is refracted less than blue light. Solutions to this problem are not as well developed as those to spherical aberration; certain fish have lenses with multiple focal lengths to attempt to provide photoreceptors with an image that is in focus for some of the light that reaches it. The human solution is worse than this in that we have narrowed the wavelengths of light used for high acuity vision to reduce the blue spectrum of light. Aberrations are very dependent upon the size of the eye and due to the fact that the blurring increases as the focal length of the eye increases. However, increasing the focal length will increase the resolution of the eye due to the fact that s/f=δ, as the focal length rises δ will fall. Therefore we cans see how striking a compromise between the defects caused by aberration and the benefits of reducing δ is important in the eye. In eyes with very short focal lengths of about 100µm, such as the apposition compound eye of the Dragonfly (section E of Figure 2), aberration defects are unnoticeable the focal length of the eye must be much greater for the effects of aberration to become apparent. Aberration defects are not found in superposition compound eyes for the same reasons (Land & Nilsson, 2002).

The photoreceptors in the eye are crucial as the means for transmitting an optical image to the brain or nervous system. Having discussed diffraction one can now see why photoreceptors must not be larger than the finest grating the optics of the eye can resolve onto the photoreceptor or smaller than this either. If they are too small they will not be able to measure the intensity of the line accurately and if they are too large then they will detect the average intensity from several lines. Surely the solution would be to reduce the size of the photoreceptors but they have a natural limit in that they cannot be smaller than the wavelength of light itself because if they were they would not be able to successfully retain the light it receives within it by total internal reflection. This causes signal crossover with the adjacent photoreceptors thereby reducing resolution, the size of a photoreceptor can therefore not be smaller than 1µm. The length of the photoreceptor is also important as it determines the amount of light absorbed by pigment molecules in the photoreceptor. In the vertebrate eye a photoreceptor 77µm in length will absorb 90% of the available photons passing through it. We can see how the length of the photoreceptors has an important role in sensitivity of the eye to light (Land & Nilsson, 2002).

The sensitivity of an eye is controlled by altering the size of the aperture A of the eye and the angle over which each photoreceptor receives photons given by d/f, where d is the diameter of the photoreceptor and f is the focal length of the eye. Increasing the size of A will increase sensitivity until you reach the limit of the aperture and then the only way of increasing sensitivity further would be by increasing the size of the eye thereby increasing the focal length, photoreceptor diameter and aperture simultaneously. Increasing the diameter of the receptors will unfortunately reduce resolution but a trade off has been reached in simple vertebrate eyes where the inputs from multiple adjacent photoreceptors is pooled in low light, effectively increasing the size of the photoreceptor. The pupil in vertebrate simple eyes is very useful for altering the aperture of the eye depending on light conditions to increase or reduce sensitivity, reducing the size of the aperture with the pupil will also have the effect of reducing aberration defects but will increase diffraction defects (Warrant, 2004). Methods of adapting to light and dark in apposition compound eyes are varied: a variable pupil that reduces the number of rhabdoms that are receiving light, having a longitudinal pupil that containing large numbers of pigment molecules that move towards or away from the rhabdoms to increase or decrease light sensitivity, altering the shape and size of the rhabdoms and finally altering the focal length of the lens to focus more or less light onto the rhabdom. Superposition eyes are more sensitive to light by a factor of almost a hundred because each rhabdom has multiple lenses focusing light onto the rhabdom. This effective increase in the size of the aperture increases light sensitivity correspondingly. The very high sensitivity to light of the superposition eye means that they must have some form of adaptation to protect the eye from very bright light such as daylight. Pigment cells move from in between the individual lenses towards the rhabdoms. This has the effect of preventing light from multiple lenses from reaching single rhabdoms and therefore reducing light sensitivity (Land & Nilsson, 2002).


The solutions to the optical problems presented to an eye that resolves images in daylight are varied across the differing types of simple and compound eyes. However the varied the solutions are the limits posed by the wavelength of light and the quantal nature of light are always the same and confer a certain commonality in how these problems must be solved. Although the nature of light has presented many challenges for the evolution of eyes it has made understanding the methods the eye uses to solve them relatively simple. The physical characteristics of light that are needed to achieve this are well understood and characterised meaning we can be sure that how we understand how the eye works is correct.