Why do many humans have bad eyesight, such as near-sightedness, which hampers performance in a wide variety of tasks? Shouldn't there be evolutionary pressure towards better eyesight?
Remi.b and Potterbond007 have put forward excellent answers. Would like to add something…
One of the reasons that Myopia (near sightedness) happens is because of use of eye in work like reading, which requires working with close objects. I don't think there was much working with close objects in the past. So very few people with myopia might have been present (and of course, they may have had disadvantages in catching prey.)
Other common disease is hypermetropia which generally occurs during old age, past the reproductive prime, so it could not have much evolutionary impact.
With nearly 40,000 years of natural selection on eyesight, prior to the invention of eye glasses, there needs to be a better explanation than the effects of 'modern' technology. There are two likely reasons for the persistence of "poor" eyesight in humans. First, humans are social animals and live in groups. Within groups there is frequently a division of labor. This division of labor deemphasized the importance of vision. Individuals with "poor" eyesight could easily maintain a high fitness (reproductive success) in this social structure. Second, humans have long term parental care (both within and among families - again a function of social living). Most human offspring maintain close familial connections up to and including the age of reproduction. The strength of selection on eyesight diminishes past reproductive age, allowing individual with poor eyesight to have relatively high fitness.
Maintenance of Polymorphism and Mutation Load
There are many possible reasons why some amount of deleterious alleles are maintained in the population. One of which is the mutation-selection-drift balance. In short: Because mutations always occur, there is continually an input of deleterious mutations in populations genome resulting in some fitness decay (called mutation load). While mutations create this polymorphism, processes of selection and genetic drift cause a decrease in polymorphism. At equilibrium there is a balance between mutations, selection and drift called the mutation-selection-drift balance. You may also want to consider various population structure related concepts such as migration between patches. Haldane first explored this mutation load theoretically and estimated that the fitness of an existing individual in a population of perfect (deleteterious-mutations-free) individual is about 80% lower, which is huge! Such mechanism can explain various disease found in a population.
Age-specific intensity of selection
Also, for the question of disease appearing late in life (which is often the case of eyesight related disease), it is important to realize that those genes which are expressed late in life undergoes a lower selection resulting in a higher mutation-selection-drift balance. See this post for more info.
Impact of Medicine and social helping on the intensity of selection
As pointed out by @potterbond007 modern medicine allows to diminish the deleterious effect of some mutations. For example, one that has clubfoot may be operated and is likely to survive and reproduce and therefore propagating its deleterious alleles. This would be different in some past. Therefore, medicine diminish the selection pressure on deleterious alleles. However, modern medicine probably has a very low effect on our genomes because it is so modern. 100 years ago we could not operate clubfoot and 500 years ago (almost) nobody had glasses to correct for eyesight deficiency. We may go a bit further in the past and also consider that social interactions such as helping weak individuals in a tribe may also have decreases the selection pressure against deleterious alleles (more info in @theBIOguy 's answer). Of course, if selection is diminished the mutation-selcetion-drift balance is increased and deleterious alleles are more common and mutation load is greater.
Further investigations will be needed in Epidemiology
Assuming that eyesights disease are particularly common and deleterious, we should have a closer look to the epidemiology of the diseases of interests. If it seems to you that eyesight-related diseases are more common and more important in terms of their effects on fitness than diseases on other traits then, you may want to seek into some physiological and developmental explanations that may tell you why eyes are so likely to be impacted by genetic and/or environmental factors. I personally have absolutely no knowledge about the epidemiology and developmental mechanisms of any eyesight disease! Also considerations of the impact of modern environment should be considered such as the impact of TV and computer screens on our eyes (more info in @biogirl 's answer)
Mainly because we haven't been forced to remove the bad genes that cause these defects from our gene pool. In ancient times, a bad gene that causes poor eyesight would have made it difficult for a person to spot prey or escape from a predator. However, we have glasses now which gives the bad eyesight gene as much a chance to propagate as any other gene. it also doesn't really matter since poor eyesight is never a factor for survival in present times. The same with poor hearing which we correct with hearing aid. Please read this for an article in the guardian and this article in the orlando sentinel. You could also read this discussion on reddit to see a few fun arguments.
Why Do So Many Humans Need Glasses?
When I was very young, I was given an assignment in school to write a report on the Peregrine Falcon. One interesting fact about this bird happens to be that it's quite fast: When the bird spots prey (sometimes from over a mile away), it can enter into a high-altitude dive, reaching speeds in excess of 200 mph, and snatch its prey out of midair (if you're interested in watching a video of such a hunt, you can check one out here). The Peregrine would be much less capable of achieving these tasks—both the location and capture of prey—if its vision was not particularly acute: Failures of eyesight can result in not spotting the prey in the first place, or failing to capture it if distances and movements aren't properly tracked. For this reason, I suspect (though am not positive) that you'll find very few Peregrines that have bad vision—their survival depends very heavily on seeing well. These birds would probably not be in need of corrective lens, like the glasses and contacts that humans regularly rely upon in modern environments. This raises a rather interesting question: Why do so many humans wear glasses?
What I'm referring to in this case is not the general degradation of vision with age. As organisms age, all their biological systems should be expected to break down and fail with increasing regularity, and eyes are no exception. Crucially, all these systems should be expected to break down, more-or-less, at the same time. This is because there's little point in a body investing loads of metabolic resources into maintaining a completely healthy heart that will last for 100 years if the liver is going to shut down at 60. The whole body will die if the liver does, healthy heart (or eyes) included, so it would be adaptive to allocate those development resources differently. The mystery posed by frequently-poor human eyesight is appreciably different, as poor vision can develop early in life, often before puberty. When you observe apparent maladaptive development early in life like that, it requires another type of explanation.
So what might explain why human visual acuity appears so lackluster early in life (to the tune of over 20 percent of teenagers using corrective lenses)? There are a number of possible explanations we might entertain. The first of these is that visual acuity hasn't been terribly important to human populations for some time, meaning that having poor eyesight did not have an appreciable impact on people's ability to survive and reproduce. This strikes me as a rather implausible hypothesis on the face of it not only because vision seems rather important for navigating the world, but also because it ought to predict that having poor vision should be something of a species universal. While 20 percent of young people using corrective lenses is a lot, eyes (and the associated brain regions dedicated to vision) are costly organs to grow and maintain. If they truly weren't that important to have around, then we might expect that everyone needs glasses to see better, not just pockets of the population. Humans don't seem to resemble the troglobites that have lost their vision after living in caves away from sunlight for many generations.
Another possibility is that visual acuity has been important—it's adaptive to have good vision—but people's eyes fail to develop properly sometimes because of development insults, like infectious organisms. While this isn't implausible in principle—infectious agents have been known to disrupt development and result in blindness, deafness, and even death on the extreme end—the sheer numbers of people who need corrective lenses seem a bit high to be caused by some kind of infection. Further, the numbers of younger children and adults who need glasses appear to have been rising over time, which might seem strange as medical knowledge and technologies have been steadily improving. If the need for glasses is caused by some kind of infectious agent, we would need to have been unaware of its existence and not accidentally treated it with antibiotics or other such medications. Further, we might expect glasses to be associated with other signs of developmental stress, like bodily asymmetries, low IQ, or other such outcomes. If your immune system didn't fight off the bugs that harmed your eyes, it might not be good enough to fight off other development-disrupting infections. However, there seems to be a positive correlation between myopia and intelligence, which would be strange under a disease hypothesis.
A third possible explanation is that visual acuity is indeed important for humans, but our technologies have been relaxing the selection pressures that were keeping it sharp. In other words, since humans invented glasses and granted those who cannot see as well a crutch to overcome this issue, any reproductive disadvantage associated with poor vision was effectively removed. It's an interesting hypothesis that should predict people's eyesight in a population begins to get worse following the invention and/or proliferation of corrective lenses. So, if glasses were invented in Italy around 1300, that should have lead to the Italian population's eyesight growing worse, followed by the eyesight of other cultures to which glasses spread but not beforehand. I don't know much about the history of vision across time in different cultures, but something tells me that pattern wouldn't show up if it could be assessed. In no small part, that intuition is driven by the relatively-brief window of historical time between when glasses were invented, and subsequently refined, produced in sufficient numbers, distributed globally, and today. A window of only about 700 years for all of that to happen and reduce selection pressures for vision isn't a lot of time. Further, there seems to be evidence that myopia can develop rather rapidly in a population, sometimes as quick as a generation:
That's much too fast for a relaxation of selection pressures to be responsible for the change.
This brings us to the final hypothesis I wanted to cover today: an evolutionary mismatch hypothesis. In the event that modern environments differ in some key ways from the typical environments humans have faced ancestrally, it is possible that people will develop along an atypical path. In this case, the body is (metaphorically) expecting certain inputs during its development, and if they aren't received things can go poorly. As a for instance, it has been suggested that people develop allergies, in part, as a result of improved hygiene: Our immune systems are expecting a certain level of pathogen threat which, when not present, can result in our immune system attacking inappropriate targets, like pollen.
There does seem to be some promising evidence on this front for understanding human vision issues. A paper by Rose et al (2008) reports on myopia in two samples of similarly-aged Chinese children: 628 children living in Singapore and 124 living in Sydney. Of those living in Singapore, 29 percent appeared to display myopia, relative to only 3 percent of those living in Sydney. These dramatic differences in rates of myopia are all the stranger when you consider the rates of myopia in their parents were quite comparable. For the Sydney/Singapore samples, respectively, 32/29 percent of the children had no parent with myopia, 43/43 percent had one parent with myopia, and 25/28 percent had two parents with myopia. If myopia was simply the result of inherited genetic mutations, its frequencies between countries shouldn't be as different as they are, disqualifying hypotheses one and three from above.
When examining what behavioral correlates of myopia existed between countries, several were statistically—but not practically—significant, including number of books read and hours spent on computers or watching TV. The only appreciable behavioral difference between the two samples was the number of hours the children tended to spend outdoors. In Sydney, the children spent an average of about 14 hours a week outside, compared to a mere 3 hours in Singapore. It might be the case, then, that the human eye requires exposure to certain kinds of stimulation provided by outdoor activities to develop properly, and some novel aspects of modern culture (like spending lots of time indoors in a school when children are young) reduce such exposure (which might also explain the aforementioned IQ correlation: smarter children may be sent to school earlier). If that were true, we should expect that providing children with more time outdoors when they are young is preventative against myopia, which it actually seems to be.
It should always strike people as strange when key adaptive mechanisms appear to develop along an atypical path early in life that ultimately makes them worse at performing their function. An understanding of what types of biological explanations can account for these early maladaptive outcomes goes a long way in helping you understand where to begin your searches and what patterns of data to look out for.
Rose, K., Morgan, I., Smith, W., Burlutsky, G., Mitchell, P., & Saw, S. (2008). Myopia, lifestyle, and schooling in students of Chinese ethnicity in Singapore and Sydney. Archives of Ophthalmology, 126, 527-530.
Fear of Snakes Drove Pre-Human Evolution
An evolutionary arms race between early snakes and mammals triggered the development of improved vision and large brains in primates, a radical new theory suggests.
The idea, proposed by Lynne Isbell, an anthropologist at the University of California, Davis, suggests that snakes and primates share a long and intimate history, one that forced both groups to evolve new strategies as each attempted to gain the upper hand.
To avoid becoming snake food, early mammals had to develop ways to detect and avoid the reptiles before they could strike. Some animals evolved better snake sniffers, while others developed immunities to serpent venom when it evolved. Early primates developed a better eye for color, detail and movement and the ability to see in three dimensions&mdashtraits that are important for detecting threats at close range.
Humans are descended from those same primates.
Gallery: Snakes of the World Snake News Deadly Aim: Cobras Really Do Shoot for the Eyes Millions of Years Ago, Snakes Were Hip Flying Snakes: New Videos Reveal How They Do It Race Fears Linger Like Dread of Snakes The Surprising Origin of Venom Revealed Snakes in Trouble
Scientists had previously thought that these traits evolved together as primates used their hands and eyes to grab insects, or pick fruit or to swing through trees, but recent discoveries from neuroscience are casting doubt on these theories.
"Primates went a particular route," Isbell told LiveScience. "They focused on improving their vision to keep away from [snakes]. Other mammals couldn't do that. Primates had the pre-adaptations to go that way."
Harry Greene, an evolutionary biologist and snake expert at Cornell University in New York, says Isbell's new idea is very exciting.
"It strikes me as a very special piece of scholarship and I think it's going to provoke a lot of thought," Greene said.
Isbell's work is detailed in the July issue of the Journal of Human Evolution.
A new weapon
Fossil and DNA evidence suggests that the snakes were already around when the first mammals evolved some 100 million years ago. The reptiles were thus among the first serious predators mammals faced. Today, the only other threats faced by primates are raptors, such as eagles and hawks, and large carnivores, such as bears, large cats and wolves, but these animals evolved long after snakes.
Furthermore, these other predators can be safely detected from a distance. For snakes, the opposite is true.
"If you see them close to you, you still have time to avoid them," Isbell said. "Primate vision is particularly good at close range."
Early snakes killed their prey using surprise attacks and by suffocating them to death&mdashthe method of boa constrictors. But the improved vision of primates, combined with other snake-coping strategies developed by other animals, forced snakes to evolve a new weapon: venom. This important milestone in snake evolution occurred about 60 million years ago.
"The [snakes] had to do something to get better at finding their prey, so that's where venom comes in," Isbell said. "The snakes upped the ante and then the primates had to respond by developing even better vision."
Once primates developed specialized vision and enlarged brains, these traits became useful for other purposes, such as social interactions in groups.
Seeing in 3D
Isbell's new theory could explain how a number of primate-defining traits evolved.
For example, primates are among the few animals whose eyes face forward (most animals have eyes located on the sides of their heads). This so-called "orbital convergence" improves depth perception and allows monkeys and apes, including humans, to see in three dimensions. Primates also have better color vision than most animals and are also unique in relying heavily on vision when reaching and grasping for objects.
One of the most popular ideas for explaining how these traits evolved is called the "visual predation hypothesis." It proposes that our early ancestors were small, insect eating mammals and that the need to stalk and grab insects at close range was the driving force behind the evolution of improved vision.
Another popular idea, called the "leaping hypothesis," argues that orbital convergence is not only important for 3D vision, but also for breaking through camouflage. Thus, it would have been useful not only for capturing insects and finding small fruits, but also for aiming at small, hard-to-see branches during mid-leaps through trees.
But there are problems with both hypotheses, Isbell says.
First, there is no solid evidence that early primates were committed insectivores. It's possible that like many primates today, they were generalists, eating a variety of plant foods, such as leaves, fruit and nectar, as well as insects.
More importantly, recent neuroscience studies do not support the idea that vision evolved alongside the ability to reach and grasp. Rather, the data suggest that the reaching-and-grasping abilities of primates actually evolved before they learned to leap and before they developed stereoscopic, or 3D, vision.
Agents of evolutionary change
Isbell thinks proto-primates&mdashthe early mammals that eventually evolved into primates&mdashwere in better position compared to other mammals to evolve specialized vision and enlarged brains because of the foods they ate.
"They were eating foods high in sugar, and glucose is required for metabolizing energy," Isbell said. "Vision is a part of the brain, and messing with the brain takes a lot of energy so you're going to need a diet that allows you to do that."
Modern primates are among the most frugivorous, or "fruit-loving," of all mammals, and this trend might have started with the proto-primates. "Today there are primates that focus on leaves and things like that, but the earliest primates may have had a generalized diet that included fruits, nectar, flowers and insects," she said.
Thus, early primates not only had a good incentive for developing better vision, they might have already been eating the high-energy foods needed to do so.
Testing the theory
Isbell says her theory can be tested. For example, scientists could look at whether primates can visually detect snakes more quickly or more reliably than other mammals. Scientists could also examine whether there are differences in the snake-detecting abilities of primates from around the world.
"You could see whether there is any difference between Malagasy lemurs, South American primates and the African and Asian primates," Isbell said.
Anthropologists have tended to stress things like hunting to explain the special adaptations of primates, and particularly humans, said Greene, the Cornell snake expert, but scientists are starting to warm to the idea that predators likely played a large role in human evolution as well.
"Getting away from things is a big deal, too," Greene said in a telephone interview.
If snake and primate history are as intimately connected as Isbell suggests, then it might account for other things as well, Greene added.
"Snakes and people have had a long history it goes back to long before we were people in fact," he said. "That might sort of explain why we have such extreme attitudes towards snakes, varying from deification to "ophidiphobia," or fear of snakes.
How And Why Did The Human Eye Evolve?
Why do you think the human eye evolved? originally appeared on Quora: the place to gain and share knowledge, empowering people to learn from others and better understand the world.
Answer by C Stuart Hardwick, Award-Winning Scifi Author, on Quora:
Why do you think the human eye evolved?
Because humans didn’t need vision nearly as powerful as most other mammals.
What? You think our eyes are the peak of perfection? Not by a longshot.
Sure, we have three color vision, which we evolved through a mutation and kept because it helps with hunting and gathering, but insects and birds have four color vision that’s better balanced and extended into the ultraviolet.
And we don’t have the tapetum lucidum that gives many predators such exceptional night vision—because it degrades daytime acuity and our ancestors hunted during the day.
And we don’t have the excellent long range vision of most raptors—because we didn’t need to be able to see rabbits eight miles away.
Nor can we see motion at the resolution of dogs (which is why dogs ignored analog TV, because their eyesight was good enough to see the flicker where we see constancy).
And then there’s our blind spot… an area about the size of a dollar coin held at arms length is blind in each eye—and your brain just guesses the missing scenery, partly by making the eyes jiggle frequently to catch missing detail—which is part of why it’s so very hard for humans to do detailed close-up work.
Yeah, our eye evolved all right. Because we aren’t dogs, or eagles, or roaches, we’re toolmakers. We can see in anything from microwaves to infrared, in sounds and electron fields, we just have to build the right machinery. That’s our niche. The sweet spot accorded us by evolution.
In editing this, I just ran across a common deficit of human vision that you might notice. I added a line and caused the part of the screen to bump up into the blind spots of both eyes, making it invisible. Since scanning the screen for such a small target has a low success rate (is slow), I found it by deleting a few characters, causing my brain to lock in on the movement. This is why on most computers, you can opt to make the cursor larger, but that has the drawback of continually drawing the attention to it, even when not desired. Everything we do is a trade-off, because ours is not the optimized design of an architect, but the “sufficiency” provided by not having gone extinct.
This question originally appeared on Quora - the place to gain and share knowledge, empowering people to learn from others and better understand the world. You can follow Quora on Twitter, Facebook, and Google+. More questions:
The average number of colours we can distinguish is around a million
"You'd be hard-pressed to put a number on it," says Kimberly Jameson, an associate project scientist at the University of California, Irvine. "What might be possible with one person is only a fraction of the colours that another person sees."
Some people can see in ultraviolet, but only after eye surgery (Credit: SPL)
Jameson knows what she's talking about, given her work with "tetrachromats", people who possess apparent superhuman vision. These rare individuals, mostly women, have a genetic mutation granting them an extra, fourth cone cell. As a rough approximation based on the number of these extra cones, tetrachromats might see 100 million colours. (People who are colour-blind, or dichromats, have only two cones and see perhaps 10,000 colours.)
What's the smallest number of photons we need to see?
To yield colour vision, cone cells typically need a lot more light to work with than their cousins, the rods. That's why in low-light situations, colour diminishes as the monochromatic rods take over visual duties.
In ideal lab conditions and in places on the retina where rod cells are largely absent, cone cells can be activated when struck by only a handful of photons. Rod cells, though, do even better at picking up whatever ambient light is available. As experiments first conducted in the 1940s show, just one quanta of light can be enough to trigger our awareness. "People can respond to a single photon," says Brian Wandell, professor of psychology and electrical engineering at Stanford. "There is no point in being any more sensitive."
What are the limits of your vision? (Credit: Thinkstock)
In 1941, Columbia University researchers led subjects into a darkened room and gave their eyes some time to adjust. Rod cells take several minutes to achieve full sensitivity – which is why we have trouble seeing when the lights first go out.
The researchers then flashed a blue-green light in front of the subjects’ face. At a rate better than chance, participants could detect the flash when as few as 54 photons reached their eyes.
After compensating for the loss of photons through absorption by other components in the eye, researchers found that as few as five photons activating five separate rods triggered an awareness of light by the participants.
What is the smallest and farthest we can see?
Now here’s a fact that may surprise you: There is no intrinsic limit to the smallest or farthest thing we can see. So long as an object of whatever size, distance or brevity transfers a photon to a retinal cell, we can spy it.
Visual acuity drops off over greater distances (Credit: Thinkstock)
"All the eye cares about for vision is the amount of light that lands on the eye," says Landy. "It's just the total number of photons. So you can make [a light source] ridiculously tiny and ridiculously brief, but if it's really strong in photons, you can still see it."
Psychology textbooks, for instance, routinely state that on a clear, dark night, a candle flame can be spotted from as far away as 48 kilometres. In practice, of course, our eyes are routinely inundated by photons, so stray quanta of light from great distances get lost in the wash. "When you increase the background intensity, the amount of extra light you need to see something increases," says Landy.
Snake eyes: New insights into visual adaptations
Snakes have adapted their vision to hunt their prey day or night. For example, snakes that need good eyesight to hunt during the day have eye lenses that act as sunglasses, filtering out ultraviolet light and sharpening their vision while nocturnal snakes have lenses that allow ultraviolet light through, helping them to see in the dark.
New insights into the relationship between ultraviolet (UV) filters and hunting methods in snakes is one of the findings of the first major study of visual pigment genes and lenses in snakes -- published in the advanced online edition of Molecular Biology and Evolution.
The new research was an international collaboration between snake biologists and vision experts led by the David Gower and included fellow Natural History Museum researchers Bruno Simões and Filipa Sampaio. Much of the research, including most of the DNA analyses, was carried out in the Museum's laboratories.
Scientists have long known that snakes have highly variable sets of rods and cones -- the specialised cells in the retina that an animal uses to detect light. But until now, most modern studies of vision in vertebrates (animals with a backbone) have concentrated on mammals, birds and fish.
To see in different colors, animals use visual pigments in their rods and cones that are sensitive to different wavelengths of light. The researchers examined the genes involved in producing the pigments from a broad genomic survey of 69 different species of snakes. What they found was as the genes vary from species to species so does the exact molecular structure of the pigments and the wavelengths of light they absorb.
The new research discovered that most snakes possess three visual pigments and are likely dichromatic in daylight -- seeing two primary colours rather than the three that most humans see.
However, it also discovered that snake visual pigment genes have undergone a great amount of adaptation, including many changes to the wavelengths of light that the pigments are sensitive to, in order to suit the diversity of lifestyles that snakes have evolved.
Most snakes examined in the new study are sensitive to UV light, which likely allows them to see well in low light conditions. For light to reach the retina and be absorbed by the pigments, it first travels through the lens of the eye. Snakes with UV-sensitive visual pigments therefore have lenses that let UV light though.
In contrast, the research showed that those snakes that rely on their eyesight to hunt in the daytime, such as the gliding golden tree snake Chrysopelea ornata and the Monypellier snake Malpolon monspessulanus, have lenses that block UV light. As well as perhaps helping to protect their eyes from damage, this likely helps sharpen their sight -- in the same way that skiers' yellow goggles cut out some blue light and improve contrast.
Moreover, these snakes with UV-filtering lenses have tuned the pigments in their retina so that they are no longer sensitive to the short UV light, but absorb longer wavelengths.
All nocturnal species examined (such as N America's glossy snake Arizona elegans) were found to have lenses that do not filter UV. Some snake species active in daylight also lack a UV-filtering lens, perhaps because they are less reliant on very sharp vision or live in places without very bright light.
By analysing how the pigments have evolved in snakes, the new study concluded also that the most recent ancestor of all living snakes had UV sensitive vision. "The precise nature of the ancestral snake is contentious, but the evidence from vision is consistent with the idea that it was adapted to living in low light conditions on land," said corresponding author Gower.
Genetic Basis of Eye Color
The main gene that controls eye color is relatively closely linked to the genes that cause skin color. It is believed that the ancient human ancestors all had dark brown or nearly black colored eyes and very dark hair (which is also controlled by linked genes to eye color and skin color). Even though brown eyes are still considered mostly dominant overall eye colors, there are several different eye colors readily seen now in the global population of human beings. So where did all of these eye colors come from?
While evidence is still being collected, most scientists agree that the natural selection for the lighter eye colors is linked to the relaxation of selection for the darker skin tones. As human ancestors began to migrate to various places around the world, the pressure for selection of dark skin color was not as intense. Particularly unnecessary to human ancestors that settled in what are now the Western European nations, selection for dark skin and dark eyes was no longer necessary for survival. These much higher latitudes afforded different seasons and no direct sunlight like near the equator on the continent of Africa. Since the selection pressure was no longer as intense, genes were more likely to mutate.
Eye color is a bit complex when talking about genetics. The color of human eyes is not dictated by a single gene like many of the other traits. It is instead considered a polygenic trait, meaning there are several different genes on various chromosomes that carry information about what eye color an individual should possess. These genes, when expressed, then blend together to make various shades of different colors. Relaxed selection for dark eye color also allowed more mutations to take hold. This created even more alleles available to combine together in the gene pool to create different eye colors.
Individuals who can trace their ancestors to Western European countries generally have a lighter skin color and lighter eye color than those from other parts of the world. Some of these individuals also have shown parts of their DNA that were very similar to those of the long-extinct Neanderthal lineage. Neanderthals were thought to have lighter hair and eye colors than their Homo sapien cousins.
Bad Eyesight and Evolution - Biology
We've seen that Pax6 from vertebrates and eyeless from flies are remarkably similar in sequence and function, but what about our other visionaries the squid and the flatworm? Despite the major differences in their eyes, they all have genes similar to Pax6. Here are corresponding sections of the Pax6-like eye-building genes for our visionaries. Similarities to the mouse gene are highlighted in green:
But why are these genes so similar when the animals from which they come, and the eyes that they develop, are so different? As discussed earlier, there are two basic evolutionary explanations for similarities: homology and analogy. Are these genes homologous (i.e., were they passed down from the common ancestor of all these different organisms) or analogous (i.e., did they all evolve independently through convergent evolution)?
Based on the observations that all of these gene versions are remarkably similar in sequence, have related functions, and are incredibly widespread (animals all across the tree of life have them), scientists have concluded that they must be homologous and must have been inherited from the common ancestor of all these animals. It is just too unlikely that all these different animal lineages happened to independently evolve remarkably similar genes that do remarkably similar jobs. The most parsimonious explanation is that the gene evolved just once long ago and was then passed down to all these different modern animal lineages.
Coyne JA. Why evolution is true. New York: Viking 2009.
Dawkins, R. The Blind Watchmaker. New York: Penguin Books 1986.
Fyfe A. Publishing and the classics: Paley's Natural theology and the nineteenth-century scientific canon. Stud Hist Phil Sci. 200233:729–51.
Gliboff S. Paley's design argument as an inference to the best explanation, or, Dawkins' dilemma. Stud Hist Phil Biol Biomed Sci. 200031:579–97.
Gregory TR. The evolution of complex organs. Evolution: Education and Outreach. 20081:358–89.
Gregory TR. Understanding natural selection: essential concepts and common misconceptions. Evolution: Education and Outreach. 20092:156–75.
Harrison OFN. The anatomy and physiology of the mammalian larynx. Cambridge: Cambridge University Press 1995.
McLaughlin P. Reverend Paley's naturalist revival. Stud Hist Phil Biol Biomed Sci. 200839:25–37.
Moore R. William Paley, 1743–1805. Reports of the NCSE. 200929:26–8.
Novella S. Suboptimal optics: vision problems as scars of evolutionary history. Evolution: Education and Outreach. 20081:493–7.
Paley W. Natural Theology, or evidences of the existence and attributes of the deity collected from the appearances of nature. Oxford: Oxford University Press 1802.
Shubin N. Your inner fish. New York: Pantheon 2008.
Williams GC. Natural selection: domains, levels, and challenges. Oxford: Oxford University Press 1992.