SCIENCE MATTERS: EVOLUTION OF EYESPOTS

Lepidopteran with eyespots in Costa Rica

By David Dannecker
Senior Editor

Readers of Prospect Journal’s Week of Photo Journals last week may have noticed a striking example of defensive eyespots in one of the photos in the Central America article. These visual patterns are a particular type of mimicry that can be found in many species of butterflies and caterpillars, and a diverse array of other kinds of animals. They can be a handy disguise, allowing an animal to appear bigger or more dangerous than they really are, or letting a predator blend in with its prey. Sometimes the level of detail achieved in eyespots is astounding. Particularly convincing eyespots, like the pair in the cover photo, can have extra levels of details – notice how each eyespot has a small white pattern within it, which mimics light shining off of a cornea, making it look even more like a genuine eye. How and why might such a trait evolve? And why might they resemble eyes in the first place?

In order for a trait to be selected, it must be hereditary (able to be passed on to offspring), and it must confer some competitive benefit to the individual. In general, that benefit could be the ability to more easily find a mate, or have greater numbers of offspring, or find food or other resources more easily. In this particular case, eyespots often promote easier survival by repelling predators, which we will discuss in more detail later. For a trait like this to develop in the first place, there must have been some variable patterns of pigmentation to begin with. Random pattern variations are fairly common throughout nature, but why do these patterns look like eyes? Spots, speckles, rosettes, and other circular shapes are generally pretty common throughout the animal kingdom. Think about cheetahs, leopards, many kinds of fish, salamanders, seals, hyenas, deer, etc. Spotted patterns crop up all over the place, and while they don’t typically resemble eyes all that closely, it is possible to see how they might evolve gradually from generic spot to eyespot.

Zebra long-wing caterpillar

Let’s consider a few different species of caterpillar to illustrate different calibers of eye-mimicry in naturally occurring spots. Pictured here is the caterpillar of the zebra longwing butterfly (Heliconius charithonia). As you can see by its numerous spines, it has gone for an entirely different type of defensive strategy, but underneath that prickly shield is a smattering of spots that don’t look like eyes much at all. However, spots similar to these must have been the original foundation, the starting point of many of the detailed eyespots that eventually evolved.

Elephant hawk-moth caterpillar

Pictured above is the caterpillar of the elephant hawk-moth (Deilephila elpenor). This is a pretty good example of a caterpillar with spots that are much closer to eyespots in size and position, but still rather unconvincing. The spots are certainly there, but they don’t really look very much like eyes, and they are probably less likely to scare off a predator. However, if genetic variation caused some of the individuals in the population to have spots that were even closer to the right shape, or size, or color or position to start resembling eyes, thus causing the whole animal to look more like a snake, then it would start to be effective at repelling predators.

Lepidopteran with eyespots in Costa Rica

Like this guy, the caterpillar of the spicebush swallowtail butterfly (Papilio troilus). This species’ eyespots are very lifelike, and definitely give the impression that the caterpillar is both larger and more potentially dangerous than it actually is.

The selective pressure acting on the eyespotted animal here relies less on how the individual utilizes the trait, and more on how the predator interacts with the potential prey. Typically, eyespots are exclusively visual traits. Although some kinds of butterflies and moths do flash them to momentarily surprise predators, it is unlikely that they know except by instinct that a predator might react to them the way they do. Nevertheless, the action of displaying eyespots has been shown to startle predators and give the would-be prey an opportunity to potentially escape.

The notion of eyespots repelling predators has been theorized to be based on risk assessment on the part of the predator. Consider a scenario where you are walking along a trail and see something coiled up on the side of the path. It might be a snake, or it might be a pile of rope. You can’t see it well enough to tell what it is. If there is a small possibility that it is a snake and not a rope, it would generally be a bad idea to go and pick it up. Sure it might be a harmless rope, but it also *might* be a deadly snake, and you have a lot more to lose than to gain if that’s the case. It’s a similar phenomenon at work with eyespots. With the caterpillar example, if you are a predator, such as a mantis, and you are looking for a meal, when you see an eyespotted caterpillar, there is a chance that it is a harmless tasty caterpillar, but also a chance that it is a well-defended snake that might do you harm. Really the only way to determine what it is would be attempting to attack it, but if there are other prey options available, why on earth would you take that risk?

With eyespotted butterflies, the disparity is even greater. If it is a butterfly, it would make a good meal, but if the eyespots are instead the face of a larger animal, you’d be walking or flying right into a trap. If the eyespots are flashed suddenly, the benefit of surprise would be conferred as well, forcing the predator to pause and process the new stimulus, allowing the butterfly an extra moment to escape. A predator would be smart not to take the risk at all, especially if there are other, more clearly vulnerable prey options out there. The risk avoidance behavior in predators discussed here is one of the selective forces that would allow eyespots that passively resembled eyes to become more common in the population. Individuals without eyespots would lack that defense and hence become more accessible prey options for predators, while individuals with eyespots would benefit from the risk avoidance behavior. By further random variation, and selective predation, the eyespots that are most effective would become more common over time. Since individuals without any protective eyespots are gradually weeded out of the population, some species have evolved to have eyespots as a universal characteristic.

There are species of flies that utilize a similar strategy of disguise. Many species of hoverfly in the family Syrphidae passively resemble bees for the same reason – why should a bird try to eat a fly that might be a stinging bee? It’s just not really worth the risk.

Lepidopteran with eyespots in Costa Rica

Conversely, there are some species of spiders that mimic ants for the opposite reason; they want to look benign and blend in until the last possible moment when they reveal they are actually dangerous. More of a “wolf-in-sheep’s-clothing” example. Amazingly, the organism pictured below is a spider, not an ant. Another spider disguises itself abdomen-first as an ant, complete with abdominal eyespots, which are intended here to make it look harmless rather than dangerous. This species of crab spider’s resemblance to its weaver ant prey is pretty incredible.

Lepidopteran with eyespots in Costa Rica

Among moths and butterflies, there are a few other examples of patterns besides eyes. Any pattern could theoretically work as a defense as long as it is perceived as scary by a predator. The main reason eye patterns are so commonly evolved is that circles are a comparatively common shape in nature. More complicated shapes are less likely to appear in random variation, but they still can happen. As is the case with this recently-discovered moth, whose wing patterns strongly resemble a spider, complete with hairy body and eight spindly legs.

Cover photo by David Dannecker, Prospect Senior Editor

Authors of additional images linked here, in order of appearance: First caterpillar by DeadEyeArrow; Second caterpillar by Richerman; Third caterpillar by Michael Hodge; Bee-mimicking fly by Bruce Marlin; First spider by Sean Hoyland; Link to photo gallery by Alexander Wild.

New to Prospect Journal’s Science Matters blog? Check out our introductory post to see what we’re about!

THE HUMAN BRAIN IS POORLY ADAPTED TO THE TECHNOLOGICAL AGE

Poster for Internet Addicts Anonymous

By Alexandra Reich
Staff Writer

A supernormal stimulus is a stimulus that elicits an unusually heightened response from an animal. Supernormal stimuli can be easily observed in nature. Nikolas Tinbergen first discovered the existence of supernormal stimuli over the course of several experiments. He found that herring gulls, when presented with an artificially large egg intentionally sized larger than what a herring gull could possibly produce, would take care of the fake egg instead of their own. When presented with two eggs in nature, herring gulls are likely motivated to take care of the larger one in order to hatch a larger chick, which would have a better chance of survival. The gulls have not been conditioned to conceive an upper limit of preferred egg size because eggs that are too large for the gulls to take care of are never produced in nature. They are unable to conceptualize the potential disadvantage in taking care of an egg that is artificially large. Tinbergen found trends of supernormal stimuli in other animals as well, occurring in certain kinds of fish and butterflies. Humans, like animals, can be enraptured by disadvantageous supernormal stimuli.

In industrialized nations, one human equivalent of supernormal stimuli corresponds to technological advances, which are relatively recent considering the timeline of evolution. Television, social media sites, and the wide availability of new information on the Internet have the tendency to draw in users for hours every day. According to one study, American adults, on average, spend over five hours total on digital media per day.

Movies and television shows hook the brain through the human capacity for emotion. Essentially, movies and television shows are abstract or lifelike pixilated images moving across a screen, yet people are so allured by the artificial characters’ personalities and struggles that they can react to the show with real emotions. A term coined by Jeffery Zachs, the “Mirror Rule,” explains the human tendency to imitate the facial expression, and to some extent, the emotion, of the human it is interacting with. Zachs argues that this rule can be applied to the characters in movies to invoke emotional responses from audience members. This concept may be applicable to human evolutionary behavior. As television was introduced relatively recently compared to how long humans have existed, human brains have evolved to be wired for face-to-face social interaction. Television shows or movies simulating human interaction that is either more interesting or more desirable than what people normally encounter could serve as a supernormal stimulus, enticing the human brain and drawing that individual away from their less interesting non-televised life.

In moderation, the consumption of imagined situations or artificial worlds is not necessarily negative. However, prolonged exposure to the supernatural stimuli of virtual Internet worlds can result in addiction. Not only is Internet addiction a legitimate condition, it is estimated to affect six percent of the population worldwide. Similar to the herring gull that cannot resist the allure of the artificially large egg, Internet addiction is an “impulse control problem” in which the affected individual prefers the ease of interacting via the internet over face-to-face societal interaction.

Online video games are a major source of Internet addiction. Theoretically, it makes sense that a video game, particularly a violent one, could act as a super stimulus. Players have the heightened experience of dominance by ‘killing’ artificial enemies without actually risking their own lives, social standing, or a potential prison sentence in the process. The only factor at risk is the virtual progress of their electronic avatar. In addition to the dominance factor, Chatfield argues in his TED talk that successful games stimulate dopamine receptors in the brain as a result of the human evolution to appreciate rewards for effort and problem-solving tasks.

Some countries have taken action against Internet addiction. To look at an extreme example, in South Korea there is a functioning summer camp intended to alleviate children’s dependence on the Internet. Jump Up Internet Rescue School is a tuition-free program that offers participants directed physical exercise as well as other offline hobbies in order to show children that they can have fun outside the confines of the online world. Interventions such as these have the potential to work well because they will reduce the sometimes drastic gap between real life stimuli and internet stimuli by having children participate in a variety of activities. Internet addiction has also been treated in the U.S. with similar mechanisms.

Internet addiction is the result of the human brain’s lack of evolutionary adaption to the supernormal stimuli presented by technology. Just as animals are unable to resist these enhanced stimuli, humans follow suit, employing technology to the point of physical health depravity and social isolation. While advancements in technology have proven beneficial to society in a variety of fields, it does present its limitations.

Image by Michael Mandiberg