Did Neanderthals Start Fires?



It is one of the most iconic Christmas songs of all time.

Written by Bob Wells and Mel Torme in the summer of 1945, “The Christmas Song” (subtitled “Chestnuts Roasting on an Open Fire”) was crafted in less than an hour. As the story goes, Wells and Torme were trying to stay cool during the blistering summer heat by thinking cool thoughts and then jotting them down on paper. And, in the process, “The Christmas Song” was born.

Many of the song’s lyrics evoke images of winter, particularly around Christmastime. But none has come to exemplify the quiet peace of a Christmas evening more than the song’s first line, “Chestnuts roasting on an open fire . . . ”

Gathering around the fire to stay warm, to cook food, and to share in a community has been an integral part of the human experience throughout history—including human prehistory. Most certainly our ability to master fire played a role in our survival as a species and in our ability as human beings to occupy and thrive in some of the world’s coldest, harshest climates.

But fire use is not limited only to modern humans. There is strong evidence that Neanderthals made use of fire. But, did these creatures have control over fire in the same way we do? In other words, did Neanderthals master fire? Or, did they merely make opportunistic use of natural fires? These questions are hotly debated by anthropologists today and they contribute to a broader discussion about the cognitive capacity of Neanderthals. Part of that discussion includes whether these creatures were cognitively inferior to us or whether they were our intellectual equals.

In an attempt to answer these questions, a team of researchers from the Netherlands and France characterized the microwear patterns on bifacial (having opposite sides that have been worked on to form an edge) tools made from flint recovered from Neanderthal sites, and concluded that the wear patterns suggest that these hominins used pyrite to repeatedly strike the flint. This process generates sparks that can be used to start fires.1 To put it another way, the researchers concluded that Neanderthals had mastery over fire because they knew how to start fires.


Figure 1: Biface tools for cutting or scraping. Image credit: Shutterstock

However, a closer examination of the evidence along with results of other studies, including recent insight into the cause of Neanderthal extinction, raises significant doubts about this conclusion.

What Do the Microwear Patterns on Flint Say?

The investigators focused on the microwear patterns of flint bifaces recovered from Neanderthal sites as a marker for fire mastery because of the well-known practice among hunter-gatherers and pastoralists of striking flint with pyrite (an iron disulfide mineral) to generate sparks to start fires. Presumably, the first modern humans also used this technique to start fires.


Figure 2: Starting a fire with pyrite and flint. Image credit: Shutterstock

The research team reasoned that if Neanderthals started fires, they would use a similar tactic. Careful examination of the microwear patterns on the bifaces led the research team to conclude that these tools were repeatedly struck by hard materials, with the strikes all occurring in the same direction along the bifaces’ long axis.

The researchers then tried to experimentally recreate the microwear pattern in a laboratory setting. To do so, they struck biface replicas with a number of different types of materials, including pyrites, and concluded that the patterns produced by the pyrite strikes most closely matched the patterns on the bifaces recovered from Neanderthal sites. On this basis, the researchers claim that they have found evidence that Neanderthals deliberately started fires.

Did Neanderthals Master Fire?

While this conclusion is possible, at best this study provides circumstantial, not direct, evidence for Neanderthal mastery of fire. In fact, other evidence counts against this conclusion. For example, bifaces with the same type of microwear patterns have been found at other Neanderthal sites, locales that show no evidence of fire use. These bifaces would have had a range of usages, including butchery of the remains of dead animals. So, it is possible that these tools were never used to start fires—even at sites with evidence for fire usage.

Another challenge to the conclusion comes from the failure to detect any pyrite on the bifaces recovered from the Neanderthal sites. Flint recovered from modern human sites shows visible evidence of pyrite. And yet the research team failed to detect even trace amounts of pyrite on the Neanderthal bifaces during the course of their microanalysis.

This observation raises further doubt about whether the flint from the Neanderthal sites was used as a fire starter tool. Rather, it points to the possibility that Neanderthals struck the bifaces with materials other than pyrite for reasons not yet understood.

The conclusion that Neanderthals mastered fire also does not square with results from other studies. For example, a careful assessment of archaeological sites in southern France occupied by Neanderthals from about 100,000 to 40,000 years ago indicates that Neanderthals could not create fire. Instead, these hominins made opportunistic use of natural fire when it was available to them.2

These French sites do show clear evidence of Neanderthal fire use, but when researchers correlated the archaeological layers displaying evidence for fire use with the paleoclimate data, they found an unexpected pattern. Neanderthals used fire during warm climate conditions and failed to use fire during cold periods—the opposite of what would be predicted if Neanderthals had mastered fire.

Lightning strikes that would generate natural fires are much more likely to occur during warm periods. Instead of creating fire, Neanderthals most likely harnessed natural fire and cultivated it as long as they could before it extinguished.

Another study also raises questions about the ability of Neanderthals to start fires.3 This research indicates that cold climates triggered Neanderthal extinctions. By studying the chemical composition of stalagmites in two Romanian caves, an international research team concluded that there were two prolonged and extremely cold periods between 44,000 and 40,000 years ago. (The chemical composition of stalagmites varies with temperature.)

The researchers also noted that during these cold periods, the archaeological record for Neanderthals disappears. They interpret this disappearance to reflect a dramatic reduction in Neanderthal population numbers. Researchers speculate that when this population downturn took place during the first cold period, modern humans made their way into Europe. Being better suited for survival in the cold climate, modern human numbers increased. When the cold climate mitigated, Neanderthals were unable to recover their numbers because of the growing populations of modern humans in Europe. Presumably, after the second cold period, Neanderthal numbers dropped to the point that they couldn’t recover, and hence, became extinct.

But why would modern humans be more capable than Neanderthals of surviving under extremely cold conditions? It seems as if it should be the other way around. Neanderthals had a hyper-polar body design that made them ideally suited to withstand cold conditions. Neanderthal bodies were stout and compact, comprised of barrel-shaped torsos and shorter limbs, which helped them retain body heat. Their noses were long and sinus cavities extensive, which helped them warm the cold air they breathed before it reached their lungs. But, despite this advantage, Neanderthals died out and modern humans thrived.

Some anthropologists believe that the survival discrepancy could be due to dietary differences. Some data indicates that modern humans had a more varied diet than Neanderthals. Presumably, these creatures primarily consumed large herbivores—animals that disappeared when the climatic conditions turned cold, thereby threatening Neanderthal survival. On the other hand, modern humans were able to adjust to the cold conditions by shifting their diets.

But could there be a different explanation? Could it be that with their mastery of fire, modern humans were able to survive cold conditions? And did Neanderthals die out because they could not start fires?

Taken in its entirety, the data seems to indicate that Neanderthals lacked mastery of fire but could use it opportunistically. And, in a broader context, the data indicates that Neanderthals were cognitively inferior to humans.

What Difference Does It Make?

One of the most important ideas taught in Scripture is that human beings uniquely bear God’s image. As such, every human being has immeasurable worth and value. And because we bear God’s image, we can enter into a relationship with our Maker.

However, if Neanderthals possessed advanced cognitive ability just like that of modern humans, then it becomes difficult to maintain the view that modern humans are unique and exceptional. If human beings aren’t exceptional, then it becomes a challenge to defend the idea that human beings are made in God’s image.

Yet, claims that Neanderthals are cognitive equals to modern humans fail to withstand scientific scrutiny, time and time, again. Now it’s time to light a fire in my fireplace and enjoy a few contemplative moments thinking about the real meaning of Christmas.



  1. A. C. Sorensen, E. Claud, and M. Soressi, “Neanderthal Fire-Making Technology Inferred from Microwear Analysis,” Scientific Reports 8 (July 19, 2018): 10065, doi:10.1038/s41598-018-28342-9.
  2. Dennis M. Sandgathe et al., “Timing of the Appearance of Habitual Fire Use,” Proceedings of the National Academy of Sciences, USA 108 (July 19, 2011), E298, doi:10.1073/pnas.1106759108; Paul Goldberg et al., “New Evidence on Neandertal Use of Fire: Examples from Roc de Marsal and Pech de l’Azé IV,” Quaternary International 247 (2012): 325–40, doi:10.1016/j.quaint.2010.11.015; Dennis M. Sandgathe et al., “On the Role of Fire in Neandertal Adaptations in Western Europe: Evidence from Pech de l’Azé IV and Roc de Marsal, France,” PaleoAnthropology (2011): 216–42, doi:10.4207/PA.2011.ART54.
  3. Michael Staubwasser et al., “Impact of Climate Change on the Transition of Neanderthals to Modern Humans in Europe,” Proceedings of the National Academy of Sciences, USA 115 (September 11, 2018): 9116–21, doi:10.1073/pnas.1808647115.

Vocal Signals Smile on the Case for Human Exceptionalism



Before Thanksgiving each year, those of us who work at Reasons to Believe (RTB) headquarters take part in an annual custom. We put our work on pause and use that time to call donors, thanking them for supporting RTB’s mission. (It’s a tradition we have all come to love, by the way.)

Before we start making our calls, our ministry advancement team leads a staff meeting to organize our efforts. And each year at these meetings, they remind us to smile when we talk to donors. I always found this to be an odd piece of advice, but they insist that when we talk to people, our smiles come across over the phone.

Well, it turns out that the helpful advice of our ministry advancement team has scientific merit, based on a recent study from a team of neuroscientists and psychologists from France and the UK.1 This research highlights the importance of vocal signaling for communicating emotions between people. And from my perspective, the work also supports the notion of human exceptionalism and the biblical concept of the image of God.

We Can Hear Smiles

The research team was motivated to perform this study in order to learn the role vocal signaling plays in social cognition. They chose to focus on auditory “smiles,” because, as these researchers point out, smiles are among the most powerful facial expressions and one of the earliest to develop in children. As I am sure we all know, smiles express positive feelings and are contagious.

When we smile, our zygomaticus major muscle contracts bilaterally and causes our lips to stretch. This stretching alters the sounds of our voices. So, the question becomes: Can we hear other people when they smile?


Figure 1: Zygomaticus major. Image credit: Wikipedia

To determine if people can “hear” smiles, the researchers recorded actors who spoke a range of French phonemes, with and without smiling. Then, they modeled the changes in the spectral patterns that occurred in the actors’ voices when they smiled while they spoke.

The researchers used this model to manipulate recordings of spoken sentences so that they would sound like they were spoken by someone who was smiling (while keeping other features such as pitch, content, speed, gender, etc., unchanged). Then, they asked volunteers to rate the “smiley-ness” of voices before and after manipulation of the recordings. They found that the volunteers could distinguish the transformed phonemes from those that weren’t altered.

Next, they asked the volunteers to mimic the sounds of the “smiley” phonemes. The researchers noted that for the volunteers to do so, they had to smile.

Following these preliminary experiments, the researchers asked volunteers to describe their emotions when listening to transformed phonemes compared to those that weren’t transformed. They found that when volunteers heard the altered phonemes, they expressed a heightened sense of joy and irony.

Lastly, the researchers used electromyography to monitor the volunteers’ facial muscles so that they could detect smiling and frowning as the volunteers listened to a set of 60 sentences—some manipulated (to sound as if they were spoken by someone who was smiling) and some unaltered. They found that when the volunteers judged speech to be “smiley,” they were more likely to smile and less likely to frown.

In other words, people can detect auditory smiles and respond by mimicking them with smiles of their own.

Auditory Signaling and Human Exceptionalism

This research demonstrates that both the visual and auditory clues we receive from other people help us to understand their emotional state and to become influenced by it. Our ability to see and hear smiles helps us develop empathy toward others. Undoubtedly, this trait plays an important role in our ability to link our minds together and to form complex social structures—two characteristics that some anthropologists believe contribute to human exceptionalism.

The notion that human beings differ in degree, not kind, from other creatures has been a mainstay concept in anthropology and primatology for over 150 years. And it has been the primary reason why so many people have abandoned the belief that human beings bear God’s image.

Yet, this stalwart view in anthropology is losing its mooring, with the concept of human exceptionalism taking its place. A growing minority of anthropologists and primatologists now believe that human beings really are exceptional. They contend that human beings do, indeed, differ in kind, not merely degree, from other creatures—including Neanderthals. Ironically, the scientists who argue for this updated perspective have developed evidence for human exceptionalism in their attempts to understand how the human mind evolved. And, yet, these new insights can be used to marshal support for the biblical conception of humanity.

Anthropologists identify at least four interrelated qualities that make us exceptional: (1) symbolism, (2) open-ended generative capacity, (3) theory of mind, and (4) our capacity to form complex social networks.

Human beings effortlessly represent the world with discrete symbols and to denote abstract concepts. Our ability to represent the world symbolically and to combine and recombine those symbols in a countless number of ways to create alternate possibilities has interesting consequences. Human capacity for symbolism manifests in the form of language, art, music, and body ornamentation. And humans alone desire to communicate the scenarios we construct in our minds with other people.

But there is more to our interactions with other human beings than a desire to communicate. We want to link our minds together and we can do so because we possess a theory of mind. In other words, we recognize that other people have minds just like ours, allowing us to understand what others are thinking and feeling. We also possess the brain capacity to organize people we meet and know into hierarchical categories, allowing us to form and engage in complex social networks.

Thus, I would contend that our ability to hear people’s smiles plays a role in theory of mind and our sophisticated social capacities. It contributes to human exceptionalism.

In effect, these four qualities could be viewed as scientific descriptors of the image of God. In other words, evidence for human exceptionalism is evidence that human beings bear God’s image.

So, even though many people in the scientific community promote a view of humanity that denigrates the image of God, scientific evidence and common-day experience continually support the notion that we are unique and exceptional as human beings. It makes me grin from ear to ear to know that scientific investigations into our cognitive and behavioral capacities continue to affirm human exceptionalism and, with it, the image of God.

Indeed, we are the crown of creation. And that makes me thankful!



  1. Pablo Arias, Pascal Belin, and Jean-Julien Aucouturier, “Auditory Smiles Trigger Unconscious Facial Imitation,” Current Biology 28 (July 23, 2018): PR782–R783, doi:10.1016/j.cub.2018.05.084.
Reprinted with permission by the author
Original article at:

When Did Modern Human Brains—and the Image of God—Appear?



When I was a kid, I enjoyed reading Ripley’s Believe It or Not! I couldn’t get enough of the bizarre facts described in the pages of this comic.

I was especially drawn to the panels depicting people who had oddly shaped heads. I found it fascinating to learn about people whose skulls were purposely forced into unnatural shapes by a practice known as intentional cranial deformation.

For the most part, this practice is a thing of the past. It is rarely performed today (though there are still a few people groups who carry out this procedure). But for much of human history, cultures all over the world have artificially deformed people’s crania (often for reasons yet to be fully understood). They accomplished this feat by binding the heads of infants, which distorts the normal growth of the skull. Through this practice, the shape of the human head can be readily altered to be abnormally flat, elongated, rounded, or conical.


Figure 1: Deformed ancient Peruvian skull. Image credit: Shutterstock.

It is remarkable that the human skull is so malleable. Believe it, or not!


Figure 2: Parts of the human skull. Image credit: Shutterstock.

For physical anthropologists, the normal shape of the modern human skull is just as bizarre as the conical-shaped skulls found among the remains of the Nazca culture of Peru. Compared to other hominins (such as Neanderthals and Homo erectus), modern humans have oddly shaped skulls. The skull shape of the hominins was elongated along the anterior-posterior axis. But the skull shape of modern humans is globular, with bulging and enlarged parietal and cerebral areas. The modern human skull also has another distinctive feature: the face is retracted and relatively small.


Figure 3: Comparison of modern human and Neanderthal skulls. Image credit: Wikipedia.

Anthropologists believe that the difference in skull shape (and hence, brain shape) has profound significance and helps explain the advanced cognitive abilities of modern humans. The parietal lobe of the brain is responsible for:

  • Perception of stimuli
  • Sensorimotor transformation (which plays a role in planning)
  • Visuospatial integration (which provides hand-eye coordination needed for throwing spears and making art)
  • Imagery
  • Self-awareness
  • Working and long-term memory

Human beings seem to uniquely possess these capabilities. They make us exceptional compared to other hominins. Thus, for paleoanthropologists, two key questions are: when and how did the globular human skull appear?

Recently, a team of researchers from the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany, addressed these questions. And their answers add evidence for human exceptionalism while unwittingly providing support for the RTB human origins model.1

The Appearance of the Modern Human Brain

To characterize the mode and tempo for the origin of the unusual morphology (shape) of the modern human skull, the German researchers generated and analyzed the CT scans of 20 fossil specimens representing three windows of time: (1) 300,000 to 200,000 years ago; (2) 130,000 to 100,000 years ago; and (3) 35,000 to 10,000 years ago. They also included 89 cranially diverse skulls from present-day modern humans, 8 Neanderthal skulls, and 8 from Homo erectus in their analysis.

The first group consisted of three specimens: (1) Jebel Irhoud 1 (dating to 315,000 years in age); (2) Jebel Irhoud 2 (also dating to 315,000 years in age); and (3) Omo Kibish (dating to 195,000 years in age). The specimens that comprise this group are variously referred to as near anatomically modern humans or archaic Homo sapiens.

The second group consisted of four specimens: (1) LH 18 (dating to 120,000 years in age); (2) Skhul (dating to 115,000 years in age); (3) Qafzeh 6; and (4) Qafzeh 9 (both dating to about 115,000 years in age. This group consists of specimens typically considered to be anatomically modern humans. The third group consisted of thirteen specimens that are all considered to be anatomically and behaviorally modern humans.

Researchers discovered that the group one specimens had facial features like that of modern humans. They also had brain sizes that were similar to Neanderthals and modern humans. But their endocranial shape was unlike that of modern humans and appeared to be intermediate between H. erectus and Neanderthals.

On the other hand, the specimens from group two displayed endocranial shapes that clustered with the group three specimens and the present-day samples. In short, modern human skull morphology (and brain shape) appeared between 130,000 to 100,000 years ago.

Confluence of Evidence Locates Humanity’s Origin

This result aligns with several recent archaeological finds that place the origin of symbolism in the same window of time represented by the group two specimens. (See the Resources section for articles detailing some of these finds.) Symbolism—the capacity to represent the world and abstract ideas with symbols—appears to be an ability that is unique to modern humans and is most likely a manifestation of the modern human brain shape, specifically an enlarged parietal lobe.

Likewise, this result coheres with the most recent dates for mitochondrial Eve and Y-chromosomal Adam around 120,000 to 150,000 years ago. (Again, see the Resources section for articles detailing some of these finds.) In other words, the confluence of evidence (anatomical, behavioral, and genetic) pinpoints the origin of modern humans (us) between 150,000 to 100,000 years ago, with the appearance of modern human anatomy coinciding with the appearance of modern human behavior.

What Does This Finding Mean for the RTB Human Origins Model?

To be clear, the researchers carrying out this work interpret their results within the confines of the evolutionary framework. Therefore, they conclude that the globular skulls—characteristic of modern humans—evolved recently, only after the modern human facial structure had already appeared in archaic Homo sapiens around 300,000 years ago. They also conclude that the globular skull of modern humans had fully emerged by the time humans began to migrate around the world (around 40,000 to 50,000 years ago).

Yet, the fossil evidence doesn’t show the gradual emergence of skull globularity. Instead, modern human specimens form a distinct cluster isolated from the distinct clusters formed by H. erectus, Neanderthals, and archaic H. sapiens. There are no intermediate globular specimens between archaic and modern humans, as would be expected if this trait evolved. Alternatively, the distinct clusters are exactly as expected if modern humans were created.

It appears that the globularity of our skull distinguishes modern humans from H. erectus, Neanderthals, and archaic Homo sapiens (near anatomically modern humans). This globularity of the modern human skull has implications for when modern human behavior and advanced cognitive abilities emerged.

For this reason, I see this work as offering support for the RTB human origins creation model (and, consequently, the biblical account of human origins and the biblical conception of human nature). RTB’s model (1) views human beings as cognitively superior and distinct from other hominins, and (2) posits that human beings uniquely possess a quality called the image of God that I believe manifests as human exceptionalism.

This work supports both predictions by highlighting the uniqueness and exceptional qualities of modern humans compared to H. erectus, Neanderthals, and archaic H. sapiens, calling specific attention to our unusual skull and brain morphology. As noted, anthropologists believe that this unusual brain morphology supports our advanced cognitive capabilities—abilities that I believe reflect the image of God. Because archaic H. sapiens, Neanderthals, and H. erectus did not possess this brain morphology, it makes it unlikely that these creatures had the sophisticated cognitive capacity displayed by modern humans.

In light of RTB’s model, it is gratifying to learn that the origin of anatomically modern humans coincides with the origin of modern human behavior.

Believe it or not, our oddly shaped head is part of the scientific case that can be made for the image of God.



  1. Simon Neubauer, Jean-Jacques Hublin, and Philipp Gunz, “The Evolution of Modern Human Brain Shape,” Science Advances 4 (January 24, 2018): eaao596, doi:10.1126/sciadv.aao5961.
Reprinted with permission by the author
Original article at:

Further Review Overturns Neanderthal Art Claim



As I write this blog post, the 2018–19 NFL season is just underway.

During the course of any NFL season, several key games are decided by a controversial call made by the officials. Nobody wants the officials to determine the outcome of a game, so the NFL has instituted a way for coaches to challenge calls on the field. When a call is challenged, part of the officiating crew looks at a computer tablet on the sidelines—reviewing the game footage from a number of different angles in an attempt to get the call right. After two minutes of reviewing the replays, the senior official makes his way to the middle of the field and announces, “Upon further review, the call on the field . . .”

Recently, a team of anthropologists from Spain and the UK created quite a bit of controversy based on a “call” they made from working in the field. Using a new U-Th dating method, these researchers age-dated the artwork in caves from Iberia. Based on the age of a few of their samples, they concluded that Neanderthals produced cave paintings.1 But new work by three independent research teams challenges the “call” from the field—overturning the conclusion that Neanderthals made art and displayed symbolism like modern humans.

U-Th Dating Method

The new dating method under review measures the age of calcite deposits beneath cave paintings and those formed over the artwork after the paintings were created. As water flows down cave walls, it deposits calcite. When calcite forms, it contains trace amounts of U-238. This isotope decays into Th-230. Normally, detection of such low quantities of the isotopes would require extremely large samples. Researchers discovered that by using accelerator mass spectrometry, they could get by with 10-milligram samples. And by dating the calcite samples with this technique, they produced minimum and maximum ages for the cave paintings.2

Call from the Field: Neanderthals Are Artists

The team applied their dating method to the art found in three cave sites in Iberia (ancient Spain): (1) La Pasiega, which houses paintings of animals, linear signs, claviform signs, and dots; (2) Ardales, which contains about 1,000 paintings of animals, along with dots, discs, lines, geometric shapes, and hand stencils; and (3) Maltravieso, which displays a set of hand stencils and geometric designs. The research team took a total of 53 samples from 25 carbonate formations associated with the cave art in these three cave sites. While most of the samples dated to 40,000 years old or less (which indicates that modern humans were the artists), three measurements produced minimum ages of around 65,000 years, including: (1) red scalariform from La Pasiega, (2) red areas from Ardales, and (3) a hand stencil from Maltravieso. On the basis of the three measurements, the team concluded that the art must have been made by Neanderthals because modern humans had not made their way into Iberia at that time. In other words, Neanderthals made art, just like modern humans did.


Figure: Maltravieso Cave Entrance, SpainImage credit: Shutterstock

Shortly after the findings were published, I wrote a piece expressing skepticism about this claim for two reasons.

First, I questioned the reliability of the method. Once the calcite deposit forms, the U-Th method will only yield reliable results if none of the U or Th moves in or out of the deposit. Based on the work of researchers from France and the US, it does not appear as if the calcite films are closed systems. The calcite deposits on the cave wall formed because of hydrological activity in the cave. Once a calcite film forms, water will continue to flow over its surface, leeching out U (because U is much more water soluble than Th). By removing U, water flowing over the calcite will make it seem as if the deposit and, hence, the underlying artwork is much older than it actually is.3

Secondly, I expressed concern that the 65,000-year-old dates measured for a few samples are outliers. Of the 53 samples measured, only three gave age-dates of 65,000 years. The remaining samples dated much younger, typically around 40,000 years in age. So why should we give so much credence to three measurements, particularly if we know that the calcite deposits are open systems?

Upon Further Review: Neanderthals Are Not Artists

Within a few months, three separate research groups published papers challenging the reliability of the U-Th method for dating cave art and, along with it, the claim that Neanderthals produced cave art.4 It is not feasible to detail all their concerns in this article, but I will highlight six of the most significant complaints. In several instances, the research teams independently raised the same concerns.

  1. The U-Th method is unreliable because the calcite deposits are an open system. The concern that I raised was reiterated by two of the research teams for the same reason I expressed. The U-Th dating technique can only yield reliable results if no U or Th moves in or out of the system once the calcite film forms. The continued water flow over the calcite deposits will preferentially leech U from the deposit, making the deposit appear to be older than it is.
  2. The U-Th method is unreliable because it fails to account for nonradiogenic Th. This isotope would have been present in the source water producing the calcite deposits. As a result, Th would already be present in calcite at the time of formation. This nonradiogenic Th would make the samples appear to be older than they actually are.
  3. The 65,000-year-old dates for the three measurements from La Pasiega, Ardales, and Maltravieso are likely outliers. Just as I pointed out before, two of the research groups expressed concern that only 3 of the 53 measurements came in at 65,000 years in age. This discrepancy suggests that these dates are outliers, most likely reflecting the fact that the calcite deposits are an open system that formed with Th already present. Yet, the researchers from Spain and the UK who reported these results emphasized the few older dates while downplaying the younger dates.
  4. Multiple measurements on the same piece of art yielded discordant ages. For example, the researchers made five age-date measurements of the hand stencil at Maltravieso. These dates (66.7 kya [thousand years ago], 55.2 kya, 35.3 kya, 23.1 kys, and 14.7 kya) were all over the place. And yet, the researchers selected the oldest date for the age of the hand stencil, without justification.
  5. Some of the red “markings” on cave walls that were dated may not be art. Red markings are commonplace on cave walls and can be produced by microorganisms that secrete organic materials or iron oxide deposits. It is possible that some of the markings that were dated were not art at all.
  6. The method used by the researchers to sample the calcite deposits may have been flawed. One team expressed concern that the sampling technique may have unwittingly produced dates for the cave surface on which the paintings were made rather than the pigments used to make the art itself. If the researchers inadvertently dated the cave surface, it could easily be older than the art.

In light of these many shortcomings, it is questionable if the U-Th method to date cave art is reliable. After review, the call from the field is overturned. There is no conclusive evidence that Neanderthals made art.

Why Does This Matter?

Artistic expression reflects a capacity for symbolism. And many people view symbolism as a quality unique to human beings that contributes to our advanced cognitive abilities and exemplifies our exceptional nature. In fact, as a Christian, I see symbolism as a manifestation of the image of God. If Neanderthals possessed symbolic capabilities, such a quality would undermine human exceptionalism (and with it the biblical view of human nature), rendering human beings nothing more than another hominin. At this juncture, every claim for Neanderthal symbolism has failed to withstand scientific scrutiny.

Now, it is time for me to go back to the game.

Who dey! Who dey! Who dey think gonna beat dem Bengals!



  1. L. Hoffmann et al., “U-Th Dating of Carbonate Crusts Reveals Neandertal Origin of Iberian Cave Art,” Science359 (February 23, 2018): 912–15, doi:10.1126/science.aap7778.
  2. W. G. Pike et al., “U-Series Dating of Paleolithic Art in 11 Caves in Spain,” Science 336 (June 15, 2012): 1409–13, doi:10.1126/science.1219957.
  3. Georges Sauvet et al., “Uranium-Thorium Dating Method and Palaeolithic Rock Art,” Quaternary International 432 (2017): 86–92, doi:10.1016/j.quaint.2015.03.053.
  4. Ludovic Slimak et al., “Comment on ‘U-Th Dating of Carbonate Crusts Reveals Neandertal Origin of Iberian Cave Art,’” Science 361 (September 21, 2018): eaau1371, doi:10.1126/science.aau1371; Maxime Aubert, Adam Brumm, and Jillian Huntley, “Early Dates for ‘Neanderthal Cave Art’ May Be Wrong,” Journal of Human Evolution (2018), doi:10.1016/j.jhevol.2018.08.004; David G. Pearce and Adelphine Bonneau, “Trouble on the Dating Scene,” Nature Ecology and Evolution 2 (June 2018): 925–26, doi:10.1038/s41559-018-0540-4.
Reprinted with permission by the author
Original article at:

Can Evolution Explain the Origin of Language?



Oh honey hush, yes you talk too much
Oh honey hush, yes you talk too much
Listenin’ to your conversation is just about to separate us

—Albert Collins

He was called the “Master of the Telecaster.” He was also known as the “Iceman,” because his guitar playing was so hot, he was cold. Albert Collins (1932–93) was an electric blues guitarist and singer whose distinct style of play influenced the likes of Stevie Ray Vaughn and Robert Cray.


Image: Albert Collins in 1990. Image Credit: Masahiro Sumori [GFDL (https://www.gnu.org/copyleft/fdl.html), CC-BY-SA-3.0 (https://creativecommons.org/licenses/by-sa/3.0/) or CC BY-SA 2.5 (https://creativecommons.org/licenses/by-sa/2.5)], from Wikimedia Commons.

Collins was known for his sense of humor and it often came through in his music. In one of Collins’s signature songs, Honey Hush, the bluesman complains about his girlfriend who never stops talking: “You start talkin’ in the morning; you’re talkin’ all day long.” Collins finds his girlfriend’s nonstop chatter so annoying that he contemplates ending their relationship.

While Collins may have found his girlfriend’s unending conversation irritating, the capacity for conversation is a defining feature of human beings (modern humans). As human beings, we can’t help ourselves—we “talk too much.”

What does our capacity for language tell us about human nature and our origins?

Language and Human Exceptionalism

Human language flows out of our capacity for symbolism. Humans have the innate ability to represent the world (and abstract ideas) using symbols. And we can embed symbols within symbols to construct alternative possibilities and then link our scenario-building minds together through language, music, art, etc.

As a Christian, I view our symbolism as a facet of the image of God. While animals can communicate, as far as we know only human beings possess abstract language. And despite widespread claims about Neanderthal symbolism, the scientific case for symbolic expression among these hominids keeps coming up short. To put it another way, human beings appear to be uniquely exceptional in ways that align with the biblical concept of the image of God, with our capacity for language serving as a significant contributor to the case for human exceptionalism.

Recent insights into the mode and tempo of language’s emergence strengthen the scientific case for the biblical view of human nature. As I have written in previous articles (see Resources) and in Who Was Adam?, language appears to have emerged suddenly—and it coincides with the appearance of anatomically modern humans. Additionally, when language first appeared, it was syntactically as complex as contemporary language. That is, there was no evolution of language—proceeding from a proto-language through simple language and then to complex language. Language emerges all at once as a complete package.

From my vantage point, the sudden appearance of language that uniquely coincides with the first appearance of humans is a signature for a creation event. It is precisely what I would expect if human beings were created in God’s image, as Scripture describes.

Darwin’s Problem

This insight into the origin of language also poses significant problems for the evolutionary paradigm. As linguist Noam Chomsky and anthropologist Ian Tattersall admit, “The relatively sudden origin of language poses difficulties that may be called ‘Darwin’s problem.’”1

Anthropologist Chris Knight’s insights compound “Darwin’s problem.” He concludes that “language exists, but for reasons which no currently accepted theoretical paradigm can explain.”2 Knight arrives at this conclusion by surveying the work of three scientists (Noam Chomsky, Amotz Zahavi, and Dan Sperber) who study language’s origin using three distinct approaches. All three converge on the same conclusion; namely, evolutionary processes should not produce language or any form of symbolic communication.

Chris Knight writes:

Language evolved in no other species than humans, suggesting a deep-going obstacle to its evolution. One possibility is that language simply cannot evolve in a Darwinian world—that is, in a world based ultimately on competition and conflict. The underlying problem may be that the communicative use of language presupposes anomalously high levels of mutual cooperation and trust—levels beyond anything which current Darwinian theory can explain . . . suggesting a deep-going obstacle to its evolution.3

To support this view, Knight synthesizes the insights of linguist Noam Chomsky, ornithologist and theoretical biologist Amotz Zahavi, and anthropologist Dan Sperber. All three scientists determine that language cannot evolve from animal communication for three distinct reasons.

Three Reasons Why Language Is Unique to Humans

Chomsky views animal minds as only being capable of bounded ranges of expression. On the other hand, human language makes use of a finite set of symbols to communicate an infinite array of thoughts and ideas. For Chomsky, there are no intermediate steps between bounded and infinite expression of ideas. The capacity to express an unlimited array of thoughts and ideas stems from a capacity that must have appeared all at once. And this ability must be supported by brain and vocalization structures. Brain structures and the ability to vocalize would either have to already be in place at the time language appeared (because these structures were selected by the evolutionary process for entirely different purposes) or they simultaneously arose with the capacity to conceive of infinite thoughts and ideas. To put it another way, language could not have emerged from animal communication through a step-evolutionary process. It had to appear all at once and be fully intact at the time of its genesis. No one knows of any mechanism that can effect that type of transformation.

Zahavi’s work centers on understanding the evolutionary origin of signaling in the animal world. Endemic to his approach, Zahavi divides natural selection into two components: utilitarian selection (which describes selection for traits that improve the efficiency of some biological process—enhancing the organism’s fitness) and signal selection (which involves the selection of traits that are wasteful). Though counterintuitive, signal selection contributes to the fitness of the organism because it communicates the organism’s fitness to other animals (either members of the same or different species). The example Zahavi uses to illustrate signal selection is the unusual behavior of gazelles. These creatures stot (jump up and down, stomp the ground, loudly snort) when they detect a predator, which calls attention to themselves. This behavior is counterintuitive. Shouldn’t these creatures use their energy to run away, getting the biggest jump they can on the pursuing predator? As it turns out, the “wasteful and costly” behavior communicates to the predator the fitness of the gazelle. In the face of danger, the gazelle is willing to take on risk, because it is so fit. The gazelle’s behavior dissuades the predator from attacking. Observations in the wild confirm Zahavi’s ideas. Predators most often will go after gazelles that don’t stot or that display limited stotting behavior.

Animal signaling is effective and reliable only when actual costly handicaps are communicated. The signaling can only be effective when a limited and bounded range of signals is presented. This constraint is the only way to communicate the handicap. In contrast, language is open-ended and infinite. Given the constraints on animal signaling, it cannot evolve into language. Natural selection prevents animal communication from evolving into language because, in principle, when the infinite can be communicated, in practice, nothing is communicated at all.

Based in part on fieldwork he conducted in Ethiopia with the Dorze people, Dan Sperber concluded that people use language to primarily communicate alternative possibilities and realities—falsehoods—rather than information that is true about the world. To be certain, people use language to convey brute facts about the world. But most often language is used to communicate institutional facts—agreed-upon truths—that don’t necessarily reflect the world as it actually is. According to Sperber, symbolic communication is characterized by extravagant imagery and metaphor. Human beings often build metaphor upon metaphor—and falsehood upon falsehood—when we communicate. For Sperber, this type of communication can’t evolve from animal signaling. What evolutionary advantage arises by transforming communication about reality (animal signaling) to communication about alternative realities (language)?

Synthesizing the insights of Chomsky, Zahavi, and Sperber, Knight concludes that language is impossible in a Darwinian world. He states, “The Darwinian challenge remains real. Language is impossible not simply by definition, but—more interestingly—because it presupposes unrealistic levels of trust. . . . To guard against the very possibility of being deceived, the safest strategy is to insist on signals that just cannot be lies. This rules out not only language, but symbolic communication of any kind.”4

Signal for Creation

And yet, human beings possess language (along with other forms of symbolism, such as art and music). Our capacity for abstract language is one of the defining features of human beings.

For Christians like me, our language abilities reflect the image of God. And what appears as a profound challenge and mystery for the evolutionary paradigm finds ready explanation in the biblical account of humanity’s origin.

Is it time for our capacity for conversation to separate us from the evolutionary explanation for humanity’s origin?



  1. Johan J. Bolhuis et al., “How Could Language Have Evolved?” PLoS Biology 12 (August 2014): e1001934, doi:10.1371/journal.pbio.1001934.
  2. Chris Knight, “Puzzles and Mysteries in the Origins of Language,” Language and Communication 50 (September 2016): 12–21, doi:10.1016/j.langcom.2016.09.002.
  3. Knight, “Puzzles and Mysteries,” 12–21.
  4. Knight, “Puzzles and Mysteries,” 12–21.
Reprinted with permission by the author
Original article at:

Neuroscientists Transfer “Memories” from One Snail to Another: A Christian Perspective on Engrams



Scientists from UCLA recently conducted some rather bizarre experiments. For me, it’s these types of things that make it so much fun to be a scientist.

Biologists transferred memories from one sea slug to another by extracting RNA from the nervous system of a trained sea slug and then injecting the extract into an untrained sea slug.1 After the injection, the untrained sea snails responded to environmental stimuli just like the trained ones, based on false memories created by the transfer of biomolecules.

Why would researchers do such a thing? Even though it might seem like their motives were nefarious, they weren’t inspired to carry out these studies by Dr. Frankenstein or Dr. Moreau. Instead, they had really good reasons for performing these experiments: they wanted to gain insight into the physical basis of memory.

How are memories encoded? How are they stored in the brain? And how are memories retrieved? These are some of the fundamental scientific questions that interest researchers who work in cognitive neuroscience. It turns out that sea slugs belonging to the group Aplysia(commonly referred to as sea hares) make ideal organisms to study in order to address these questions. The fact that we can gain insight into how memories are stored with sea slugs is mind-blowing and indicates to me (as a Christian and a biochemist) that biological systems have been designed for discovery.

Sea Hares

Sea hares have become the workhorses of cognitive neuroscience. This creature has a nervous system that’s complex enough to allow neuroscientists to study reflexes and learned behaviors, but simple enough that they can draw meaningful conclusions from their experiments. (By way of comparison, members of Aplysia have about 20,000 neurons in their nervous systems compared to humans who have 85 billion neurons in our brains alone.)

Toward this end, neuroscientists took advantage of a useful reflexive behavior displayed by sea hares, called gill and siphon withdrawal. When these creatures are disturbed, they rapidly withdraw their delicate gill and siphon.

The nervous system of these creatures can also experience sensitization, which is learned by repeated exposure to stimuli, resulting in an enhanced and broad response by the nervous system to stimuli that are related—say, stimuli that connote danger.

What Causes Memories?

Sensitization is a learned response that is possible because memories have been encoded and stored in the sea hares’ nervous system. But how is this memory stored?

Many neuroscientists think that the physical instantiation of memories (called engrams) reside in the synaptic connections between nerve cells (neurons). Other neuroscientists hold a differing view. Instead of being mediated by cell-cell interactions, others think that engrams form within the interior of neurons, through biochemical events that take place within the cell nucleus. In fact, some studies have implicated RNA molecules in memory formation and storage.2 The UCLA researchers sought to determine if RNA plays a role in memory formation.

Memory Transfer from One Sea Hare to Another

To test this hypothesis, the researchers sensitized sea hares to painful stimuli. They accomplished this feat by inserting an electrode in the tail regions of several sea hares and delivering a shock. The shock caused the sea hares to withdraw their gill and siphon. After 20 minutes, they repeated the shock protocol and continued to do so in 20-minute intervals five more times. Twenty-four hours later, they repeated the shock protocol. By this point, the sea hare test subjects were sensitized to threatening stimuli. When touched, the trained sea hares would withdraw their gill and siphon for nearly 1 minute. Untrained sea hares (who weren’t subjected to the shock protocol) would withdraw their gill and siphon when touched for only about 1 second.

Next, the researchers sacrificed the sensitized sea hares and isolated RNA from their nervous system. Then they injected the RNA extracts into the hemocoel of untrained sea hares. When touched, the sea hares withdrew their gill and siphon for about 45 seconds.

To confirm that this response was not due to the injection procedure, they repeated it by injecting RNA extracted from the nervous system of an untrained sea hare into untrained sea hares. When touched, the gill and siphon withdrawal reflex lasted only about 1 second.


Figure: Sea Hare Stimulus Protocol. Image credit: Alexis Bédécarrats, Shanping Chen, Kaycey Pearce, Diancai Cai, and David L. Glanzman, eNeuro 14 May 2018, 5 (3) ENEURO.0038-18.2018; doi:10.1523/ENEURO.0038-18.2018.

The researchers then applied the RNA extracts from both trained and untrained sea hares to sensory neurons grown in the lab. The RNA extracts from the trained sea hares caused the sensory neurons to display heightened activity. Conversely, the RNA extracts from the untrained sea hares had no effect on the activity of the cultured sensory neurons.

Finally, the researchers added compounds called methylase inhibitors to the RNA extracts before injecting them into untrained sea hares. These inhibitors blocked the memory transfer. This result indicates that epigenetic modifications of DNA mediated by RNA molecules play a role in forming engrams.

Based on these results, it appears that RNA mediates the formation and storage of memories. And, though the research team does not know which class of RNAs play a role in the formation of engrams, they suspect that micro RNAs may be the biochemical actors.

Biomedical Implications

Now that the UCLA researchers have identified RNA and epigenetic modifications of DNA as central to the formation of engrams, they believe that it might one day be possible to develop biomedical procedures that could treat memory loss that occurs with old age or with diseases such as Alzheimer’s and dementia. Toward this end, it is particularly encouraging that the researchers could transfer memories from one sea hare to another. This insight might even lead to therapies that would erase horrific memories.

Of course, this raises questions about human nature—specifically, the relationship between the brain and mind. For many people, the fact that there is a physical basis for memories suggests that our mind is indistinguishable from the activities taking place within our brains. To put it differently, many people would reject the idea that our mind is a nonphysical substance, based on the discovery of engrams.

Engrams, Brain, and Mind

However, I would contend that if we adopt the appropriate mind-body model, it is possible to preserve the concept of the mind as a nonphysical entity distinct from the brain even if engrams are a reality. A model I find helpful is based on a computer hardware/software analogy. Accordingly, the brain is the hardware that manifests the mind’s activity. Meanwhile, the mind is analogous to the software programming. According to this model, hardware structures—brain regions—support the expression of the mind, the software.

A computer system needs both the hardware and software to function properly. Without the hardware, the software is just a set of instructions. For those instructions to take effect, the software must be loaded into the hardware. It is interesting that data accessed by software is stored in the computer’s hardware. So, why wouldn’t the same be true for the human brain?

We need to be careful not to take this analogy too far. However, from my perspective, it illustrates how it is possible for memories to be engrams while preserving the mind as a nonphysical, distinct entity.

Designed for Discovery

The significance of this discovery extends beyond the mind-brain problem. It’s provocative that the biology of a creature such as the sea hare could provide such important insight into human biology.

This is possible only because of the universal nature of biological systems. All life on Earth shares the same biochemistry. All life is made up of the same type of cells. Animals possess similar anatomical and physiological systems.

Most biologists today view these shared features as evidence for an evolutionary history of life. Yet, as a creationist and an intelligent design proponent, I interpret the universal nature of the cell’s chemistry and shared features of biological systems as manifestations of archetypical designs that emanate from the Creator’s mind. To put it another way, I regard the shared features of biological systems as evidence for common design, not common descent.

This view leads to the follow-up rebuttal: Why would God create using the same template? Why not create each biochemical system from scratch to be ideally suited for its function? There may be several reasons why a Creator would design living systems around a common set of templates. In my estimation, one of the most significant reasons is discoverability. The shared features of biochemical and biological systems make it possible to apply what we learn by studying one organism to all others. Without life’s shared features, the discipline of biology wouldn’t exist.

This discoverability makes it easier to appreciate God’s glory and grandeur, as evinced by the elegance, sophistication, and ingenuity in biochemical and biological systems. Discoverability of biochemical systems also reflects God’s providence and care for humanity. If not for the shared features, it would be nearly impossible for us to learn enough about the living realm for our benefit. Where would biomedical science be without the ability to learn fundamental aspects of our biology by studying model organisms such as yeast, fruit flies, mice—and sea hares?

The shared features in the living realm are a manifestation of the Creator’s care and love for humanity. And there is nothing bizarre about that.



  1. Alexis Bédécarrats et al., “RNA from Trained Aplysia Can Induce an Epigenetic Engram for Long-Term Sensitization in Untrained Aplysia,” eNeuro 5 (May/June 2018): e0038-18.2018, 1–11, doi:10.1523/ENEURO.0038-18.2018.
  2. For example, see Germain U. Busto et al., “microRNAs That Promote Or Inhibit Memory Formation in Drosophila melanogaster,” Genetics 200 (June 1, 2015): 569–80, doi:10.1534/genetics.114.169623.
Reprinted with permission by the author
Original article at:

Differences in Human and Neanderthal Brains Explain Human Exceptionalism



When I was a little kid, my mom went through an Agatha Christie phase. She was a huge fan of the murder mystery writer and she read all of Christie’s books.

Agatha Christie was caught up in a real-life mystery of her own when she disappeared for 10 days in December 1926 under highly suspicious circumstances. Her car was found near her home, close to the edge of a cliff. But, she was nowhere to be found. It looked as if she disappeared without a trace, without any explanation. Eleven days after her disappearance, she turned up in a hotel room registered under an alias.

Christie never offered an explanation for her disappearance. To this day, it remains an enduring mystery. Some think it was a callous publicity stunt. Some say she suffered a nervous breakdown. Others think she suffered from amnesia. Some people suggest more sinister reasons. Perhaps, she was suicidal. Or maybe she was trying to frame her husband and his mistress for her murder.

Perhaps we will never know.

Like Christie’s fictional detectives Hercule Poirot and Miss Marple, paleoanthropologists are every bit as eager to solve a mysterious disappearance of their own. They want to know why Neanderthals vanished from the face of the earth. And what role did human beings (Homo sapiens) play in the Neanderthal disappearance, if any? Did we kill off these creatures? Did we outcompete them or did Neanderthals just die off on their own?

Anthropologists have proposed various scenarios to account for the Neanderthals’ disappearance. Some paleoanthropologists think that differences in the cognitive capabilities of modern humans and Neanderthals help explain the creatures’ extinction. According to this model, superior reasoning abilities allowed humans to thrive while Neanderthals faced inevitable extinction. As a consequence, we replaced Neanderthals in the Middle East, Europe, and Asia when we first migrated to these parts of the world.

Computational Neuroanatomy

Innovative work by researchers from Japan offers support for this scenario.1 Using a technique called computational neuroanatomy, researchers reconstructed the brain shape of Neanderthals and modern humans from the fossil record. In their study, the researchers used four Neanderthal specimens:

  • Amud 1 (50,000 to 70,000 years in age)
  • La Chapelle-aux Saints 1 (47,000 to 56,000 years in age)
  • La Ferrassie 1 (43,000 to 45,000 years in age)
  • Forbes’ Quarry 1 (no age dates)

They also worked with four Homo sapiens specimens:

  • Qafzeh 9 (90,000 to 120,000 years in age)
  • Skhūl 5 (100,000 to 135,000 years in age
  • Mladeč 1 (35,000 years in age)
  • Cro-Magnon 1 (32,000 years in age)

Researchers used computed tomography scans to construct virtual endocasts (cranial cavity casts) of the fossil brains. After generating endocasts, the team determined the 3D brain structure of the fossil specimens by deforming the 3D structure of the average human brain so that it fit into the fossil crania and conformed to the endocasts.

This technique appears to be valid, based on control studies carried out on chimpanzee and bonobo brains. Using computational neuroanatomy, researchers can deform a chimpanzee brain to accurately yield the bonobo brain, and vice versa.

Brain Differences, Cognitive Differences

The Japanese team learned that the chief difference between human and Neanderthal brains is the size and shape of the cerebellum. The cerebellar hemisphere is projected more toward the interior in the human brain than in the Neanderthal brain and the volume of the human cerebellum is larger. Researchers also noticed that the right side of the Neanderthal cerebellum is significantly smaller than the left side—a phenomenon called volumetric laterality. This discrepancy doesn’t exist in the human brain. Finally, the Japanese researchers observed that the parietal regions in the human brain were larger than those regions in Neanderthals’ brains.

Image credit: Shutterstock


Because of these brain differences, the researchers argue that humans were socially and cognitively more sophisticated than Neanderthals. Neuroscientists have discovered that the cerebellum helps motor functions and higher cognition by contributing to language function, working memory, thought, and social abilities. Hence, the researchers argue that the reduced size of the right cerebellar hemisphere in Neanderthals limits the connection to the prefrontal regions—a connection critical for language processing. Neuroscientists have also discovered that the parietal lobe plays a role in visuo-spatial imagery, episodic memory, self-related mental representations, coordination between self and external spaces, and sense of agency.

On the basis of this study, it seems that humans either outcompeted Neanderthals for limited resources—driving them to extinction—or simply were better suited to survive than Neanderthals because of superior mental capabilities. Or perhaps their demise occurred for more sinister reasons. Maybe we used our sophisticated reasoning skills to kill off these creatures.

Did Neanderthals Make Art, Music, Jewelry, etc.?

Recently, a flurry of reports has appeared in the scientific literature claiming that Neanderthals possessed the capacity for language and the ability to make art, music, and jewelry. Other studies claim that Neanderthals ritualistically buried their dead, mastered fire, and used plants medicinally. All of these claims rest on highly speculative interpretations of the archaeological record. In fact, other studies present evidence that refutes every one of these claims (see Resources).

Comparisons of human and Neanderthal brain morphology and size become increasingly important in the midst of this controversy. This recent study—along with previous work (go here and here)—indicates that Neanderthals did not have the brain architecture and, hence, cognitive capacity to communicate symbolically through language, art, music, and body ornamentation. Nor did they have the brain capacity to engage in complex social interactions. In short, Neanderthal brain anatomy does not support any interpretation of the archaeological record that attributes advanced cognitive abilities to these creatures.

While this study provides important clues about the disappearance of Neanderthals, we still don’t know why they went extinct. Nor do we know any of the mysterious details surrounding their demise as a species.

Perhaps we will never know.

But we do know that in terms of our cognitive and social capacities, human beings stand apart from Neanderthals and all other creatures. Human brain biology and behavior render us exceptional, one-of-a-kind, in ways consistent with the image of God.



  1. Takanori Kochiyama et al., “Reconstructing the Neanderthal Brain Using Computational Anatomy,” Science Reports 8 (April 26, 2018): 6296, doi:10.1038/s41598-018-24331-0.
Reprinted with permission by the author
Original article at:

Sophisticated Cave Art Evinces the Image of God



It’s a new trend in art. Museums and galleries all over the world are exploring the use of sounds, smells, and lighting to enhance the viewer’s experience as they interact with pieces of art. The Tate Museum in London is one institution pioneering this innovative approach to experiencing artwork. For example, on display recently at Tate’s Sensorium was Irish artist Francis Bacon’s Figure in a Landscape, a piece that depicts a gray human figure on a bench. Visitors to the Sensorium put on headphones while they view this painting, and they hear sounds of a busy city. Added to the visual and auditory experiences are the bitter burnt smell of chocolate and the sweet aroma of oranges that engulf the viewer. This multisensory experience is meant to depict a lonely, brooding figure lost in the never-ending activities of a city, with the contrasting aromas simultaneously communicating the harshness and warmth of life in an urban setting.

It goes without saying that designing multisensory experiences like the ones on display at the Sensorium requires expertise in sound, taste, and lighting. This expertise makes recent discoveries on ancient cave and rock art found throughout the world all the more remarkable. As it turns out, the cave and rock art found in Europe, Asia, and Africa are multisensory displays.1 The sophistication of this early art highlights the ingenuity of the first artists—modern humans, who were people just like us.

Cave Art

Though many people have the perception that cave and rock art is crude and simplistic, in fact, it is remarkably sophisticated. For example, the Chauvet-Pont-d’Arc Cave in southern France houses cave art that dates (using carbon-14 measurements) to two periods: 28,000 to 31,000 years ago and 33,500 to 37,000 years ago. These cave sites house realistic depictions of hundreds of animals including herbivores such as horses, cattle, and mammoths. The art also depicts rhinos and carnivores such as cave lions, panthers, bears, and hyenas. The site also contains hand stencils and geometric shapes, such as lines and dots.

The Chauvet Cave human artists painted the animal figures on areas of the cave walls that they polished to make smooth and lighter in color. They also made incisions and etchings around the outline of the painted figures to create a three-dimensional quality to the art and to give the figures a sense of movement.

Multisensory Cave Art

One of the most intriguing aspects of cave art is its location in caves. Oftentimes, the animal figures are depicted deep within the cave’s interior, at unusual locations for the placement of cave paintings.

Recently, archaeologists have offered an explanation for the location of the cave art. It appears as if the artists made use of the caves’ acoustical properties to create a multisensory experience. To say it another way, the cave art is depicted in areas of the caves where the sounds in that area of the cave reinforce the cave paintings. For example, hoofed animals are often painted in areas of the caves where the echoes and reverberations make percussive sounds like those made by thundering hooves when these animals are running. Carnivores are often depicted in areas of the caves that are unusually quiet.

San Rock Art

Recently, researchers have discovered that the rock art produced by the San (indigenous hunter-gatherer people from Southern Africa), the oldest of which dates to about 70,000 years ago, also provides viewers a multisensory experience.2 Archaeologists believe that the art depicted on the rocks reflects the existence of a spirit world beneath the surface. These rock paintings are often created in areas where echoes can be heard, presumably reflecting the activities of the spirit world.

Who Made the Cave and Rock Art?

Clearly, the first human artists were sophisticated. But, when did this sophisticated behavior emerge? The discovery of art in Europe and Asia indicates that the first humans who made their way out of Africa as they migrated around the world carried with them the capacity for art. To put it another way, the capacity for art did not emerge in humans after they reached Europe, but instead was an intrinsic part of human nature before we began to make our way around the world.

The discovery of symbolic artifacts as old as 80,000 years in age in caves in South Africa(artistic expression is a manifestation of the capacity to represent the world with symbols) and the dating of the oldest San rock art at 70,000 years in age adds support to this view.

Linguist Shigeru Miyagawa points out that genetic evidence indicates that the San separated from the rest of humanity around 125,000 years ago. While the San remained in Africa, the group of humans who separated from the San and made their way into Asia and Europe came from a separate branch of humanity. And yet, the art produced by the San displays the same multisensory character as the art found in Europe and Asia. To say it another way, the rock art of the San and the cave art in Europe and Asia display unifying characteristics. These unifying features indicate that the art share the same point of origin. Given that the data seems to indicate that humanity’s origin is about 150,000 years ago, it appears that the origin of art coincides closely to the time that modern humans appear in the fossil record.3

Cave Art and Rock Evince the Biblical View of Human Nature

The sophistication of the earliest art highlights the exceptional nature of the first artists—modern humans, people just like you and me. The capacity to produce art reflects the capacity for symbolism—a quality that appears to be unique to human beings, a quality contributing to our advanced cognitive abilities, and a quality that contributes to our exceptional nature. As a Christian, I view symbolism (and artistic expression) as one of the facets of God’s image. And, as such, I would assert that the latest insights on cave art provide scientific credibility for the biblical view of human nature.



  1. Shigeru Miyagawa, Cora Lesure, and Vitor A. Nóbrega, “Cross-Modality Information Transfer: A Hypothesis about the Relationship among Prehistoric Cave Paintings, Symbolic Thinking, and the Emergence of Language,” Frontiers in Psychology 9 (February 20, 2018): 115, doi:10.3389/fpsyg.2018.00115.
  2. Francis Thackery, “Eland, Hunters and Concepts of ‘Symapthetic Control’: Expressed in Southern African Rock Art,’ Cambridge Archaeological Journal 15 (2005): 27–35, doi:10.1017/S0959774305000028.
  3. Miyagawa et al., “Cross-Modality Information Transfer,” 115.
Reprinted with permission by the author
Original article at:

Did Neanderthals Produce Cave Paintings?



One time when our kids were little, my wife and I discovered that someone had drawn a picture on one of the walls in our house. Though all of our children professed innocence, it was easy to figure out who the culprit was, because the little artist also wrote the first letter of her name on the wall next to her “masterpiece.”

If only archaeologists had it as easy as my wife and me when it comes to determining who made the ancient artwork on the cave walls in Europe. Most anthropologists think that modern humans produced the art. But, a growing minority of scientists think that Neanderthals were the artists, not modern humans. If anthropologists only had some initials to go by.

In the absence of a “smoking gun,” archaeologists believe they now have an approach that will help them determine the artists’ identity. Instead of searching for initials, researchers are trying to indirectly determine who the artists were by dating the cave art. They hope this approach will work because modern humans did not make their way into Europe until around 40,000 years ago. And Neanderthals disappeared around that same time. So, knowing the age of the art would help narrow down the artists’ identity.

Recently, a team from the UK and Spain have applied this new dating method to art found in the caves of Iberia (southwest corner of Europe). And based on the age of the art, they think that the paintings on the cave walls were produced by Neanderthals, not modern humans.1

Artistic expression reflects a capacity for symbolism. And many people view symbolism as a quality unique to human beings, contributing to our advanced cognitive abilities and reflecting our exceptional nature. In fact, as a Christian, I see symbolism as a manifestation of the image of God. Yet, if Neanderthals possessed symbolic capabilities, such a quality would undermine human exceptionalism (and with it the biblical view of human nature), rendering human beings nothing more than another hominin.

Limitations of Dating Cave Art

Dating cave art is challenging, to say the least. Typically, archaeologists will either: (1) date the remains associated with the cave art and try to establish a correlation, or (2) attempt to directly date the cave paintings using carbon-14 measurements of the pigments and charcoal used to make the art. Both approaches have limitations.

In 2012, researchers from the UK and Spain employed a new technique to date the art found on the walls in 11 caves located in northwest Spain.2 This dating method measures the age of the calcite deposits beneath the cave paintings and those that formed over the artwork, once the paintings had been created. As water flows down cave walls, it deposits calcite. When calcite forms it contains trace amounts of U-238. This isotope decays into Th-230. Normally, detection of such low quantities of these isotopes would require extremely large samples. The researchers discovered that by using accelerator mass spectrometry they could get by with 10-milligram samples.

By dating the calcite samples, they produced minimum and maximum ages for the cave paintings. While most of the 50 samples they took dated to around 25,000 years in age (or more recent than that), three were significantly older. They found a claviform-like symbol that dated to 31,000 years in age. They also found hand stencils that were 37,000 years old and, finally, a red disk that dated to 41,000 years in age.

Most anthropologists believe modern humans made their way into Europe around 40,000 years ago, prompting researchers to suggest that maybe Neanderthals created some of the cave art, “because of the 40.8 ky date for the disk is a minimum age, it cannot be ruled out that the earliest paintings were symbolic expressions of Neanderthals, which were present at Cantabrian Spain until at least 42 ka.”3

Dating the Art from Three Cave Sites in Iberia

Recently, this research team applied the same U-Th dating method to the art found in three cave sites in Iberia: (1) La Pasiega, which houses paintings of animals, linear signs, claviform signs, and dots; (2) Ardales, which contains about 1,000 paintings of animals, along with dots, discs, lines, geometric shapes, and hand stencils; and (3) Maltravieso, which displays a set of hand stencils and geometric designs.

The research team took a total of 53 samples from 25 carbonate formations associated with the cave art in these three cave sites. While most of the samples dated to 40,000 years old or less, three measurements produced minimum ages of around 65,000 years in age, including: (1) red scalariform from La Pasiega, (2) red areas from Ardales, and (3) a hand stencil from Maltravieso. On the basis of the three measurements, the team concluded that the art must have been made by Neanderthals because modern humans had not made their way into Iberia at that time. In other words, Neanderthals made art, just like modern humans did.

Are These Results Valid?

At first glance, it seems like the research team has a compelling case for Neanderthal art. Yet, careful examination of the U-Th method and the results raise some concerns.

First, it is not clear if the U-Th method yields reliable results. Recently, a team from France and the US questioned the application of the U-Th method to date cave art.4 Like all radiometric dating methods, the U-Th method only works if the system to be age-dated is closed. In other words, once the calcite deposit forms, the U-Th method will only yield reliable results if none of the U or Th moves in or out of the deposit. Unfortunately, it does not appear as if the calcite films are closed systems. The calcite films form as a result of hydrological activity in the cave. Once a calcite film forms, water will continue to flow over its surface, leeching out U (because U is much more water soluble than Th). This process will make it seem as if the calcite film and, hence, the underlying artwork is much older than it actually is.

In the face of this criticism, the team from the UK and Spain assert the reliability of their method because, for a few of the calcite deposits, they sampled the outermost surface, the middle of the deposit, and the innermost region. Measurements of these three samples gave ages that matched the expected chronology, with the innermost layer measuring older than the outermost surface. But, as the researchers from France and the US (who challenge the validity of the U-Th method to date cave art) point out, this sampling protocol doesn’t ensure that the calcite is a closed system.

Additionally, critics from France and the US identified several examples of cave art dated by both carbon-14 methods and U-Th methods, noting that the carbon-14 method consistently gives much younger ages than the U-Th method. This difference is readily explained if the calcite is an open system.

Secondly, it seems more plausible that the 65,000-year-old dates are outliers. It is important to note that of the 53 samples measured, only three gave age-dates of 65,000 years. The remaining samples gave dates much younger, typically around 40,000 years in age. Given the concerns about the calcite being an open system, should the 65,000-year-old samples be viewed as mere outliers?

Compounding this concern is the fact that samples taken from the same piece of art give discordant dates, with one of the samples dating to 65,000 years in age and the other two samples dating to be much younger. The team from the UK and Spain argue that the artwork was produced in a patchwork manner. But this explanation does not account for the observation that the artwork appears to be a unified piece.

What Does Neanderthal Biology Say?

The archaeological record is not the only evidence we have available to us to assess Neanderthals’ capacity for symbolism (and advanced cognitive abilities). Scientists can also glean insight from Neanderthal biology.

As I discuss in Who Was Adam?, comparisons of the genomes of Neanderthals and modern humans reveal important differences in a number of genes related to neural development, suggesting that there are cognitive differences between the two species. Additionally, the fossil remains of Neanderthals indicate that their brain development s took a different trajectory than ours after birth. As a result, it doesn’t appear as if Neanderthals experienced much of an adolescence (which is the time that significant brain development takes place in modern humans). Finally, the brain structure of Neanderthals indicates that these creatures lacked advanced cognitive capacity and the hand-eye coordination needed to make art.

On the basis of concerns about the validity of the U-Th method when applied to dating calcite films and Neanderthal brain biology, I remain unconvinced that Neanderthals made cave art, let alone had the capacity to do so. So, to me, it appears as if modern humans are, indeed, the “guilty party.” The entire body of evidence still indicates that they are the ones who painted the walls of caves throughout the world. Though, I doubt either my wife or I will have these early artists scrub down the cave walls as punishment. The cave art is much too precious.



  1. D. L. Hoffmann et al., “U-Th Dating of Carbonate Crusts Reveals Neanderthal Origin of Iberian Cave Art,” Science 359 (February 23, 2018): 912–15, doi:10.1126/science.aap7778.
  2. A. W. G. Pike et al., “U-Series Dating of Paleolithic Art in 11 Caves in Spain,” Science 336 (June 15, 2012): 1409–13, doi:10.1126/science.1219957.
  3. A. W. G. Pike et al., “U-Series Dating of Paleolithic Art.”
  4. Georges Sauvet et al., “Uranium-Thorium Dating Method and Paleolithic Rock Art,” Quaternary International432 (2017): 86–92, doi:10.1016/j.quaint.2015.03.053.
Reprinted with permission by the author
Original article at:

Rabbit Burrowing Churns Claims about Neanderthal Burials



As a kid, watching cartoons was one of the highlights of my afternoons. As soon as I arrived home from school, I would plop down in front of the TV. Among my favorites were the short features produced by Warner Brothers. What a wonderful cast of characters: Daffy Duck, Sylvester and Tweety, Yosemite Sam, the Tasmanian Devil, the Road Runner and Wile E. Coyote. As much as I loved to watch their shenanigans, none of them compared to the indomitable Bugs Bunny. That “wascally wabbit” (to quote Elmer Fudd) always seemed to create an upheaval everywhere he went.

Recently, a research team from France has come to realize that Bugs Bunny isn’t the only rabbit to make a mess of things. These investigators learned that burrowing rabbits have created an upheaval—literally—at Neanderthal archaeological sites, casting doubt on claims that the hominins displayed advanced sophisticated cognitive abilities.1

Researchers from France unearthed this problem while studying the Regourdou Neanderthal site in Dordogne. Neanderthal bones and stone artifacts, along with animal remains, were recovered from this cave site in 1954. Unfortunately, the removal of the remains by archaeologists was done in a nonscientific manner—by today’s standards.

Based on the arrangement of the Neanderthal remains, lithic artifacts, and cave bear bones at the site, anthropologists initially concluded that one of the Neanderthals found at Regourdou was deliberately buried, indicating that these hominids must have engaged in complex funerary practices. Many anthropologists consider complex funeral activities to reflect one of the most sophisticated examples of symbolic behavior. If so, then Neanderthals must have possessed similar cognitive abilities to modern humans, undermining the scientific case for human exceptionalism and, along with it, casting aspersions on the biblical view of humanity.

Questions about Neanderthal Burials

Yet, more recent analysis of the Regourdou site has raised questions about Neanderthal burial practices. One piece of evidence cited by anthropologists for the funerary burial at this French cave site was the recovery of bear remains associated with a nearly complete Neanderthal specimen. Some anthropologists argued that Neanderthals used the cave bear bones to construct a funerary structure.

But anthropologists have started to question this interpretation. Evidence mounts that this cave site functioned primarily as a den for cave bears, with the accumulation of cave bear bones largely stemming from attritional mortality—not the deliberate activity of Neanderthals.

Rabbits at Regourdou

Anthropologists have also recovered a large quantity of rabbit remains at the Regourdou site. At first, these rabbit bones were taken as evidence that the hominids had the cognitive capacity to hunt and trap small game—something only modern humans were thought to be able to do.

One species found at the Regourdou cave site is the European rabbit (Ochotona cuniculus). These rabbits dig interconnected burrows (called a warren) to avoid predation and harsh climatic conditions. Depending on the sediment, the warren architecture can be deep and complex.

Because the researchers discovered over 10,000 rabbit bones at the Regourdou site, they became concerned that the burrowing behavior of these creatures may have compromised the integrity of the site. To address this issue, they used radiocarbon dating to age-date the rabbit remains. They discovered that the rabbit bones were significantly younger than the sediments harboring them. They also noted that the skeletal parts, breakage pattern of the bones, and surface modification of the rabbit remains indicate that these creatures died within the warrens due to natural causes, negating the claim that Neanderthals hunted small game. This set of observations indicates that the rabbits burrowed and lived in warrens in the Regourdou site, well after the cave deposits formed.

Perhaps of greatest concern associated with this finding is the uncertainty it creates about the integrity of sedimentary layers, because the rabbit burrows cross and perturb several layers, resulting in the mixing of bones and artifacts from one layer to the next. This bioturbation appears to have transported artifacts and bones from the upper layers to the lower layers.

Upheaval of the cave layers caused by the rabbits means that grave goods associated with Neanderthal skeletons may not have been intentionally placed with the body at the time of death. Instead, they may just have happened to wind up next to the hominin remains due to burrowing activity.

Such tumult may not be limited to the Regourdou cave site. These creatures live throughout France and the Iberian Peninsula, raising questions about the influence that the rabbits may have had on the integrity of other archaeological cave sites in France and Spain. For example, it is not hard to envision scenarios in which rabbit burrowing caused mixing at other cave sites, resulting in the accidental association of Neanderthal remains with artifacts initially deposited in upper cave layers made by modern humans who occupied the cave sites after Neanderthals. If so, this association could mistakenly lead anthropologists to conclude that Neanderthals had advanced cognitive abilities, when in fact they did not. While Bugs Bunny’s antics may amuse us, it is no laughing matter to consider the possible impact rabbits may have had on scientific findings.

Only Human Beings Are Exceptional

Even though some anthropologists assert that Neanderthals possessed advanced cognitive abilities like those of modern humans, ongoing scientific scrutiny of the archaeological evidence consistently fails to substantiate those claims. This failure is clearly the case with the Regourdou burial. No doubt, Neanderthals were fascinating creatures. But there is no compelling scientific reason to think that their behavioral capacity threatens human exceptionalism and the notion that human beings were created to bear God’s image.



  1. Maxime Pelletier et al., “Rabbits in the Grave! Consequences of Bioturbation on the Neandertal ‘Burial’ at Regourdou (Montignac-sur-Vezérè, Dordogne)” Journal of Human Evolution 110 (September 2017): 1–17, doi:10.1016/j.jhevol.2017.04.001.
Reprinted with permission by the author
Original article at: