New Genetic Evidence Affirms Human Uniqueness

By Fazale Rana – March 4, 2020

It’s a remarkable discovery—and a bit gruesome, too.

It is worth learning a bit about some of its unseemly details because this find may have far-reaching implications that shed light on our origins as a species.

In 2018, a group of locals discovered the remains of a two-year-old male puppy in the frozen mud (permafrost) in the eastern part of Siberia. The remains date to 18,000 years in age. Remarkably, the skeleton, teeth, head, fur, lashes, and whiskers of the specimen are still intact.

Of Dogs and People

The Russian scientists studying this find (affectionately dubbed Dogor) are excited by the discovery. They think Dogor can shed light on the domestication of wolves into dogs. Biologists believe that this transition occurred around 15,000 years ago. Is Dogor a wolf? A dog? Or a transitional form? To answer these questions, the researchers have isolated DNA from one of Dogor’s ribs, which they think will provide them with genetic clues about Dogor’s identity—and clues concerning the domestication process.

Biologists study the domestication of animals because this process played a role in helping to establish human civilization. But biologists are also interested in animal domestication for another reason. They think this insight will tell us something about our identity as human beings.

In fact, in a separate study, a team of researchers from the University of Milan in Italy used insights about the genetic changes associated with the domestication of dogs, cats, sheep, and cattle to identify genetic features that make human beings (modern humans) stand apart from Neanderthals and Denisovans.1 They conclude that modern humans share some of the same genetic characteristics as domesticated animals, accounting for our unique and distinct facial features (compared to other hominins). They also conclude that our high level of cooperativeness and lack of aggression can be explained by these same genetic factors.

This work in comparative genomics demonstrates that significant anatomical and behavioral differences exist between humans and hominins, supporting the concept of human exceptionalism. Though the University of Milan researchers carried out their work from an evolutionary perspective, I believe their insights can be recast as scientific evidence for the biblical conception of human nature; namely, creatures uniquely made in God’s image.

Biological Changes that Led to Animal Domestication

Biologists believe that during the domestication process, many of the same biological changes took place in dogs, cats, sheep, and cattle. For example, they think that during domestication, mild deficits in neural crest cells resulted. In other words, once animals are domesticated, they produce fewer, less active neural crest cells. These stem cells play a role in neural development; thus, neural crest cell defects tend to make animals friendlier and less aggressive. This deficit also impacts physical features, yielding smaller skulls and teeth, floppy ears, and shorter, curlier tails.

Life scientists studying the domestication process have identified several genes of interest. One of these is BAZ1B. This gene plays a role in the maintenance of neural crest cells and controls their migration during embryological development. Presumably, changes in the expression of BAZ1B played a role in the domestication process.

Neural Crest Deficits and Williams Syndrome

As it turns out, there are two genetic disorders in modern humans that involve neural crest cells: Williams-Beuren syndrome (also called Williams syndrome) and Williams-Beuren region duplication syndrome. These genetic disorders involve the deletion or duplication, respectively, of a region of chromosome 7 (7q11.23). This chromosomal region harbors 28 genes. Craniofacial defects and altered cognitive and behavioral traits characterize these disorders. Specifically, people with these syndromes have cognitive limitations, smaller skulls, and elf-like faces, and they display excessive friendliness.

Among the 28 genes impacted by the two disorders is the human version of BAZ1B. This gene codes for a type of protein called a transcription factor. (Transcription factors play a role in regulating gene expression.)

The Role of BAZ1B in Neural Crest Cell Biology

To gain insight into the role BAZ1B plays in neural crest cell biology, the European research team developed induced pluripotent stem cell lines from (1) four patients with Williams syndrome, (2) three patients with Williams-Beuren region duplication syndrome, and (3) four people without either disorder. Then, they coaxed these cells in the laboratory to develop into neural crest cells.

Using a technique called RNA interference, they down-regulated BAZ1B in all three types of neural crest cells. By doing this, the researchers learned that changes in the expression of this gene altered the migration rates of the neural crest cells. Specifically, they discovered that neural crest cells developed from patients with Williams-Beuren region duplication syndrome migrated more slowly than control cells (generated from test subjects without either syndrome) and neural crest cells derived from patients with Williams syndrome migrated more rapidly than control cells.

The discovery that the BAZ1B gene influences neural crest cell migration is significant because these cells have to migrate to precise locations in the developing embryo to give rise to distinct cell types and tissues, including those that form craniofacial features.

Because BAZ1B encodes for a transcription factor, when its expression is altered, it alters the expression of genes under its control. The team discovered that 448 genes were impacted by down-regulating BAZ1B. They learned that many of these impacted genes play a role in craniofacial development. By querying databases of genes that correlate with genetic disorders, researchers also learned that, when defective, some of the impacted genes are known to cause disorders that involve altered facial development and intellectual disabilities.

Lastly, the researchers determined that the BAZ1B protein (again, a transcription factor) targets genes that influence dendrite and axon development (which are structures found in neurons that play a role in transmissions between nerve cells).

BAZ1B Gene Expression in Modern and Archaic Humans

With these findings in place, the researchers wondered if differences in BAZ1B gene expression could account for anatomical and cognitive differences between modern humans and archaic humans—hominins such as Neanderthals and Denisovans. To carry out this query, the researchers compared the genomes of modern humans to Neanderthals and Denisovans, paying close attention to DNA sequence differences in genes under the influence of BAZ1B.

This comparison uncovered differences in the regulatory region of genes targeted by the BAZ1B transcription factor, including genes that control neural crest cell activities and craniofacial anatomy. In other words, the researchers discovered significant genetic differences in gene expression among modern humans and Neanderthals and Denisovans. And these differences strongly suggest that anatomical and cognitive differences existed between modern humans and Neanderthals and Denisovans.

Did Humans Domesticate Themselves?

The researchers interpret their findings as evidence for the self-domestication hypothesis—the idea that we domesticated ourselves after the evolutionary lineage that led to modern humans split from the Neanderthal/Denisovan line (around 600,000 years ago). In other words, just as modern humans domesticated dogs, cats, cattle, and sheep, we domesticated ourselves, leading to changes in our anatomical features that parallel changes (such as friendlier faces) in the features of animals we domesticated. Along with these anatomical changes, our self-domestication led to the high levels of cooperativeness characteristic of modern humans.

On one hand, this is an interesting account that does seem to have some experimental support. But on the other, it is hard to escape the feeling that the idea of self-domestication as the explanation for the origin of modern humans is little more than an evolutionary just-so story.

It is worth noting that some evolutionary biologists find this account unconvincing. One is William Tecumseh Fitch III—an evolutionary biologist at the University of Vienna. He is skeptical of the precise parallels between animal domestication and human self-domestication. He states, “These are processes with both similarities and differences. I also don’t think that mutations in one or a few genes will ever make a good model for the many, many genes involved in domestication.”2

Adding to this skepticism is the fact that nobody has anything beyond a speculative explanation for why humans would domesticate themselves in the first place.

Genetic Differences Support the Idea of Human Exceptionalism

Regardless of the mechanism that produced the genetic differences between modern and archaic humans, this work can be enlisted in support of human uniqueness and exceptionalism.

Though the claim of human exceptionalism is controversial, a minority of scientists operating within the scientific mainstream embrace the idea that modern humans stand apart from all other extant and extinct creatures, including Neanderthals and Denisovans. These anthropologists argue that the following suite of capacities uniquely possessed by modern humans accounts for our exceptional nature:

  • symbolism
  • open-ended generative capacity
  • theory of mind
  • capacity to form complex social systems

As human beings, we effortlessly represent the world with discrete symbols. We denote abstract concepts with symbols. And our ability to represent the world symbolically has interesting consequences when coupled with our abilities to combine and recombine those symbols in a countless number of ways to create alternate possibilities. Our capacity for symbolism manifests in the form of language, art, music, and even body ornamentation. And we desire to communicate the scenarios we construct in our minds with other human beings.

But there is more to our interactions with other human beings than a desire to communicate. We want to link our minds together. And we can do this because we possess a theory of mind. In other words, we recognize that other people have minds just like ours, allowing us to understand what others are thinking and feeling. We also have the brain capacity to organize people we meet and know into hierarchical categories, allowing us to form and engage in complex social networks. Forming these relationships requires friendliness and cooperativeness.

In effect, these qualities could be viewed as scientific descriptors of the image of God, if one adopts a resemblance view for the image of God.

This study demonstrates that, at a genetic level, modern humans appear to be uniquely designed to be friendlier, more cooperative, and less aggressive than other hominins—in part accounting for our capacity to form complex hierarchical social structures.

To put it differently, the unique capability of modern humans to form complex, social hierarchies no longer needs to be inferred from the fossil and archaeological records. It has been robustly established by comparative genomics in combination with laboratory studies.

A Creation Model Perspective on Human Origins

This study not only supports human exceptionalism but also affirms RTB’s human origins model.

RTB’s biblical creation model identifies hominins such as Neanderthals and the Denisovans as animals created by God. These extraordinary creatures possessed enough intelligence to assemble crude tools and even adopt some level of “culture.” However, the RTB model maintains that these hominids were not spiritual creatures. They were not made in God’s image. RTB’s model reserves this status exclusively for Adam and Eve and their descendants (modern humans).

Our model predicts many biological similarities will be found between the hominins and modern humans, but so too will significant differences. The greatest distinction will be observed in cognitive capacity, behavioral patterns, technological development, and culture—especially artistic and religious expression.

The results of this study fulfill these two predictions. Or, to put it another way, the RTB model’s interpretation of the hominins and their relationship to modern humans aligns with “mainstream” science.

But what about the similarities between the genetic fingerprint of modern humans and the genetic changes responsible for animal domestication that involve BAZ1B and genes under its influence?

Instead of viewing these features as traits that emerged through parallel and independent evolutionary histories, the RTB human origins model regards the shared traits as reflecting shared designs. In this case, through the process of domestication, modern humans stumbled upon the means (breeding through artificial selection) to effect genetic changes in wild animals that resemble some of the designed features of our genome that contribute to our unique and exceptional capacity for cooperation and friendliness.

It is true: studying the domestication process does, indeed, tell us something exceptionally important about who we are.


  1. Matteo Zanella et al., “Dosage Analysis of the 7q11.23 Williams Region Identifies BAZ1B as a Major Human Gene Patterning the Modern Human Face and Underlying Self-Domestication,” Science Advances 5, no. 12 (December 4, 2019): eaaw7908, doi:10.1126/sciadv.aaw7908.
  2. Michael Price, “Early Humans Domesticated Themselves, New Genetic Evidence Suggests,” Science (December 4, 2019), doi:10.1126/science.aba4534.

Reprinted with permission by the author

Original article at:

Ancient DNA Indicates Modern Humans Are One-of-a-Kind

By Fazale Rana – February 19, 2020

The wonderful thing about tiggers
Is tiggers are wonderful things!
Their tops are made out of rubber
Their bottoms are made out of springs!
They’re bouncy, trouncy, flouncy, pouncy
Fun, fun, fun, fun, fun!
But the most wonderful thing about tiggers is
I’m the only one!1

With eight grandchildren and counting (number nine will be born toward the end of February), I have become reacquainted with children’s stories. Some of the stories my grandchildren want to hear are new, but many of them are classics. It is fun to see my grandchildren experiencing the same stories and characters I enjoyed as a little kid.

Perhaps my favorite children’s book of all time is A. A. Milne’s (1882–1956) Winnie-the-Pooh. And of all the characters that populated Pooh Corner, my favorite character is the ineffable Tigger—the self-declared one-of-a-kind.

A. A. Milne. Credit: Wikipedia

For many people (such as me), human beings are like Tigger. We are one-of-a-kind among creation. As a Christian, I take the view that we are unique and exceptional because we alone have been created in God’s image.

For many others, the Christian perspective on human nature is unpopular and offensive. Who are we to claim some type of special status? They insist that humans aren’t truly unique and exceptional. We are not fundamentally different from other creatures. If anything, we differ only in degree, not kind. Naturalists and others assert that there is no evidence that human beings bear God’s image. In fact, some would go so far as to claim that creatures such as Neanderthals were quite a bit like us. They maintain that these hominins were “exceptional,” just like us. Accordingly, if we are one-of-a-kind it is because, like Tigger, we have arrogantly declared ourselves to be so, when in reality we are no different from any of the other characters who make their home at Pooh Corner.

Despite this pervasive and popular challenge to human exceptionalism (and the image-of-God concept), there is mounting evidence that human beings stand apart from all extant creatures (such as the great apes) and extinct creatures (such as Neanderthals). This growing evidence can be marshaled to make a scientific case that as human beings we, indeed, are image bearers.

As a case in point, many archeological studies affirm human uniqueness and exceptionalism. (See the Resources section for a sampling of some of this work.) These studies indicate that human beings alone possess a suite of characteristics that distinguish us from all other hominins. I regard these qualities as scientific descriptors of the image of God:

  • Capacity for symbolism
  • Ability for open-ended manipulation of symbols
  • Theory of mind
  • Capacity to form complex, hierarchical social structures

Other studies have identified key differences between the brains of modern humans and Neanderthals. (For a sample of this evidence see the Resources section.) One key difference relates to skull shape. Neanderthals (and other hominins) possessed an elongated skull. In contradistinction, our skull shape is globular. The globularity allows for the expansion of the parietal lobe. This is significant because an expanded parietal lobe explains a number of unique human characteristics:

  • Perception of stimuli
  • Sensorimotor transformation (which plays a role in planning)
  • Visuospatial integration (which provides hand-eye coordination)
  • Imagery
  • Self-awareness
  • Working and long-term memory

Again, I connect these scientific qualities to the image of God.

Now, two recent studies add to the case for human exceptionalism. They involve genetic comparisons of modern humans with both Neanderthals and Denisovans. Through the recovery and sequencing of ancient DNA, we have high quality genomes for these hominins that we can analyze and compare to the genomes of modern humans.

While the DNA sequences of protein-coding genes in modern human genomes and the genomes of these two extant hominins is quite similar, both studies demonstrate that the gene expression is dramatically different. That difference accounts for anatomical differences between humans and these two hominins and suggests that significant cognitive differences exist as well.

Differences in Gene Regulation

To characterize gene expression patterns in Neanderthals and Denisovans and compare them to modern humans, researchers from Vanderbilt University (VU) used statistical methods to develop a mathematical model that would predict gene expression profiles from the DNA sequences of genomes.2 They built their model using DNA sequences and gene expression data (measured from RNA produced by transcription) for a set of human genomes. To ensure that their model could be used to assess gene expression for Neanderthals and Denisovans, the researchers paid close attention to the gene expression pattern for genes in the human genome that were introduced when modern humans and Neanderthals presumably interbred and compared their expression to human genes that were not of Neanderthal origin.

blog__inline--ancient-dna-indicates-modern-humans-2The Process of Gene
Credit: Shutterstock

With their model in hand, the researchers analyzed the expression profile for nearly 17,000 genes from the Altai Neanderthal. Their model predicts that 766 genes in the Neanderthal genome had a different expression profile than the corresponding genes in modern humans. As it turns out, the differentially expressed genes in the Neanderthal genomes failed to be incorporated into the human genome after interbreeding took place, suggesting to the researchers that these genes are responsible for key anatomical and physiological differences between modern humans and Neanderthals.

The VU investigators determined that these 766 deferentially expressed genes play roles in reproduction, forming skeletal structures, and the functioning of cardiovascular and immune systems.

Then, the researchers expanded their analysis to include two other Neanderthal genomes (from the Vindija and Croatian specimens) and the Denisovan genome. The researchers learned that the gene expression profiles of the three Neanderthal genomes were more similar to one another than they were to either the gene expression patterns of modern human and Denisovan genomes.

This study clearly demonstrates that significant differences existed in the regulation of gene expression in modern humans, Neanderthals, and Denisovans and that these differences account for biological distinctives between the three hominin species.

Differences in DNA Methylation

In another study, researchers from Israel compared gene expression profiles in modern human genomes with those from and Neanderthals and Denisovans using a different technique. This method assesses DNA methylation.3 (Methylation of DNA downregulates gene expression, turning genes off.)

Methylation of DNA influences the degradation process for this biomolecule. Because of this influence, researchers can determine the DNA methylation pattern in ancient DNA by characterizing the damage to the DNA fragments isolated from fossil remains.

Using this technique, the researchers measured the methylation pattern for genomes of two Neanderthals (Altai and Vindija) and a Denisovan and compared these patterns with genomes recovered from the remains of three modern humans, dating to 45,000 years in age, 8,000 years in age, and 7,000 years in age, respectively. They discovered 588 genes in modern human genomes with a unique DNA methylation pattern, indicating that these genes are expressed differently in modern humans than in Neanderthals and Denisovans. Among the 588 genes, researchers discovered some that influence the structure of the pelvis, facial morphology, and the larynx.

The researchers think that differences in gene expression may explain the anatomical differences between modern humans and Neanderthals. They also think that this result indicates that Neanderthals lacked the capacity for speech.

What Is the Relationship between Modern Humans and Neanderthals?

These two genetic studies add to the extensive body of evidence from the fossil record, which indicates that Neanderthals are biologically distinct from modern humans. For a variety of reasons, some Christian apologists and Intelligent Design proponents classify Neanderthals and modern humans into a single group, arguing that the two are equivalent. But these two studies comparing gene regulation profiles make it difficult to maintain that perspective.

Modern Humans, Neanderthals, and the RTB Human Origins Model

RTB’s human origins model regards Neanderthals (and other hominins) as creatures made by God, without any evolutionary connection to modern humans. These extraordinary creatures walked erect and possessed some level of intelligence, which allowed them to cobble together tools and even adopt some level of “culture.” However, our model maintains that the hominins were not spiritual beings made in God’s image. RTB’s model reserves this status exclusively for modern humans.

Based on our view, we predict that biological similarities will exist among the hominins and modern humans to varying degrees. In this regard, we consider the biological similarities to reflect shared designs, not a shared evolutionary ancestry. We also expect biological differences because, according to our model, the hominins would belong to different biological groups from modern humans.

We also predict that significant cognitive differences would exist between modern humans and the other hominins. These differences would be reflected in brain anatomy and behavior (inferred from the archeological record). According to our model, these differences reflect the absence of God’s image in the hominins.

The results of these two studies affirm both sets of predictions that flow from the RTB human origins model. The differences in gene regulation between modern human and Neanderthals is precisely what our model predicts. These differences seem to account for the observed anatomical differences between Neanderthals and modern humans observed from fossil remains.

The difference in the regulation of genes affecting the larynx is also significant for our model and the idea of human exceptionalism. One of the controversies surrounding Neanderthals relates to their capacity for speech and language. Yet, it is difficult to ascertain from fossil remains if Neanderthals had the anatomical structures needed for the vocalization range required for speech. The differences in the expression profiles for genes that control the development and structure of the larynx in modern humans and Neanderthals suggests that Neanderthals lacked the capacity for speech. This result dovetails nicely with the differences in modern human and Neanderthal brain structure, which suggest that Neanderthals also lacked the neural capacity for language and speech. And, of course, it is significant that there is no conclusive evidence for Neanderthal symbolism in the archeological record.

With these two innovative genetic studies, the scientific support for human exceptionalism continues to mount. And the wonderful thing about this insight is that it supports the notion that as human beings we are the only ones who bear God’s image and can form a relationship with our Creator.


Behavioral Differences between Humans and Neanderthals

Biological Differences between Humans and Neanderthals

  1. Richard M. Sherman and Robert B. Sherman, composers, “The Wonderful Thing about Tiggers” (song), released December 1968.
  2. Laura L. Colbran et al., “Inferred Divergent Gene Regulation in Archaic Hominins Reveals Potential Phenotypic Differences,” Nature Evolution and Ecology 3 (November 2019): 1598-606, doi:10.1038/s41559-019-0996-x.
  3. David Gokhman et al., “Reconstructing the DNA Methylation Maps of the Neandertal and the Denisovan,” Science 344, no. 6183 (May 2, 2014): 523–27, doi:1126/science.1250368; David Gokhman et al., “Extensive Regulatory Changes in Genes Affecting Vocal and Facial Anatomy Separate Modern from Archaic Humans,” bioRxiv, preprint (October 2017), doi:10.1101/106955.

Reprinted with permission by the author

Original article at:

Cave Art Tells the Story of Human Exceptionalism

By Fazale Rana – February 5, 2020

Comic books intrigue me. They are a powerful storytelling vehicle. The combination of snapshot-style imagery, along with narration and dialogue, allows the writer and artist to depict action and emotion in a way that isn’t possible using the written word alone. Comic books make it easy to depict imaginary worlds. And unlike film, comics engage the reader in a deeper, more personal way. The snapshot format requires the reader to make use of their imagination to fill in the missing details. In this sense, the reader becomes an active participant in the storytelling process.


Figure 1: Speech Bubbles on a
Comic Strip Background
. Credit:


In America, comics burst onto the scene in the 1930s, but the oldest comics (at least in Europe) trace their genesis to Rodolphe Töpffer (1799-1846). Considered by many to be “the father of comics,” Töpffer was a Swiss teacher, artist, and author who became well-known for his illustrated books—works that bore similarity to modern-day comics.


Figure 2: Rodolphe Töpffer, Self
Portrait, 1840.

Despite his renown, Töpffer wasn’t the first comic book writer and artist. That claim to fame belongs to long forgotten artists from prehistory. In fact, recent work by Australian and Indonesian researchers indicates that comics as a storytelling device dates to earlier than 44,000 years ago.


These investigators discovered and characterized cave art from a site on the Indonesian island of Sulawesi that depicts a pig and buffalo hunt. Researchers interpret this mural to be the oldest known recorded story1 —a comic book story on a cave wall.

This find, and others like it, provide important insight into our origins as human beings. From my perspective as a Christian apologist, this discovery is important for another reason. I see it as affirming the biblical teaching about humanity: God made human beings in his image.

The Find

Leading up to this discovery, archeologists had already identified and dated art on cave walls in Sulawesi and Borneo. This art, which includes hand stencils and depictions of animals, dates to older than 40,000 years in age and is highly reminiscent of the cave art of comparable age found in Europe.


Figure 3: Hand Stencils from a Cave
in Southern Sulawesi
. Credit:


In December 2017, an archeologist from Indonesia discovered the hunting mural in a cave (now called Leang Bulu’ Sipong 4) in the southern part of Sulawesi. The panel presents the viewer with an ensemble of pigs and small buffalo (called anoas), endemic to Sulawesi. Most intriguing about the artwork is the depiction of smaller human-like figures with animal features such as tails and snouts. In some instances, the figures appear to be holding spears and ropes. Scholars refer to these human-animal depictions as therianthropes.


Figure 4: Illustration of a Pig Deer
Found in a Cave in Southern
. Credit:

Dating the Find

Dating cave art can be notoriously difficult. One approach is to directly date the charcoal pigments used to make the art using radiocarbon methods. Unfortunately, the dates measured by this technique can be suspect because the charcoal used to make the art can be substantially older than the artwork itself.

Recently, archeologists have developed a new approach to date cave art. This method measures the levels of uranium and thorium in calcite deposits that form on top of the artwork. Calcite is continuously deposited on cave walls due to hydrological activity in the cave. As water runs down the cave walls, calcium carbonate precipitates onto the cave wall surface. Trace amounts of radioactive uranium are included in the calcium carbonate precipitates. This uranium decays into thorium, hence the ratio of uranium to thorium provides a measure of the calcite deposit’s age and, in turn, yields a minimum age for the artwork.

To be clear, this dating method has been the subject of much controversy. Some archeologists argue that the technique is unreliable because the calcite deposits are an open system. Once the calcite deposit forms, water will continue to flow over the surface. The water will solubilize part of the calcite deposit and along with it the trace amounts of uranium and thorium. Thus, because uranium is more soluble than thorium we get an artificially high level of thorium. So, when the uranium-thorium ratio is measured, it may make it appear as if the cave art is older than it actually is.

To ensure that the method worked as intended, the researchers only dated calcite deposits that weren’t porous (which is a sign that they have been partially re-dissolved) and they made multiple measurements from the surface of the deposit toward the interior. If this sequence of measurements produced a chronologically consistent set of ages, the researchers felt comfortable with the integrity of the calcite samples. Using this method, the researchers determined that the cave painting of the pig and buffalo hunt dates to older than 43,900 years.

Corroborating evidence gives the archeologists added confidence in this result. For example, the discovery of archeological finds in the Sulawesi cave site that were independently dated indicate that modern humans were in the caves between 40,000 to 50,000 years ago, in agreement with the measured age of the cave art.

The research team also noted that the animal and the therianthropes in the mural appear to have been created at the same time. This insight is important because therianthropes don’t appear in the cave paintings found in Europe until around 10,000 years ago. This observation means that it is possible that the therianthropes could have been added to the painting millennia after the animals were painted onto the cave wall. However, the researchers don’t think this is the case for at least three reasons. First, the same artistic style was used to depict the animals and therianthropes. Second, the technique and pigment used to create the figures is the same. And third, the degree of weathering is the same throughout the panel. None of these features would be expected if the therianthropes were a late addition to the mural.

Interpreting the Find

The researchers find the presence of therianthropes in 44,000+ year-old cave art significant. It indicates that humans in Sulawesi not only possessed the capacity for symbolism, but, more importantly, had the ability to conceive of things that did not exist in the material world. That is to say, they had a sense of the supernatural.

Some archeologists believe that the cave art reflects shamanic beliefs and visions. If this is the case, then it suggests that the therianthropes in the painting may reflect spirit animal helpers who ensured the success of the hunt. The size of the therianthropes supports this interpretation. These animal-human hybrids are depicted as much smaller than the pigs and buffalo. On the island of Sulawesi, both the pig and buffalo species in question were much smaller than modern humans.

Because this artwork depicts a hunt involving therianthropes, the researchers see rich narrative content in the display. It seems to tell a story that likely reflected the mythology of the Sulawesi people. You could say it’s a comic book on a cave wall.

Relationship between Cave Art in Europe and Asia

Cave art in Europe has been well-known and carefully investigated by archeologists and anthropologists for nearly a century. Now archeologists have access to a growing archeological record in Asia.

Art found at these sites is of the same quality and character as the European cave art. However, it is older. This discovery means that modern humans most likely had the capacity to make art even before beginning their migrations around the world from out of Africa (around 60,000 years ago).

As noted, the discovery of therianthropes at 44,000+ years in age in Sulawesi is intriguing because these types of figures don’t appear in cave art in Europe until around 10,000 years ago. But archeologists have discovered the lion-man statue in a cave site in Germany. This artifact, which depicts a lion-human hybrid, dates to around 40,000 years in age. In other words, therianthropes were part of the artwork of the first Europeans. It also indicates that modern humans in Europe had the capacity to envision imaginary worlds and held belief in a supernatural realm.

Capacity for Art and the Image of God

For many people, our ability to create and contemplate art serves as a defining feature of humanity—a quality that reflects our capacity for sophisticated cognitive processes. So, too, does our capacity for storytelling. As humans, we seem to be obsessed with both. Art and telling stories are manifestations of symbolism and open-ended generative capacity. Through art (as well as music and language), we express and communicate complex ideas and emotions. We accomplish this feat by representing the world—and even ideas—with symbols. And, we can manipulate symbols, embedding them within one another to create alternate possibilities.

As a Christian, I believe that our capacity to make art and to tell stories is an outworking of the image of God. As such, the appearance of art (as well as other artifacts that reflect our capacity for symbolism) serves as a diagnostic for the image of God in the archeological record. That record provides the means to characterize the mode and tempo of the appearance of behavior that reflect the image of God. If the biblical account of human origins is true, then I would expect that artistic expression should be unique to modern humans and should appear at the same time that we make our first appearance as a species.

So, when did art (and symbolic capacity) first appear? Did art emerge suddenly? Did it appear gradually? Is artistic expression unique to human beings or did other hominins, such as Neanderthals, produce art too? Answers to these questions are vital to our case for human exceptionalism and, along with it, the image of God.

When Did the Capacity for Art First Appear?

Again, the simultaneous appearance of cave art in Europe and Asia indicates that the capacity for artistic expression (and, hence, symbolism) dates back to the time in prehistory before humans began to migrate around the world from out of Africa (around 60,000 years ago). This conclusion gains support from the recent discovery of a silcrete flake from a layer in the Blombos Cave that dates to about 73,000 years old. (The Blombos Cave is located around 150 miles east of Cape Town, South Africa.) A portion of an abstract drawing is etched into this flake.2

Linguist Shigeru Miyagawa believes that artistic expression emerged in Africa earlier than 125,000 years ago. Archeologists have discovered rock art produced by the San people that dates to 72,000 years ago. This art shares certain elements with European cave art. Because the San diverged from the modern human lineage around 125,000 years ago, the ancestral people groups that gave rise to both lines must have possessed the capacity for artistic expression before that time.3

It is also significant that the globular brain shape of modern humans first appears in the archeological record around 130,000 years ago. As I have written about previously, globular brain shape allows expansion of the parietal lobe, which is responsible for many of our capacities:

  • Perception of stimuli
  • Sensorimotor transformation (which plays a role in planning)
  • Visuospatial integration (which provides hand-eye coordination needed for making art)
  • Imagery
  • Self-awareness
  • Working and long-term memory

In other words, the evidence indicates that our capacity for symbolism emerged at the time that our species first appears in the fossil record. Some archeologists claim that Neanderthals displayed the capacity for symbolism as well. If this claim proves true, then human beings don’t stand apart from other creatures. We aren’t special.

Did Neanderthals Have the Capacity to Create Art?

Claims of Neanderthal artistic expression abound in popular literature and appear in scientific journals. However, a number of studies question these claims. When taken as a whole, the evidence indicates that Neanderthals were cognitively inferior to modern humans.

So, when the evidence is considered as a whole, only human beings (modern humans) possess the capability for symbolism, open-ended generative capacity, and theory of mind—in my view, scientific descriptors of the image of God. The archeological record affirms the biblical view of human nature. It is also worth noting that the origin of our symbolic capacity seems to arise at the same time that modern humans appear in the fossil record, an observation I would expect given the biblical account of human origins.

Like the comics that intrigue me, this narrative resonates on a personal level. It seems as if the story told in the opening pages of the Old Testament is true.


Cave Art and the Image of God

The Modern Human Brain

Could Neanderthals Make Art?

  1. Maxime Aubert et al., “Earliest Hunting Scene in Prehistoric Art,” Nature 576 (December 11, 2019): 442–45, doi:10.1038/s41586-019-1806y.
  2. Shigeru Miyagawa, Cora Lesure, and Vitor A. Nóbrega, “Cross-Modality Information Transfer: A Hypothesis about the Relationship among Prehistoric Cave Paintings, Symbolic Thinking, and the Emergence of Language,” Frontiers in Psychology 9 (February 20, 2018): 115, doi:10.3389/fpsyg.2018.00115.
  3. Christopher S. Henshilwood et al., “An Abstract Drawing from the 73,000-Year-Old Levels at Blombos Cave, South Africa,” Nature 562 (September 12, 2018): 115–18, doi:10.1038/s41586-018-0514-3.

Reprinted with permission by the author

Original article at:

Primate Thanatology and the Case for Human Exceptionalism


By Fazale Rana – September 18, 2019

I will deliver this people from the power of the grave;
I will redeem them from death.
Where, O death, are your plagues?
Where, O grave, is your destruction?

Hosea 13:14

It was the first time someone I knew died. I was in seventh grade. My classmate’s younger brother and two younger sisters perished in a fire that burned his family’s home to the ground. We lived in a small rural town in West Virginia at the time. Everyone knew each other and the impact of that tragedy reverberated throughout the community.

I was asked to be a pallbearer at the funeral. To this day, I remember watching my friend’s father with a cast on one arm and another on one of his legs, hobble up to each of the little caskets to touch them one last time as he sobbed uncontrollably right before we lifted and carried the caskets to the waiting hearses.

Death is part of life and our reaction to death is part of what makes us human. But, are humans unique in this regard?

Funerary Practices

Human responses to death include funerary practices—ceremonies that play an integral role in the final disposition of the body of the deceased.

Anthropologists who study human cultures see funerals as providing important scientific insight into human nature. These scientists define funerals as cultural rituals designed to honor, remember, and celebrate the life of those who have died. Funerals provide an opportunity for people to express grief, mourn loss, offer sympathy, and support the bereaved. Also, funerals often serve a religious purpose that includes (depending on the faith tradition) praying for the person who has died, helping his or her soul transition to the afterlife (or reincarnate).

Funerary Practices and Human Exceptionalism

For many anthropologists, human funerary practices are an expression of our capacities for:

  • symbolism
  • open-ended generative manipulation of symbols
  • theory of mind
  • complex, hierarchical social interactions

Though the idea of human exceptionalism is controversial within anthropology today, a growing minority of anthropologists argue that the combination of these qualities sets us apart from other creatures. They make us unique and exceptional.

As a Christian, I view this set of qualities as scientific descriptors of the image of God. That being the case, then, from my vantage point, human funerary practices (along with language, music, and art) are part of the body of evidence that we can marshal to make the case that human beings uniquely bear God’s image.

What about Neanderthals?

But are human beings really unique and exceptional?

Didn’t Neanderthals bury their dead? Didn’t these hominins engage in funerary practices just like modern humans do?

If the answer to these questions is yes, then for some people it undermines the case for human uniqueness and exceptionalism and, along with it, the scientific case for the image of God. If Neanderthal funerary practices flow out of the capacity for symbolism, open-ended generative capacity, etc., then it means that Neanderthals must have been like us. They must have been exceptional, too, and humans don’t stand apart from all other creatures on Earth, as the Scriptures teach.

Did Neanderthals Bury Their Dead?

But, could these notions about Neanderthal exceptionalism be premature? Although there is widespread belief that Neanderthals buried their dead in a ritualistic manner and even though this claim can be attested in the scientific literature, a growing body of archeological evidence challenges this view.

Many anthropologists question if Neanderthal burials were in fact ritualistic. (If they weren’t, then it most likely indicates that these hominins didn’t have a concept of the afterlife—a concept that requires symbolism and open-ended generative capacities.) Others go so far as to question if Neanderthals buried their dead at all. (For an in-depth discussion of the scientific challenges to Neanderthal burials, see the Resources section below.)

Were Neanderthal Burials an Evolutionary Precursor to Human Funerary Practices?

It is not unreasonable to think that these hominins may well have disposed of corpses and displayed some type of response when members of their group died. Over the centuries, keen observers (including primatologists, most recently) have documented nonhuman primates inspecting, protecting, retrieving, carrying, and dragging the dead bodies of members of their groups.1 In light of these observations, it makes sense to think that Neanderthals may have done something similar.

While it doesn’t appear that Neanderthals responded to death in the same way we do, it is tempting (within the context of the evolutionary paradigm) to view Neanderthal behavior as an evolutionary stepping-stone to the funerary practices of modern humans.

But, is this transitional view the best explanation for Neanderthal burials—assuming that these hominins did, indeed, dispose of group members’ corpses? Research in thanatology (the study of dying and death) among nonhuman primates holds the potential to shed light on this question.

The Nonhuman Primate Response to Death

Behavioral evolution researchers André Gonçalves and Susana Caravalho recently reviewed studies in primate thanatology—categorizing and interpreting the way these creatures respond to death. In the process, they sought to explain the role the death response plays among various primate groups.


Figure 1: Monkey Sitting over the Body of a Deceased Relative. Image credit: Shutterstock

When characterizing the death response of nonhuman primates, Gonçalves and Caravalho group the behaviors of these creatures into two categories: (1) responses to infant deaths and (2) responses to adult deaths.

In most primate taxa (classified groups), when an infant dies the mother will carry the dead baby for days before abandoning it, often grooming the corpse and swatting away flies. Eventually, she will abandon it. Depending on the taxon, in some instances young females will carry the infant’s remains for a few days after the mother abandons it. Most other members in the group ignore the corpse. At times, they will actively avoid both mother and corpse when the stench becomes overwhelming.


Figure 2: Baboon Mother with a Child. Image credit: Shutterstock

The death of an adult member of the group tends to elicit a much more pervasive response than does the death of an infant. The specific nature of the response depends upon the taxon and also on other factors such as: (1) the bond between individual members of the group and the deceased; (2) the social status of the deceased; and (3) the group structure of the particular taxon. Typically, the closer the bond between the deceased and the group member the longer the duration of the death response. The same is true if the deceased is a high-ranking member of the group.

Often the death response includes vocalizations that connote alarm and distress. Depending on the taxon, survivors may hit and pull at the corpse, as if trying to rouse it. Other times, it appears that survivors hit the corpse out of frustration. Sometimes groups members will sniff at the corpse or peer at it. In some taxa, survivors will groom the corpse or stroke it gently, while swatting away flies. In other taxa, survivors will stand vigil over the corpse, guarding it from scavengers.

In some instances, survivors return to the corpse and visit it for days. After the corpse is disposed, group members may continue to visit the site for quite some time. In other taxa, group members may avoid the death site. Both behaviors indicate that group members understand that an event of great importance to the group took place at the site where a member died.

Are Humans and Nonhuman Primates Different in Degree? Or Kind?

It is clear that nonhuman primates have an awareness of death and, for some primate taxa, it seems as if members of the group experience grief. Some anthropologists and primatologists see this behavior as humanlike. It’s easy to see why. We are moved by the anguish and confusion these creatures seem to experience when one of their group members dies.

For the most part, these scientists would agree that the human response to death is more complex and sophisticated. Yet, they see human behavior as differing only in degree rather than kind when compared to other primates. Accordingly, they interpret primate death awareness as an evolutionary antecedent to the sophisticated funerary practices of modern humans, with Neanderthal behavior part of the trajectory. And for this reason, they maintain that human beings really aren’t unique or exceptional.

The Trouble with Anthropomorphism

One problem with this conclusion (even within an evolutionary framework) is that it fails to account for the human tendency toward anthropomorphism. As part of our human nature, we possess theory of mind. We recognize that other human beings have minds like ours. And because of this capability, we know what other people are thinking and feeling. But, we don’t know how to turn this feature on and off. As a result, we also apply theory of mind to animals and inanimate objects, attributing humanlike behaviors and motivations to them, though they don’t actually possess these qualities.

British ethnologist Marian Stamp Dawkins argues in her book Why Animals Matter that scientists studying animal behavior fall victim to the tendency to anthropomorphize just as easily as the rest of us. Too often, researchers interpret experimental results from animal behavioral studies and from observations of animal behavior in captivity and the wild in terms of human behavior. When they do, these researchers ascribe human mental experiences—thoughts and feelings—to animals. Dawkins points out that when investigators operate this way, it leads to untestable hypotheses because we can never truly know what occurs in animal minds. Moreover, Dawkins argues that we tend to prefer anthropomorphic interpretations to other explanations. She states, “Anthropomorphism tends to make people go for the most human-like explanation and ignore the other less exciting ones.”2

A lack of awareness of our tendency toward anthropomorphism raises questions about the all-too-common view that the death response of nonhuman primates—and Neanderthals—is humanlike and an evolutionary antecedent to modern human funerary practices. This is especially true in light of the explanation offered by Gonçalves and Caravalho for the death response in primates.

The two investigators argue that the response of mothers to the death of their infants is actually maladaptive (from an evolutionary perspective). Carrying around dead infants and caring for them is energetically costly and hinders their locomotion. Both consequences render them vulnerable to predators. The pair explain this behavior by arguing that the mother’s response to the death of her infant falls on the continuum of care-taking behavior and can be seen as a trade-off. In other words, nonhuman primate mothers who have a strong instinct to care for their offspring will ensure the survival of their infant. But if the infant dies, the instinct is so strong that they will continue to care for it after its death.

Gonçalves and Caravalho also point out that the death response toward adult members of the group plays a role in reestablishing new group dynamics. Depending on the primate taxon, the death of members shifts the group’s hierarchical structure. This being the case, it seems reasonable to think that the death response helps group members adjust to the new group structure as survivors take on new positions in the hierarchy.

Finally, as Dawkins argues, we can’t know what takes place in the minds of animals. Therefore, we can’t legitimately attribute human mental experiences to animals. So, while it may seem to us as if some nonhuman primates experience grief as part of the death response, how do we know that this is actually the case? Evidence for grief often consists of loss of appetite and increased vocalizations. However, though these changes occur in response to the death of a group member, there may be other explanations for these behaviors that have nothing to do with grief at all.

Death Response in Nonhuman Primates and Neanderthals

Study of primate thanatology also helps us to put Neanderthal burial practices (assuming that these hominins buried dead group members) into context. Often, when anthropologists interpret Neanderthal burials (from an evolutionary perspective), they are comparing these practices to human funerary practices. This comparison makes it seem like Neanderthal burials are part of an evolutionary trajectory toward modern human behavior and capabilities.

But what if the death response of nonhuman primates is factored into the comparison? When we add a second endpoint, we find that the Neanderthal response to death clusters more closely to the responses displayed by nonhuman primates than to modern humans. And as remarkable as the death response of nonhuman primates may be, it is categorically different from modern human funerary practices. To put it another way, modern human funerary practices reflect our capacity for symbolism, open-ended manipulation of symbols, theory of mind, etc. In contrast, the death response of nonhuman primates and hominins, such as Neanderthals, seems to serve utilitarian purposes. So, it isn’t the presence or absence of the death response that determines our exceptional nature. Instead, it is a death response shaped by our capacity for symbolism and open-ended generative capacity that highlights our exceptional uniqueness.

Modern humans really do seem to stand apart compared to all other creatures in a way that aligns with the biblical claim that human beings uniquely possess and express the image of God.

RTB’s biblical creation model for human origins, described in Who Was Adam?, views hominins such as Neanderthals as creatures created by God’s divine fiat that possess intelligence and emotional capacity. These animals were able to employ crude tools and even adopt some level of “culture,” much like baboons, gorillas, and chimpanzees. But they were not spiritual beings made in God’s image. That position—and all of the intellectual, relational, and symbolic capabilities that come with it—remains reserved for modern humans alone.

Resources for Further Exploration

Did Neanderthals Bury Their Dead?

Nonhuman Primate Behavior

Problem-Solving in Animals and Human Exceptionalism

  1. André Gonçalves and Susana Caravalho, “Death among Primates: A Critical Review of Nonhuman Primate Interactions towards Their Dead and Dying,” Biological Reviews 94, no. 4 (April 4, 2019), doi:10.1111/brv.12512.
  2. Marian Stamp Dawkins, Why Animals Matter: Animal Consciousness, Animal Welfare, and Human Well-Being (New York, Oxford University Press, 2012), 30.

Reprinted with permission by the author

Original article at:

Does Transhumanism Refute Human Exceptionalism? A Response to Peter Clarke

Untitled 11

I just finished binge-watching Altered Carbon. Based on the 2002 science fiction novel written by Richard K. Morgan, this Netflix original series is provocative, to say the least.

Altered Carbon takes place in the future, where humans can store their personalities as digital files in devices called stacks. These disc-like devices are implanted at the top of the spinal column. When people die, their stacks can be removed from their body (called sleeves) and stored indefinitely until they are re-sleeved—if and when another body becomes available to them.

In this world, people who possess extreme wealth can live indefinitely, without ever having to spend any time in storage. Referred to as Meths (after the biblical figure Methuselah, who lived 969 years), the wealthy have the financial resources to secure a continual supply of replacement bodies through cloning. Their wealth also affords them the means to back up their stacks once a day, storing the data in a remote location in case their stacks are destroyed. In effect, Meths use technology to attain a form of immortality.

Forthcoming Posthuman Reality?

The world of Altered Carbon is becoming a reality right before our eyes. Thanks to recent advances in biotechnology and bioengineering, the idea of using technology to help people live indefinitely no longer falls under the purview of science fiction. Emerging technologies such as CRISPR-Cas9 gene editing and brain-computer interfaces offer hope to people suffering from debilitating diseases and injuries. They can also be used for human enhancements—extending our physical, intellectual, and psychological capabilities beyond natural biological limits.

These futuristic possibilities give fuel to a movement known as transhumanism. Residing on the fringe of the academy and culture for several decades, the movement has gone mainstream in the ivory towers of the academy and on the street. Sociologist James Hughes describes the transhumanist vision this way in his book Citizen Cyborg:

“In the twenty-first century the convergence of artificial intelligence, nanotechnology and genetic engineering will allow human beings to achieve things previously imagined only in science fiction. Lifespans will extend well beyond a century. Our senses and cognition will be enhanced. We will gain control over our emotions and memory. We will merge with machines, and machines will become more like humans. These technologies will allow us to evolve into varieties of “posthumans” and usher us into a “transhuman” era and society. . . . Transhuman technologies, technologies that push the boundaries of humanism, can radically improve our quality of life, and . . . we have a fundamental right to use them to control our bodies and minds. But to ensure these benefits we need to democratically regulate these technologies and make them equally available in free societies.”1


Figure 1: The transhumanism symbol. Image credit: Wikimedia Commons

In short, transhumanists want us to take control of our own evolution, transforming human beings into posthumans and in the process creating a utopian future that carves out a path to immortality.

Depending on one’s philosophical or religious perspective, transhumanists’ vision and the prospects of a posthuman reality can bring excitement or concern or a little bit of both. Should we pursue the use of technology to enhance ourselves, transcending the constraints of our biology? What role should these emerging biotechnologies play in shaping our future? What are the boundaries for developing and using these technologies? Should there be any boundaries?2

All of these questions revolve around a central question: Who are we as human beings?

Are Humans Exceptional?

Prior to the rising influence of transhumanism, the answer to this question followed along one of two lines. For people who hold to a Judeo-Christian worldview, human beings are exceptional, standing apart from all other creatures on the planet. Accordingly, our exceptional nature results from the image of God. As image bearers, human beings have infinite worth and value.

On the other hand, those influenced by the evolutionary paradigm maintain that human beings are nothing more than animals—differing in degree, not kind, from other creatures. In fact, many who hold this view of humanity find the notion of human exceptionalism repugnant. In their view, to elevate the value of human beings above that of other creatures constitutes speciesism and reflects an unjustifiable arrogance.

And now transhumanism enters into the fray. People on both sides of the controversy about human nature and identity argue that transhumanism brings an end to any notion about human exceptionalism, once and for all.

One is Peter Clarke. In an article published on the Areo website entitled “Transhumanism and the Death of Human Exceptionalism,” Clarke says:

“As a philosophical movement, transhumanism advocates for improving humanity through genetic modifications and technological augmentations, based upon the position that there is nothing particularly sacred about the human condition. It acknowledges up front that our bodies and minds are riddled with flaws that not only can but should be fixed. Even more radically, as the name implies, transhumanism embraces the potential of one day moving beyond the human condition, transitioning our sentience into more advanced forms of life, including genetically modified humans, superhuman cyborgs, and immortal digital intelligences.”3

On the other side of the aisle is Wesley J. Smith of the Discovery Institute. In his article “Transhumanist Bill of Wrongs,” Smith writes:

“Transhumanism would shatter human exceptionalism. The moral philosophy of the West holds that each human being is possessed of natural rights that adhere solely and merely because we are human. But transhumanists yearn to remake humanity in their own image—including as cyborgs, group personalities residing in the Internet Cloud, or AI-controlled machines. That requires denigrating natural man as unexceptional to justify our substantial deconstruction and redesign.”4

In other words, transhumanism highlights the notion that our bodies, minds, and personalities are inherently flawed and we have a moral imperative, proponents say, to correct these flaws. But this view denigrates humanity, opponents say, and with it the notion of human exceptionalism. For Clarke, this nonexceptional perspective is something to be celebrated. For Smith, transhumanism is of utmost concern and must be opposed.

Evidence of Exceptionalism

While I am sympathetic to Smith’s concern, I would take a differing perspective. I find that transhumanism provides one of the most powerful pieces of evidence for human exceptionalism—and along with it the image of God.

In my forthcoming book (coauthored with Ken Samples), Humans 2.0, I write:

“Ironically, progress in human enhancement technology and the prospects of a posthuman future serve as one of the most powerful arguments for human exceptionalism and, consequently, the image of God. Human beings are the only species that exists—or that has ever existed—that can create technologies to enhance our capabilities beyond our biological limits. We alone work toward effecting our own immortality, take control of evolution, and look to usher in a posthuman world. These possibilities stem from our unique and exceptional capacity to investigate and develop an understanding of nature (including human biology) through science and then turn that insight into technology.”5

Our ability to carry out the scientific enterprise and develop technology stems from four qualities that a growing number of anthropologists and primatologists think are unique to humans, including:

  • symbolism
  • open-ended generative capacity
  • theory of mind
  • our capacity to form complex social networks

From my perspective as a Christian, these qualities stand as scientific descriptors of the image of God.

As human beings, we effortlessly represent the world with discrete symbols. We denote abstract concepts with symbols. And our ability to represent the world symbolically has interesting consequences when coupled with our abilities to combine and recombine those symbols in a nearly infinite number of ways to create alternate possibilities.

Human capacity for symbolism manifests in the form of language, art, music, and even body ornamentation. And we desire to communicate the scenarios we construct in our minds with other human beings.

For anthropologists and primatologists who think that human beings differ in kind—not degree—from other animals, these qualities demarcate us from the great apes and Neanderthals. The separation becomes most apparent when we consider the remarkable technological advances we have made during our tenure as a species. Primatologist Thomas Suddendorf puts it this way:

“We reflect on and argue about our present situation, our history, and our destiny. We envision wonderful harmonious worlds as easily as we do dreadful tyrannies. Our powers are used for good as they are for bad, and we incessantly debate which is which. Our minds have spawned civilizations and technologies that have changed the face of the Earth, while our closest living animal relatives sit unobtrusively in their remaining forests. There appears to be a tremendous gap between human and animal minds.”6

Moreover, no convincing evidence exists that leads us to think that Neanderthals shared the qualities that make us exceptional. Neanderthals—who first appear in the fossil record around 250,000 to 200,000 years ago and disappear around 40,000 years ago—existed on Earth longer than modern humans have. Yet our technology has progressed exponentially, while Neanderthal technology remained largely static.

According to paleoanthropologist Ian Tattersall and linguist Noam Chomsky (and their coauthors):

“Our species was born in a technologically archaic context, and significantly, the tempo of change only began picking up after the point at which symbolic objects appeared. Evidently, a new potential for symbolic thought was born with our anatomically distinctive species, but it was only expressed after a necessary cultural stimulus had exerted itself. This stimulus was most plausibly the appearance of language. . . . Then, within a remarkably short space of time, art was invented, cities were born, and people had reached the moon.”7

In other words, the evolution of human technology signifies that there is something special—exceptional—about us as human beings. In this sense, transhumanism highlights our exceptional nature precisely because the prospects for controlling our own evolution stem from our ability to advance technology.

To be clear, transhumanism possesses an existential risk for humanity. Unquestioningly, it has the potential to strip human beings of dignity and worth. But, ironically, transhumanism is possible only because we are exceptional as human beings.

Responsibility as the Crown of Creation

Ultimately, our exceptional nature demands that we thoughtfully deliberate on how to use emerging biotechnologies to promote human flourishing, while ensuring that no human being is exploited or marginalized by these technologies. It also means that we must preserve our identity as human beings at all costs.

It is one thing to enjoy contemplating a posthuman future by binge-watching a sci-fi TV series. But, it is another thing altogether to live it out. May we be guided by ethical wisdom to live well.


  1. James Hughes, Citizen Cyborg: Why Democratic Societies Must Respond to the Redesigned Humans of the Future (Cambridge, MA: Westview Press, 2004), xii.
  2. Ken Samples and I take on these questions and more in our book Humans 2.0, due to be published in July of 2019.
  3. Peter Clarke, “Transhumanism and the Death of Human Exceptionalism,” Areo (March 6, 2019),
  4. Wesley J. Smith,“Transhumanist Bill of Wrongs,” Discovery Institute (October 23, 2018),
  5. Fazale Rana with Kenneth Samples, Humans 2.0: Scientific, Philosophical, and Theological Perspectives on Transhumanism (Covina, CA: RTB Press, 2019) in press.
  6. Thomas Suddendorf, The Gap: The Science of What Separates Us from Other Animals (New York: Basic Books, 2013), 2.
  7. Johan J. Bolhuis et al., “How Could Language Have Evolved?” PLoS Biology 12, no.8 (August 26, 2014): e1001934, doi:10.1371/journal.pbio.1001934.

Reprinted with permission by the author
Original article at:

Timing of Neanderthals’ Disappearance Makes Art Claims Unlikely

Untitled 10

In Latin it literally means, “somewhere else.”

Legal experts consider an alibi to be one of the most effective legal defenses available in a court of law because it has the potential to prove a defendant’s innocence. It goes without saying: if a defendant has an alibi, it means that he or she was somewhere else when the crime was committed.

As it turns out, paleoanthropologists have discovered that Neanderthals have an alibi, of sorts. Evidence indicates that they weren’t the ones to scratch up the floor of Gorham’s Cave.

Based on recent radiocarbon dates measured for samples from Bajondillo Cave (located on the southern part of the Iberian Peninsula—southwest corner of Europe), a research team from the Japan Agency for Marine-Earth Science and Technology and several Spanish institutions determined that modern humans made their way to the southernmost tip of Iberia around 43,000 years ago, displacing Neanderthals.1

Because Neanderthals disappeared from Iberia at that time, it becomes unlikely that they were responsible for hatch marks (dated to be 39,000 years in age) made on the floor of Gorham’s Cave on the island of Gibraltar. These scratches have been interpreted by some paleoanthropologists as evidence that Neanderthals possessed symbolic capabilities.

But how could Neanderthals have made the hatch marks if they weren’t there? Ladies and gentlemen of the jury: the perfect alibi. Instead, it looks as if modern humans were the culprits who marked up the cave floor.


Figure 1: Gorham’s Cave. Image credit: Wikipedia

The Case for Neanderthal Exceptionalism

Two of the biggest questions in anthropology today relate to Neanderthals:

  • When did these creatures disappear from Europe?
  • Did they possess symbolic capacity like modern humans, thus putting their cognitive abilities on par with ours as a species?

For paleoanthropologists, these two questions have become inseparable. With regard to the second question, some paleoanthropologists are convinced that Neanderthals displayed symbolic capabilities.

It is important to note that the case for Neanderthal symbolism is largely based on correlations between the archaeological and fossil records. Toward this end, some anthropologists have concluded that Neanderthals possessed symbolism because researchers have recovered artifacts (presumably reflecting symbolic capabilities) from the same layers that harbored Neanderthal fossils. Unfortunately, this approach is complicated by other studies that show that the cave layers have been mixed by either cave occupants (either hominid or modern human) or animals living in the caves. This mixing leads to the accidental association of fossil and archaeological remains. In other words, the mixing of layers raises questions about who the manufacturers of these artifacts were.

Because we know modern humans possess the capacity for symbolism, it is much more likely that modern humans, not Neanderthals, made the symbolic artifacts, in these instances. Then, only through an upheaval of the cave layers did the artifacts mix with Neanderthal remains. (See the Resources section for articles that elaborate this point.)

More often than not, archaeological remains are unearthed by themselves with no corresponding fossil specimens. This is the reason why understanding the timing of Neanderthals’ disappearance and modern humans’ arrival in different regions of Europe becomes so important (and why the two questions interrelate). Paleoanthropologists believe that if they can show that Neanderthals lived in a locale at the time symbolic artifacts were produced, then it becomes conceivable that these creatures made the symbolic items. This interpretation increases in plausiblity if no modern humans were around at the time.

Some researchers have argued along these lines regarding the hatch marks found on the floor of Gorham’s Cave.2 The markings were made in the bedrock of the cave floor. The layers above the bedrock date to between 30,000 and 39,000 years in age. Some paleoanthropologists argue that Neanderthals must have made the markings. Why? Because, even though modern humans were already in Europe by that time, these paleoanthropologists think that modern humans had not yet made their way to the southern part of the Iberian Peninsula. These same researchers also think that Neanderthals survived in Iberia until about 32,000 years ago, even though their counterparts in other parts of Europe had already disappeared. So, on this basis, paleoanthropologists conclude that Neanderthals produced the hatch marks and, thus, displayed symbolic capabilities.


Figure 2: Hatch marks on the floor of Gorham’s Cave. Image credit: Wikipedia

When Did Neanderthals Disappear from Iberia?

But recent work challenges this conclusion. The Spanish and Japanese team took 17 new radiocarbon measurements from layers of the Bajondillo Cave (located in southern Iberia, near Gorham’s Cave) with the hopes of precisely documenting the change in technology from Mousterian (made by Neanderthals) to Aurignacian (made by modern humans). This transition corresponds to the replacement of Neanderthals by modern humans elsewhere in Europe.

The researchers combined the data from their samples with previous measurements made at the site to pinpoint this transition at around 43,000 years ago—not 32,000 years ago. In other words, modern humans occupied Iberia at the same time they occupied other places in Europe. This result also means that Neanderthals had disappeared from Iberia well before the hatch marks in Gorham’s Cave were made.

Were Neanderthals Exceptional Like Modern Humans?

Though claims of Neanderthal exceptionalism abound in the scientific literature and in popular science articles, the claims universally fail to withstand ongoing scientific scrutiny, as this latest discovery attests. Simply put, based on the archaeological record, there are no good reasons to think that Neanderthals displayed symbolism.

From my perspective, the case for Neanderthal symbolism seems to be driven more by ideology than actual scientific evidence.

It is also worth noting that comparative studies on Neanderthal and modern human brain structures also lead to the conclusion that humans displayed symbolism and Neanderthals did not. (See the Resources section for articles that describe this work in more detail.)

Why Does It Matter?

Questions about Neanderthal symbolic capacity and, hence, exceptionalism have bearing on how we understand human beings. Are human beings unique in our capacity for symbolism or is this quality displayed by other hominins? If humans are not alone in our capacity for symbolism, then we aren’t exceptional. And, if we aren’t exceptional then it becomes untenable to embrace the biblical concept of human beings as God’s image bearers. (As a Christian, I see symbolism as a manifestation of the image of God.)

But, based on the latest scientific evidence, the verdict is in: modern humans are the only species to display the capacity for symbolism. In this way, scientific advance affirms that humans are exceptional in a way that aligns with the biblical concept of the image of God.

The Neanderthals’ alibi holds up. They weren’t there, but humans were. Case closed.


  1. Miguel Cortés-Sánchez et al., “An Early Aurignacian Arrival in Southwestern Europe,” Nature Ecology and Evolution 3 (January 21, 2019): 207–12, doi:10.1038/s41559-018-0753-6.
  2. Joaquín Rodríguez-Vidal et al., “A Rock Engraving Made by Neanderthals in Gibraltar,” Proceedings of the National Academy of Sciences USA 111, no. 37 (September 16, 2014): 13301–6, doi:10.1073/pnas.1411529111.

Reprinted with permission by the author
Original article at:

Does Animal Planning Undermine the Image of God?

Untitled 6

A few years ago, we had an all-white English Bulldog named Archie. He would lumber toward even complete strangers, eager to befriend them and earn their affections. And people happily obliged this playful pup.

Archie wasn’t just an adorable dog. He was also well trained. We taught him to ring a bell hanging from a sliding glass door in our kitchen so he could let us know when he wanted to go out. He rarely would ring the bell. Instead, he would just sit by the door and wait . . . unless the neighbor’s cat was in the backyard. Then, Archie would repeatedly bang on the bell with great urgency. He had to get the cat at all costs. Clearly, he understood the bell’s purpose. He just chose to use it for his own devices.

Anyone who has owned a cat or dog knows that these animals do remarkable things. Animals truly are intelligent creatures.

But there are some people who go so far as to argue that animal intelligence is much more like human intelligence than we might initially believe. They base this claim, in part, on a handful of high-profile studies that indicate that some animals such as great apes and ravens can problem-solve and even plan for the future—behaviors that make them like us in some important ways.

Great Apes Plan for the Future

In 2006, two German anthropologists conducted a set of experiments on bonobos and orangutans in captivity that seemingly demonstrated that these creatures can plan for the future. Specifically, the test subjects selected, transported, and saved tools for use 1 hour and 14 hours later, respectively.1

To begin the study, the researchers trained both bonobos and orangutans to use a tool to get a reward from an apparatus. In the first experiment, the researchers blocked access to the apparatus. They laid out eight tools for the apes to select—two were suitable for the task and six were unsuitable. After selecting the tools, the apes were ushered into another room where they were kept for 1 hour. The apes were then allowed back into the room and granted access to the apparatus. To gain the reward, the apes had to select the correct tool and transport it to and from the waiting area. The anthropologists observed that the apes successfully obtained the reward in 70 percent of the trials by selecting and hanging on to the correct tool as they moved from room to room.

In the second experiment, the delay between tool selection and access to the apparatus was extended to 14 hours. This experiment focused on a single female individual. Instead of taking the test subject to the waiting room, the researchers took her to a sleeping room one floor above the waiting room before returning her to the room with the apparatus. She selected and held on to to the tool for 14 hours while she moved from room to room in 11 of the 12 trials—each time successfully obtaining the reward.

On the basis of this study, the researchers concluded that great apes have the ability to plan for the future. They also argued that this ability emerged in the common ancestor of humans and great apes around 14 million years ago. So, even though we like to think of planning for the future as one of the “most formidable human cognitive achievements,”2 it doesn’t appear to be unique to human beings.

Ravens Plan for the Future

In 2017, two researchers from Lund University in Sweden demonstrated that ravens are capable of flexible planning just like the great apes.3 These cognitive scientists conducted a series of experiments with ravens, demonstrating that the large black birds can plan for future events and exert self-control for up to 17 hours prior to using a tool or bartering with humans for a reward. (Self-control is crucial for successfully planning for the future.)

The researchers taught ravens to use a tool to gain a reward from an apparatus. As part of the training phase, the test subjects also learned that other objects wouldn’t work on the apparatus.

In the first experiment, the ravens were exposed to the apparatus without access to tools. As such, they couldn’t gain the reward. Then the researchers removed the apparatus. One hour later, the ravens were taken to a different location and offered tools. Then, the researchers presented them with the apparatus 15 minutes later. On average, the raven test subjects selected and used tools to gain the reward in approximately 80 percent of the trials.

In the next experiment, the ravens were trained to barter by exchanging a token for a food reward. After the training, the ravens were taken to a different location and presented with a tray containing the token and three distractor objects by a researcher who had no history of bartering with the ravens. As with the results of the tool selection experiment, the ravens selected and used the token to successfully barter for food in approximately 80 percent of the trials.

When the scientists modified the experimental design to increase the time delay from 15 minutes to 17 hours between tool or token selection and access to the reward, the ravens successfully completed the task in nearly 90 percent of the trials.

Next, the researchers wanted to determine if the ravens could exercise self-control as part of their planning for the future. First, they presented the ravens with trays that contained a small food reward. Of course, all of the ravens took the reward. Next, the researchers offered the ravens trays that had the food reward and either tokens or tools and distractor items. By selecting the token or the tools, the ravens were ensured a larger food reward in the future. The researchers observed that the ravens selected the tool in 75 percent of the trials and the token in about 70 percent, instead of taking the small morsel of food. After selecting the tool or token, the ravens were given the opportunity to receive the reward about 15 minutes later.

The researchers concluded that, like the great apes, ravens can plan for the future. Moreover, these researchers argue that this insight opens up greater possibilities for animal cognition because, from an evolutionary perspective, ravens are regarded as avian dinosaurs. And mammals (including the great apes) are thought to have shared an evolutionary ancestor with dinosaurs 320 million years ago.

Are Humans Exceptional?

In light of these studies (and others like them), it becomes difficult to maintain that human beings are exceptional. Self-control and the ability to flexibly plan for future events is considered by many to be the cornerstone of human cognition. Planning for the future requires mental representation of temporally distant events, the ability to set aside current sensory inputs for unobservable future events, and an understanding of what current actions result in achieving a future goal.

For many Christians, such as me, the loss of human exceptionalism is concerning because if this idea is untenable, so, too, is the biblical view of human nature. According to Scripture, human beings stand apart from all other creatures because we bear God’s image. And, because every human being possesses the image of God, every human being has intrinsic worth and value. But if, in essence, human beings are no different from animals, it is challenging to maintain that we are the crown of creation, as Scripture teaches.

Yet recent work by biologist Johan Lind from Stockholm University (Sweden) indicates that the results of these two studies and others like them may be misleading. In effect, when properly interpreted, these studies pose no threat to human exceptionalism in any way. According to Lind, animals can engage in behavior that resembles flexible planning through a different behavior: associative learning.4 If so, this insight preserves the case for human exceptionalism and the image of God, because it means that only humans engage in genuine flexible planning for the future through higher-order cognitive processes.

Associative Learning and Planning for the Future

Lind points out that researchers working in artificial intelligence (AI) have long known that associative learning can produce complex behaviors in AI systems that give the appearance of having the capacity for planning. (Associative learning is the process that animals [and AI systems] use to establish an association between two stimuli or events, usually by the use of punishments or rewards.)


Figure 1: An illustration of associative learning in dogs. Image credit: Shutterstock

Lind wonders why researchers studying animal cognition ignore the work in AI. Applying the insights from the work on AI systems, Lind developed mathematical models based on associative learning that he used to simulate results of the studies on the great apes and ravens. He discovered that associative learning produced the same behaviors as observed by the two research teams for the great apes and ravens. In other words, planning-like behavior can actually emerge through associative learning. That is, the same processes that give AI systems the capacity to beat humans in chess can, through associative learning, account for the planning-like behavior of animals.

The results of Lind’s simulations mean that it is most likely that animals “plan” for the future in ways that are entirely different from humans. In effect, the planning-like behavior of animals is an outworking of associative learning. On the other hand, humans uniquely engage in bona fide flexible planning through advanced cognitive processes such as mental time travel, among others.

Humans Are Exceptional

Even though the idea of human exceptionalism is continually under assault, it remains intact, as the latest work by Johan Lind illustrates. When the entire body of evidence is carefully weighed, there really is only one reasonable conclusion: Human beings uniquely possess advanced cognitive abilities that make possible our capacity for symbolism, open-ended generative capacity, theory of mind, and complex social interactions—scientific descriptors of the image of God.


  1. Nicholas J. Mulcahy and Josep Call, “Apes Save Tools for Future Use,” Science 312 (May 19, 2006): 1038–40, doi:10.1126/science.1125456.
  2. Mulcahy and Call, “Apes Save Tools for Future Use.”
  3. Can Kabadayi and Mathias Osvath, “Ravens Parallel Great Apes in Flexible Planning for Tool-Use and Bartering,” Science 357 (July 14, 2017): 202–4, doi:10.1126/science.aam8138.
  4. Johan Lind, “What Can Associative Learning Do for Planning?” Royal Society Open Science 5 (November 28, 2018): 180778, doi:10.1098/rsos.180778.

Reprinted with permission by the author
Original article at:

Did Neanderthals Start Fires?



It is one of the most iconic Christmas songs of all time.

Written by Bob Wells and Mel Torme in the summer of 1945, “The Christmas Song” (subtitled “Chestnuts Roasting on an Open Fire”) was crafted in less than an hour. As the story goes, Wells and Torme were trying to stay cool during the blistering summer heat by thinking cool thoughts and then jotting them down on paper. And, in the process, “The Christmas Song” was born.

Many of the song’s lyrics evoke images of winter, particularly around Christmastime. But none has come to exemplify the quiet peace of a Christmas evening more than the song’s first line, “Chestnuts roasting on an open fire . . . ”

Gathering around the fire to stay warm, to cook food, and to share in a community has been an integral part of the human experience throughout history—including human prehistory. Most certainly our ability to master fire played a role in our survival as a species and in our ability as human beings to occupy and thrive in some of the world’s coldest, harshest climates.

But fire use is not limited only to modern humans. There is strong evidence that Neanderthals made use of fire. But, did these creatures have control over fire in the same way we do? In other words, did Neanderthals master fire? Or, did they merely make opportunistic use of natural fires? These questions are hotly debated by anthropologists today and they contribute to a broader discussion about the cognitive capacity of Neanderthals. Part of that discussion includes whether these creatures were cognitively inferior to us or whether they were our intellectual equals.

In an attempt to answer these questions, a team of researchers from the Netherlands and France characterized the microwear patterns on bifacial (having opposite sides that have been worked on to form an edge) tools made from flint recovered from Neanderthal sites, and concluded that the wear patterns suggest that these hominins used pyrite to repeatedly strike the flint. This process generates sparks that can be used to start fires.1 To put it another way, the researchers concluded that Neanderthals had mastery over fire because they knew how to start fires.


Figure 1: Biface tools for cutting or scraping. Image credit: Shutterstock

However, a closer examination of the evidence along with results of other studies, including recent insight into the cause of Neanderthal extinction, raises significant doubts about this conclusion.

What Do the Microwear Patterns on Flint Say?

The investigators focused on the microwear patterns of flint bifaces recovered from Neanderthal sites as a marker for fire mastery because of the well-known practice among hunter-gatherers and pastoralists of striking flint with pyrite (an iron disulfide mineral) to generate sparks to start fires. Presumably, the first modern humans also used this technique to start fires.


Figure 2: Starting a fire with pyrite and flint. Image credit: Shutterstock

The research team reasoned that if Neanderthals started fires, they would use a similar tactic. Careful examination of the microwear patterns on the bifaces led the research team to conclude that these tools were repeatedly struck by hard materials, with the strikes all occurring in the same direction along the bifaces’ long axis.

The researchers then tried to experimentally recreate the microwear pattern in a laboratory setting. To do so, they struck biface replicas with a number of different types of materials, including pyrites, and concluded that the patterns produced by the pyrite strikes most closely matched the patterns on the bifaces recovered from Neanderthal sites. On this basis, the researchers claim that they have found evidence that Neanderthals deliberately started fires.

Did Neanderthals Master Fire?

While this conclusion is possible, at best this study provides circumstantial, not direct, evidence for Neanderthal mastery of fire. In fact, other evidence counts against this conclusion. For example, bifaces with the same type of microwear patterns have been found at other Neanderthal sites, locales that show no evidence of fire use. These bifaces would have had a range of usages, including butchery of the remains of dead animals. So, it is possible that these tools were never used to start fires—even at sites with evidence for fire usage.

Another challenge to the conclusion comes from the failure to detect any pyrite on the bifaces recovered from the Neanderthal sites. Flint recovered from modern human sites shows visible evidence of pyrite. And yet the research team failed to detect even trace amounts of pyrite on the Neanderthal bifaces during the course of their microanalysis.

This observation raises further doubt about whether the flint from the Neanderthal sites was used as a fire starter tool. Rather, it points to the possibility that Neanderthals struck the bifaces with materials other than pyrite for reasons not yet understood.

The conclusion that Neanderthals mastered fire also does not square with results from other studies. For example, a careful assessment of archaeological sites in southern France occupied by Neanderthals from about 100,000 to 40,000 years ago indicates that Neanderthals could not create fire. Instead, these hominins made opportunistic use of natural fire when it was available to them.2

These French sites do show clear evidence of Neanderthal fire use, but when researchers correlated the archaeological layers displaying evidence for fire use with the paleoclimate data, they found an unexpected pattern. Neanderthals used fire during warm climate conditions and failed to use fire during cold periods—the opposite of what would be predicted if Neanderthals had mastered fire.

Lightning strikes that would generate natural fires are much more likely to occur during warm periods. Instead of creating fire, Neanderthals most likely harnessed natural fire and cultivated it as long as they could before it extinguished.

Another study also raises questions about the ability of Neanderthals to start fires.3 This research indicates that cold climates triggered Neanderthal extinctions. By studying the chemical composition of stalagmites in two Romanian caves, an international research team concluded that there were two prolonged and extremely cold periods between 44,000 and 40,000 years ago. (The chemical composition of stalagmites varies with temperature.)

The researchers also noted that during these cold periods, the archaeological record for Neanderthals disappears. They interpret this disappearance to reflect a dramatic reduction in Neanderthal population numbers. Researchers speculate that when this population downturn took place during the first cold period, modern humans made their way into Europe. Being better suited for survival in the cold climate, modern human numbers increased. When the cold climate mitigated, Neanderthals were unable to recover their numbers because of the growing populations of modern humans in Europe. Presumably, after the second cold period, Neanderthal numbers dropped to the point that they couldn’t recover, and hence, became extinct.

But why would modern humans be more capable than Neanderthals of surviving under extremely cold conditions? It seems as if it should be the other way around. Neanderthals had a hyper-polar body design that made them ideally suited to withstand cold conditions. Neanderthal bodies were stout and compact, comprised of barrel-shaped torsos and shorter limbs, which helped them retain body heat. Their noses were long and sinus cavities extensive, which helped them warm the cold air they breathed before it reached their lungs. But, despite this advantage, Neanderthals died out and modern humans thrived.

Some anthropologists believe that the survival discrepancy could be due to dietary differences. Some data indicates that modern humans had a more varied diet than Neanderthals. Presumably, these creatures primarily consumed large herbivores—animals that disappeared when the climatic conditions turned cold, thereby threatening Neanderthal survival. On the other hand, modern humans were able to adjust to the cold conditions by shifting their diets.

But could there be a different explanation? Could it be that with their mastery of fire, modern humans were able to survive cold conditions? And did Neanderthals die out because they could not start fires?

Taken in its entirety, the data seems to indicate that Neanderthals lacked mastery of fire but could use it opportunistically. And, in a broader context, the data indicates that Neanderthals were cognitively inferior to humans.

What Difference Does It Make?

One of the most important ideas taught in Scripture is that human beings uniquely bear God’s image. As such, every human being has immeasurable worth and value. And because we bear God’s image, we can enter into a relationship with our Maker.

However, if Neanderthals possessed advanced cognitive ability just like that of modern humans, then it becomes difficult to maintain the view that modern humans are unique and exceptional. If human beings aren’t exceptional, then it becomes a challenge to defend the idea that human beings are made in God’s image.

Yet, claims that Neanderthals are cognitive equals to modern humans fail to withstand scientific scrutiny, time and time, again. Now it’s time to light a fire in my fireplace and enjoy a few contemplative moments thinking about the real meaning of Christmas.



  1. A. C. Sorensen, E. Claud, and M. Soressi, “Neanderthal Fire-Making Technology Inferred from Microwear Analysis,” Scientific Reports 8 (July 19, 2018): 10065, doi:10.1038/s41598-018-28342-9.
  2. Dennis M. Sandgathe et al., “Timing of the Appearance of Habitual Fire Use,” Proceedings of the National Academy of Sciences, USA 108 (July 19, 2011), E298, doi:10.1073/pnas.1106759108; Paul Goldberg et al., “New Evidence on Neandertal Use of Fire: Examples from Roc de Marsal and Pech de l’Azé IV,” Quaternary International 247 (2012): 325–40, doi:10.1016/j.quaint.2010.11.015; Dennis M. Sandgathe et al., “On the Role of Fire in Neandertal Adaptations in Western Europe: Evidence from Pech de l’Azé IV and Roc de Marsal, France,” PaleoAnthropology (2011): 216–42, doi:10.4207/PA.2011.ART54.
  3. Michael Staubwasser et al., “Impact of Climate Change on the Transition of Neanderthals to Modern Humans in Europe,” Proceedings of the National Academy of Sciences, USA 115 (September 11, 2018): 9116–21, doi:10.1073/pnas.1808647115.

Vocal Signals Smile on the Case for Human Exceptionalism



Before Thanksgiving each year, those of us who work at Reasons to Believe (RTB) headquarters take part in an annual custom. We put our work on pause and use that time to call donors, thanking them for supporting RTB’s mission. (It’s a tradition we have all come to love, by the way.)

Before we start making our calls, our ministry advancement team leads a staff meeting to organize our efforts. And each year at these meetings, they remind us to smile when we talk to donors. I always found this to be an odd piece of advice, but they insist that when we talk to people, our smiles come across over the phone.

Well, it turns out that the helpful advice of our ministry advancement team has scientific merit, based on a recent study from a team of neuroscientists and psychologists from France and the UK.1 This research highlights the importance of vocal signaling for communicating emotions between people. And from my perspective, the work also supports the notion of human exceptionalism and the biblical concept of the image of God.

We Can Hear Smiles

The research team was motivated to perform this study in order to learn the role vocal signaling plays in social cognition. They chose to focus on auditory “smiles,” because, as these researchers point out, smiles are among the most powerful facial expressions and one of the earliest to develop in children. As I am sure we all know, smiles express positive feelings and are contagious.

When we smile, our zygomaticus major muscle contracts bilaterally and causes our lips to stretch. This stretching alters the sounds of our voices. So, the question becomes: Can we hear other people when they smile?


Figure 1: Zygomaticus major. Image credit: Wikipedia

To determine if people can “hear” smiles, the researchers recorded actors who spoke a range of French phonemes, with and without smiling. Then, they modeled the changes in the spectral patterns that occurred in the actors’ voices when they smiled while they spoke.

The researchers used this model to manipulate recordings of spoken sentences so that they would sound like they were spoken by someone who was smiling (while keeping other features such as pitch, content, speed, gender, etc., unchanged). Then, they asked volunteers to rate the “smiley-ness” of voices before and after manipulation of the recordings. They found that the volunteers could distinguish the transformed phonemes from those that weren’t altered.

Next, they asked the volunteers to mimic the sounds of the “smiley” phonemes. The researchers noted that for the volunteers to do so, they had to smile.

Following these preliminary experiments, the researchers asked volunteers to describe their emotions when listening to transformed phonemes compared to those that weren’t transformed. They found that when volunteers heard the altered phonemes, they expressed a heightened sense of joy and irony.

Lastly, the researchers used electromyography to monitor the volunteers’ facial muscles so that they could detect smiling and frowning as the volunteers listened to a set of 60 sentences—some manipulated (to sound as if they were spoken by someone who was smiling) and some unaltered. They found that when the volunteers judged speech to be “smiley,” they were more likely to smile and less likely to frown.

In other words, people can detect auditory smiles and respond by mimicking them with smiles of their own.

Auditory Signaling and Human Exceptionalism

This research demonstrates that both the visual and auditory clues we receive from other people help us to understand their emotional state and to become influenced by it. Our ability to see and hear smiles helps us develop empathy toward others. Undoubtedly, this trait plays an important role in our ability to link our minds together and to form complex social structures—two characteristics that some anthropologists believe contribute to human exceptionalism.

The notion that human beings differ in degree, not kind, from other creatures has been a mainstay concept in anthropology and primatology for over 150 years. And it has been the primary reason why so many people have abandoned the belief that human beings bear God’s image.

Yet, this stalwart view in anthropology is losing its mooring, with the concept of human exceptionalism taking its place. A growing minority of anthropologists and primatologists now believe that human beings really are exceptional. They contend that human beings do, indeed, differ in kind, not merely degree, from other creatures—including Neanderthals. Ironically, the scientists who argue for this updated perspective have developed evidence for human exceptionalism in their attempts to understand how the human mind evolved. And, yet, these new insights can be used to marshal support for the biblical conception of humanity.

Anthropologists identify at least four interrelated qualities that make us exceptional: (1) symbolism, (2) open-ended generative capacity, (3) theory of mind, and (4) our capacity to form complex social networks.

Human beings effortlessly represent the world with discrete symbols and to denote abstract concepts. Our ability to represent the world symbolically and to combine and recombine those symbols in a countless number of ways to create alternate possibilities has interesting consequences. Human capacity for symbolism manifests in the form of language, art, music, and body ornamentation. And humans alone desire to communicate the scenarios we construct in our minds with other people.

But there is more to our interactions with other human beings than a desire to communicate. We want to link our minds together and we can do so because we possess a theory of mind. In other words, we recognize that other people have minds just like ours, allowing us to understand what others are thinking and feeling. We also possess the brain capacity to organize people we meet and know into hierarchical categories, allowing us to form and engage in complex social networks.

Thus, I would contend that our ability to hear people’s smiles plays a role in theory of mind and our sophisticated social capacities. It contributes to human exceptionalism.

In effect, these four qualities could be viewed as scientific descriptors of the image of God. In other words, evidence for human exceptionalism is evidence that human beings bear God’s image.

So, even though many people in the scientific community promote a view of humanity that denigrates the image of God, scientific evidence and common-day experience continually support the notion that we are unique and exceptional as human beings. It makes me grin from ear to ear to know that scientific investigations into our cognitive and behavioral capacities continue to affirm human exceptionalism and, with it, the image of God.

Indeed, we are the crown of creation. And that makes me thankful!



  1. Pablo Arias, Pascal Belin, and Jean-Julien Aucouturier, “Auditory Smiles Trigger Unconscious Facial Imitation,” Current Biology 28 (July 23, 2018): PR782–R783, doi:10.1016/j.cub.2018.05.084.
Reprinted with permission by the author
Original article at:

When Did Modern Human Brains—and the Image of God—Appear?



When I was a kid, I enjoyed reading Ripley’s Believe It or Not! I couldn’t get enough of the bizarre facts described in the pages of this comic.

I was especially drawn to the panels depicting people who had oddly shaped heads. I found it fascinating to learn about people whose skulls were purposely forced into unnatural shapes by a practice known as intentional cranial deformation.

For the most part, this practice is a thing of the past. It is rarely performed today (though there are still a few people groups who carry out this procedure). But for much of human history, cultures all over the world have artificially deformed people’s crania (often for reasons yet to be fully understood). They accomplished this feat by binding the heads of infants, which distorts the normal growth of the skull. Through this practice, the shape of the human head can be readily altered to be abnormally flat, elongated, rounded, or conical.


Figure 1: Deformed ancient Peruvian skull. Image credit: Shutterstock.

It is remarkable that the human skull is so malleable. Believe it, or not!


Figure 2: Parts of the human skull. Image credit: Shutterstock.

For physical anthropologists, the normal shape of the modern human skull is just as bizarre as the conical-shaped skulls found among the remains of the Nazca culture of Peru. Compared to other hominins (such as Neanderthals and Homo erectus), modern humans have oddly shaped skulls. The skull shape of the hominins was elongated along the anterior-posterior axis. But the skull shape of modern humans is globular, with bulging and enlarged parietal and cerebral areas. The modern human skull also has another distinctive feature: the face is retracted and relatively small.


Figure 3: Comparison of modern human and Neanderthal skulls. Image credit: Wikipedia.

Anthropologists believe that the difference in skull shape (and hence, brain shape) has profound significance and helps explain the advanced cognitive abilities of modern humans. The parietal lobe of the brain is responsible for:

  • Perception of stimuli
  • Sensorimotor transformation (which plays a role in planning)
  • Visuospatial integration (which provides hand-eye coordination needed for throwing spears and making art)
  • Imagery
  • Self-awareness
  • Working and long-term memory

Human beings seem to uniquely possess these capabilities. They make us exceptional compared to other hominins. Thus, for paleoanthropologists, two key questions are: when and how did the globular human skull appear?

Recently, a team of researchers from the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany, addressed these questions. And their answers add evidence for human exceptionalism while unwittingly providing support for the RTB human origins model.1

The Appearance of the Modern Human Brain

To characterize the mode and tempo for the origin of the unusual morphology (shape) of the modern human skull, the German researchers generated and analyzed the CT scans of 20 fossil specimens representing three windows of time: (1) 300,000 to 200,000 years ago; (2) 130,000 to 100,000 years ago; and (3) 35,000 to 10,000 years ago. They also included 89 cranially diverse skulls from present-day modern humans, 8 Neanderthal skulls, and 8 from Homo erectus in their analysis.

The first group consisted of three specimens: (1) Jebel Irhoud 1 (dating to 315,000 years in age); (2) Jebel Irhoud 2 (also dating to 315,000 years in age); and (3) Omo Kibish (dating to 195,000 years in age). The specimens that comprise this group are variously referred to as near anatomically modern humans or archaic Homo sapiens.

The second group consisted of four specimens: (1) LH 18 (dating to 120,000 years in age); (2) Skhul (dating to 115,000 years in age); (3) Qafzeh 6; and (4) Qafzeh 9 (both dating to about 115,000 years in age. This group consists of specimens typically considered to be anatomically modern humans. The third group consisted of thirteen specimens that are all considered to be anatomically and behaviorally modern humans.

Researchers discovered that the group one specimens had facial features like that of modern humans. They also had brain sizes that were similar to Neanderthals and modern humans. But their endocranial shape was unlike that of modern humans and appeared to be intermediate between H. erectus and Neanderthals.

On the other hand, the specimens from group two displayed endocranial shapes that clustered with the group three specimens and the present-day samples. In short, modern human skull morphology (and brain shape) appeared between 130,000 to 100,000 years ago.

Confluence of Evidence Locates Humanity’s Origin

This result aligns with several recent archaeological finds that place the origin of symbolism in the same window of time represented by the group two specimens. (See the Resources section for articles detailing some of these finds.) Symbolism—the capacity to represent the world and abstract ideas with symbols—appears to be an ability that is unique to modern humans and is most likely a manifestation of the modern human brain shape, specifically an enlarged parietal lobe.

Likewise, this result coheres with the most recent dates for mitochondrial Eve and Y-chromosomal Adam around 120,000 to 150,000 years ago. (Again, see the Resources section for articles detailing some of these finds.) In other words, the confluence of evidence (anatomical, behavioral, and genetic) pinpoints the origin of modern humans (us) between 150,000 to 100,000 years ago, with the appearance of modern human anatomy coinciding with the appearance of modern human behavior.

What Does This Finding Mean for the RTB Human Origins Model?

To be clear, the researchers carrying out this work interpret their results within the confines of the evolutionary framework. Therefore, they conclude that the globular skulls—characteristic of modern humans—evolved recently, only after the modern human facial structure had already appeared in archaic Homo sapiens around 300,000 years ago. They also conclude that the globular skull of modern humans had fully emerged by the time humans began to migrate around the world (around 40,000 to 50,000 years ago).

Yet, the fossil evidence doesn’t show the gradual emergence of skull globularity. Instead, modern human specimens form a distinct cluster isolated from the distinct clusters formed by H. erectus, Neanderthals, and archaic H. sapiens. There are no intermediate globular specimens between archaic and modern humans, as would be expected if this trait evolved. Alternatively, the distinct clusters are exactly as expected if modern humans were created.

It appears that the globularity of our skull distinguishes modern humans from H. erectus, Neanderthals, and archaic Homo sapiens (near anatomically modern humans). This globularity of the modern human skull has implications for when modern human behavior and advanced cognitive abilities emerged.

For this reason, I see this work as offering support for the RTB human origins creation model (and, consequently, the biblical account of human origins and the biblical conception of human nature). RTB’s model (1) views human beings as cognitively superior and distinct from other hominins, and (2) posits that human beings uniquely possess a quality called the image of God that I believe manifests as human exceptionalism.

This work supports both predictions by highlighting the uniqueness and exceptional qualities of modern humans compared to H. erectus, Neanderthals, and archaic H. sapiens, calling specific attention to our unusual skull and brain morphology. As noted, anthropologists believe that this unusual brain morphology supports our advanced cognitive capabilities—abilities that I believe reflect the image of God. Because archaic H. sapiens, Neanderthals, and H. erectus did not possess this brain morphology, it makes it unlikely that these creatures had the sophisticated cognitive capacity displayed by modern humans.

In light of RTB’s model, it is gratifying to learn that the origin of anatomically modern humans coincides with the origin of modern human behavior.

Believe it or not, our oddly shaped head is part of the scientific case that can be made for the image of God.



  1. Simon Neubauer, Jean-Jacques Hublin, and Philipp Gunz, “The Evolution of Modern Human Brain Shape,” Science Advances 4 (January 24, 2018): eaao596, doi:10.1126/sciadv.aao5961.
Reprinted with permission by the author
Original article at: