Neuroscientists Transfer “Memories” from One Snail to Another: A Christian Perspective on Engrams



Scientists from UCLA recently conducted some rather bizarre experiments. For me, it’s these types of things that make it so much fun to be a scientist.

Biologists transferred memories from one sea slug to another by extracting RNA from the nervous system of a trained sea slug and then injecting the extract into an untrained sea slug.1 After the injection, the untrained sea snails responded to environmental stimuli just like the trained ones, based on false memories created by the transfer of biomolecules.

Why would researchers do such a thing? Even though it might seem like their motives were nefarious, they weren’t inspired to carry out these studies by Dr. Frankenstein or Dr. Moreau. Instead, they had really good reasons for performing these experiments: they wanted to gain insight into the physical basis of memory.

How are memories encoded? How are they stored in the brain? And how are memories retrieved? These are some of the fundamental scientific questions that interest researchers who work in cognitive neuroscience. It turns out that sea slugs belonging to the group Aplysia(commonly referred to as sea hares) make ideal organisms to study in order to address these questions. The fact that we can gain insight into how memories are stored with sea slugs is mind-blowing and indicates to me (as a Christian and a biochemist) that biological systems have been designed for discovery.

Sea Hares

Sea hares have become the workhorses of cognitive neuroscience. This creature has a nervous system that’s complex enough to allow neuroscientists to study reflexes and learned behaviors, but simple enough that they can draw meaningful conclusions from their experiments. (By way of comparison, members of Aplysia have about 20,000 neurons in their nervous systems compared to humans who have 85 billion neurons in our brains alone.)

Toward this end, neuroscientists took advantage of a useful reflexive behavior displayed by sea hares, called gill and siphon withdrawal. When these creatures are disturbed, they rapidly withdraw their delicate gill and siphon.

The nervous system of these creatures can also experience sensitization, which is learned by repeated exposure to stimuli, resulting in an enhanced and broad response by the nervous system to stimuli that are related—say, stimuli that connote danger.

What Causes Memories?

Sensitization is a learned response that is possible because memories have been encoded and stored in the sea hares’ nervous system. But how is this memory stored?

Many neuroscientists think that the physical instantiation of memories (called engrams) reside in the synaptic connections between nerve cells (neurons). Other neuroscientists hold a differing view. Instead of being mediated by cell-cell interactions, others think that engrams form within the interior of neurons, through biochemical events that take place within the cell nucleus. In fact, some studies have implicated RNA molecules in memory formation and storage.2 The UCLA researchers sought to determine if RNA plays a role in memory formation.

Memory Transfer from One Sea Hare to Another

To test this hypothesis, the researchers sensitized sea hares to painful stimuli. They accomplished this feat by inserting an electrode in the tail regions of several sea hares and delivering a shock. The shock caused the sea hares to withdraw their gill and siphon. After 20 minutes, they repeated the shock protocol and continued to do so in 20-minute intervals five more times. Twenty-four hours later, they repeated the shock protocol. By this point, the sea hare test subjects were sensitized to threatening stimuli. When touched, the trained sea hares would withdraw their gill and siphon for nearly 1 minute. Untrained sea hares (who weren’t subjected to the shock protocol) would withdraw their gill and siphon when touched for only about 1 second.

Next, the researchers sacrificed the sensitized sea hares and isolated RNA from their nervous system. Then they injected the RNA extracts into the hemocoel of untrained sea hares. When touched, the sea hares withdrew their gill and siphon for about 45 seconds.

To confirm that this response was not due to the injection procedure, they repeated it by injecting RNA extracted from the nervous system of an untrained sea hare into untrained sea hares. When touched, the gill and siphon withdrawal reflex lasted only about 1 second.


Figure: Sea Hare Stimulus Protocol. Image credit: Alexis Bédécarrats, Shanping Chen, Kaycey Pearce, Diancai Cai, and David L. Glanzman, eNeuro 14 May 2018, 5 (3) ENEURO.0038-18.2018; doi:10.1523/ENEURO.0038-18.2018.

The researchers then applied the RNA extracts from both trained and untrained sea hares to sensory neurons grown in the lab. The RNA extracts from the trained sea hares caused the sensory neurons to display heightened activity. Conversely, the RNA extracts from the untrained sea hares had no effect on the activity of the cultured sensory neurons.

Finally, the researchers added compounds called methylase inhibitors to the RNA extracts before injecting them into untrained sea hares. These inhibitors blocked the memory transfer. This result indicates that epigenetic modifications of DNA mediated by RNA molecules play a role in forming engrams.

Based on these results, it appears that RNA mediates the formation and storage of memories. And, though the research team does not know which class of RNAs play a role in the formation of engrams, they suspect that micro RNAs may be the biochemical actors.

Biomedical Implications

Now that the UCLA researchers have identified RNA and epigenetic modifications of DNA as central to the formation of engrams, they believe that it might one day be possible to develop biomedical procedures that could treat memory loss that occurs with old age or with diseases such as Alzheimer’s and dementia. Toward this end, it is particularly encouraging that the researchers could transfer memories from one sea hare to another. This insight might even lead to therapies that would erase horrific memories.

Of course, this raises questions about human nature—specifically, the relationship between the brain and mind. For many people, the fact that there is a physical basis for memories suggests that our mind is indistinguishable from the activities taking place within our brains. To put it differently, many people would reject the idea that our mind is a nonphysical substance, based on the discovery of engrams.

Engrams, Brain, and Mind

However, I would contend that if we adopt the appropriate mind-body model, it is possible to preserve the concept of the mind as a nonphysical entity distinct from the brain even if engrams are a reality. A model I find helpful is based on a computer hardware/software analogy. Accordingly, the brain is the hardware that manifests the mind’s activity. Meanwhile, the mind is analogous to the software programming. According to this model, hardware structures—brain regions—support the expression of the mind, the software.

A computer system needs both the hardware and software to function properly. Without the hardware, the software is just a set of instructions. For those instructions to take effect, the software must be loaded into the hardware. It is interesting that data accessed by software is stored in the computer’s hardware. So, why wouldn’t the same be true for the human brain?

We need to be careful not to take this analogy too far. However, from my perspective, it illustrates how it is possible for memories to be engrams while preserving the mind as a nonphysical, distinct entity.

Designed for Discovery

The significance of this discovery extends beyond the mind-brain problem. It’s provocative that the biology of a creature such as the sea hare could provide such important insight into human biology.

This is possible only because of the universal nature of biological systems. All life on Earth shares the same biochemistry. All life is made up of the same type of cells. Animals possess similar anatomical and physiological systems.

Most biologists today view these shared features as evidence for an evolutionary history of life. Yet, as a creationist and an intelligent design proponent, I interpret the universal nature of the cell’s chemistry and shared features of biological systems as manifestations of archetypical designs that emanate from the Creator’s mind. To put it another way, I regard the shared features of biological systems as evidence for common design, not common descent.

This view leads to the follow-up rebuttal: Why would God create using the same template? Why not create each biochemical system from scratch to be ideally suited for its function? There may be several reasons why a Creator would design living systems around a common set of templates. In my estimation, one of the most significant reasons is discoverability. The shared features of biochemical and biological systems make it possible to apply what we learn by studying one organism to all others. Without life’s shared features, the discipline of biology wouldn’t exist.

This discoverability makes it easier to appreciate God’s glory and grandeur, as evinced by the elegance, sophistication, and ingenuity in biochemical and biological systems. Discoverability of biochemical systems also reflects God’s providence and care for humanity. If not for the shared features, it would be nearly impossible for us to learn enough about the living realm for our benefit. Where would biomedical science be without the ability to learn fundamental aspects of our biology by studying model organisms such as yeast, fruit flies, mice—and sea hares?

The shared features in the living realm are a manifestation of the Creator’s care and love for humanity. And there is nothing bizarre about that.



  1. Alexis Bédécarrats et al., “RNA from Trained Aplysia Can Induce an Epigenetic Engram for Long-Term Sensitization in Untrained Aplysia,” eNeuro 5 (May/June 2018): e0038-18.2018, 1–11, doi:10.1523/ENEURO.0038-18.2018.
  2. For example, see Germain U. Busto et al., “microRNAs That Promote Or Inhibit Memory Formation in Drosophila melanogaster,” Genetics 200 (June 1, 2015): 569–80, doi:10.1534/genetics.114.169623.
Reprinted with permission by the author
Original article at:

Differences in Human and Neanderthal Brains Explain Human Exceptionalism



When I was a little kid, my mom went through an Agatha Christie phase. She was a huge fan of the murder mystery writer and she read all of Christie’s books.

Agatha Christie was caught up in a real-life mystery of her own when she disappeared for 10 days in December 1926 under highly suspicious circumstances. Her car was found near her home, close to the edge of a cliff. But, she was nowhere to be found. It looked as if she disappeared without a trace, without any explanation. Eleven days after her disappearance, she turned up in a hotel room registered under an alias.

Christie never offered an explanation for her disappearance. To this day, it remains an enduring mystery. Some think it was a callous publicity stunt. Some say she suffered a nervous breakdown. Others think she suffered from amnesia. Some people suggest more sinister reasons. Perhaps, she was suicidal. Or maybe she was trying to frame her husband and his mistress for her murder.

Perhaps we will never know.

Like Christie’s fictional detectives Hercule Poirot and Miss Marple, paleoanthropologists are every bit as eager to solve a mysterious disappearance of their own. They want to know why Neanderthals vanished from the face of the earth. And what role did human beings (Homo sapiens) play in the Neanderthal disappearance, if any? Did we kill off these creatures? Did we outcompete them or did Neanderthals just die off on their own?

Anthropologists have proposed various scenarios to account for the Neanderthals’ disappearance. Some paleoanthropologists think that differences in the cognitive capabilities of modern humans and Neanderthals help explain the creatures’ extinction. According to this model, superior reasoning abilities allowed humans to thrive while Neanderthals faced inevitable extinction. As a consequence, we replaced Neanderthals in the Middle East, Europe, and Asia when we first migrated to these parts of the world.

Computational Neuroanatomy

Innovative work by researchers from Japan offers support for this scenario.1 Using a technique called computational neuroanatomy, researchers reconstructed the brain shape of Neanderthals and modern humans from the fossil record. In their study, the researchers used four Neanderthal specimens:

  • Amud 1 (50,000 to 70,000 years in age)
  • La Chapelle-aux Saints 1 (47,000 to 56,000 years in age)
  • La Ferrassie 1 (43,000 to 45,000 years in age)
  • Forbes’ Quarry 1 (no age dates)

They also worked with four Homo sapiens specimens:

  • Qafzeh 9 (90,000 to 120,000 years in age)
  • Skhūl 5 (100,000 to 135,000 years in age
  • Mladeč 1 (35,000 years in age)
  • Cro-Magnon 1 (32,000 years in age)

Researchers used computed tomography scans to construct virtual endocasts (cranial cavity casts) of the fossil brains. After generating endocasts, the team determined the 3D brain structure of the fossil specimens by deforming the 3D structure of the average human brain so that it fit into the fossil crania and conformed to the endocasts.

This technique appears to be valid, based on control studies carried out on chimpanzee and bonobo brains. Using computational neuroanatomy, researchers can deform a chimpanzee brain to accurately yield the bonobo brain, and vice versa.

Brain Differences, Cognitive Differences

The Japanese team learned that the chief difference between human and Neanderthal brains is the size and shape of the cerebellum. The cerebellar hemisphere is projected more toward the interior in the human brain than in the Neanderthal brain and the volume of the human cerebellum is larger. Researchers also noticed that the right side of the Neanderthal cerebellum is significantly smaller than the left side—a phenomenon called volumetric laterality. This discrepancy doesn’t exist in the human brain. Finally, the Japanese researchers observed that the parietal regions in the human brain were larger than those regions in Neanderthals’ brains.

Image credit: Shutterstock


Because of these brain differences, the researchers argue that humans were socially and cognitively more sophisticated than Neanderthals. Neuroscientists have discovered that the cerebellum helps motor functions and higher cognition by contributing to language function, working memory, thought, and social abilities. Hence, the researchers argue that the reduced size of the right cerebellar hemisphere in Neanderthals limits the connection to the prefrontal regions—a connection critical for language processing. Neuroscientists have also discovered that the parietal lobe plays a role in visuo-spatial imagery, episodic memory, self-related mental representations, coordination between self and external spaces, and sense of agency.

On the basis of this study, it seems that humans either outcompeted Neanderthals for limited resources—driving them to extinction—or simply were better suited to survive than Neanderthals because of superior mental capabilities. Or perhaps their demise occurred for more sinister reasons. Maybe we used our sophisticated reasoning skills to kill off these creatures.

Did Neanderthals Make Art, Music, Jewelry, etc.?

Recently, a flurry of reports has appeared in the scientific literature claiming that Neanderthals possessed the capacity for language and the ability to make art, music, and jewelry. Other studies claim that Neanderthals ritualistically buried their dead, mastered fire, and used plants medicinally. All of these claims rest on highly speculative interpretations of the archaeological record. In fact, other studies present evidence that refutes every one of these claims (see Resources).

Comparisons of human and Neanderthal brain morphology and size become increasingly important in the midst of this controversy. This recent study—along with previous work (go here and here)—indicates that Neanderthals did not have the brain architecture and, hence, cognitive capacity to communicate symbolically through language, art, music, and body ornamentation. Nor did they have the brain capacity to engage in complex social interactions. In short, Neanderthal brain anatomy does not support any interpretation of the archaeological record that attributes advanced cognitive abilities to these creatures.

While this study provides important clues about the disappearance of Neanderthals, we still don’t know why they went extinct. Nor do we know any of the mysterious details surrounding their demise as a species.

Perhaps we will never know.

But we do know that in terms of our cognitive and social capacities, human beings stand apart from Neanderthals and all other creatures. Human brain biology and behavior render us exceptional, one-of-a-kind, in ways consistent with the image of God.



  1. Takanori Kochiyama et al., “Reconstructing the Neanderthal Brain Using Computational Anatomy,” Science Reports 8 (April 26, 2018): 6296, doi:10.1038/s41598-018-24331-0.
Reprinted with permission by the author
Original article at:

Yeast Gene Editing Study Raises Questions about the Evolutionary Origin of Human Chromosome 2



As a biochemist and a skeptic of the evolutionary paradigm, people often ask me two interrelated questions:

  1. What do you think are the greatest scientific challenges to the evolutionary paradigm?
  2. How do you respond to all the compelling evidence for biological evolution?

When it comes to the second question, people almost always ask about the genetic similarity between humans and chimpanzees. Unexpectedly, new research on gene editing in brewer’s yeast helps answer these questions more definitively than ever.

For many people, the genetic comparisons between the two species convince them that human evolution is true. Presumably, the shared genetic features in the human and chimpanzee genomes reflect the species’ shared evolutionary ancestry.

One high-profile example of these similarities is the structural features human chromosome 2 shares with two chimpanzee chromosomes labeled chromosome 2A and chromosome 2B. When the two chimpanzee chromosomes are placed end to end, they look remarkably like human chromosome 2. Evolutionary biologists interpret this genetic similarity as evidence that human chromosome 2 arose when chromosome 2A and chromosome 2B underwent an end-to-end fusion. They claim that this fusion took place in the human evolutionary lineage at some point after it separated from the lineage that led to chimpanzees and bonobos. Therefore, the similarity in these chromosomes provides strong evidence that humans and chimpanzees share an evolutionary ancestry.


Figure 1: Human and Chimpanzee Chromosomes Compared

Image credit: Who Was Adam? (Covina, CA: RTB Press, 2015), p. 210.

Yet, new work by two separate teams of synthetic biologists from the United States and China, respectively, raises questions about this evolutionary scenario. Working independently, both research teams devised similar gene editing techniques that, in turn, they used to fuse the chromosomes in the yeast species, Saccharomyces cerevisiae (brewer’s yeast).Their work demonstrates the central role intelligent agency must play in end-on-end chromosome fusion, thereby countering the evolutionary explanation while supporting a creation model interpretation of human chromosome 2.

The Structure of Human Chromosome 2

Chromosomes are large structures visible in the nucleus during the cell division process. These structures consist of DNA combined with proteins to form the chromosome’s highly condensed, hierarchical architecture.

yeast-gene-editing-study-2Figure 2: Chromosome Structure

Image credit: Shutterstock

Each species has a characteristic number of chromosomes that differ in size and shape. For example, humans have 46 chromosomes (23 pairs); chimpanzees and other apes have 48 (24 pairs).

When exposed to certain dyes, chromosomes stain. This staining process produces a pattern of bands along the length of the chromosome that is diagnostic. The bands vary in number, location, thickness, and intensity. And the unique banding profile of each chromosome helps geneticists identify them under a microscope.

In the early 1980s, evolutionary biologists compared the chromosomes of humans, chimpanzees, gorillas, and orangutans for the first time.2 These studies revealed an exceptional degree of similarity between human and chimp chromosomes. When aligned, the human and corresponding chimpanzee chromosomes display near-identical banding patterns, band locations, band size, and band stain intensity. To evolutionary biologists, this resemblance reveals powerful evidence for human and chimpanzee shared ancestry.

The most noticeable difference between human and chimp chromosomes is the quantity: 46 for humans and 48 for chimpanzees. As I pointed out, evolutionary biologists account for this difference by suggesting that two chimp chromosomes (2A and 2B) fused. This fusion event would have reduced the number of chromosome pairs from 24 to 23, and the chromosome number from 48 to 46.

As noted, evidence for this fusion comes from the close similarity of the banding patterns for human chromosome 2 and chimp chromosomes 2A and 2B when the two are oriented end on end. The case for fusion also gains support by the presence of: (1) two centromeres in human chromosome 2, one functional, the other inactive; and (2) an internal telomeresequence within human chromosome 2.3 The location of the two centromeres and internal telomere sequences corresponds to the expected locations if, indeed, human chromosome 2 arose as a fusion event.4

Evidence for Evolution or Creation?

Even though human chromosome 2 looks like it is a fusion product, it seems unlikely to me that its genesis resulted from undirected natural processes. Instead, I would argue that a Creator intervened to create human chromosome 2 because combining chromosomes 2A and 2B end to end to form it would have required a succession of highly improbable events.

I describe the challenges to the evolutionary explanation in some detail in a previous article:

  • End-to-end fusion of two chromosomes at the telomeres faces nearly insurmountable hurdles.
  • And, if somehow the fusion did occur, it would alter the number of chromosomes and lead to one of three possible scenarios: (1) nonviable offspring, (2) viable offspring that suffers from a diseased state, or (3) viable but infertile offspring. Each of these scenarios would prevent the fused chromosome from entering and becoming entrenched in the human gene pool.
  • Finally, if chromosome fusion took place and if the fused chromosome could be passed on to offspring, the event would have had to create such a large evolutionary advantage that it would rapidly sweep through the population, becoming fixed.

This succession of highly unlikely events makes more sense, from my vantage point, if we view the structure of human chromosome 2 as the handiwork of a Creator instead of the outworking of evolutionary processes. But why would these chromosomes appear to be so similar, if they were created? As I discuss elsewhere, I think the similarity between human and chimpanzee chromosomes reflects shared design, not shared evolutionary ancestry. (For more details, see my article “Chromosome 2: The Best Evidence for Evolution?”)

Yeast Chromosome Studies Offer Insight

Recent work by two independent teams of synthetic biologists from the US and China corroborates my critique of the evolutionary explanation for human chromosome 2. Working within the context of the evolutionary framework, both teams were interested in understanding the influence that chromosome number and organization have on an organism’s biology and how chromosome fusion shapes evolutionary history. To pursue this insight, both research groups carried out similar experiments using CRISPR/Cas9 gene editing to reduce the number of chromosomes in brewer’s yeast from 16 to 1 (for the Chinese team) and from 16 to 2 (for the team from the US) through a succession of fusion events.

Both teams reduced the number of chromosomes in stages by fusing pairs of chromosomes. The first attempt reduced the number from 16 to 8. In the next round they fused pairs of the newly created chromosome to reduce the number from 8 to 4, and so on.

To their surprise, the yeast seemed to tolerate this radical genome editing quite well—although their growth rate slowed and the yeast failed to thrive under certain laboratory conditions. Gene expression was altered in the modified yeast genomes, but only for a few genes. Most of the 5,800 genes in the brewer’s yeast genome were normally expressed, compared to the wild-type strain.

For synthetic biology, this work is a milestone. It currently stands as one of the most radical genome reconfigurations ever achieved. This discovery creates an exciting new research tool to address fundamental questions about chromosome biology. It also may have important applications in biotechnology.

The experiment also ranks as a milestone for the RTB human origins creation model because it helps address questions about the origin of human chromosome 2. Specifically, the work with brewer’s yeast provides empirical evidence that human chromosome 2 must have been shaped by an Intelligent Agent. This research also reinforces my concerns about the capacity of evolutionary mechanisms to generate human chromosome 2 via the fusion of chimpanzee chromosomes 2A and 2B.

Chromosome fusion demonstrates the critical role intelligent agency plays.

Both research teams had to carefully design the gene editing system they used so that it would precisely delete two distinct regions in the chromosomes. This process affected end-on-end chromosome fusions in a way that would allow the yeast cells to survive. Specifically, they had to delete regions of the chromosomes near the telomeres, including the highly repetitive telomere-associated sequences. While they carried out this deletion, they carefully avoided deleting DNA sequences near the telomeres that harbored genes. They also simultaneously deleted one of the centromeres of the fused chromosomes to ensure that the fused chromosome would properly replicate and segregate during cell division. Finally, they had to make sure that when the two chromosomes fused, the remaining centromere was positioned near the center of the resulting chromosome.

In addition to the high-precision gene editing, they had to carefully construct the sequence of donor DNA that accompanied the CRISPR/Cas9 gene editing package so that the chromosomes with the deleted telomeres could be directed to fuse end on end. Without the donor DNA, the fusion would have been haphazard.

In other words, to fuse the chromosomes so that the yeast survived, the research teams needed a detailed understanding of chromosome structure and biology and a strategy to use this knowledge to design precise gene editing protocols. Such planning would ensure that chromosome fusion occurred without the loss of key genetic information and without disrupting key processes such as DNA replication and chromosome segregation during cell division. The researchers’ painstaking effort is a far cry from the unguided, undirected, haphazard events that evolutionary biologists think caused the end-on-end chromosome fusion that created human chromosome 2. In fact, given the high-precision gene editing required to create fused chromosomes, it is hard to envision how evolutionary processes could ever produce a functional fused chromosome.

A discovery by both research teams further complicates the evolutionary explanation for the origin of human chromosome 2. Namely, the yeast cells could not replicate unless the centromere of one of the chromosomes was deleted at the time the chromosomes fused. The researchers learned that if this step was omitted, the fused chromosomes weren’t stable. Because centromeres serve as the point of attachment for the mitotic spindle, if a chromosome possesses two centromeres, mistakes occur in the chromosome segregation step during cell division.

It is interesting that human chromosome 2 has two centromeres but one of them has been inactivated. (In the evolutionary scenario, this inactivation would have happened through a series of mutations in the centromeric DNA sequences that accrued over time.) However, if human chromosome 2 resulted from the fusion of two chimpanzee chromosomes, the initial fusion product would have possessed two centromeres, both functional. In the evolutionary scenario, it would have taken millennia for one of the chromosomes to become inactivated. Yet, the yeast studies indicate that centromere loss must take place simultaneously with end-to-end fusion. However, based on the nature of evolutionary mechanisms, it cannot.

Chromosome fusion in yeast leads to a loss of fitness.

Perhaps one of the most remarkable outcomes of this work is the discovery that the yeast cells lived after undergoing that many successive chromosome fusions. In fact, experts in synthetic biology such as Gianni Liti (who commented on this work for Nature), expressed surprise that the yeast survived this radical genome restructuring.5

Though both research teams claimed that the fusion had little effect on the fitness of the yeast, the data suggests otherwise. The yeast cells with the fused chromosomes grew more slowly than wild-type cells and struggled to grow under certain culture conditions. In fact, when the Chinese research team cultured the yeast with the single fused chromosome with the wild-type strain, the wild-type yeast cells out-competed the cells with the fused chromosome.

Although researchers observed changes in gene expression only for a small number of genes, this result appears to be a bit misleading. The genes with changed expression patterns are normally located near telomeres. The activity of these genes is normally turned down low because they usually are needed only under specific growth conditions. But with the removal of telomeres in the fused chromosomes, these genes are no longer properly regulated; in fact, they may be over-expressed. And, as a consequence of chromosome fusion, some genes that normally reside at a distance from telomeres find themselves close to telomeres, leading to reduced activity.

This altered gene expression pattern helps explains the slower growth rate of the yeast strain with fused chromosomes and the yeast cells’ difficulty to grow under certain conditions. The finding also raises more questions about the evolutionary scenario for the origin of human chromosome 2. Based on the yeast studies, it seems reasonable to think that the end-to-end fusion of chromosomes 2A and 2B would have reduced the fitness of the offspring that first inherited the fused chromosome 2, making it less likely that the fusion would have taken hold in the human gene pool.

Chromosome fusion in yeast leads to a loss of fertility.

Normally, yeast cells reproduce asexually. But they can also reproduce sexually. When yeast cells mate, they fuse. As a result of this fusion event, the resulting cell has two sets of chromosomes. In this state, the yeast cells can divide or form spores. In many respects, the sexual reproduction of yeast cels resembles the sexual reproduction in humans, in which egg and sperm cells, each with one set of chromosomes, fuse to form a zygote with two sets of chromosomes.


Figure 3: Yeast Cell Reproduction

Image credit: Shutterstock

Both research groups discovered that genetically engineered yeast cells with fused chromosomes could mate and form spores, but spore viability was lower than for wild-type yeast.

They also discovered that after the first round of chromosome fusion when the genetically engineered yeast possessed 8 chromosomes, mating normal yeast cells with those harboring fused chromosomes resulted in low fertility. When wild-type yeast cells were mated with yeast strains that had been subjected to additional rounds of chromosome fusion, spore formation failed altogether.

The synthetic biologists find this result encouraging because it means that if they use yeast with fused chromosomes for biotechnology applications, there is little chance that the genetically engineered yeast will mate with wild-type yeast. In other words, the loss of fertility serves as a safeguard.

However, this loss of fertility does not bode well for evolutionary explanations for the origin of human chromosome 2. The yeast studies indicate that chromosome fusion leads to a loss of fertility because of the mismatch in chromosome number, which makes it difficult for chromosomes to align and properly segregate during cell division. So, why wouldn’t this loss of fertility happen if chromosomes 2A and 2B fuse?


Figure 4: Cell Division

Image credit: Shutterstock

In short, the theoretical concerns I expressed about the evolutionary origin of human chromosome 2 find experimental support in the yeast studies. And the indisputable role intelligent agency plays in designing and executing the protocols to fuse yeast chromosomes provides empirical evidence that a Creator must have intervened in some capacity to design human chromosome 2.

Of course, there are a number of outstanding questions that remain for a creation model interpretation of human chromosome 2, including:

  • Why would a Creator seemingly fuse together two chromosomes to create human chromosome 2?
  • Why does this chromosome possess internal telomere sequences?
  • Why does human chromosome 2 harbor seemingly nonfunctional centromere sequences?

We predict that as we learn more about the biology of human chromosome 2, we will discover a compelling rationale for the structural features of this chromosome, in a way that befits a Creator.

But, at this juncture the fusion of yeast chromosomes in the lab makes it hard to think that unguided evolutionary processes could ever successfully fuse two chromosomes, including human chromosome 2, end on end. Creation appears to make more sense.



  1. Jingchuan Luo et al., “Karyotype Engineering by Chromosome Fusion Leads to Reproductive Isolation in Yeast,” Nature 560 (2018): 392–96, doi:10.1038/s41586-018-0374-x; Yangyang Shao et al., “Creating a Functional Single-Chromosome Yeast,” Nature 560 (2018): 331–35, doi:10.1038/s41586-018-0382-x.
  2. Jorge J. Yunis, J. R. Sawyer, and K. Dunham, “The Striking Resemblance of High-Resolution G-Banded Chromosomes of Man and Chimpanzee,” Science 208 (1980): 1145–48, doi:10.1126/science.7375922; Jorge J. Yunis and Om Prakash, “The Origin of Man: A Chromosomal Pictorial Legacy,” Science 215 (1982): 1525–30, doi:10.1126/science.7063861.
  3. The centromere is a region of the DNA molecule near the center of the chromosome that serves as the point of attachment for the mitotic spindle during the cell division process. Telomeres are DNA sequences located at the tip ends of chromosomes designed to stabilize the chromosome and prevent it from undergoing degradation.
  4. J. W. Ijdo et al., “Origin of Human Chromosome 2: An Ancestral Telomere-Telomere Fusion,” Proceedings of the National Academy of Sciences USA 88 (1991): 9051–55, doi:10.1073/pnas.88.20.9051; Rosamaria Avarello et al., “Evidence for an Ancestral Alphoid Domain on the Long Arm of Human Chromosome 2,” Human Genetics 89 (1992): 247–49, doi:10.1007/BF00217134.
  5. Gianni Liti, “Yeast Chromosome Numbers Minimized Using Genome Editing,” Nature 560 (August 1, 2018): 317–18, doi:10.1038/d41586-018-05309-4.
Reprinted with permission by the author
Original article at: