Do we understand the genome well enough to let Big Pharma jump into it?

share this article

All the technology available today can entice us into believing that we can control the genome and manipulate it to our bidding. But the genome is deceptive and any hasty applications can have cascading consequences for future generations.

The human cell, the host of the genome, is a complex thing. We need a deeper understanding of the genome before we can afford the luxury of recklessly deploying it to solve real-world problems. | R.V. Moorthy

Very few questions we ask truly matter. In almost all cases, we also have the answers. However, the question “Why are we who we are?” is one we are yet to fully grasp, let alone answer. For centuries, the most intelligent of us have tried to understand how we inherit specific traits — physical characteristics such as height & weight, looks and complexion, intelligence, and the most important one of them all: personality & behavior.

The cliché “It’s all in the genes” surfaced after we had spent most of the last century analysing human characteristics and their genetic underpinning. Aided by some of the finest technological advancements, we have well and truly moved from a phase where Gregor Mendel laboriously counted individual variants in pea plants, through Watson and Crick’s seminal discovery of the DNA structure, to being able to map the entire human genome in a ridiculously short span of time. More than 75% of the knowledge we have on the genome has stemmed from research in the last 3-4 decades. But, the more we learn, the more we seem to realise that we have just about begun a seemingly endless voyage. More so in the context of behavioral genetics, scientists have begun to appreciate how the simplest foundation has led to the extraordinary complexity and interconnectedness of the genome.

 

Mendel, during his experiments with the pea plant, came up with Laws of Inheritance. Mendel's laws, however, do not apply to most traits. Although the number of disorders that can be mapped to a single gene is very high, it still represents only a fraction of the overall picture. Many characteristics such as height, intelligence, behaviour (and other personality traits), and many mental illnesses seem to follow a very complex pattern of inheritance.

Galton, the cousin of Charles Darwin, was one of the first to identify that many characteristics and traits followed a normal variation (also known as Galtonian inheritance). Unfortunately, his study of intelligence led to the widespread belief that eliminating the ‘feeble-minded’ individuals would lead to a ‘more intelligent’ population. While Galton himself did not go on to support this strongly in his lifetime, his students espoused this view and set the stage for the eugenics movement.

 

Source: Wikipedia

Sir Francis Galton, a half-cousin of Charles Darwin, was an British anthropologist who pioneered the concept of eugenics. He was also the first to study the effects of human selective mating.

Heavily influenced by the Origin of Species, Galton conducted the earliest twin studies, and also sowed the seeds for today's human fingerprint classification system.

The movement, which had a massive support base in the U.S, led to one of the worst judgments in history when the court pronounced that Carrie Buck, an 18-year old woman, be sterilised because she had a low mental age. The judge, while delivering the sentence, harshly declared “Three generations of imbeciles are enough”. The eugenics movement quickly spread to Europe, and Germany, in particular. The Nazi regime felt this was a perfect tool to signal the superiority of their race and went about eliminating every person who was deemed physically or mentally unfit. After World War II, the field of behavioral genetics, which held so much promise, was cast aside completely because of the terrible consequences endured by many in the early 20th Century. After almost 30 years of little to no research in the field, behavioral genetics experienced a timely revival. This turnaround was made possible because of twin studies.

Nearly everyone has been intrigued by twins. Are they identical? Are they not? How do you distinguish between them? Do they behave similarly? Many such questions have made the study of twins one of the most interesting topics in psychology. While rudimentary twin studies had been done earlier, there was not much that had come out of the research until a team from the University of Minnesota, led by Thomas Bouchard, began working on what would become a pioneering work in the history of behavioral genetics. The Minnesota Study of Twins Reared Apart (MISTRA) was a longitudinal study of monozygotic (identical) and dizygotic (non-identical) twins who had been reared apart.

The behaviourist movement, which supported the view that all human behaviour could be taught and that the human being is born a ‘blank slate’, had been the predominant view until the point then. MISTRA, however, pointed to a very high correlation between two monozygotic twins (MZ) for many traits — in some cases, the correlation was close to 90%. Considering that the two twins had been raised in different environments, the high degree of similarity could be attributed to the genetics (MZ twins share 100% of the genetic material). In contrast, the correlation was much lower for dizygotic twins (DZ) who share only 50% of the genetic material. The next two decades have witnessed a spurt in the number of twin studies; many of the studies have been Meta studies that combine results across numerous studies (large sample size > 20,000). Although there has been some criticism directed towards twin studies and the research approach adopted, they still remain one of the most popular and useful techniques to understand heritability (the extent to which any behavioural variation is attributable to genetics).

Siddhartha Mukherjee, in his highly accessible book, “The Gene”, discussed the debilitating mental illness of Schizophrenia and how many members of his family were diagnosed with it. The study of the genetics of schizophrenia is complicated. While a large sample size has been rendered possible by using Genome-Wide Association Studies (GWAS), the heritability is very small. Twin studies, on the other hand, suggest a high concordance (40-45%) when it comes to schizophrenia.

What does this mean then? Does it mean that we do not have a powerful enough understanding of the genome and its functioning or does it imply that we do not have the methods (and technology) to analyse the genome as well as we should? Given the supercomputing prowess of machines like the IBM Watson, it has to be said (with a tinge of sadness, admittedly) that our understanding of the genome is way behind. While the Human Genome Project has been a success in terms of our ability to quickly map out the entire genome, there is still much work to be done in terms of grasping how the deceptively simple genome works.

 

Initially, most scientists pursued the more specific candidate gene approach. By comparing the occurrence of a particular form of the gene (known as an allele) in a normal subject versus a subject who has a particular disorder, researchers tried to determine heritability. This technique, although widely used, has drawn flak because it fails the statistical test i.e. a hypothesis is proven only when it holds true across a large sample. Most scientists believe that the single-gene approach would not work for most complex traits. A great example to illustrate this is a 1993 study that links a gene on the X chromosome to homosexuality. The controversial study never quite managed to confirm the same findings in a larger population. Behavioural scientists believe that in this case, as with many other personality traits, the gene-environment interaction leads to specific traits being expressed or not.

Similarly, studies of depression and violent behaviour have also confirmed that certain mutations may predispose some individuals to behave in a particular manner depending on the environment. There may be virtually no difference in behaviour between a normal person and one who is predisposed to depression when the environment is congenial. However, when the environment is stressful, someone with the mutation is far more likely to suffer from depression than someone without the mutation.

 

The ghost of Eugenics looms over humankind to warn us of the dark phase when we swung into action under a grand assumption that we knew enough

In fact, the last few years of research have revealed something quite stunning about the genome; the 98% of the genome that was thought to be ‘Junk DNA’ has been found to have a very important role in gene regulation (expression or non-expression). The vital role of gene regulation & expression played by the ‘Junk DNA’ is being investigated in the field of epigenetics.

Where does all this leave us on the medical front? How useful are these advancements we have made in genetics when it actually comes to treatment — in both prevention and cure? Personalised medicine, which has become the buzzword for all of Big Pharma, is still nowhere close to being a reality. Yes, there are some elements that have been put in place. People with some genetic disorders who risk transmitting the genes to the next generation have been advised not to have children. In cases of infants diagnosed with phenylketonuria (PKU), doctors have been able to recommend a phenylalanine-free diet and ensure that the child develops normally (phenylalanine intake leads to mental retardation).

The ‘one size fits all’ approach to medicine no longer works and pharmaceutical companies are fully aware that personalised medicine is the way forward. In the age of data and more data, every major pharma company is looking to access as much patient data as possible to read, analyse, and make better decisions when it comes to research, development and marketing of any new drug. Companies such as PatientsLikeMe which provide a massive database of patient data (age, prescriptions, dosage, symptoms, side effects etc.) have most of Big Pharma as their clients.

 

Science fiction has always embraced the idea of genetically engineering a human being. While there have been warning signs that this could lead to the creation of a monster, as in the case of movies like Frankenstein, there is still an overwhelming hope that we could one day repair the genome and heal the sick or — better still — design and create a human being from a single cell. Gene therapy, unfortunately, has turned out to be far more challenging that anyone envisioned. Numerous failed attempts and a couple of botched efforts over the years have meant that the successful gene therapy treatments of Ashanti DeSilva and Cindy Kisik in the early 1990s have been more or less forgotten. In recent years, however, the invention of CRISPR-Cas9, the fantastic genome editing technology, has given the scientific community renewed hope of being able to crack the conundrum that the genome is. There is a great deal of concern regarding the use of CRISPR though; clinical researchers have agreed that CRISPR should conform to ethical and legal guidelines and not be used to experiment on germline cells as it may lead to serious ramifications for further generations.

With all the technology available, it is very easy for us to get ahead of ourselves and imagine that we can ‘control’ what the genome does. However, it might be far more prudent for us to spend more time and resources in trying to first understand the genome and its functioning, given its exceptionally high degree of complexity. The humongous interest in the space, the vast amounts of research money, the finest facilities, and the rapid developments in machine learning, Big Data & AI can all be utilised best if we muster the necessary humility and willingness to learn. The ghost of Eugenics looms all over to warn humankind of a dark phase when we swung into action under a grand assumption that we knew enough.

share this article
This article is closed for comments.
Please Email the Editor