Before Genomics

James Lowe, University of Edinburgh

Genomics developed through the prospect of increasing the capacity and capabilities of sequencing ever larger stretches of DNA more quickly and cheaply. It therefore owes the possibility of its existence to a number of lines of scientific research, as well as developments in the organisation and funding of science.

The disciplinary underpinnings include genetics, biochemistry, radiobiology, computer science and informatics and statistics.

The broader developments include the development of so-called ‘big science’, the increasing use of computers in biology, networking through the internet and increasing international collaboration.

The most obvious antecedent is the discipline of genetics, which arose in the early 20th century. Encompassing a set of methods and tools for investigating the phenomenon of heredity in the living world, genetics was inspired by theoretical developments such as the rediscovery of the experiments conducted by the Silesia-born friar Gregor Mendel on plant hybrids and experimental findings such as those on ‘pure lines’ of beans by the Swede Wilhelm Johanssen. From the very beginning, genetics was driven by practical concerns, including animal and plant breeding and eugenics.

Genetics was initially associated with a Mendelian interest in the transmission of discrete units linked to differences in traits of interest such as the colour of peas. In 1905, these Mendelian Faktoren became genes in the hands of Johanssen.

A rival approach to Mendelism was posed by the biometricians, who pioneered statistical approaches and worked primarily on continuously varying traits (such as height), rather than the discontinuous or discrete traits presumed by Mendelians. In the opening decades of the 20th century,

The conflicts between Mendelians and biometricians, and Mendelians and Darwinism, were resolved as a result of a series of statistical and theoretical developments from 1918 to the 1930s. Ronald A. Fisher demonstrated the possibility of explaining continuous variation by the action of multiple Mendelian genes. Building on the work of Fisher, as well as J. B. S. Haldane and Sewall Wright, the field of population genetics that this gave rise to enabled evolution by natural selection to be accounted for by changes in the frequency of genes and gene variants (alleles).

This theoretical achievement accomplished the synthesis of the two main – Mendelian and biometric – approaches to heredity, and the resulting genetics with Darwinian evolution. This was at the cost of excluding embryology and the study of organismal development from the syntheses. Additionally, the models developed from the formerly rival approaches persisted, as they proved useful to geneticists working in more applied domains. Scientists interested in the role of genes in health and disease found the Mendelian approach of searching for particular genes valuable. Researchers aiming to develop the means to improve the efficacy of the selective breeding of livestock animals found inspiration in the models of gene action pioneered by the biometricians.

From the 1940s, biochemists joined the genetics enterprise by seeking to uncover the chemistry of gene action. At this point, the significance of DNA (deoxyribonucleic acid) had not been established, and there were multiple theories as to the material basis of heredity and genes. Bodies in the nucleus of cells that distinguished themselves by being amenable to staining by various chemicals called chromosomes had been identified as significant since the beginning of the 20th century. In 1944, an experiment conducted by Oswald Avery, Colin MacLeod, and Maclyn McCarty produced significant evidence that DNA, a key component of chromosomes, actually formed the basis of genes.

While the immediate significance of the Avery-MacLeod-McCarty experiment is contested, eventually it became accepted that DNA was the material basis of genes. Its structure was determined by Francis Crick and James D. Watson in 1953, building on multiple lines of evidence provided by (among others) Erwin Chargaff, Rosalind Franklin, Linus Pauling and Maurice Wilkins. This prompted increasingly intense work over the following decades to establish the molecular biology behind the transcription and translation of stretches of the DNA molecule to produce particular proteins which perform a myriad of structure and functional roles in the organism. Intermediary molecules such as messenger RNA were theoretically proposed and experimentally examined. Francis Crick, in a characteristically provocative way, pronounced his theory that information flows from DNA to RNA to proteins but never the other way round as the ‘Central Dogma’ of molecular biology. DNA was therefore conceptualised as a carrier of information – an informational molecule. The relationship of the four bases that make up DNA – adenine, thymine, cytosine and guanine (commonly referred to by their initials: A, T, C and G) – to the order of amino acids that make up proteins was conceived as a code. Working out that the code was triplet (based on ‘codons’ of three bases) and redundant (amino acids were typically coded for by more than one triplet) taxed the theoretical and experimental energies of multiple researchers.

The motivation for determining the sequence of bases in given parts of DNA molecules was now clear: if one knew the sequence of bases, then given the code and the ‘Central Dogma’, one could work out the order of amino acids in the protein that would be produced by the molecular machinery of the cell working on the basis of the DNA transcribed by RNA molecules. And, if one could identify the function of the protein, one could therefore identify the function of genes. This became complicated quite quickly by the realisation of the complexity of the regulation of the switching on and off of genes, the control of gene expression, as well as the existence of ‘introns’, parts of DNA that are not reflected in the composition of resultant proteins.

Nevertheless, sequencing offered the promise of being able to understand the biological basis of normal and abnormal processes in living organisms, and even to perhaps intervene in them. This prospect was enhanced in 1973 by the advent of recombinant DNA technology, which now enabled scientists to ‘cut and paste’ and then clone (copy) genes in bacteria. Given the right conditions, bacteria multiply rapidly, and therefore the ability to transfer a gene into them could enable the mass production of whatever protein coded for by the gene. Together with regulatory and legal changes and cultural shifts in scientific research, this helped to give birth to the biotechnology industry. Into the 1980s, genetics was becoming increasingly amenable to the methods and approaches of molecular biology, and one of the techniques that gathered interest was the ability to sequence.

Sequencing itself has its origins in biochemistry. See this article on the history of sequencing. For information on the role of computation and informatics in the history of sequencing, see this article. This article includes a discussion of the changing organisation of research, and this article discusses the radiobiological underpinnings of genomics.

Further reading:

A Cultural History of Heredity‘, by Staffan Müller-Wille and Hans-Jörg Rheinberger (The University of Chicago Press, 2012)

The Century of the Gene‘, by Evelyn Fox Keller (Harvard University Press, 2002)

The Gene: From Genetics to Postgenomics‘, by Hans-Jörg Rheinberger and Staffan Müller-Wille; translated by Adam Bostanci (The University of Chicago Press, 2017)


Published online: January 2018

Advertisement