"In the early days, computers inspired widespread awe and the popular press dubbed them giant brains. In fact, the computer's power resembled that of a bulldozer; it did not harness subtlety, though subtlety went into its design. It did mainly bookkeeping and math, by rote procedures, and it did them far more quickly than they had ever been done before. But computers were relatively scarce, and they were large and very expensive. Typically, one big machine served an entire organization. Often it lay behind a plate glass window, people in white gowns attending it, and those who wished to use it did so through intermediaries. Users were like supplicants. The process could be annoying".
This opening to Tracey Kidder's The Soul of a New Machine caused me to realize another great component of my belief system, 30 years of experience in all of the phases of information system design and development. Thirty years of watching my first words scratched on the top of a yellow legal pad proselytizing themselves across a corporate seascape to transform the way that things have been done in the past into the way they are to be done in the future. This can hardly be compared in significance to the work of God or evolution, but it does repeat in microcosm some of the steps of those larger and longer projects.
Machines, and people acting like machines, have replaced a good deal of human thought, judgment, and recognition. Very few know how this system works, so for everyone else, it is like a mystical oracle, producing an unpredictable judgment. Mechanical, determinate processes are producing clever astonishing decisions. Until about the year I was born, the only implementation of such a universal "machine" was thought to be in the human brain. Directly over the ear, in a place that you can almost cover with your thumb lies the most important machine component of all, the place where we remember and handle words. At the bottom of this word spot, we remember how words sound. An inch further up and toward the back, we remember how words look in print. A little further up and forward lies the 'speech center' from which, when we want to talk, we direct the tongue and lips what to say. Thus we get our word-hearing, our word-seeing, and our word-speaking centers close together, so that when we speak we have close by and handy our memory of what we have heard in words, and of what we have read.
For those people who build mechanical implementations of this general purpose machine, we do not mean that the components of this machine should resemble the components of this 'wetware' or that their connections should imitate the manner in which the regions of the brain are connected. That the brain stores words, pictures, skills in some definite way, connected with input signals from the senses and output signals to the muscles, was almost all we needed. The signalling and the organization was all there was.This put the proper emphasis not on the internal workings of the brain, but upon the explicit instructions that a human worker could follow blindly. Within a few years after my birth a cornucopian series of 'instructional notes' based and deciphered on the new machines were being worked out by people like me who act like machines to write elaborately coded instructional notes.
In the system development process these initial words and algorithms expand and translate into subroutines which then become modules, which clump to become programs, which converge to become subsystems, interfaces, messages, and finally functionally integrated systems. Observed from afar, this system may inspire widespread awe and as time passes its creation may be described in miraculous terms. The initial scratch pads of its words are thrown away and new members of the corporate staff are introduced to the functions of the system through a descriptive animation including colorful surface images concealing the tangled and somewhat intimidating spider web of its underlying code.
In the beginning was the "Logos" says the gospel of St. John. And even the Logos is not a basic building block because it is made up of some combination of letters from its related alphabet. The letters of the alphabet are, in turn, composed of some binary code of black and white dots which we now reference as an eight bit byte with 64 possibilities called ASCII. Each of the human survival machines that I encounter each day contains 100 trillion (million million) cells, most of which are less than a tenth of a millimeter across. Inside of each cell there is a black blob called a nucleus. Inside the nucleus are two complete sets of the human genome (except in egg cells and sperm cells, which have one copy each, and red blood cells which have none). One set of the genome came from the mother and one from the father. In principle, each set includes the same twenty-three chromosomes. In practice, there are often small and subtle differences between the paternal and maternal versions of each gene, differences that account for blue eyes or brown, for example. When we breed, we pass on one complete set, but only after swapping bits of the paternal and maternal chromosomes in a procedure known as recombination.
There are times when I am swept away by the romantic surfaces of these amazingly complex human survival machines (moist robots) and by the things they create with their hands and minds, but on occasion I also view them with a classic deconstructionist's eye and speculate as to how long it would take to design and construct a pure machine equivalent. Some moist robots seem to be very easy to functionally decompose so that the pure machine replica could pass the Turing Test of supplying answers resembling in complexity those given by the original. Others like Einstein, Hawking, and Picasso would be more difficult and would take longer periods to analyze and program. With minor adjustments, IBM's Watson which was able to lay waste to his moist robot opponents on the Jeopardy show, would be able to pass the Turing Test for most of us walking encyclopedias who have had little or no experience in the rough and tumble world of real human survival. Classicists, in turn, would reply that romantics would be easiest to replace with pure machines because romantics are merely collectors of sensory impressions without regard to underlying form and could be replaced with the simplest of processors attached to sophisticated sensory devices for seeing, touching, hearing, smelling, and tasting the environment. To paraphrase an expression used in the military, the romantic salutes the shiny and repaints that which is not. We pay homage to the beautiful and gloss that which does not meet our romantic modeling expectations.
As I have explained ad nauseum, our reality started from a point of infinite density known as the Big Bang about 13.8 billion years ago, expanded extremely rapidly for a fraction of a second, and has then continued to expand much more slowly ever since. Elemental particles cooled and coalesced into atoms which survived through time by becoming as stable as possible by binding to other atoms through chance encounters. Through binding, atoms enter their lowest possible energy configuration and contain the highest information content able to be captured in their bond. Once formed, this process is statistically irreversable, unless a large amount of energy is applied from outside the system in a thermoinfocomplexity reaction. Much later in time these atoms continued to coalesce into stars, planets, inorganic structures, more complex and more highly replicative organic structures, and ultimately complex adaptive systems currently topped on this planet by homo sapien sapiens.
My systems development years led me to the examination of moist robots with the same eye for the devil in the details. I see the Genome as the human survival machine's description manual with twenty-three chapters called Chromosomes. Each chapter covers several thousand modules called Genes. Each gene module is made up of paragraphs called Exons, which are interrupted by popup sponsor advertisements called Introns. Each paragraph is composed of words, called Codons, and each word is written in letters called Bases. Whereas my business system's description manuals were written in English words of varying length using 26 letters, genomes are written entirely in three-letter words using only four letters: A, C, G, and T (which stand respectively for Adenine, Cytosine, Guanine, and Thymine). And instead of being written in ASCII on flat pages of paper, they are written on long chains of sugar and phosphate called DNA molecules to which the bases are attached as side rings. Each chromosome is one pair of very long DNA molecules.
Much of the tedium of an information systems career follows the installation or birth of a system. The post installation maintenance begins with the copying, translation, and distribution of the system's description manual so that its content is available to the lower level minions responsible for maintenance and enhancement. The genome is a very clever system's description manual because in the right conditions it can both photocopy and read itself. The photocopying is known as Replication, and the reading as Translation. Replication works because of an ingenious property of the four bases: A likes to pair with T, and G with C. So a single strand of DNA can copy itself by assembling a complementary strand with Ts opposite all the As, As opposite all the Ts, Cs opposite all the Gs, and Gs opposite all the Cs. In fact, the usual form of DNA is the famous double helix of the original strand and its complementary pair intertwined. To make a copy of the complementary strand therefore brings back the original text. So the sequence ACGT becomes TGCA in the copy, which transcribes back to ACGT in the copy of the copy. This enables DNA to replicate indefinitely, yet still contain the same information.
Translation is a little more complicated. First the text of a gene is transcribed into a copy by the same base-pairing process, but this time the copy is not made of DNA but of RNA, a very slightly different chemical. RNA, too can carry a linear code and it uses the same letters as DNA except that it uses U, for Uracil, in place of thymine (T). This RNA copy, called the Messenger RNA, is then edited by the excision of all introns and the splicing together of all exons. The messenger is then befriended by a Ribosome, itself made partly of RNA. The ribosome moves along the messenger, translating each three-letter codon into one letter of a different alphabet, an alphabet of 20 different amino acids each brought by a different molecule called Transfer RNA. Each amino acid is attached to the last to form a chain in the same order as the codons. When the whole message has been translated, the chain of amino acids folds itself up into a distinctive shape that depends on its sequence. It is now known as a Protein.
Almost everything in the human survival machine, from hair to hormones, is either made of proteins or made by them in the form of lower level nano-servants called Enzymes. Every protein is a translated gene. In particular, the survival machine's chemical reactions are catalysed by these protein/enzymes. Even the processing, photocopying, error-correction, and assembly of DNA and RNA molecules themselves -- the replication and translation -- are done with the help of these nano-servant proteins. Proteins are also responsible for switching genes on and off, by physically attaching themselves to Promoter and Enhancer sequences near the start of the gene's text. Different genes are switched on in different parts of the body.
When genes are replicated, mistakes are sometimes made. A letter (base) is occasionally left out or the wrong letter inserted . Whole sentences or paragraphs are sometimes duplicated, omitted or reversed. This is known as mutation. Many mutations are neither harmful or nor beneficial. For instance, if they change one codon to another that has the same amino acid 'meaning": there are sixty four different codons and only twenty amino acids, so many DNA 'words' share the same meaning. Human beings accumulate about one hundred mutations per generation, which may not seem much given that there are more than a million codons in the human genome, but in the wrong place even a single one can be fatal.
Our bodies are composed of anywhere from 10 to 100 trillion human cells, 1,000 trillion bacteria, and 60,000 trillion, (60 quadrillion) symbiotic mitochondria. Our guest bacteria live within us cooperatively, slowing the rate of energy dissipation within our bodies and defending us against harmful invaders. We are a superorganism of multiple cooperative agents.
Within the flattened membrane of the endoplasmic reticulum, which winds through the cell, connected to the nucleus, the cell's proteins are packaged and transported to different parts of the cell. To accomplish this task, the organelle places specific "labels" on each of the molecules it creates. Another organelle, the lysosomes, crunch-up waste products by trapping them in a membrane pocket called a vacuole and then expelling the waste. Other organelle, the mitochondria--around 600 of them in each cell--employ ion pumps to manufacture adenosine triphosphate, ATP. Everyday our bodies recycle a mass of ATP equal to our entire body weight.
Are all of these parts just like bricks stacked up into a living human body? The answer is yes, if we look at the smallest part: atoms. But it is no at every other level, from the molecules to the cellular and higher. The proteins that are the workhorses inside our cells are amazingly competent and discriminating little robots--nanobots, as we might call them. Like the operating system of a computer, a benevolent scheduler system doles out our machine cycles to whatever process has the highest priority, and although there may be a bidding mechanism of one sort or another that determines which processes get priority, this is an orderly queue, not a struggle for life. As Marx would have it, "From each according to his abilities, to each according to his needs." A dim appreciation of this fact underlies the common folk intuition that a computer could never care about anything. Not because it is made of the wrong materials--why should silicon be any less suitable a substrate for caring than carbon?--but because its internal economy has no built-in risks or opportunities, so it doesn't have to care. The general run of the cells that comprise our bodies are probably just willing slaves--rather like the selfless, sterile worker ants in a colony, doing stereotypic jobs and living out their lives in a relatively noncompetitive (Marxist) environment.
The neurons that do most of the transmission and switching and adjusting work in our brain are more versatile and competent robots--microbots, we might call them. They form coalitions that compete and cooperate in larger structures, communicating back and forth, suppressing each other, analyzing the flood of information from the senses, waking up dormant information structures "in memory" (which is not a separate place in the brain) and orchestrating the subtle cascades of signals that move our muscles when we act.. These cells must compete vigorously in a marketplace. For what? What could a neuron want? The energy and raw materials to continue to thrive--just like its unicellular eukarote ancestors and more distant cousins, the bacteria and archaea. Neurons are biological robots : they are certainly not conscious in any rich sense. Remember they are eukarotic cells, akin to yeast cells or fungi. If individual neurons are conscious, then so is athlete's foot! But neurons, like their mindless unicellular cousins, are highly competent agents in the life-or-death struggle, not in the environment between your toes, but in the demanding environment of the brain, where the victories go to the cells that can network more effectively, contribute to more influential trends at the virtual machine levels where large-scale human purposes and urges are discernable. Its like a biological Washington DC, "a world driven by insecurity, hypocrisy and cable hits, where relationships are transactional, blind-copying is rampant and acts of public service appear largely accidental."
Many of the subsystems in the nervous system are organized as opponent processes, engaged in a tug-of-war between two sub-subsystems, each trying to have its own way. (Our emotions, for instance, are well seen as rival storms diplacing each other as best they can, thwarting each other or collaborating against yet another storm.) The opponent-process dynamics of emotions, and the roles they play in controlling our minds, are underpinned by an economy of neurochemistry that harnesses the competitive talents of individual neurons. The dream of every cell is to become two cells, neurons vie to stay active and be influential, but do not dream of multiplying. Intelligent controll of an animal's behavior is still a computational process--in the same way that a transaction in the stock market is a computational process--but the neurons are selfish neurons striving to maximize their intake of the different currencies of reward to be found in the brain. What do neurons buy with their dopamine, serotonin, or oxytoxin? They are purchasing greater influence in the networks in which they participate, and hence greater security. The fact that mules are sterile doesn't stop them from fending for themselves, and neurons can similarly be moved by self-protective instincts they inherited from their reproducing ancestors.
All these levels higher than the basic atomic building blocks exhibit a degree of agency. In other words, they are interpretable as intentional systems. At the molecular level (motor proteins, DNA, proofreading enzymes, gatekeepers at trillions of portals in the membranes of our cells, and the like), their competencies are very "robotic"but still impressive, like the armies of marching brooms in The Sorcerer's Apprentice or Maxwell's Demon, to take two fictional examples. At the cell level, the individual neurons are more exploratory in their behavior, poking around in search of better connections, changing their patterns of firing as a function of their recent experiences. They are like prisoners or slaves rather than mere machines (like the protein nanobots); you might think of them as nerve cells in jail cells, myopically engaged in mass projects of which they have no inkling, but ever eager to improve their lot by changing their policies. At higher levels, the myopia begins to disapate, as groups of cells--tracts, columns, ganglia, "nuclei"--take on specialized roles that are sensitive to ever wider conditions, including conditions in the external world. The sense of agency here is even stronger, because the "jobs done" require considerable discernment and even decision-making.
These agents are like white-collar workers, analysts and executives with particular responsibilities, but also, like white-collar workers everywhere, they have a healthy dose of competitive zeal and a willingness to appropriate any power they encounter in the course of their activities, or even usurp control of any ill-defended activities engaged in by their neighbors or others in communication with them. When we reach agents at this level of competence, the sub-personal parts are intelligent bricks indeed, and we can begin to see, at least in sketchy outline, how we might fashion a whole comprehending person out of them. (Some assembly required," as it says on the carton containing all the bicycle parts. but at least we don't have to cut and bend the metal, and make the nuts and bolts.)
This idea, that we can divide and conquer the daunting problem of imagining how a person could be composed of (nothing but) mindless molecules, can be looked at bottom up, as we have just done, or top-down, starting with the whole person and asking what smallish collection of very small humonculi could conspire to do all the jobs that have to be done to keep a person going. Plato pioneered the top-down approach. His analysis of the soul into three agent-like parts, analogized to the Guardians, the Auxiliaries, and the Workers, or the rational, the spirited, and the appetitive, was not a very good start, for reasons well analized over the last two millennia. Freud's id, ego, and superego of the last century was something of an improvement, but the enterprise of breaking down the whole into subminds really began to take shape with the invention of the computer and the birth of the field of artificial intelligence, which at the outset had the explicit goal of analyzing the cognitive competences of a whole (adult, conscious, language using) person into a vast network of sub-personal specialists, such as the goal-generator, the memory-searcher, the plan-evaluator, the perception-analyzer, the sentence-parser, and so on.
Sometime around five million years ago the fossil record suggests that Africa became drier. The forests retreated and a group of chimp-like apes --perhaps virtually the same as modern chimps—was forced onto the open savanna. There they became more dependent upon a diet of meat (the fruits of the forest having gone). The ape ancestors of man must already have had a complex social life—as do modern chimps—and hunting would have encouraged further cooperativeness, for hunting in general is more efficient in groups. Cooperativeness encourages communication in general. And this complex social living provided the selective pressure that encouraged consciousness to evolve: for consciousness, began as self-consciousness, as our ancestors strove to “read the thoughts” of their companions, so as to reduce the chances of conflict. Language
The general tendency to communicate was abetted by a change in the anatomy of the larynx. In virtually all mammals except human beings, the larynx sits very high in the throat, and as well as serving as an organ of sound, it functions as a valve: it enables the mammal to breathe and drink at the same time. In humans, the larynx has descended, and now sits halfway down the throat. It is useless as a valve, and we choke if we attempt to breathe and drink simultaneously, because the drink is liable to go down the windpipe. But now, above the larynx, there is a space that serves as a sound chamber, and although the larynx has lost its value as a valve, it has gained enormously as a sound box. We can make a huge variety of sounds, and these when further modified by our intricately muscled tongue and lips, gives us the facility for speech. No one knows why this laryngeal descent occurred—what selective pressures triggered it off—but for whatever reason, it has proved a tremendous boon, which far outweighs the occasional death by choking. A desire to communicate, a growing consciousness that caused us increasingly to monitor and direct our own thoughts, and the ability to articulate a wide variety of sounds to express these thoughts clearly—these are the components of human language. Human language on the one hand has become the handmaiden and the director of conscious thought; on the other, it provides the means by which, as a species, we pool our thoughts, so that each of us can in principle partake of the thoughts of all other human beings who have ever lived ( and many of our deepest thoughts, and our everyday inventions, have derived by word of mouth or by example from preliterate times).
So there we have it: a perfectly good ape, adapted for the forest, very like a modern chimp, forced into open country by a change of climate; an apparently gratuitous change in the anatomy of the throat; time and chance, putting pressure on that ape to change, to become more intelligent and more communicative. We initially diverged from apes, it seems, when we became hunters, and it is easy to appreciate that a hunter should benefit from at least rudimentary arithmetic. Lose sight of one from a group of five buffalo and the fifth might sneak up behind you. Fail to count your companions, and you may find yourself on your own. But natural selection could not alone have produced creatures like us and orthodox biologists must invoke two very plausible principles, both of which are known to operate in other contexts.
The first is the notion of combination. Vervet monkeys for example, warn one another that predators are about. Furthermore, they make a different sound for “snake” than they do for “eagle.” There is a clear selective advantage in this, for different predators require different escape strategies. The earliest human beings lived exposed on the savanna, hunted by some animals and hunted by others. Clearly the simple trick that was already developed in monkeys would serve them, too: attaching different sounds to different animals. Selection would also favor an increase in versatility, to differentiate not only eagles and snakes but between zebras, lions, tortoises, bees, leopards, antelopes, and all the thousand and one other potential foodstuffs and hazards of the African plains. The descent of the larynx into the throat would clearly have assisted in this, by increasing the range of sounds possible. But one further trick would have increased the vocabulary even further, that of combining different sounds. Natural selection would have favored a creature that could combine, say, the snake sound with the eagle sound to connote ”lion,” or eagle sound with snake sound to mean “elephant.” But once this process begins—“eagle-eagle-snake” meaning baboon; “snake-eagle-snake” meaning honey badger—it becomes potentially infinite, and the path to infinity is smoothed if the animal can make more than two basic sounds, as early humans may well have been able to do. Thus a simple trick—in which there would have been obvious selective advantage—could have given rise to a potentially infinite vocabulary. Vocabulary is not synonymous with language in the human sense. Syntax must be added to provide order. But a potentially infinite vocabulary is at least the raw material of human language.
It also seems that manual dexterity and human intelligence have had a positive feedback upon one another. The ability to manipulate objects offers all kinds of possibilities, notably in tool using and tool making. There is clearly selective advantage in exploiting those possibilities. Thus manual dexterity imposes a selective pressure in favor of intelligence. Intelligence in turn suggests more possible tasks, which in turn increases the selective pressure that favors manual versatility. And so on. By the same token, the descended larynx made it possible to produce a greater range of sounds, and the increase in vocabulary that resulted from this would have given a selective advantage, which in turn would have favored individuals with an even more descended larynx and even greater range of sound.
Together, the positive-feedback loop and the combination of existing faculties can produce leaps in evolutionary progress that outstrip what otherwise seems possible. These two mechanisms are in turn driven by competition, for, as Darwin suggested, those creatures that achieve an edge over their fellows, by whatever means, are the ones most likely to produce the most offspring. In this scenario, then, Einstein and Shakespeare emerge, not as preconceived pinnacles of creation, but as side effects: the chance and inadvertent combinations of combination and feedback loop, acting on qualities that originally were shaped by the need to operate in collectives, to hunt, and to avoid being hunted.
From this village sized population, the world was peopled. And, since people in societies around the world behave in much the same way, the principal elements must already have been present in the ancestral human population before its dispersal into Africa and the world beyond. It would be of greatest interest to know everything about this population--its way of life, its social structure the roles of men and women, its religion the language that its members spoke. Not a trace of these first people has been found by archaeologists. Yet despite the total lack of direct evidence, a surprising amount can now be inferred about them. Geneticists can estimate how large the population was and, by identifying its closest descendents, can point to where in Africa the ancestral population may have lived. They can even say something about the language the ancestral people spoke. And by analysing the behaviors common to societies around the world, particularly the hunter-gatherers who seem closest to the ancestral people, anthropologists can describe how the ancestral population probably lived and what its people were like.
Because all of our current gene machines are descended from this ancestral population, geneticists can infer some of its properties by analyzing the DNA of living people, then working backward. Two parts of the human genome are particularly useful for this purpose. One is the Y chromosome, the only chromosome possessed by men alone. The other is known as mitochondrial DNA. These are the only two parts of the genome that escape the shuffling of genetic material between generations. The shuffling, an evolutionary mechanism for generalizing diversity at each generation, means that almost all the parts of the human genome have a pedigree that is at present too complex to untangle.
Unlike most pairs of chromosomes, the X and Y do not exchange segments of DNA between generations (except at their very tips). This is to ensure that the Y's most important gene, the one that makes a person male, never gets shuffled into the X chromosome. The Y chromosome is therefore passed down essentially unchanged from father to son, generation after generation. Mitochondrial DNA escapes shuffling through a different process. Mitochondria, cellular components that generate chemical energy, are former bacteria that were enslaved long ago by animal cells. They live in the main body of the cell, outside the nucleus that holds the chromosomes. When the sperm fuses with the egg, all the sperm's mitochondria are destroyed, leaving the fertilized egg equipped with only the mother's mitochondria. Because of this arrangement, mitochondria are bequeathed unchanged from mother to child (and a man's mitochondria are not passed on to his children).
In addition to their exempt status, the Y chromosome and mitochondrial DNA each have a special and surprising property of uniqueness. All men in the world today carry the same Y chromosome, and both men and women carry the same mitochondria. All of today's Y chromosomes were inherited from the same, single source. a Y chromosome carried by an individual male who belonged to, or lived slightly before, the ancestral human population. The same is true of mitochondrial DNA, everyone carries the same mitochondrial DNA because all are copies of the same original, the mitochondrial DNA belonging to a single woman. The metaphor is hard to avoid--this is Adam's Y chromosome, and Eve's mitochondrial DNA. The ancestral population, of course, included many Adams and Eves, indeed about 2,500 of each if the geneticists calculations are to be believed. So how did it come about that just one man bequeathed his Y chromosome to the whole world and one woman her mitochondria?
It's a curious fact of genetics that one version of a gene, especially in small populations, can displace all the other existing versions of the same gene in just a few generations, through a purely random process called genetic drift. Consider how this might work among surnames, which are passed on from father to son just like Y chromosomes. Suppose a hundred families are living on an island, each with a different surname. In the first generation, many of these families will have only daughters or no children at all. So in just a single generation, all those families' surnames (and accompanying Y chromosomes) will go extinct. Assuming no male settlers arrive on the island, the same unavoidable winnowing will happen each generation until only one surname (and Y chromosome) is left. That is what has happened with the human Y chromosome. Every Y chromosome that exists today is a copy of the same original, carried by a single individual in the ancestral population. The Y chromosomes of all the other Adams have perished at some point along the way when their owners had no sons.
But despite all being copies of the same original, Y chromosomes are not identical. Over the generations--the switch of one of the four kinds of DNA units for another--have built up on the Y. The mutations are harmless but serve the invaluable purpose for geneticists of assigning the owners of Y chromosomes to different male lineages.. The reason is that once a man has acquired a novel mutation to his Y chromosome, all of his sons will carry that mutation, and no one else will. If one of the sons has a second mutation, all of his descendents will carry the two mutations. Each new mutation from this creates a fork on the family tree--between those who carry it and those who don't--and stands at the head of all the linages below it. By looking at the most informative of the mutations on the Y chromosome, geneticists can assign every man to one lineage or another. Since there is only one Y chromosome, all these lineages or branches eventually coalesce to a single trunk, the Y chromosome of the original "Adam." Of particular help in defining the ancestral human population is the lineage of the men who left Africa. A few men inside Africa and all men outside it, carry a Y chromosome mutation known as M168. This means that modern humans left Africa sometime shortly after the M168 mutation occurred. Based on the mutative counting method, one recent estimate is that M168 occurred 44,000 years ago.
The women's lineages, like the men's, have all turned out to be branches from a single root, the mitochondrial DNA possessed by a single woman who lived in or before the ancestral human population. The mitochondrial Eve appears to have lived considerably earlier than the Y chromosomal Adam--about 150,000 years ago--but that may reflect the difficulty of dating mitochondrial DNA, which gathers mutations more rapidly than does the Y chromosome. This mitochondrial genealogy of humankind has three main branches, known as L1, L2 and L3. L1 and L2 are confined to Africans who live south of the Sahara. The L3 branch gave rise to a lineage known as M, and it was the descendants of M who left Africa.
A single tribe of our fishing ancestors crossed from Africa to the Arabian peninsula via the Gate of Grief about 50,000 years ago, during the Pliestocene ice age when the water levels were 200 feet lower, to carry those gene mutations to the rest of the world. These spells of favorable climate may also have drawn down Neanderthals from the north. The Neanderthals may have thwarted previous attempts by humans in Africa to cross into Arabia. just as they crushed the attempts of anatomically modern humans to penetrate the Middle East. By 50,000 years ago, however, the Neanderthals would have faced a different adversary. The ancestral people, with their new gift of language, would have enjoyed better organization and superior weaponry. Though physically weaker than the Neanderthals, the new model of humans may at last have gained an edge over their fierce archaic relatives.
Hugging the southern coast of the Arabian peninsula this little band of dark pigmented Africans and their descendents made their way to their first stopping point in India, where because it is there that we find the first diversifications of their mitochondrial and Y chromosome trees. The M and N linages that came out of Africa are still frequent in today's Indian population. The M lineage is very common, and its mutations are older than those found farther east, supporting the idea that the Indian subcontinent was settled soon after the African exodus. On the Y chromosome side, several offshoots of the early male lineage are restricted to the Indian subcontinent.
In India there was a historic parting of the ways. Some people continued the coast-hugging, population budding process along the southern shores of Asia, eventually reaching the Australian land mass, China and Japan. Others pushed inland in a northwesterly direction, through the lands that are now Iran and Turkey, and began the long contest with the Neanderthals for the possession of Europe. Both paths tested the power of the new modern people to innovate, survive in hostile surroundings, and overcome daunting obstacles.
The group expanding along the coast pushed eastward around India and Indochina, eventually reaching the two lost continents of Sunda and Sahul. With sea level much lower 50,000 years ago than it is now, the Malay peninsula and the islands of Sumatra, Java, and Borneo formed a single land mass known as Sunda or Sundaland, which was a southern extension of the Asian land mass. Australia was then connected to New Guinea in the north and to Tasmania in the south, the three islands forming the lost continent of known as Sahul, directly south of Sunda.
While some people crossed the straits from Sunda to Sahul, others continued eastward arounf the southern borders of Sunda. They would have followed the coastline northward, up the eastern coast of China until they reached Japan and the Kamchatka peninsula, leaving a trail of settlements in their wake. The penetration of the Eurasian land mass would have brought modern humans into conflict with the Neanderthals in the west and perhaps with Homo erectus in the east. Possibly this invasion was delayed for many generations until the innovative moderns had developed the necessary weapons and tactics to defeat the archaics or perhaps until they had evolved the genetic adaptations for living in cold climates.
There is no doubt that human evolution continued after the dispersal of the ancestral population 50,000 years ago, and took different forms in different populations. Much of this evolution may have been convergent, as each population adapted with different alleles to the same challenges. But convergent evolution does not necessarily proceed in lockstep in each separate population. For many millenia people would probably all have had dark skin, just as do the relict populations of Australia, New Guinea, and the Andaman Islands. It seems likeely that the first modern humans who reached Europe 45,000 years ago would also have retained black skin and other African features. The Neanderthals, on the other hand, may have lived in northern climates long enough for the melanocortin trceptor gene, which controls skin color, to have reverted back to its default state of producing pale skin. Though there exists no direct evidence as to skin color, and the point is only a curiousity, the Neanderthals may have had light skin and their conquerers black. Early Europeans, including the great artists of the Chauvet cave in France, may have retained the dark skin and other badges of their African origin for many thousands of years.
One striking genetic differentation involves two genes involved in the construction of the human brain. Each of these genes has several alternative versions or alleles, but in each case one specific allele has become much more widespread than the others in certain populations. For the allele to rise to high frequency very quickly is the signature of natural selectivity hard at work. So presumably each allele conferred some very strong selective advantage. One of the alleles is an alternative version of a gene known as microcephalin. The allele appeared around 37,000 years ago and is now carried by some 70% of many populations of Europe and East Asia. The allele is less common in sub-Saharan Africa, where it is typically carried by from zero to 25% of the population.
Just some 6,000 years ago a new allele of another brain gene, known as ASPM, appeared in the Middle East or Europe and rapidly rose to prominence, being carried by about 50% of the people in those populations. The allele is less common in East Asia and occurs hardly at all in sub-Saharan Africans. What made these two alleles spread so quickly? It seems likely that each conferred some cognitive advantage, perhaps a slight one yet enough for natural selection to work on. There us at present no evidence that the microcephalin or ASPM genes do anything other than determine brain size. Some genes do play more than one role, but no other functions have been detected for microcephalin and ASPM. Their role in the brain, however, is well established. They first came to light because they are disabled in people with microcephaly, causing the brain to be much smaller than usual, particularly in the cerebral hemispheres that are the site of the brain's higher cognitive functions. The brain has grown larger because a succession of new and powerful versions of genes like the microcephalin and ASPM are just a continuation of this process. So it could be that the spread of the microcephalin allele some 37,000 years ago expanded the cognitive power of Caucasion populations and underlay such striking cultural advances as the Aurignacian people's adeptness at painting caves, while other populations developed such capabilities later. Isolated on their separate continents, the far flung branches of the human gene machines were to follow different trajecctories as each adapted to the strange world that lay beyond the boundaries of their ancestral homeland.
The Last Glacial Maximum preceded the emergence not only of people who looked somewhat different from each other but, far more significantly, of people who behaved differently from all their other predecessors. In the southern borders of the western half of Eurasia, around the eastern shores of the Mediterranean, a new kind of human society evolved, one in which hunters and gatherers at last developed the behaviors necessary for living in settled communities. One possibility is that some evolutionary adaptation had first to occur in human social behavior. That would explain why it took so many generations for people to settle down. The adaptation, probably mediated by a suite of genetic changes, would have been new behaviors, perhaps ones that made people readier to live together in large groups, to coexist without constant fighting and to accept the imposition of chieftains and hiearchy. This first change, of lesser aggressiveness, would have created this novel environment of a settled society, which in turn prompted a sequence of further adaptations, including perhaps a different set of intellectual capacities that is rewarded by the institution of property.
The people of the Near East, having developed suites of domesticated plant and animal species, expanded their farming activites north and west , by settling into Europe, and perhaps the original inhabitants started to imitate their success, by settling down and adapting the new technology. Or the new farming groups, if composed largely of men in search of new land, may simply have captured women from the indigenous groups. The farmers' Neolithic genes would have become more diluted, generation by generation, as they and their new culture pushed farther into Europe.
One widely studied Near East community is that of the Jews whose remarkable feature, from which all others follow, is that they have to an amazing extent married among themselves over the centuries. Jewish communities, in other words, have been largely endogamous, at least until recent times, which means the population's gene pool has had time to develop its own private history, and this genetic history has shed light on many historical events.
An important consequence of endogamy is that the gene pool is not diluted through intermarriage and so the selective pressures that may act on a population are able, over time, to favor specific genetic variations. A striking possibility, plausible though not yet confirmed, is that one particular Jewish community, the Ashkenazim of northern and central Europe, lived for a long time under a harsh selective presssure that raised certain variant genes to high frequency. These variant genes are well known to physicians because of their serious side effects--when inherited from both parents they cause a variety of serious diseases. But the variant genes also confer a special benefit in the form of increased intelligence. The selective pressure was the restriction of Ashkenazim by their European host population to a small number of occupations that happened to require higher than usual mental agility.
Judaism is a religion, open to others who convert, and it has long seemed that religion and culture, not necessarily genetics, were the common elements of and between the world's various Jewish communities. But many men from many far flung Jewish communities have the same set of variations on their Y chromosomes. The variations are not exclusive to Jews but are common throughout the Middle East. The founding fathers of Jewish communities around the world were drawn from the same ancestral Middle Eastern population of 4,000 years ago from which other people, such as Arabs, Turks, and Armenians, are also descended. These generic Middle Eastern Y chromosomes, part of the J branch of the worldwide Y chromosome family tree, are both a common link between men of different Jewish communities and proof that their communities must have remained genetically separate from their non-Middle Eastern host population.
However, the mitochondrial DNA in each Jewish community doesn't closely resemble that of any other population, meaning that the geographic origin of the founding mothers of Jewish communities cannot be identified for certain, and must come from the host community. The Jewish men, arriving perhaps as traders and presumably unmarried, took wives from the local population. in each country, and it seems then converted their wives to Judaism. Once the community was established and reached sufficient size, it became closed, no morewives were taken from the host population, and community members married among themselves.
For whatever reason, the Ashkenazim have an average IQ of 115, one standard deviation above that of northern Europeans. This is the highest average IQ of any ethnic group for which reliable data exist. Such an advantage may not make much difference at the average, where most people are situated, but it translates into a significant difference at the extremes. The proportion of northern Europeans with IQs greater than 140 is 4 per thousand but the figure for Ashkenazim is 23 per thousand, a sixfold difference. This may have something to do with the fact that Ashkenazim make up only 3% of the U.S. population but have won 27% of U.S. Nobel prizes. Ashkenazim account for more than half of the world's chess champions. Jews and half Jews, who make up about 0.2 percent of the world's population, have won a total of 155 Nobel prizes in all fields, 117 in physics, chemistry and medicine.
While people were shaping the genetics of domesticated plants and animals by altering various features of their environment, a curious thing was happening to people themselves. Their genetics too were changing as they adapted to settled societies. The ease with which the human genome responds to cultural change in society has come about from a physiological adaptation, the unusual ability to continue to digest milk in adulthood, otherwise known as lactose tolerance. Lactose is a special sugar that accounts for most of the calories in mother's milk. The gene for lactase, the enzyme that digests lactose, is switched on just before birth and, in most people, switched off after weaning. Because lactose does not occur naturally in most people's diet, it would be a waste of the body's resources to continue to make the lastase enzyme. But in people of mostly northern European extraction, and to some extent in African and Beduin tribes that drink raw milk, the lactase gene remains switched on to early adulthood or throughout life. Among these milk drinkers, the ability to digest the lactose in cow's, sheep's, or goat's milk evidently conferred so great a benefit that the genetic mutation conferring the ability became widespread.
At the turn of the twenty-first century, we passed a detectable turning point in both neuroscience and computing power. For the first time in history. we collectively know enough about our own brains, and have developed such advanced computing technology, that we can now seriously undertake the construction of a verifiable, real-time, high resolution model of significant parts of our intelligence. Within many of our lifetimes, genetic and molecular science will extend biology and correct its obvious flaws (such as our vulnerability to disease). By about the year 2020, the full effects of the genetic revolution will be felt across society. We are rapidly gaining the knowledge and the tools to drastically extend the usability of the “replication organism” each of us calls his body and brain. Nanomedicine experts estimate that eliminating 50% of medically preventable conditions would extend human life expectancy to 150 years. If we were able to prevent 99% of naturally occurring medical problems, we’d live to be more than 1,000 years old. We can see the beginnings of this awesome medical revolution today. The field of genetic biotechnology is fueled by a growing arsenal of tools. Drug discovery was once a matter of finding substrates (chemicals) that produced some beneficial result without excessive side effects, a research method similar to early humans’ seeking out rocks and other natural implements that could be used for helpful purposes. Today, we are discovering the precise biochemical pathways that underlie both disease and aging processes. We are able to design drugs to carry out precise missions at the molecular level. With recently developed gene technologies, we’re on the verge of being able to control how genes express themselves.
Gene expression is the process by which cellular components (specifically RNA and the ribosomes) produce proteins according to a precise genetic blueprint. While every human cell contains a complete DNA sample, and thus the full complement of the body’s genes, a specific cell, such as a skin cell or a pancreatic islet cell, gets its characteristics from only the fraction of genetic information relevant to that particular cell type. Gene expression is controlled by peptides (molecules made up of sequences of up to 100 amino acids) and short RNA strands. Researchers are now beginning to learn how these processes work. Many new therapies currently in development and testing are based on manipulating peptides either to turn off the expression of disease-causing genes or to turn on desirable genes that may otherwise not be expressed in a particular type of cell. A new technique called RNA interference is able to destroy the messenger RNA expressing a gene and thereby effectively turn that gene off. Accelerating progress in biotechnology will enable us to reprogram our genes and metabolic processes to propel the fields of genomics (influencing genes), proteomics (understanding and influencing the role of proteins), gene therapy (suppressing gene expression as well as adding new genetic information), rational drug design (formulating drugs that target precise changes in disease and aging processes), as well as the therapeutic cloning of rejuvenated cells, tissues, and organs.
The nanotechnology revolution promises the tools to rebuild the physical world, our bodies, and our brains, molecular fragment by molecular fragment and potentially atom by atom. We are shrinking the key features (working parts), in accordance with the law of accelerating returns, at an exponential rate (over four per linear dimension per decade, or about 100 per 3-D volume.) At this rate, the key feature sizes for most electronic and many mechanical technologies will be in the nanotechnology range --generally considered to be less than 100 nanometers (one billionth of one meter)—by the 2020s. Electronics has already dipped below this threshold, although not yet in three-dimensional structures and not yet in structures that are capable of assembling other similar structures, an essential step before nanotechnology can reach its promised potential. Meanwhile, rapid progress has been made recently in preparing the conceptual framework and design ideas for the coming age of nanotechnology.
Nanotechnology has expanded to include any technology in which a machine’s key features are measured by fewer than 100 nanometers. Just as contemporary electronics has already quietly slipped into this nano realm, the area of biological and medical applications has already entered the era of nanoparticles, in which nanoscale objects are being developed to create more-effective tests and treatments. In the area of testing and diagnosis, nanoparticles are being employed in experimental biological tests as tags and labels to greatly enhance sensitivity in detecting substances such as proteins. Magnetic nanotags can be used to bind with antibodies that can then be read using magnetic probes while still inside the body. Successful experiments have been conducted with gold nanoparticles that are bound to DNA segments and can rapidly test for specific DNA sequences in a sample. Small nanoscale beads called quantum dots can be programmed with specific codes combining multiple colors, similar to a color bar code, that can facilitate tracking of substances through the body.
In the future, nanoscale devices will run hundreds of tests simultaneously on tiny samples of a given substance. These devices will allow extensive tests to be conducted on nearly invisible samples of blood. In the area of treatment, a particularly exciting application of this technology is the harnessing of nanoparticles to deliver medication to specific sites in the body. Nanoparticles can guide drugs into cell walls and through the blood-brain barrier. Nanoscale packages can be designed to hold drugs, protect them through the gastrointestinal tract, ferry them to specific locations, and then release them in sophisticated ways that can be influenced and controlled, wirelessly, from outside the body. Nanotherapeutics has developed a biodegradable polymer only several nanometers thick that uses this approach. Meanwhile, scientists have demonstrated a nanopill with structures in the 25 to 45 nanometer range. The nanopill is small enough to pass through the cell wall and deliver medications directly to targeted structures inside the cell. MicroCHIPS of Bedford, Massachusetts, has developed a computerized device that is implanted under the skin and delivers precise mixtures of medicines from hundreds of nanoscale wells inside the device. Future versions of the device are expected to be able to measure blood levels of substances such as glucose. The system could be used as an artificial pancreas, releasing precise amounts of insulin based on the blood glucose response. The system would also be capable of simulating any other hormone-producing organ. Another innovative proposal is to guide nanoparticles (probably composed of gold) to a tumor site and then heat them with infrared beams to destroy the cancer cells. The revolution in nanotechnology will allow us to do a great deal more than simply treat disease. Ultimately, nanotech will enable us to redesign and rebuild not only our bodies and brains, but also the world with which we interact. The full realization of nanotechnology, however, will lag behind the biotechnology revolution by about one decade. But by the mid to late 2020s, the effects of the nanotech revolution will be widespread and obvious.
The most important and radical application particularly of circa-2030 nanobots will be to expand our minds through the merger of biological and nonbiological, or “machine,” intelligence. In the next 25 years, we will learn how to augment our 100 trillion very slow interneuronal connections with highspeed virtual connections via nanorobotics. This will allow us to greatly boost our pattern-recognition abilitie, memories, and overall thinking capacity, as well as to directly interface with powerful forms of computer intelligence. The technology will also provide wireless communication from one brain to another. In other words, the age of telepathic communication is almost upon us.
Our brains today are relatively fixed in design. Although we do add patterns of interneuronal connections and neurotransmitter concentrations as a normal part of the learning process, the current overall capacity of the human brain is highly constrained. As humanity’s artificial-intelligence (AI) capabilities begin to upstage our human intelligence at the end of the 2030s, we will be able to move beyond the basic architecture of the brain’s neural regions. Brain implants based on massively distributed intelligent nanobots will greatly expand our memories and otherwise vastly improve all of our sensory, pattern-recognition, and cognitive abilities. Since the nanobots will be communicating with one another, they will be able to create any set of new neural connections, break existing connections (by suppressing neural firing), create new hybrid biological and computer networks, and add completely mechanical networks, as well as interface intimately with new computer programs and artificial intelligences.
The implementation of artificial intelligence in our biological systems will mark an evolutionary leap forward for humanity, but it also implies we will indeed become more “machine” than “human.” Billions of nanobots will travel through the bloodstream in our bodies and brains. In our bodies, they will destroy pathogens, correct DNA errors, eliminate toxins, and perform many other tasks to enhance our physical well-being. As a result, we will be able to live indefinitely without aging. In our brains, nanobots will interact with our biological neurons. This will provide full-immersion virtual reality incorporating all of the senses, as well as neurological correlates of our emotions, from within the nervous system. More importantly, this intimate connection between our biological thinking and the machine intelligence we are creating will profoundly expand human intelligence. Warfare will move toward nanobot-based weapons, as well as cyberweapons. Learning will first move online, but once our brains are fully online we will be able to download new knowledge and skills. The role of work will be to create knowledge of all kinds, from music and art to math and science. The role of play will also be to create knowledge. In the future, there won’t be a clear distinction between work and play.
Of the three technological revolutions (genetic, nano-mechanical, and robotic), the most profound is robotic or, as it is commonly called, the strong artificial intelligence revolution. This refers to the creation of computer thinking ability that exceeds the thinking ability of humans. We are very close to the day when fully biological humans (as we now know them) cease to be the dominant intelligence on the planet. By the end of this century, computational or mechanical intelligence will be trillions of trillions of times more powerful than unaided human brain power. Computer, or non-
biological intelligence, should still be considered human since it is fully derived from human–machine civilization and will be based, at least in part, on a human-made version of a fully functional human brain. The merger of these two worlds of intelligence is not merely a merger of biological and mechanical thinking mediums, but also (and more importantly) a merger of method and organizational thinking that will expand our minds in virtually every imaginable way. Biological human thinking is limited to 1016 calculations per second (cps) per human brain (based on neuromorphic modeling of brain regions) and about 1026 cps for all human brains.
These figures will not appreciably change, even with bioengineering adjustments to our genome. The processing capacity of nonbiological intelligence or strong AI, in contrast, is growing at an exponential rate (with the rate itself increasing) and will vastly exceed biological intelligence by the mid- 2040s. Artificial intelligence will necessarily exceed human intelligence for several reasons. First, machines can share knowledge and communicate with one another far more efficiently than can humans. As humans, we do not have the means to exchange the vast patterns of interneuronal connections and neurotransmitter-concentration levels that comprise our learning, knowledge, and skills, other than through slow, language-based communication.
Second, humanity’s intellectual skills have developed in ways that have been evolutionarily encouraged in natural environments. Those skills, which are primarily based on our abilities to recognize and extract meaning from patterns, enable us to be highly proficient in certain tasks, such as distinguishing faces, identifying objects, and recognizing language sounds. Unfortunately, our brains are less well-suited for dealing with more-complex patterns, such as those that exist in financial, scientific, or product data. The application of computer-based techniques will allow us to fully master pattern-recognition paradigms.
Finally, as human knowledge migrates to the Web, machines will demonstrate increased proficiency in reading, understanding, and synthesizing all human–machine information. A key question is whether the “chicken” (strong AI) or the “egg” (nanotechnology) will come first. In other words, will strong AI lead to full nanotechnology (molecular-manufacturing assemblers that can turn information into physical products), or will full nanotechnology lead to strong AI? The logic of the first premise is that strong AI would be in a position to solve any remaining design problems required to implement full nanotechnology.
The second premise is based on the assumption that hardware requirements for strong AI will be met by nanotechnology-based computation. Likewise, the software requirements for engineering strong AI would be facilitated by nanobots. These microscopic machines will allow us to create highly detailed scans of human brains along with diagrams of how the human brain is able to do all the wonderful things that have long mystified us, such as create meaning, contextualize information, and experience emotion. Once we fully understand how the brain functions, we will be able to recreate the phenomenon of human thinking in machines. We will endow computers, already superior to us in the performance of mechanical tasks, with lifelike intelligence.
Progress in both areas (nano and robotic) will necessarily use our most-advanced tools, so advances in each field will simultaneously facilitate the other. However, the most important nanotechnological breakthroughs will emerge prior to strong AI, but only by a few years (around 2025 for nanotechnology and 2029 for strong AI). As revolutionary as nanotechnology will be, strong AI will have far more profound consequences. Nanotechnology is powerful but not necessarily intelligent. We can devise ways of at least trying to manage the enormous powers of nanotechnology, but superintelligence by its nature cannot be controlled. The nano/robotic revolution will also force us to reconsider the very definition of human. Not only will we be surrounded by machines that will display distinctly human characteristics, but also we will be less human from a literal standpoint.
Despite the wonderful future potential of medicine, real human longevity will only be attained when we move away from our biological bodies entirely. As we move toward a software-based existence, we will gain the means of “backing ourselves up” (storing the key patterns underlying our knowledge, skills, and personality in a digital setting) thereby enabling a virtual immortality. Thanks to nanotechnology, we will have bodies that we can not only modify, but also change into new forms at will. We will be able to quickly change our bodies in full-immersion virtual-reality environments incorporating all of the senses during the 2020s and in real reality in the 2040s.
Humans are metazoans, that is, we are made up of many cells. In most of our cells there is a nucleus that contains the "blueprint" for the entire individual. This blueprint is stored in DNA (deoxyribonucleic acid); DNA and its complement of helper proteins and organelles make up the molecular computer that contains the memory necessary to construct an individual organism.
Proteins are molecular machines that can perform incredibly complicated functions. They are the engines of life; DNA is a template that guides the manufacture of these engines.
DNA in eucaryotic cells is arranged in two interwoven strands--the "double helix"--and packed tightly into a complex structure called chromatin, which is arranged into chromosomes in each cell nucleus. With a few exceptions, such as red blood cells and specialized immune cells, the DNA in each cell of the human body is complete and identical. Researchers estimate that the human genome--the complete collection of genetic instructions--consists of between sixty thousand and a hundred thousand genes. Genes are heritable traits; a gene has often been defined as a segment of DNA that contains the code for a protein or proteins. This code can be transcribed to make a strand of RNA (ribonucleic acid); ribosomes then use the RNA to translate the original DNA instructions and synthesize proteins. (Some genes perform other functions, such as making the RNA constituents of ribosomes.)
Many scientists believe that RNA was the original coding molecule of life, and that DNA is a later elaboration.
While most cells in the body of an individual carry identical DNA, as the person grows and develops, that DNA is expressed in different ways within each cell. This is how identical embryonic cells become different tissues.
When DNA is transcribed to RNA, many lengths of nucleotides that do not code for proteins, called introns, are snipped out of the RNA segments. The segments that remain are spliced together; they code for proteins and are called exons. On a length of freshly transcribed RNA, these exons can be spliced together in different ways to make different proteins. Thus, a single gene can produce a number of products at different times.
Bacteria are tiny, single-celled organisms. Their DNA is not stored in a nucleus but is spread around within the cell. Their genome contains no introns, only exons making them very sleek and compact little critters. Bacteria can behave like social organisms; different varieties both cooperate and compete with each other to find and use resources in their environment. In the wild, bacteria frequently come together to produce biofilm "cities"; you may be familiar with these cities from the slime on spoiled vegetables in your refrigerator. Biofilms can also exist in your intestines, your urinary tract, and on your teeth, where they sometimes can cause problems, and specialized ecologies of bacteria protect your skin, your mouth, and other areas of your body. Bacteria are extremely important and though some cause disease, many others are necessary to our existence. Some biologists believe that bacteria lie at the root of our life-forms, and that eucaryotic cells--our own cells, for example--derive from ancient colonies of bacteria. In this sense, we may simply be spaceships for bacteria.
Bacteria swap small circular loops of DNA called plasmids. Plasmids supplement the bacterial genome. Plasmids make up a universal library that bacteria of many different types can use to live more efficienly.
Bacteria and nearly all other organisms can be attacked by viruses. Viruses are very small, generally encapsulated bits of DNA or RNA that cannot reproduce by themselves. Instead, they hijack a cell's reproductive machinery to make new viruses. In bacteria, the viruses are called bacterio-phages ("eaters of bacteria") or just phages. Many phages carry genetic material between bacterial hosts, as do some viruses in animals and plants.
It is possible that viruses originally came from segments of DNA within cells that can move around, both inside and between chromosomes. Viruses are essentially roving segments of DNA within cells that can move around, both inside and between chromosomes. Viruses are essentially roving segments of genetic material that have learned how to "put on space suits" and leave the cell.
Images of Greece 1989