Monday, December 24, 2007
The Gilboa tree dates from the middle of the Devonian period (416 to 359 million years ago), a time of explosive evolutionary action among land plants. During this period trees evolved from small, primitive forms that would have barely brushed your ankle into genuine trees up to 30 metres tall. And with the evolution of trees, they and all the other plants - hitherto confined to marshy environments - went on to conquer the surface of the planet.
The First Forests
These first forests changed the face of the Earth. Early land plants had already started leaking oxygen into the atmosphere, creating soils and providing food and shelter for animals, and the evolution of trees upped the pace of change. They weathered rocks, made soils deeper and richer, created complex habitats and changed the climate beyond recognition. By the end of the Devonian, an ecologically modern world had appeared. Discoveries such as the Gilboa tree are bringing stunning new insights into how the foresting of the world came about.
Plants in the Ordovician Period
Plants first colonised land in the Ordovician period, around 465 million years ago (see Chart). By the early Devonian they had developed many of the features of modern plants, including a protective waxy cuticle, vascular plumbing for transporting water and nutrients, and pores called stomata to draw in carbon dioxide. But there were many differences from modern plants too. Seeds had yet to evolve - early land plants instead reproduced by way of spores. Wood, large leaves and deep roots were unknown. Few plants were taller than a few centimetres.
The Rhynie ecosystem
The best place to catch a glimpse of this primitive terrestrial forest is in the hills around the village of Rhynie in Aberdeenshire, UK. Here the finely crystallized quartz of the Rhynie chert preserves in extraordinary detail an entire ecosystem that was engulfed and petrified by silica-rich waters from a volcanic spring 410 million years ago. The fossil plants still stand upright, and even their cells remain visible. Tiny creatures such as insects, centipedes, mites, harvestmen and spider-like trigonotarbids are preserved in great detail. Some still cling to the stems on which they lived and died.
Originally, Rhynie wouldn't have been recognisable as a modern environment. Every plant would have been on a small scale - knee height and lower. However, by the late Devonian, plants are on a scale that humans are used to. You can walk around a woodland and identify a tree canopy layer, a shrub layer and a herbaceous layer. Now the whole environment looks more modern.
The Rhynie landscape was not entirely devoid of large living things, however. Dotted here and there were featureless columns standing up to 6 metres high and a metre wide at the base. When first described in 1857, from fossils found in Quebec, they were identified as conifer trunks and named Prototaxites. However, studies into the structure of their "wood" soon revealed that they were not trees. Various alternatives were proposed, including giant algae, lichen and fungi, but the identity of Prototaxites remained uncertain.
Last year a research team University of Chicago decided to settle matters. They measured the ratio of different carbon isotopes in Prototaxites fossils to reveal whether they were making a living from photosynthesis or by eating rotting matter as fungi do today. The results clearly showed that Prototaxites was a fungus. It was both tougher and stronger
And there was more. The isotopes also revealed that some Prototaxites were probably living off microbial crusts composed of bacteria, algae and lichens. These crusts still exist today but are confined to places where vascular plants can't grow, such as deserts. What Prototaxites shows is that there were large patches of the early Devonian landscape with no vascular plants. This means that plants didn't conquer the land as quickly and completely as scientists once assumed.
The Rise of the Forests
With large patches of land still free of vascular plants, clearly an upgrade of their plumbing and support systems was needed for the terrestrial conquest to continue. This was achieved through "secondary growth", the evolution of tougher and stronger tissues to carry water and nutrients up longer stems. Once these were in place, plants were able to grow much bigger. It was these advances that allowed the trees of the Gilboa forest to reach their full height of 8 meters or more. The foresting of the Earth had begun.
The appearance of trees changed the rules of the game, with evolutionary scramble for height, for light to power photosynthesis, and for prime positions to disperse spores. Once it became possible to be a tree, then the race is on for size and dominance. If you look at modern ecosystems, the trees that are dominant hog the light space of the environment. The goal is to be at the top and collect the most light.
The Gilboa Forest
So what was the Gilboa forest like? Its trees, known as cladoxylopsids, looked a bit like tree ferns, although they are not related. Their trunks were long and slender, with a bulbous base and shallow roots. They did not have leaves, instead sporting a goblet-shaped crown of branches and thread-like branchlets. These were probably green to carry out photosynthesis. In other words it must have looked like a strange, giant bottle brush.
Gilboa, 385 million years ago, was a warm, wet flood plain 10 degrees south of the equator. Since the cladoxylopsids lacked leaves, the forest would have been airier and brighter than any modern forest. There was a diverse understorey that included club mosses and ferns, but the only animal inhabitants were arthropods, including insects, centipedes and mites. It would also have been wet underfoot: as with today's ferns and mosses, cladoxylopsid spores could only be fertilised on wet surfaces, so the trees were confined to flood plains.
World Wide Representation
Cladoxylopsids achieved worldwide success: fossils of their crowns (which until the Gilboa tree were thought to be complete plants) have been found in Europe, China and the Americas. However, in many respects they were primitive, and lurking in their shadows was a group of plants that would soon put them in the shade.
Other Plant Developments
These were the archaeopterids, relatives of the conifers. At the time of the Gilboa forest, archaeopterids were no bigger than shrubs, but their lineage soon made some key evolutionary advances, including wood, deep roots and large leaves. By 370 million years ago, a fully fledged tree, Archaeopteris, had emerged from the archaeopterid ranks.
With the trunk of a conifer and fern-like leaves, Archaeopteris reached 30 metres - as tall as a mature oak - and dominated late Devonian forests all over the world. Its wooden trunk permitted it to grow much taller than the cladoxylopsids. There is a stronger material around, per mass, than wood says a leading researcher in the chemical make-up of fossil plants. Wood also led to the formation of the first complex soils. Soil humus is, by and large, lignin, the polymer that makes wood tough.
The Decline of CO2 Levels and the Rise of Deep Roots
Archaeopteris was also the first plant to evolve deep roots. Roots eat away at rocks, burrowing into and dissolving them with acids in pursuit of nutrients. Over an immense period of time the weathered material gets washed into the oceans, where it combines with dissolved CO2 to form sediments that are eventually subducted into the Earth's interior by tectonic activity. This process removed huge amounts of CO2 from the oceans and atmosphere, with profound consequences for the climate. Between the beginning and end of the Devonian, levels of the gas plummeted by up to 95 per cent. Greenhouse conditions vanished, to be replaced by an ice age that at its peak 300 million years ago saw glaciers approaching the tropics.
Deep Roots and Large Leaves
But oddly it was the climatic upheaval brought about by roots that appears to have driven the next great innovation - large leaves. These first appeared 390 million years ago, but only became widespread with Archaeopteris 15 million years later. The stripping of CO2 from the Devonian atmosphere helped to remove an obstacle that had been inhibiting the evolution of large leaves.
Although large, flat leaves are very efficient at capturing sunlight for photosynthesis, they are difficult to keep cool. To prevent overheating, leaves need to release water vapour through their pore-like stomata - the plant equivalent of sweating. The problem is the number of stomata is regulated by a genetic switch that responds to CO2 levels in the atmosphere: the more CO2, the lower the stomatal density. If that genetic switch was already in place in the Devonian, high CO2 levels would have prevented plants from evolving large leaves. If large leaves had appeared then they would have cooked.
Only with falling levels of CO2, and improvements in roots and vascular systems to supply cooling water, could plants evolve the high stomatal densities that make large leaves viable. Studies on fossil leaves have so far supported this idea: earlier leaves were smaller and had far fewer pores than later ones, and only after CO2 levels fell did large leaves become abundant.
The Rise of Seeds
Archaeopteris had one more innovation to offer. It evolved a method of reproduction that partially freed trees from the flood plains that had confined the cladoxylopsids. Male and female cladoxylopsid spores were the same size, and fertilisation could only occur on wet ground where nutrients were readily available to nourish the embryo. In contrast, female Archaeopteris spores were larger than male ones and stored a food supply for the embryo. From this beginning, seeds evolved.
Seed plants - the grasses, flowers, shrubs and trees that are everywhere today - are thought to have descended from Archaeopteris and its relatives. With the evolution of seeds, plants could now spread to all sorts of places that had previously been out of bounds. Seeding freed plants from a reproductive necessity on water. Seeds allow trees to occupy and colonise drier environments.
Where Do Trees Colonise?
One of the most difficult places for trees to colonise was the uplands. In 2003, scientists discovered the oldest-known fossils of mountain plants, in Blanche Brook, a remote river in Newfoundland. Hundreds of giant trees lying in the a remote part of the river bed. The trees turned out to be 305 million years old, from the late Carboniferous. Known as cordaitaleans and standing up to 50 metres tall, they were seed plants related to conifers and looked a bit like monkey puzzle trees.
What about the Carboniferous Sporing vs Seeds?
Elsewhere in the Carboniferous, however, sporing plants still held sway. In the swampy lowland rainforests, Archaeopteris and the cladoxylopsids were overshadowed by giant club moss and horsetail trees that towered above the seed plants jostling for space below. In modern forests the opposite is true, with seed-bearing trees dominating the sporing ferns, mosses and horsetails of the understorey. It's like the world was turned upside down. Spores prevailed over seeds for the last time.
Organic Carbon and Coal
These late Carboniferous lowlands are noted for the sheer fecundity of their tree and plant fossils. Much of Europe, Asia and North America were near the equator and were blanketed by huge tracts of rainforest. Tectonic forces meant that the basins where the rainforests grew were slowly subsiding. This process, coupled with regular flooding, led to colossal amounts of organic carbon being buried. Much of the world's coal reserves formed in this 20-million-year period.
Then around 300 million years ago, a catastrophic earthquake caused one coal forest in what is now Illinois to slump below sea level, where it was rapidly buried. Low-oxygen conditions preserved a 1000-hectare expanse of this forest floor in near-pristine condition. The forest floor now forms the ceiling of the Riola and Vermilion Grove coal mines in Illinois.
The Ancient Rainforest
Scientists conducted the largest ever study of this ancient forest. The coal that had been mined out used to be the soil of the ancient rainforest, so as you walked around looking up you could see roots hanging down just above your head, and you could see giant fallen trees complete with their roots, trunk and crown.
You can see the roots of an ancient rainforest hanging down just above your head.
Spectacular 40-metre club mosses and shorter tree ferns monopolised the Illinois forest. There is really nothing today that looks anything like the giant club mosses. They probably grew much closer together than do trees in a modern rainforest because they don't have large canopies, so you don't have a lot of shadow cast by the trees. The forest would also have been greener than its modern counterpart, as the abundant club mosses had green scale-like leaf cushions all over their trunks and branches.
Swampworld and Tectonic Forces
Swampworld was not to last. As tectonic forces dragged the world's landmasses together to form the supercontinent Pangaea, the climate changed. Dryer, harsher conditions set in, and by the end of the Carboniferous the coal swamps had disappeared from most of the world. The mighty sporing trees could no longer find the water they needed to reproduce, and seed plants gained the upper hand. The reign of modern plants had truly begun.
Saturday, December 22, 2007
Finding: The accelerated uplift of mountains and highlands stretching from Ethiopia to South Africa blocked much ocean moisture, converting lush tropical forests into an arid patchwork of woodlands and savannah grasslands that gradually favored human ancestors who came down from the trees and started walking on two feet -- an energy-efficient way to search larger areas for food in an arid environment.
Scientists long have focused on how climate and vegetation allowed human ancestors to evolve in Africa. Now, University of Utah geologists are calling renewed attention to the idea that ground movements formed mountains and valleys, creating environments that favored the emergence of humanity.
Tectonics or the movement of Earth's crust may have been ultimately responsible for the evolution of humankind. This includes the movements of Earth's crust, its ever-shifting tectonic plates and the creation of mountains, valleys and ocean basins. It also includes the 3,700-mile-long stretch of highlands and mountains also known as "the Wall of Africa." It parallels the East African Rift valley, where many fossils of human ancestors were found.
As a topic about the influence on human evolution tectonics has been discussed since at least 1983. But much of the previous discussion of how climate affected human evolution involves global climate changes, such as those caused by cyclic changes in Earth's orbit around the sun, and not local and regional climate changes caused by East Africa's rising landscape.
However, 0ver the last 7 million years the crustal movement or tectonism in East Africa, the landscape drastically changed. That landscape controlled climate on a local to regional scale. That climate change spurred human ancestors to evolve away from the ape line.
Hominins (the new scientific word for humans (Homo) and their ancestors, including Ardipithecus, Paranthropus and Australopithecus) split from apes on the evolutionary tree roughly 7 million to 4 million years ago. The earliest undisputed hominin was Ardipithecus ramidus 4.4 million years ago. The earliest Homo arose 2.5 million years ago, and our species, Homo sapiens, almost 200,000 years ago.
A Force from within the Earth
The geological or tectonic forces shaping Africa begin deep in the Earth, where a "superplume" of hot and molten rock has swelled upward for at least the past 45 million years. This superplume and its branching smaller plumes help push apart the African and Arabian tectonic plates of Earth's crust, forming the Red Sea, Gulf of Aden and the Great Rift Valley that stretches from Syria to southern Africa.
As part of this process, Africa is being split apart along the East African Rift, a valley bounded by elevated "shoulders" a few tens of miles wide and sitting atop "domes" a few hundreds of miles wide and caused by upward bulging of the plume.
The East African Rift runs about 3,700 miles from the Ethiopian Plateau south-southwest to South Africa's Karoo Plateau. It is up to 370 miles wide and includes mountains reaching a maximum elevation of about 19,340 feet at Mount Kilimanjaro.
The rift "is characterized by volcanic peaks, plateaus, valleys and large basins and freshwater lakes," including sites where many fossils of early humans and their ancestors have been found, says Nahid Gani (pronounced nah-heed go-knee), a research scientist. There was some uplift in East Africa as early as 40 million years ago, but "most of these topographic features developed between 7 million and 2 million years ago."
A Wall Rises and New Species Evolve
The Wall of Africa started to form around 30 million years ago, recent studies show most of the uplift occurred between 7 million and 2 million years ago, just about when hominins split off from African apes, developed bipedalism and evolved bigger brains.
Nature built this wall, and then humans could evolve, walk tall and think big.
Is there any characteristic feature of the Wall that drove human evolution?
The answer is the variable landscape and vegetation resulting from uplift of the Wall of Africa, which created a topographic barrier to moisture, mostly from the Indian Ocean and dried the climate. Contrary to those who cite global climate cycles, the climate changes in East Africa were local and resulted from the uplift of different parts of the wall at different times.
The change from forests to a patchwork of woodland and open savannah did not happen everywhere in East Africa at the same time, and the changes also happened in East Africa later than elsewhere in the world.
The Rise of the Wall
Studies of the roughly 300-mile-by-300-mile Ethiopian Plateau, which is the most prominent part of the Wall of Africa indicated the plateau reached its present average elevation of 8,200 feet 25 million years ago. New analysis shows that the rates at which the Blue Nile River cut down into the Ethiopian Plateau, creating a canyon that rivals North America's Grand Canyon.
The conclusion: There were periods of low-to-moderate incision and uplift between 29 million and 10 million years ago, and again between 10 million and 6 million years ago, but the most rapid uplift of the Ethiopian Plateau (by some 3,200 vertical feet) happened 6 million to 3 million years ago.
Other research has shown the Kenyan part of the wall rose mostly between 7 million and 2 million years ago, mountains in Tanganyika and Malawi were uplifted mainly between 5 million and 2 million years ago, and the wall's southernmost end gained most of its elevation during the past 5 million years.
The Time Frame of the Wall development and Human evolution
Clearly, the Wall of Africa grew to be a prominent elevated feature over the last 7 million years, thereby playing a prominent role in East African aridification by wringing moisture out of monsoonal air moving across the region. That period coincides with evolution of human ancestors in the area.
The earliest undisputed evidence of true bipedalism (as opposed to knuckle-dragging by apes) is 4.1 million years ago in Australopithecus anamensis, but some believe the trait existed as early as 6 million to 7 million years ago.
The shaping of varied landscapes by tectonic forces -- lake basins, valleys, mountains, grasslands, woodlands could also be responsible, at a later stage, for hominins developing a bigger brain as a way to cope with these extremely variable and changing landscapes in which they had to find food and survive predators.
For now the lack of more precise timeframes makes it difficult to link specific tectonic events to the development of upright walking, bigger brains and other key steps in human evolution.
Thursday, December 20, 2007
Single-Cell Organisms and the Problem of Complexity
Single-cell organisms were already in existence 500 million years ago, with several thousand genes providing different cellular functions. Further developments seemed dependent on producing even more genes.
It would appear that for a highly developed organism like a human, this form of evolution would have resulted in several million genes. But researchers were surprised to learn, following publication of the human genome, that a human only has around 25,000 genes – not many more than a fruit fly or a worm with approximately 15,000 to 20,000 genes.
It would appear that, over the last 500 million years, other ways to produce highly complex organisms have evolved. Evolution has simply found more efficient ways to use the genes already there. But what could have made this possible?
Is there an answer? Yes - it involves the RNA
New results represent a piece of the puzzle and shed new light on to the purpose of an unusual structure in RNA polymerase II.
They build on earlier observations that gene expression is not just regulated by binding of the enzyme to the gene locus to which it is recruited, but also during the phase of active transcription from DNA into RNA. During this phase, parts of the newly synthesised RNA may be removed and the remaining sequences combined into new RNA message. This ‘splicing’ of RNA occurs during gene transcription, and in extreme cases, can produce RNAs coding for several thousand different proteins from a single gene.
How it Works - The Development of CTD
But what was the development that permitted this advance in gene usage? The RNA polymerase II has developed a structure composed of repeats of a 7 amino-acid sequence. In humans this structure – termed “carboxyterminal domain” or CTD – is composed of 52 such repeats. It is placed exactly at the position where RNA emerges from RNA polymerase II. In less complex organisms the CTD is much shorter: a worm has 36 repeats, and yeast as few as 26, but many single-cell organisms and bacteria have never developed an obvious CTD structure.
Although the requirement of CTD for the expression of cellular genes in higher organisms is undisputed, the molecular details for the gene-specific maturation of RNAs is still largely enigmatic. Research groups have now shown a differential requirement for phosphorylation of the amino acid serine at position 7 of CTD in the processing and maturation of specific gene products.
These results provide the groundwork for the discovery of further pieces of the CTD puzzle and thus enlarge our knowledge of gene regulation. Given its fundamental importance, understanding the mechanism of gene regulation is essential if we are to understand cancer and other diseases at the molecular level and develop new therapies.
Tuesday, December 18, 2007
The idea that gene losses might contribute to adaptation has been kicked around, but not well studied.
To find gene losses a software program called TransMap. The program compared the mouse and human genomes, searching for genes having changes significant enough to render them nonfunctional somewhere during the 75 million years since the divergence of the mouse and the human.
Genes can be lost in many ways. This study focused on losses caused by mutations that disrupt the open reading frame (ORF-disrupting mutations). These are either point mutations, where events such as the insertion or substitution of a DNA base alter the instructions delivered by the DNA, or changes that occur when a large portion of a gene is deleted altogether or moves to a new place on the genome.
Using the Dog Genome
The dog genome was used as an out-group to filter out false positives because the dog diverged from our ancient common ancestor earlier than the mouse. So if a gene is still living in both dog and mouse but not in human, it was probably living in the common ancestor and then lost in the human lineage.
Using this process, they identified 26 losses of long-established genes, including 16 that were not previously known.
The gene loss candidates found in the study do not represent a complete list of gene losses of long-established genes in the human lineage, because the analysis was designed to produce more false negatives than false positives.
The study compares multiple genomes
Next they compared the identified genes in the complete genomes of the human, chimpanzee, rhesus monkey, mouse, rat, dog, and opossum to estimate the amount of time the gene was functional before it was lost. This refined the timing of the gene loss and also served as a benchmark for whether the gene in question was long-established, and therefore probably functional, or merely a loss of a redundant gene copy. Through this process, they found 6 genes that were lost only in the human.
The ACYL3 Protein - A loss From many to none
One previously unknown loss, the gene for acyltransferase-3 (ACYL3), was particularly important. This is an ancient protein that exists throughout the whole tree of life. Multiple copies of the ACYL3 gene are encoded in the fly and worm genomes. In the mammalian clade there is only one copy left, and somewhere along primate evolution, that one copy was lost to the primate clan.
Next it was found that this gene contains a nonsense mutation in both human and chimp, and it appears to still look functional in rhesus. Further, they found that the mutation is not present in the orangutan, so the gene is probably still functional in that species. On the evolutionary tree leading to human, on the branch between chimp and orangutan sits gorilla. Knowing if the gene was still active in gorilla would narrow down the timing of the loss.
The gorilla DNA sequence showed the gene intact, without the mutation, so the loss likely occurred between the speciation of gorilla and chimpanzee.
Other Functional Losses
Acyltransferase-3 was not the only lost gene that doesn't have any close functional homologues in the human genome. A highlight of the research was that they were able to find a list of these orphan losses. Some of them have been functional for more than 300 million years, and they were the last copies left in the human genome. While the copies of these genes remaining in the human genome appear to be nonfunctional, functional copies of all of them exist in the mouse genome.
These orphan genes may be interesting candidates for experimental biologists to explore. It will be interesting to find out what was the biological effect of these losses. Once their function is well characterized in species that still have active copies, we could maybe speculate about their effects on human evolution.
Wednesday, December 12, 2007
The fossils, dated to 1.8 million years old, show some modern aspects of lower limb morphology, such as long legs and an arched foot, but retain some primitive aspects of morphology in the shoulder and foot. The species had a small stature and brain size more similar to earlier species found in Africa.
The earliest known hominins to have lived outside Africa in temperate zones of Eurasia did not yet display the full set of derived skeletal features the researchers conclude.
What this means
The new evidence shows how this species had the anatomical and behavioral capacity to be successful across a range of environments and expand out of Africa.
This research shows that the limb proportions and behavioral flexibility which allowed this species to expand out of Africa were there at least 1.8 million years ago.
Dmanisi is the site of a medieval village located about 53 miles southwest of Tbilisi, Georgia on a promontory at the confluence of the Mashavera and Phinezauri rivers.
Monday, December 10, 2007
Finding: The answers provide the first evolutionary history of the duplications in the human genome that are partly responsible for both disease and recent genetic innovations.
This work marks a significant step toward a better understanding of what genomic changes paved the way for modern humans, when these duplications occurred and what the associated costs are -- in terms of susceptibility to disease-causing genetic mutations.
Researchers have answered a similar vexing genomic question: Which of the thousands of long stretches of repeated DNA in the human genome came first? And which are the duplicates?
Genomes have an ability to copy a long stretch of DNA from one chromosome and insert it into another region of the genome. Segmental duplications hold many evolutionary secrets and uncovering them is a difficult biological and computational challenge with implications for both medicine and our understanding of evolution.
Researchers have created the first evolutionary history of the duplications in the human genome that are partly responsible for both disease and recent genetic innovations. This marks an important step toward a better understanding of what genomic changes paved the way for modern humans, when these duplications occurred and what the associated costs are - in terms of susceptibility to disease-causing genetic mutations.
In the past, the highly complex patterns of DNA duplication -- including duplications within duplications -- have prevented the construction of an evolutionary history of these long DNA duplications. To crack the duplication code and determine which of the DNA segments are originals (ancestral duplications) and which are copies (derivative duplications), the researchers looked to both algorithmic biology and comparative genomics.
Identifying the original duplications is a prerequisite to understanding what makes the human genome unstable. Researchers modified an algorithmic genome assembly technique in order to deconstruct the sequence of repeated stretches of DNA and identify the original sequences. The belief is that perhaps there may be something special about the originals, some clue or insight into what causes this colonization of the human genome.
This is the first time that we have a global view of the evolutionary origin of some of the most complicated regions of the human genome. The researchers tracked down the ancestral origin of more than two thirds of these long DNA duplications.
First, researchers suggest that specific regions of the human genome experienced elevated rates of duplication activity at different times in our recent genomic history. This contrasts with most models of genomic duplication which suggest a continuous model for recent duplications. Second, a large fraction of the recent duplication architecture centers around a rather small subset of "core duplicons" -- short segments of DNA that come together to form segmental duplications. These cores are focal points of human gene/transcript innovations.
Not all of the duplications in the human genome are created equal. Some of them -- the core duplicons -- appear to be responsible for recent genetic innovations the in human genome. Researchers uncovered 14 such core duplicons.
In 4 of the 14 cases, there is compelling evidence that genes embedded within the cores are associated with novel human gene innovations. In two cases the core duplicon has been part of novel fusion genes whose functions appear to be radically different from their antecedents.
Results suggest that the high rate of disease caused by these duplications in the normal population may be offset by the emergence of newly minted human/great-ape specific genes embedded within the duplications. The next challenge will be determining the function of these novel genes.
Mathematical Algorithms and Biological construction
Research applied their expertise in assembling genomes from millions of small fragments -- a problem that is not unlike the "mosaic decomposition" problem in analyzing duplications that the team faced.
Over the years researchers applied the 250-year old algorithmic idea first proposed by 18th century mathematician Leonhard Euler (of the fame of pi) to a variety of problems and demonstrated that it works equally well for a set of seemingly unrelated biological problems including DNA fragment assembly, reconstructing snake venoms, and now dissecting the mosaic structure of segmental duplications.