Monday, December 24, 2007

How trees changed the world

Finding: A nearly completly preserved tree dating 385 million years ago from the Gilboa Forests was found. It came from the first forests on Earth.

The Gilboa tree dates from the middle of the Devonian period (416 to 359 million years ago), a time of explosive evolutionary action among land plants. During this period trees evolved from small, primitive forms that would have barely brushed your ankle into genuine trees up to 30 metres tall. And with the evolution of trees, they and all the other plants - hitherto confined to marshy environments - went on to conquer the surface of the planet.

The First Forests
These first forests changed the face of the Earth. Early land plants had already started leaking oxygen into the atmosphere, creating soils and providing food and shelter for animals, and the evolution of trees upped the pace of change. They weathered rocks, made soils deeper and richer, created complex habitats and changed the climate beyond recognition. By the end of the Devonian, an ecologically modern world had appeared. Discoveries such as the Gilboa tree are bringing stunning new insights into how the foresting of the world came about.

Plants in the Ordovician Period
Plants first colonised land in the Ordovician period, around 465 million years ago (see Chart). By the early Devonian they had developed many of the features of modern plants, including a protective waxy cuticle, vascular plumbing for transporting water and nutrients, and pores called stomata to draw in carbon dioxide. But there were many differences from modern plants too. Seeds had yet to evolve - early land plants instead reproduced by way of spores. Wood, large leaves and deep roots were unknown. Few plants were taller than a few centimetres.

The Rhynie ecosystem
The best place to catch a glimpse of this primitive terrestrial forest is in the hills around the village of Rhynie in Aberdeenshire, UK. Here the finely crystallized quartz of the Rhynie chert preserves in extraordinary detail an entire ecosystem that was engulfed and petrified by silica-rich waters from a volcanic spring 410 million years ago. The fossil plants still stand upright, and even their cells remain visible. Tiny creatures such as insects, centipedes, mites, harvestmen and spider-like trigonotarbids are preserved in great detail. Some still cling to the stems on which they lived and died.

Originally, Rhynie wouldn't have been recognisable as a modern environment. Every plant would have been on a small scale - knee height and lower. However, by the late Devonian, plants are on a scale that humans are used to. You can walk around a woodland and identify a tree canopy layer, a shrub layer and a herbaceous layer. Now the whole environment looks more modern.

Prototaxites
The Rhynie landscape was not entirely devoid of large living things, however. Dotted here and there were featureless columns standing up to 6 metres high and a metre wide at the base. When first described in 1857, from fossils found in Quebec, they were identified as conifer trunks and named Prototaxites. However, studies into the structure of their "wood" soon revealed that they were not trees. Various alternatives were proposed, including giant algae, lichen and fungi, but the identity of Prototaxites remained uncertain.

Last year a research team University of Chicago decided to settle matters. They measured the ratio of different carbon isotopes in Prototaxites fossils to reveal whether they were making a living from photosynthesis or by eating rotting matter as fungi do today. The results clearly showed that Prototaxites was a fungus. It was both tougher and stronger

And there was more. The isotopes also revealed that some Prototaxites were probably living off microbial crusts composed of bacteria, algae and lichens. These crusts still exist today but are confined to places where vascular plants can't grow, such as deserts. What Prototaxites shows is that there were large patches of the early Devonian landscape with no vascular plants. This means that plants didn't conquer the land as quickly and completely as scientists once assumed.

The Rise of the Forests
With large patches of land still free of vascular plants, clearly an upgrade of their plumbing and support systems was needed for the terrestrial conquest to continue. This was achieved through "secondary growth", the evolution of tougher and stronger tissues to carry water and nutrients up longer stems. Once these were in place, plants were able to grow much bigger. It was these advances that allowed the trees of the Gilboa forest to reach their full height of 8 meters or more. The foresting of the Earth had begun.

The appearance of trees changed the rules of the game, with evolutionary scramble for height, for light to power photosynthesis, and for prime positions to disperse spores. Once it became possible to be a tree, then the race is on for size and dominance. If you look at modern ecosystems, the trees that are dominant hog the light space of the environment. The goal is to be at the top and collect the most light.

The Gilboa Forest
So what was the Gilboa forest like? Its trees, known as cladoxylopsids, looked a bit like tree ferns, although they are not related. Their trunks were long and slender, with a bulbous base and shallow roots. They did not have leaves, instead sporting a goblet-shaped crown of branches and thread-like branchlets. These were probably green to carry out photosynthesis. In other words it must have looked like a strange, giant bottle brush.

Gilboa, 385 million years ago, was a warm, wet flood plain 10 degrees south of the equator. Since the cladoxylopsids lacked leaves, the forest would have been airier and brighter than any modern forest. There was a diverse understorey that included club mosses and ferns, but the only animal inhabitants were arthropods, including insects, centipedes and mites. It would also have been wet underfoot: as with today's ferns and mosses, cladoxylopsid spores could only be fertilised on wet surfaces, so the trees were confined to flood plains.

World Wide Representation
Cladoxylopsids achieved worldwide success: fossils of their crowns (which until the Gilboa tree were thought to be complete plants) have been found in Europe, China and the Americas. However, in many respects they were primitive, and lurking in their shadows was a group of plants that would soon put them in the shade.

Other Plant Developments
These were the archaeopterids, relatives of the conifers. At the time of the Gilboa forest, archaeopterids were no bigger than shrubs, but their lineage soon made some key evolutionary advances, including wood, deep roots and large leaves. By 370 million years ago, a fully fledged tree, Archaeopteris, had emerged from the archaeopterid ranks.

With the trunk of a conifer and fern-like leaves, Archaeopteris reached 30 metres - as tall as a mature oak - and dominated late Devonian forests all over the world. Its wooden trunk permitted it to grow much taller than the cladoxylopsids. There is a stronger material around, per mass, than wood says a leading researcher in the chemical make-up of fossil plants. Wood also led to the formation of the first complex soils. Soil humus is, by and large, lignin, the polymer that makes wood tough.

The Decline of CO2 Levels and the Rise of Deep Roots
Archaeopteris was also the first plant to evolve deep roots. Roots eat away at rocks, burrowing into and dissolving them with acids in pursuit of nutrients. Over an immense period of time the weathered material gets washed into the oceans, where it combines with dissolved CO2 to form sediments that are eventually subducted into the Earth's interior by tectonic activity. This process removed huge amounts of CO2 from the oceans and atmosphere, with profound consequences for the climate. Between the beginning and end of the Devonian, levels of the gas plummeted by up to 95 per cent. Greenhouse conditions vanished, to be replaced by an ice age that at its peak 300 million years ago saw glaciers approaching the tropics.

Deep Roots and Large Leaves
But oddly it was the climatic upheaval brought about by roots that appears to have driven the next great innovation - large leaves. These first appeared 390 million years ago, but only became widespread with Archaeopteris 15 million years later. The stripping of CO2 from the Devonian atmosphere helped to remove an obstacle that had been inhibiting the evolution of large leaves.

Although large, flat leaves are very efficient at capturing sunlight for photosynthesis, they are difficult to keep cool. To prevent overheating, leaves need to release water vapour through their pore-like stomata - the plant equivalent of sweating. The problem is the number of stomata is regulated by a genetic switch that responds to CO2 levels in the atmosphere: the more CO2, the lower the stomatal density. If that genetic switch was already in place in the Devonian, high CO2 levels would have prevented plants from evolving large leaves. If large leaves had appeared then they would have cooked.

Only with falling levels of CO2, and improvements in roots and vascular systems to supply cooling water, could plants evolve the high stomatal densities that make large leaves viable. Studies on fossil leaves have so far supported this idea: earlier leaves were smaller and had far fewer pores than later ones, and only after CO2 levels fell did large leaves become abundant.

The Rise of Seeds
Archaeopteris had one more innovation to offer. It evolved a method of reproduction that partially freed trees from the flood plains that had confined the cladoxylopsids. Male and female cladoxylopsid spores were the same size, and fertilisation could only occur on wet ground where nutrients were readily available to nourish the embryo. In contrast, female Archaeopteris spores were larger than male ones and stored a food supply for the embryo. From this beginning, seeds evolved.

Seed plants - the grasses, flowers, shrubs and trees that are everywhere today - are thought to have descended from Archaeopteris and its relatives. With the evolution of seeds, plants could now spread to all sorts of places that had previously been out of bounds. Seeding freed plants from a reproductive necessity on water. Seeds allow trees to occupy and colonise drier environments.

Where Do Trees Colonise?
One of the most difficult places for trees to colonise was the uplands. In 2003, scientists discovered the oldest-known fossils of mountain plants, in Blanche Brook, a remote river in Newfoundland. Hundreds of giant trees lying in the a remote part of the river bed. The trees turned out to be 305 million years old, from the late Carboniferous. Known as cordaitaleans and standing up to 50 metres tall, they were seed plants related to conifers and looked a bit like monkey puzzle trees.

What about the Carboniferous Sporing vs Seeds?
Elsewhere in the Carboniferous, however, sporing plants still held sway. In the swampy lowland rainforests, Archaeopteris and the cladoxylopsids were overshadowed by giant club moss and horsetail trees that towered above the seed plants jostling for space below. In modern forests the opposite is true, with seed-bearing trees dominating the sporing ferns, mosses and horsetails of the understorey. It's like the world was turned upside down. Spores prevailed over seeds for the last time.

Organic Carbon and Coal
These late Carboniferous lowlands are noted for the sheer fecundity of their tree and plant fossils. Much of Europe, Asia and North America were near the equator and were blanketed by huge tracts of rainforest. Tectonic forces meant that the basins where the rainforests grew were slowly subsiding. This process, coupled with regular flooding, led to colossal amounts of organic carbon being buried. Much of the world's coal reserves formed in this 20-million-year period.

Then around 300 million years ago, a catastrophic earthquake caused one coal forest in what is now Illinois to slump below sea level, where it was rapidly buried. Low-oxygen conditions preserved a 1000-hectare expanse of this forest floor in near-pristine condition. The forest floor now forms the ceiling of the Riola and Vermilion Grove coal mines in Illinois.

The Ancient Rainforest
Scientists conducted the largest ever study of this ancient forest. The coal that had been mined out used to be the soil of the ancient rainforest, so as you walked around looking up you could see roots hanging down just above your head, and you could see giant fallen trees complete with their roots, trunk and crown.

You can see the roots of an ancient rainforest hanging down just above your head.
Spectacular 40-metre club mosses and shorter tree ferns monopolised the Illinois forest. There is really nothing today that looks anything like the giant club mosses. They probably grew much closer together than do trees in a modern rainforest because they don't have large canopies, so you don't have a lot of shadow cast by the trees. The forest would also have been greener than its modern counterpart, as the abundant club mosses had green scale-like leaf cushions all over their trunks and branches.

Swampworld and Tectonic Forces
Swampworld was not to last. As tectonic forces dragged the world's landmasses together to form the supercontinent Pangaea, the climate changed. Dryer, harsher conditions set in, and by the end of the Carboniferous the coal swamps had disappeared from most of the world. The mighty sporing trees could no longer find the water they needed to reproduce, and seed plants gained the upper hand. The reign of modern plants had truly begun.

Saturday, December 22, 2007

'Wall Of Africa' Allowed Humanity To Emerge


Finding: The accelerated uplift of mountains and highlands stretching from Ethiopia to South Africa blocked much ocean moisture, converting lush tropical forests into an arid patchwork of woodlands and savannah grasslands that gradually favored human ancestors who came down from the trees and started walking on two feet -- an energy-efficient way to search larger areas for food in an arid environment.

Scientists long have focused on how climate and vegetation allowed human ancestors to evolve in Africa. Now, University of Utah geologists are calling renewed attention to the idea that ground movements formed mountains and valleys, creating environments that favored the emergence of humanity.

Tectonics

Tectonics or the movement of Earth's crust may have been ultimately responsible for the evolution of humankind. This includes the movements of Earth's crust, its ever-shifting tectonic plates and the creation of mountains, valleys and ocean basins. It also includes the 3,700-mile-long stretch of highlands and mountains also known as "the Wall of Africa." It parallels the East African Rift valley, where many fossils of human ancestors were found.

As a topic about the influence on human evolution tectonics has been discussed since at least 1983. But much of the previous discussion of how climate affected human evolution involves global climate changes, such as those caused by cyclic changes in Earth's orbit around the sun, and not local and regional climate changes caused by East Africa's rising landscape.

However, 0ver the last 7 million years the crustal movement or tectonism in East Africa, the landscape drastically changed. That landscape controlled climate on a local to regional scale. That climate change spurred human ancestors to evolve away from the ape line.

Hominins (the new scientific word for humans (Homo) and their ancestors, including Ardipithecus, Paranthropus and Australopithecus) split from apes on the evolutionary tree roughly 7 million to 4 million years ago. The earliest undisputed hominin was Ardipithecus ramidus 4.4 million years ago. The earliest Homo arose 2.5 million years ago, and our species, Homo sapiens, almost 200,000 years ago.

A Force from within the Earth

The geological or tectonic forces shaping Africa begin deep in the Earth, where a "superplume" of hot and molten rock has swelled upward for at least the past 45 million years. This superplume and its branching smaller plumes help push apart the African and Arabian tectonic plates of Earth's crust, forming the Red Sea, Gulf of Aden and the Great Rift Valley that stretches from Syria to southern Africa.

As part of this process, Africa is being split apart along the East African Rift, a valley bounded by elevated "shoulders" a few tens of miles wide and sitting atop "domes" a few hundreds of miles wide and caused by upward bulging of the plume.

The East African Rift runs about 3,700 miles from the Ethiopian Plateau south-southwest to South Africa's Karoo Plateau. It is up to 370 miles wide and includes mountains reaching a maximum elevation of about 19,340 feet at Mount Kilimanjaro.

The rift "is characterized by volcanic peaks, plateaus, valleys and large basins and freshwater lakes," including sites where many fossils of early humans and their ancestors have been found, says Nahid Gani (pronounced nah-heed go-knee), a research scientist. There was some uplift in East Africa as early as 40 million years ago, but "most of these topographic features developed between 7 million and 2 million years ago."

A Wall Rises and New Species Evolve

The Wall of Africa started to form around 30 million years ago, recent studies show most of the uplift occurred between 7 million and 2 million years ago, just about when hominins split off from African apes, developed bipedalism and evolved bigger brains.
Nature built this wall, and then humans could evolve, walk tall and think big.

Is there any characteristic feature of the Wall that drove human evolution?

The answer is the variable landscape and vegetation resulting from uplift of the Wall of Africa, which created a topographic barrier to moisture, mostly from the Indian Ocean and dried the climate. Contrary to those who cite global climate cycles, the climate changes in East Africa were local and resulted from the uplift of different parts of the wall at different times.

The change from forests to a patchwork of woodland and open savannah did not happen everywhere in East Africa at the same time, and the changes also happened in East Africa later than elsewhere in the world.

The Rise of the Wall

Studies of the roughly 300-mile-by-300-mile Ethiopian Plateau, which is the most prominent part of the Wall of Africa indicated the plateau reached its present average elevation of 8,200 feet 25 million years ago. New analysis shows that the rates at which the Blue Nile River cut down into the Ethiopian Plateau, creating a canyon that rivals North America's Grand Canyon.
The conclusion: There were periods of low-to-moderate incision and uplift between 29 million and 10 million years ago, and again between 10 million and 6 million years ago, but the most rapid uplift of the Ethiopian Plateau (by some 3,200 vertical feet) happened 6 million to 3 million years ago.

Other research has shown the Kenyan part of the wall rose mostly between 7 million and 2 million years ago, mountains in Tanganyika and Malawi were uplifted mainly between 5 million and 2 million years ago, and the wall's southernmost end gained most of its elevation during the past 5 million years.

The Time Frame of the Wall development and Human evolution

Clearly, the Wall of Africa grew to be a prominent elevated feature over the last 7 million years, thereby playing a prominent role in East African aridification by wringing moisture out of monsoonal air moving across the region. That period coincides with evolution of human ancestors in the area.

The earliest undisputed evidence of true bipedalism (as opposed to knuckle-dragging by apes) is 4.1 million years ago in Australopithecus anamensis, but some believe the trait existed as early as 6 million to 7 million years ago.

The shaping of varied landscapes by tectonic forces -- lake basins, valleys, mountains, grasslands, woodlands could also be responsible, at a later stage, for hominins developing a bigger brain as a way to cope with these extremely variable and changing landscapes in which they had to find food and survive predators.

For now the lack of more precise timeframes makes it difficult to link specific tectonic events to the development of upright walking, bigger brains and other key steps in human evolution.

Thursday, December 20, 2007

Evolution With A Restricted Number Of Genes

Finding: RNA polymerase II is highly conserved through evolution, with many of its structural characteristics being conserved between bacteria and humans. The development of higher forms of life would appear to have been influenced by RNA polymerase II. This enzyme transcribes the information coded by genes from DNA into messenger-RNA (mRNA), which in turn is the basis for the production of proteins.

Single-Cell Organisms and the Problem of Complexity
Single-cell organisms were already in existence 500 million years ago, with several thousand genes providing different cellular functions. Further developments seemed dependent on producing even more genes.

It would appear that for a highly developed organism like a human, this form of evolution would have resulted in several million genes. But researchers were surprised to learn, following publication of the human genome, that a human only has around 25,000 genes – not many more than a fruit fly or a worm with approximately 15,000 to 20,000 genes.

It would appear that, over the last 500 million years, other ways to produce highly complex organisms have evolved. Evolution has simply found more efficient ways to use the genes already there. But what could have made this possible?

Is there an answer? Yes - it involves the RNA
New results represent a piece of the puzzle and shed new light on to the purpose of an unusual structure in RNA polymerase II.

They build on earlier observations that gene expression is not just regulated by binding of the enzyme to the gene locus to which it is recruited, but also during the phase of active transcription from DNA into RNA. During this phase, parts of the newly synthesised RNA may be removed and the remaining sequences combined into new RNA message. This ‘splicing’ of RNA occurs during gene transcription, and in extreme cases, can produce RNAs coding for several thousand different proteins from a single gene.

How it Works - The Development of CTD
But what was the development that permitted this advance in gene usage? The RNA polymerase II has developed a structure composed of repeats of a 7 amino-acid sequence. In humans this structure – termed “carboxyterminal domain” or CTD – is composed of 52 such repeats. It is placed exactly at the position where RNA emerges from RNA polymerase II. In less complex organisms the CTD is much shorter: a worm has 36 repeats, and yeast as few as 26, but many single-cell organisms and bacteria have never developed an obvious CTD structure.

Although the requirement of CTD for the expression of cellular genes in higher organisms is undisputed, the molecular details for the gene-specific maturation of RNAs is still largely enigmatic. Research groups have now shown a differential requirement for phosphorylation of the amino acid serine at position 7 of CTD in the processing and maturation of specific gene products.

These results provide the groundwork for the discovery of further pieces of the CTD puzzle and thus enlarge our knowledge of gene regulation. Given its fundamental importance, understanding the mechanism of gene regulation is essential if we are to understand cancer and other diseases at the molecular level and develop new therapies.

Tuesday, December 18, 2007

Losses Of Long-established Genes Contribute To Human Evolution

Finding: While it is well understood that the evolution of new genes leads to adaptations that help species survive, gene loss may also afford a selective advantage. A group of scientists has investigated this less-studied idea, carrying out the first systematic computational analysis to identify long-established genes that have been lost across millions of years of evolution leading to the human species.

The idea that gene losses might contribute to adaptation has been kicked around, but not well studied.

To find gene losses a software program called TransMap. The program compared the mouse and human genomes, searching for genes having changes significant enough to render them nonfunctional somewhere during the 75 million years since the divergence of the mouse and the human.

Genes can be lost in many ways. This study focused on losses caused by mutations that disrupt the open reading frame (ORF-disrupting mutations). These are either point mutations, where events such as the insertion or substitution of a DNA base alter the instructions delivered by the DNA, or changes that occur when a large portion of a gene is deleted altogether or moves to a new place on the genome.

Using the Dog Genome
The dog genome was used as an out-group to filter out false positives because the dog diverged from our ancient common ancestor earlier than the mouse. So if a gene is still living in both dog and mouse but not in human, it was probably living in the common ancestor and then lost in the human lineage.

Using this process, they identified 26 losses of long-established genes, including 16 that were not previously known.

The gene loss candidates found in the study do not represent a complete list of gene losses of long-established genes in the human lineage, because the analysis was designed to produce more false negatives than false positives.

The study compares multiple genomes
Next they compared the identified genes in the complete genomes of the human, chimpanzee, rhesus monkey, mouse, rat, dog, and opossum to estimate the amount of time the gene was functional before it was lost. This refined the timing of the gene loss and also served as a benchmark for whether the gene in question was long-established, and therefore probably functional, or merely a loss of a redundant gene copy. Through this process, they found 6 genes that were lost only in the human.

The ACYL3 Protein - A loss From many to none
One previously unknown loss, the gene for acyltransferase-3 (ACYL3), was particularly important. This is an ancient protein that exists throughout the whole tree of life. Multiple copies of the ACYL3 gene are encoded in the fly and worm genomes. In the mammalian clade there is only one copy left, and somewhere along primate evolution, that one copy was lost to the primate clan.

Next it was found that this gene contains a nonsense mutation in both human and chimp, and it appears to still look functional in rhesus. Further, they found that the mutation is not present in the orangutan, so the gene is probably still functional in that species. On the evolutionary tree leading to human, on the branch between chimp and orangutan sits gorilla. Knowing if the gene was still active in gorilla would narrow down the timing of the loss.

The gorilla DNA sequence showed the gene intact, without the mutation, so the loss likely occurred between the speciation of gorilla and chimpanzee.

Other Functional Losses
Acyltransferase-3 was not the only lost gene that doesn't have any close functional homologues in the human genome. A highlight of the research was that they were able to find a list of these orphan losses. Some of them have been functional for more than 300 million years, and they were the last copies left in the human genome. While the copies of these genes remaining in the human genome appear to be nonfunctional, functional copies of all of them exist in the mouse genome.

These orphan genes may be interesting candidates for experimental biologists to explore. It will be interesting to find out what was the biological effect of these losses. Once their function is well characterized in species that still have active copies, we could maybe speculate about their effects on human evolution.

Wednesday, December 12, 2007

Human Ancestors More Primitive That Once Thought

Finding: A team of researchers has determined through analysis of the earliest known hominid fossils outside of Africa, recently discovered in Dmanisi, Georgia, that the first human ancestors to inhabit Eurasia were more primitive than previously thought.

The fossils, dated to 1.8 million years old, show some modern aspects of lower limb morphology, such as long legs and an arched foot, but retain some primitive aspects of morphology in the shoulder and foot. The species had a small stature and brain size more similar to earlier species found in Africa.

The earliest known hominins to have lived outside Africa in temperate zones of Eurasia did not yet display the full set of derived skeletal features the researchers conclude.

What this means
The new evidence shows how this species had the anatomical and behavioral capacity to be successful across a range of environments and expand out of Africa.

This research shows that the limb proportions and behavioral flexibility which allowed this species to expand out of Africa were there at least 1.8 million years ago.

Dmanisi is the site of a medieval village located about 53 miles southwest of Tbilisi, Georgia on a promontory at the confluence of the Mashavera and Phinezauri rivers.

Monday, December 10, 2007

New Insights Into The Evolution Of The Human Genome

Which came first, the chicken genome or the egg genome?

Finding: The answers provide the first evolutionary history of the duplications in the human genome that are partly responsible for both disease and recent genetic innovations.

This work marks a significant step toward a better understanding of what genomic changes paved the way for modern humans, when these duplications occurred and what the associated costs are -- in terms of susceptibility to disease-causing genetic mutations.

Researchers have answered a similar vexing genomic question: Which of the thousands of long stretches of repeated DNA in the human genome came first? And which are the duplicates?


Genomes have an ability to copy a long stretch of DNA from one chromosome and insert it into another region of the genome. Segmental duplications hold many evolutionary secrets and uncovering them is a difficult biological and computational challenge with implications for both medicine and our understanding of evolution.

Evolutionary History
Researchers have created the first evolutionary history of the duplications in the human genome that are partly responsible for both disease and recent genetic innovations. This marks an important step toward a better understanding of what genomic changes paved the way for modern humans, when these duplications occurred and what the associated costs are - in terms of susceptibility to disease-causing genetic mutations.

In the past, the highly complex patterns of DNA duplication -- including duplications within duplications -- have prevented the construction of an evolutionary history of these long DNA duplications. To crack the duplication code and determine which of the DNA segments are originals (ancestral duplications) and which are copies (derivative duplications), the researchers looked to both algorithmic biology and comparative genomics.

Identifying the original duplications is a prerequisite to understanding what makes the human genome unstable. Researchers modified an algorithmic genome assembly technique in order to deconstruct the sequence of repeated stretches of DNA and identify the original sequences. The belief is that perhaps there may be something special about the originals, some clue or insight into what causes this colonization of the human genome.

This is the first time that we have a global view of the evolutionary origin of some of the most complicated regions of the human genome. The researchers tracked down the ancestral origin of more than two thirds of these long DNA duplications.

Special Findings:
First, researchers suggest that specific regions of the human genome experienced elevated rates of duplication activity at different times in our recent genomic history. This contrasts with most models of genomic duplication which suggest a continuous model for recent duplications. Second, a large fraction of the recent duplication architecture centers around a rather small subset of "core duplicons" -- short segments of DNA that come together to form segmental duplications. These cores are focal points of human gene/transcript innovations.

Not all of the duplications in the human genome are created equal. Some of them -- the core duplicons -- appear to be responsible for recent genetic innovations the in human genome. Researchers uncovered 14 such core duplicons.

In 4 of the 14 cases, there is compelling evidence that genes embedded within the cores are associated with novel human gene innovations. In two cases the core duplicon has been part of novel fusion genes whose functions appear to be radically different from their antecedents.

Results suggest that the high rate of disease caused by these duplications in the normal population may be offset by the emergence of newly minted human/great-ape specific genes embedded within the duplications. The next challenge will be determining the function of these novel genes.

Mathematical Algorithms and Biological construction
Research applied their expertise in assembling genomes from millions of small fragments -- a problem that is not unlike the "mosaic decomposition" problem in analyzing duplications that the team faced.

Over the years researchers applied the 250-year old algorithmic idea first proposed by 18th century mathematician Leonhard Euler (of the fame of pi) to a variety of problems and demonstrated that it works equally well for a set of seemingly unrelated biological problems including DNA fragment assembly, reconstructing snake venoms, and now dissecting the mosaic structure of segmental duplications.

Monday, November 26, 2007

DNA is a vestige of formation of liquid crystal order

Finding: Scientists have discovered liquid crystals of ultrashort DNA molecules immersed in water, providing a new scenario for a key step in the emergence of life on Earth.

The research team found that surprisingly short segments of DNA, life's molecular carrier of genetic information, could assemble into several distinct liquid crystal phases that "self-orient" parallel to one another and stack into columns when placed in a water solution.

Life is widely believed to have emerged as segments of DNA- or RNA-like molecules in a prebiotic "soup" solution of ancient organic molecules.

The conventional View Random formation of DNA is not possible.
If the formation of molecular chains as uniform as DNA by random chemistry is essentially impossible, then what are the effective ways for simple molecules to spontaneously self-select, "chain-up" and self-replicate.

What the study shows
In a mixture of tiny fragments of DNA, those molecules capable of forming liquid crystals selectively condense into droplets in which conditions are favorable for them to be chemically linked into longer molecules with enhanced liquid crystal-forming tendencies.

Even tiny fragments of double helix DNA can spontaneously self-assemble into columns that contain many molecules. From the collection of ancient molecules, short RNA pieces or some structurally related precursor emerged as the molecular fragments most capable of condensing into liquid crystal droplets, selectively developing into long molecules.

What are Liquid Crystals?
Liquid crystals are organic materials related to soap that exhibit both solid and liquid properties. They are commonly used for information displays in computers, flat-panel televisions, cell phones, calculators and watches.

What affects liquid crystals?
Most liquid crystal phase molecules are rod-shaped and have the ability to spontaneously form large domains of a common orientation, which makes them particularly sensitive to stimuli like changes in temperature or applied voltage.

RNA and DNA are chain-like polymers with side groups known as nucleotides, or bases, that selectively adhere only to specific bases on a second chain. Matching, or complementary base sequences enable the chains to pair up and form the widely recognized double helix structure. Genetic information is encoded in sequences of thousands to millions of bases along the chains, which can be microns to millimeters in length.

Such DNA polynucleotides had previously been shown to organize into liquid crystal phases in which the chains spontaneously oriented parallel to each other. Researchers understand the liquid crystal organization to be a result of DNA's elongated molecular shape, making parallel alignment easier, much like spaghetti thrown in a box and shaken would be prone to line up in parallel.

How short is short?
A series of experiments were conducted to see how short the DNA segments could be and still show liquid crystal ordering. The team found that even a DNA segment as short as six bases, when paired with a complementary segment that together measured just two nanometers long and two nanometers in diameter, could still assemble itself into the liquid crystal phases, in spite of having almost no elongation in shape.

What does this mean?
Structural analysis of the liquid crystal phases showed that they appeared because such short DNA duplex pairs were able to stick together "end-to-end," forming rod-shaped aggregates that could then behave like much longer segments of DNA. The sticking was a result of small, oily patches found on the ends of the short DNA segments that help them adhere to each other in a reversible way -- much like magnetic buttons -- as they expelled water in between them.

Columnar Stacking is possible if the nanoDna can form duplexes
The experiments provided direct evidence for the columnar stacking of the nano DNA pieces in a fluid liquid crystal phase. The key observation with respect to early life is that this aggregation of nano DNA strands is possible only if they form duplexes. In a sample of chains in which the bases don't match and the chains can't form helical duplexes, we did not observe liquid crystal ordering.

Complementary and noncomplementary DNA segments
Additional tests by the team involved mixed solutions of complementary and noncomplementary DNA segments. The results indicated that essentially all of the complementary DNA bits condensed out in the form of liquid crystal droplets, physically separating them from the noncomplementary DNA segments.

Significance for DNA molecules
The significance is that small molecules with the ability to pair up the right way can seek each other out and collect together into drops that are internally self-organized to facilitate the growth of larger pairable molecules.

DNA is a vestige of formation of liquid crystal order
The liquid crystal phase condensation selects the appropriate molecular components, and with the right chemistry would evolve larger molecules tuned to stabilize the liquid crystal phase. If this is correct, the linear polymer shape of DNA itself is a vestige of formation by liquid crystal order.

Saturday, November 24, 2007

Environmental Setting Of Human Migrations In The Circum-Pacific Region

Finding: The expansion of modern human populations into the circum-Pacific region occurred in at least four pulses, in part controlled by climate and sea level changes in the Late Pleistocene and Holocene epochs. Modern humans migrated into eastern Asia via a southern coastal route.

A new study adds insight into the migration of anatomically modern humans out of Africa and into Asia less than 100,000 years before present (BP).


Phase 1 45,000 to 40,000 BP Stable climate and sea level
The initial "out of Africa" migration was thwarted by dramatic changes in both sea level and climate and extreme drought in the coastal zone. A period of stable climate and sea level 45,000-40,000 years BP gave rise to the first major pulse of migration, when modern humans spread from India, throughout much of coastal southeast Asia, Australia, and Melanesia, extending northward to eastern Russia and Japan by 37,000 years BP.

33,000 to 16,000 BP Climate change - sea level and cold climate change
The northward push of modern humans along the eastern coast of Asia stalled north of 43° N latitude, probably due to the inability of the populations to adjust to cold waters and tundra/steppe vegetation.
The ensuing cold and dry Last Glacial period, ~33,000-16,000 year BP, once again brought dramatic changes in sea level and climate, which caused abandonment of many coastal sites.

Phase 2 16,000 to 8,000 BP Climate Warming
After 16,000 years BP, climates began to warm, but sea level was still 100 m below modern levels, creating conditions amenable for a second pulse of human migration into North America across an ice-free coastal plain now covered by the Bering Sea.
Phase 3 8,000 to 6,000 BP climate stabilization
The stabilization of climate and sea level in the early Holocene (8,000-6,000 years BP) supported the expansion of coastal wetlands, lagoons, and coral reefs, which in turn gave rise to a third pulse of coastal settlement, filling in most of the circum-Pacific region.
A drop in sea level in the western Pacific in the mid-Holocene (~6,000-4,000 year BP), caused a reduction in productive coastal habitats, leading to a brief disruption in human subsistence along the then densely settled coast.
Phase 4 3,500 to 1,000 BP
This disruption may have helped initiate the last major pulse of human migration in the circum-Pacific region, that of the migration to Oceania, which began about 3,500 years BP and culminated in the settlement of Hawaii and Easter Island by 2000-1000 years BP.

Friday, November 23, 2007

Gene comparison between Human and mammals

Finding: By comparing portions of the human genome with those of other mammals, researchers have discovered almost 300 previously unidentified human genes, and found extensions of several hundred genes already known.

Behind the discovery
The fundamental is the idea that as organisms evolve, sections of genetic code that do something useful for the organism change in different ways.

What is the human genome?
The complete sequence of the human genome was accomplished several years ago. That means that the 3 billion or so chemical units, called bases, that make up the order of the genetic code is known. What is not known is the identification of the exact location of all the short sections that code for proteins or perform regulatory or other functions.

The genes make proteins...the basic chemical component needed for building cells. More than 20,000 protein-coding genes have been identified. This finding is important because it shows there still could be many more genes that have been missed using current biological methods. These existing methods are very effective at finding genes that have a wide expression but may miss those that are expressed only in certain tissues or at early stages of embryonic development.

Using evolution for gene discovery
This method involves using evolution to identify these genes. Gene comparision follows evolution; it has been doing this experiment for millions of years. There are many similarities between genes of the two species. The differences can be identified. Using a computer is the microscope to observe the results.

Four different bases -- commonly referred to by the letters G, C, A and T -- make up DNA. Three bases in a row can code for an amino acid (the building blocks of proteins), and a string of these three-letter codes can be a gene, coding for a string of amino acids that a cell can make into a protein.

Conserved genes
Siepel and colleagues set out to find genes that have been "conserved" -- that are fundamental to all life and that have stayed the same, or nearly so, over millions of years of evolution.
The researchers started with "alignments" discovered by other workers -- stretches up to several thousand bases long that are mostly alike across two or more species.

Over millions of years, individual bases can be swapped -- C to G, T to A, for example -- by damage or miscopying. Changes that alter the structure of a protein can kill the organism or send it down a dead-end evolutionary path. But conserved genes contain only minor changes that leave the protein able to do its job. The computer looked for regions with those sorts of changes by creating a mathematical model of how the gene might have changed, then looking for matches to this model.

Wednesday, November 21, 2007

Migration of Early Humans From Africa Aided By Wet Weather

Finding: Migrations out of Africa 200,000 to 150,000 was dependent on the the wet climate in the presently hyper-arid Saharan-Arabian desert.

Conventional thinking
This migration was dependent on the occurrence of wetter climate in the region. There is good evidence that the southern and central Saharan-Arabian desert experienced increased monsoon precipitation during this period (200,000 to 150,000), but there is no unequivocal evidence for a corresponding rainfall increase in the northern part of the migration corridor, including the Sinai-Negev land bridge between Africa and Asia.

Passage through this "bottleneck" region would have been dependent on the development of suitable climate conditions.

Uranium series dating method - Speleothems
Scientists a reconstruction of paleoclimate in the Negev Desert based on absolute uranium series dating of carbonate cave deposits (speleothems). Speleothems only form when rainwater enters the groundwater system and vegetation grows above a cave.

Today the climate in the Negev Desert is very arid and speleothems do not form, but their presence in a number of caves clearly indicates that conditions were wetter in the past. Scientists have dated 33 speleothem samples from five caves in the central and southern Negev Desert.

Increased Rainfall in the central and Southern Negev Desert
The ages of these speleothems show that the last main period of increased rainfall occurred between 140,000 and 110,000 years ago. The climate during this time consisted of episodic wet events that enabled the deserts of the northeastern Sahara, Sinai, and the Negev to become more hospitable for the movement of early modern humans.

Wet periods in the North and South parts of the Saharan-Arabian desert
The simultaneous occurrence of wet periods in the northern and southern parts of Saharan-Arabian desert may have led to the disappearance of the desert barrier between central Africa and the Levant.

The humid period in the Negev Desert between 140,000 and 110,000 years ago was preceded and followed by essentially unbroken arid conditions; thus creating a climatic "window" for early modern human migration to the Levant.

Thursday, November 15, 2007

Evolution Is Deterministic, Not Random -- Multi-species Study

Finding: Biologists in an international team have concluded that developmental evolution is deterministic and orderly, an not the random sequence operation many previously believed based on a study of different species of roundworms.

If organs do not change, how does evolutionary development work in those organs?

Enter the study involving the female copulatory and egg-laying organ, the vulva, found in nearly 50 species of roundworms. The conventional wisdom is that because the vulva does not significantly change across species, one might predict that there would be little variation in vulva development. But that is not the case. Researchers found a lot of developmental variation. They concluded that this variation, since it did not affect the final adult vulva, could not have evolved in a random, fashion.

The research team looked at more than 40 characteristics of vulva development, including cell death, cell division patterns, and related aspects of gonad development. They plotted the evolution of these traits on a new phylogenetic tree, which illustrates how species are related to one another and provides a map as to how evolutionary changes are occurring.

Unidirectional changes
Their results showed an even greater number of evolutionary changes in vulva development than they had expected. But they found that evolutionary changes among these species were unidirectional in nearly all instances.

The decline of cell divisions
For example, they concluded that the number of cell divisions needed in vulva development declined over time instead of randomly increasing and decreasing.
The decline of number of rings
In addition the number of rings used to form the vulva consistently declined during the evolutionary process.
These results demonstrate that, even where you might expect evolution to be random, it is not.

Tuesday, November 13, 2007

Tiny DNA Molecules Show Liquid Crystal Phases, Pointing Up New Scenario For First Life On Earth

Finding: Scientists have discovered some unexpected forms of liquid crystals of ultrashort DNA molecules immersed in water, providing a new scenario for a key step in the emergence of life on Earth.

Short segments of DNA, life's molecular carrier of genetic information, could assemble into several distinct liquid crystal phases that "self-orient" parallel to one another and stack into columns when placed in a water solution.

Life is widely believed to have emerged as segments of DNA- or RNA-like molecules in a prebiotic "soup" solution of ancient organic molecules.

Since the formation of molecular chains as uniform as DNA by random chemistry is essentially impossible, scientists have been seeking effective ways for simple molecules to spontaneously self-select, "chain-up" and self-replicate. The new study shows that in a mixture of tiny fragments of DNA, those molecules capable of forming liquid crystals selectively condense into droplets in which conditions are favorable for them to be chemically linked into longer molecules with enhanced liquid crystal-forming tendencies, he said.

Tiny fragments of double helix DNA can spontaneously self-assemble into columns that contain many molecules. Our vision is that from the collection of ancient molecules, short RNA pieces or some structurally related precursor emerged as the molecular fragments most capable of condensing into liquid crystal droplets, selectively developing into long molecules.

Liquid crystals -- organic materials related to soap that exhibit both solid and liquid properties -- are commonly used for information displays in computers, flat-panel televisions, cell phones, calculators and watches. Most liquid crystal phase molecules are rod-shaped and have the ability to spontaneously form large domains of a common orientation, which makes them particularly sensitive to stimuli like changes in temperature or applied voltage.

RNA and DNA are chain-like polymers with side groups known as nucleotides, or bases, that selectively adhere only to specific bases on a second chain. Matching, or complementary base sequences enable the chains to pair up and form the widely recognized double helix structure. Genetic information is encoded in sequences of thousands to millions of bases along the chains, which can be microns to millimeters in length.

Such DNA polynucleotides had previously been shown to organize into liquid crystal phases in which the chains spontaneously oriented parallel to each other, he said. Researchers understand the liquid crystal organization to be a result of DNA's elongated molecular shape, making parallel alignment easier, much like spaghetti thrown in a box and shaken would be prone to line up in parallel.

Teams at CU-Boulder and University of Milan began a series of experiments to see how short the DNA segments could be and still show liquid crystal ordering. The teams found that even a DNA segment as short as six bases, when paired with a complementary segment that together measured just two nanometers long and two nanometers in diameter, could still assemble itself into the liquid crystal phases, in spite of having almost no elongation in shape.

Structural analysis of the liquid crystal phases showed that they appeared because such short DNA duplex pairs were able to stick together "end-to-end," forming rod-shaped aggregates that could then behave like much longer segments of DNA. The sticking was a result of small, oily patches found on the ends of the short DNA segments that help them adhere to each other in a reversible way -- much like magnetic buttons -- as they expelled water in between them.

A key characterization technique employed was X-ray microbeam diffraction combined with in-situ optical microscopy, carried out with researchers from Argonne and Brookhaven National Laboratories. The experiments provided direct evidence for the columnar stacking of the nano DNA pieces in a fluid liquid crystal phase.

The key observation with respect to early life is that this aggregation of nano DNA strands is possible only if they form duplexes. In a sample of chains in which the bases don't match and the chains can't form helical duplexes, we did not observe liquid crystal ordering.

Subsequent tests by the team involved mixed solutions of complementary and noncomplementary DNA segments. The results indicated that essentially all of the complementary DNA bits condensed out in the form of liquid crystal droplets, physically separating them from the noncomplementary DNA segments.

It means that small molecules with the ability to pair up the right way can seek each other out and collect together into drops that are internally self-organized to facilitate the growth of larger pairable molecules.

In essence, the liquid crystal phase condensation selects the appropriate molecular components, and with the right chemistry would evolve larger molecules tuned to stabilize the liquid crystal phase. If this is correct, the linear polymer shape of DNA itself is a vestige of formation by liquid crystal order.

Tuesday, November 6, 2007

Fantasy Hero's and Intelligent Design

A lot of Intelligent design proponents question the slow and methodical evolutionary march to sophistication. They believe that sophistication came all at once, that it was designed that way...because the designer knew before hand what the final design was to look like, so there was no need for an intermediary step.

OK. Prove it!

This reminds me of the time when I was young and believed in superheroes and superhuman feats. If you start with the comic book heroes like Superman, The Fantastic Four and the like they all had extra-ordinary gifts and abilities. I soon realized that such fantasy powers couldn't come true. So I substituted those comic book fantasy powers for other fantasy powers, like hitting .500 in baseball, or throwing 6 touchdowns per game without an interception, or carrying a ball for an average of 8 yards per carry. It soon dawned on me why most hitters were hitting .260, why most Quarterbacks throw only 2 touchdowns passes and also throw picks; and that most runners only have a 2.5 yard per carry average. Reality bites.

The intelligent design argument is much like believing in the fantasy world of the comic books. It is a fine world, but reality bites. If you want to produce a scientific basis for Intelligent design, as a viable alternative to Evolution, I and most other scientists would be glad to listen.

So develop a scientific theory of Intelligent Design...prove it!.

Sunday, November 4, 2007

Simple to complex then simple to complex

There has been one rule that evolutionary biologists felt they could cling to: the amount of complexity in the living world has always been on the increase. Now even that is in doubt.

The Tree of Life
In recent years, genetic analysis has forced biologists to consider the possibility that organisms such as the sea squirt might have lost some of the complexity of their ancestors. Yet even now, few recognise the full implications of loss as a key player in evolution. The entire tree of life has been built on the assumption that evolution entails increasing complexity. So, for example, if two groups of animals were considered close because both had a particular prominent feature, then someone discovered a third, intermediate line that lacked that feature but shared many other aspects of the two groups, traditional phylogenists would conclude that the feature had arisen independently in the two outlying groups, by a process known as convergent evolution.

They often did not even consider the alternative explanation: that the feature in question had evolved just once in an ancestor of all three groups, and had subsequently been lost in the intermediate one. Now a handful of molecular biologists are considering that possibility.

Instead of simply looking to see whether two species share certain genes, the new approach involves taking the "molecular fingerprint" of different types of cells. It identifies the unique combination of transcription factors - molecules that control which of a cell's genes are turned on and when - that specify the make-up of a cell, including the molecular signals it transmits and receives.

If two groups of organisms share the same type of cells, with the same molecular fingerprint, giving rise to similar features in both, then it is extremely unlikely that these features evolved twice. So any intermediate groups of organisms that lack that feature would most likely have lost it during the course of evolution. Only now, with the ability to explore at the molecular level how morphological features have been lost, gained and modified over time, is the true extent of evolutionary loss coming to light.

Finding: In genetic variation - an ancestor may develop certain traits, only to be followed by generations of child variants that lose the trait, and then redevelop the trait.

Friday, November 2, 2007

500 million year old Jellyfish

Finding: Using "fossil snapshots" found in rocks more than 500 million years old, three University of Kansas researchers have described the oldest definitive jellyfish ever found. This is significant because the fossil record is biased against soft-bodied life forms such as jellyfish, because they leave little behind when they die. That means that scientists are still working to solve the evolutionary development of many soft-bodied animals. But now they have a clear picture of what the soft-body form looked like when it comes to Jelly-Fish.

Here is what happened: researchers describe four types of cnidarian fossils preserving traits that allow them to be related to modern orders and families of jellyfish. The specimens are about 200 million years older than the oldest previously discovered jellyfish fossils.

Rapid Species diversification
Research jellyfish the group describes, found in Utah, offer insights into the puzzle of rapid species diversification and development that occurred during the Cambrian radiation, a time when most animal groups appear in the fossil record, beginning roughly 540 million years ago. The fossil record reveals much less about the origin and early evolution of animals such as jellyfish than it does about animals with hard shells or bones.

The Problem resolution:
The fossil record is full of circular shaped blobs, some of which are jellyfish. That's one of the reasons the fossils we describe are so interesting, because you can see a distinct bell-shape, tentacles, muscle scars and possibly even the gonads. The jellyfish left behind a film in fine sediment that resembles a picture of the animal. Most jellyfish do not leave such a clear impression behind because they are often preserved in coarse sand.

With the discovery of the four different types of jellyfish in the Cambrian, however, provide enough detail to assert that the types can be related to the modern orders and families of jellyfish. The specimens show the same complexity. That means that either the complexity of modern jellyfish developed rapidly roughly 500 million years ago, or that the group is even older and existed long before then.

The jellyfish described in the article are also unique because they push the known occurrence of definitive jellyfish back from 300 million to 505 million years, a huge jump, and show more detail than anything previously described that is younger.

Monday, October 29, 2007

Dawn Of Animal Vision Discovered

Finding: By peering deep into evolutionary history, scientists at the University of California, Santa Barbara have discovered the origins of photosensitivity in animals. 600 Million Years.

The scientists studied the aquatic animal Hydra, a member of Cnidaria, which are animals that have existed for hundreds of millions of years. The scientists looked at light-receptive genes in cnidarians, an ancient class of animals that includes corals, jellyfish, and sea anemones.

Not only are we the first to analyze these vision genes (opsins) in these early animals, but because we don't find them in earlier evolving animals like sponges, you can put a date on the evolution of light sensitivity in animals.

Now there is a time frame for the evolution of animal light sensitivity. We know its precursors existed roughly 600 million years ago.

There are only a handful of cases where scientists have documented the very specific mutational events that have given rise to new features during evolution.

Anti-evolutionists often argue that mutations, which are essential for evolution, can only eliminate traits and cannot produce new features. Such claims are simply wrong. Specific mutational changes in a particular duplicated genes (opsin) allowed the new genes to interact with different proteins in new ways. Today, these different interactions underlie the genetic machinery of vision, which is different in various animal groups.

Hydras are predators, and the authors speculate that they use light sensitivity in order to find prey. Hydra use opsin proteins all over their bodies, but they are concentrated in the mouth area, near the tip of the animal. Hydras have no eyes or light-receptive organs, but they have the genetic pathways to be able to sense light.

A map of Human Global Migration

Palaeontologists, archaeologists and geneticists are piecing the migration picture.

As a coherent picture emerges, however, new mysteries arise. It is looking likely that our species appeared far earlier than previously suspected - and remained in Africa for tens of thousands of years before going global. It could be said that humans were all dressed up and going nowhere.


Why the delay? Yet when our ancestors finally flocked onto the world stage, their spread was remarkably rapid. What caused them to explode out of Africa when they did? What circumstances suddenly allowed those early humans to smash down their boundaries like no species before or since?

Friday, October 26, 2007

Map of Neanderthal Geography


Neanderthals have been at the centre of many of the most intense debates in palaeoanthropology ever since the discovery of their bones spawned the field 150 years ago. A popular caricature portrays them as beetle-browed brutes, but this is far from the truth. Neanderthals were sophisticated stone-tool makers and made razor-sharp knives out of flint. They made fires when and where they wanted, and seem to have made a living by hunting large mammals such as bison and deer. Neanderthals also buried their dead, which, fortunately for researchers, increases the odds of the bones being preserved.


Bones and artefacts leave a whole range of questions wide open, though. How exactly are Neanderthals related to us? Did our ancestors interbreed with them, and if so, do modern Eurasians still carry a little Neanderthal DNA?


Just how "human" were they? There's only one way to be sure: By sequencing their entire genome we can begin to learn more about their biology. What's more, if we can answer the genetic questions we might solve the biggest mystery of all: why did Neanderthals die out while modern humans went on to conquer the globe?

Monday, October 15, 2007

Duplication and Division of Labor - The Mechanics of Evolution

Finding: Evolution works by gene duplication and division of labor.

Researchers at the University of Wisconsin-Madison have provided an exquisitely detailed picture of natural selection as it occurs at the genetic level.

Two scientists document how, over many generations, a single yeast gene divides in two and parses its responsibilities to be a more efficient denizen of its environment. The work illustrates, at the most basic level, the driving force of evolution.

This is how new capabilities arise and new functions evolve. This is what goes on in butterflies and elephants and humans. It is evolution in action.

The work is important because it provides the most fundamental view of how organisms change to better adapt to their environments. It documents the workings of natural selection, the critical idea first posited by Charles Darwin where organisms accumulate random variations, and changes that enhance survival are "selected" by being genetically transmitted to future generations.

The new study replayed a set of genetic changes that occurred in a yeast 100 million or so years ago when a critical gene was duplicated and then divided its nutrient processing responsibilities to better utilize the sugars it depends on for food.

One source of newness is gene duplication

When you have two copies of a gene, useful mutations can arise that allow one or both genes to explore new functions while preserving the old function. This phenomenon is going on all the time in every living thing. Many of us are walking around with duplicate genes we're not aware of. They come and go.

Two genes can be better than one because redundancy promotes a division of labor. Genes may do more than one thing, and duplication adds a new genetic resource that can share the workload or add new functions. For example, in humans the ability to see color requires different molecular receptors to discriminate between red and green, but both arose from the same vision gene.

The difficulty, he says, in seeing the steps of evolution is that in nature genetic change typically occurs at a snail's pace, with very small increments of change among the chemical base pairs that make up genes accumulating over thousands to millions of years.

To measure such small change requires a model organism like simple brewer's yeast that produces a lot of offspring in a relatively short period of time. Yeast, Carroll argues, are perfect because their reproductive qualities enable study of genetic change at the deepest level and greatest resolution because researchers can produce and quickly count a large number of organisms. The same work in fruit flies, one of biology's most powerful models, would require "a football stadium full of flies" and years of additional work, Carroll explains.

The process of becoming better occurs in very small steps. When compounded over time, these very small changes make one group of organisms successful and they out-compete others.

The new study involved swapping out different regions of the yeast genome to assess their effects on the performance of the twin genes, as well as engineering in the gene from another species of yeast that had retained only a single copy.

Retracing the Steps of evolution
The work shows in great detail how the ancestral gene gained efficiency through duplication and division of labor. They became optimally connected in that job. They're working in cahoots, but together they are better at the job the ancestral gene held. Natural selection has taken one gene with two functions and sculpted an assembly line with two specialized genes.

Friday, October 12, 2007

Is Junk DNA really Junk?

The discovery of the structure of DNA led to the idea that genomes are merely a series of DNA sequences, or genes, that code for proteins. Yet a paradox soon emerged: some relatively simple creatures turned out to have much larger genomes than more complex ones. Why would they need more genes?

What does DNA code for? Genetic traits and proteins. So do simple creatures need larger DNA structures? They don't. It rapidly became clear that in animals and plants, most DNA does not code for proteins. Early in studies of the Genome. 98 per cent of our DNA is of the non-coding variety. But even back in the 1970s it was obvious that not all non-coding DNA is junk. There is a certain kind of regulatory DNA. Certain sequences for which certain proteins bind can boost or block the expression of genes nearby. Such DNA is important.

This feature has been discovered over the years. Tiny bits of non-coding DNA have turned out to have a regulatory role or some other function. It was believed until recently that such sequences were only a small-part of non-coding DNA. Only in the past decade, as the genomes of more and more species have been sequenced and compared, has the bigger picture begun to emerge.

Conservation of Genes
Even though it is 450 million years since the ancestors of pufferfish and humans parted ways, everyone expected that we would still share many of the same genes - as proved to be the case. Most of the protein-coding DNA in different vertebrates is very similar or "conserved". The surprise was that even more of the non-coding DNA is conserved, too. Why did this occur?

DNA is constantly mutating due to copying mistakes and damage from chemicals and radiation. Specific sequences will be conserved only if natural selection weeds out any offspring with changes in these sequences. This will happen only if the changes are harmful, so researchers are convinced that all the conserved non-coding DNA must do something important. Why else would evolution hang on to it?

Those regions really challenge our understanding of biology. Biologists trying to find out what conserved non-coding DNA does, so scientists recently added extra copies of some of these sequences to mice. It's like taking a few extra pages and stapling them into a book.

Ultra-conserved
Copies of the "ultra-conserved" sequences that are almost exactly the same, base for base, in the mouse, rat and human. Nearly half of the sequences the team tested boosted gene expression in specific tissues, especially genes involved in nervous system development, the team reported last year.

This suggests that much of the conserved non-coding DNA is needed to make a brain cell, say, different from a skin cell. However, conserved DNA still accounts for only a tiny proportion of the genome. Even counting the 1.2 per cent of coding DNA, the human sequences found in other mammals add up to just 5 per cent. What's the other 95 per cent for?

One possibility is that some of the DNA whose sequence is not conserved might be conserved in a different sense. Regulatory sequences are essentially binding sites for proteins, so what matters is their three-dimensional structure. And while the conventional view is that the 3D structure of DNA is closely related to its sequence, scientists have found evidence that some regulatory regions share similar structures even though their sequences are different. Looked at this way, the total amount of conserved DNA could be much higher.

The RNA transcription factor
Another line of evidence suggesting that some non-conserved DNA has a function comes from looking at which DNA sequences get transcribed into RNA. It used to be thought that, with a few exceptions, most RNAs were produced as the first step in making proteins.

Protein-coding genes contain vast stretches of non-coding DNA called introns, which make up a quarter of our genome. These introns are transcribed into RNA but immediately edited out of the "raw" RNA. The resulting "processed" RNAs represent just 2 per cent of the genome.
A few years ago, however, scientists showed that far more than 2 per cent of the genome gets transcribed into RNA. The latest estimates are that 85 to 97 per cent of the entire genome is transcribed into raw RNA, resulting in processed RNAs representing 18 per cent of the genome.

Clearly, most of this RNA is non-coding, or ncRNA. So what is it for? While some of the very small ncRNAs have a big role in the control of gene expression most ncRNA remains mysterious.

Sunday, October 7, 2007

Science and Falseability

Science works on several principles. The most important is the search for truth. The duplication of experiments to verify a claim as true. But also is the possibility of showing that something that is claimed to be true is actually false. That is the falsibility criteria. Here are some examples:

The Ptolemaic system of astronomy made some very important claims about how the solar system was structured. That Earth was the center of the solar system...maybe even the universe. It claimed that the stars and the planets orbited around the Earth.


These were scientific claims and as such were subject to verification. As astronomers grew interested in the stars, they becan to examine these claims. Leonardo Da Vince, Gallileo, Kepler, Copernicus, Tyco Brahe and Issac Newton were scientists that researched and found that these claims were false. The earth was not the center of the solar system, the sun was.

The claim was falsifiable. That was important. As a scientific claim it was wrong. It was shown to be wrong, and several astronomers, physicists, and mathematicians were able to verify the experiments that showed how the claim was false, and a new claim was true.

This is the essence of science.

Can the same be said about Evolution? Can it be shown to be falsifiable? This is a critical claim. Because if it cannot be falsifiable then it is like Intelligent Design, a philosophical claim that cannot be verified, or denied.

One way to show that evolution is false would be to show that there are no variations in the human or animal kingdom. But that is not the case. Just recently there was a sad case of an Indian girl that was born with four arms and four legs. This example shows that there are genetic variations possible. But there have to be other tests to show that evolution works. Evolution as a function of genetic mutation.

One experiment would be to expose cells to radiation. If there are no genetic mutations as a result of the experiment then genetic mutation might be suspect as a vehicle of evolution. But as it is there are many cases of radiation leading to genetic mutation. This however is the falsifiability condition.

Friday, September 28, 2007

DNA microarray

A DNA microarray (also commonly known as gene or genome chip, DNA chip, or gene array) is a collection of microscopic DNA spots, commonly representing single genes, arrayed on a solid surface by covalent attachment to chemically suitable matrices.

DNA arrays are different from other types of microarray only in that they either measure DNA or use DNA as part of its detection system. Qualitative or quantitative measurements with DNA microarrays utilize the selective nature of DNA-DNA or DNA-RNA hybridization under high-stringency conditions and fluorophore-based detection. DNA arrays are commonly used for expression profiling, i.e., monitoring expression levels of thousands of genes simultaneously, or for comparative genomic hybridization.

Microarray technology is often used for gene expression profiling. It makes use of the sequence resources created by the genome sequencing projects and other sequencing efforts to answer the question, what genes are expressed in a particular cell type of an organism, at a particular time, under particular conditions?

For instance, they allow comparison of gene expression between normal and diseased (e.g., cancerous) cells. There are several names for this technology - DNA microarrays, DNA arrays, DNA chips, gene chips, others. Sometimes a distinction is made between these names but in fact they are all synonyms as there are no standard definitions for which type of microarray technology should be called by which name.

Microarrays exploit the preferential binding of complementary nucleic acid sequences. A microarray is typically a glass slide, on to which DNA molecules are attached at fixed locations (spots or features). There may be tens of thousands of spots on an array, each containing a huge number of identical DNA molecules (or fragments of identical molecules), of lengths from twenty to hundreds of nucleotides. The spots on a microarray are either printed on the microarrays by a robot, or synthesized by photo-lithography (similar to computer chip productions) or by ink-jet printing. There are commercially available microarrays, however many academic labs produce their own microarrays.

Microarrays that contain all of the about 6000 genes of the yeast genome have been available since 1997. The latest generations of commercial microarrays represent the entire human genome, more than 30,000 genes, on two microarrays.

Tuesday, September 25, 2007

Genetic Carrying Handles: Cloning Vectors

In order to clone a gene, its DNA sequence must be attached to some kind of carrier, also made of DNA, that can take it into the cell. Biologists call these carriers vectors. A vector acts like a handle for the DNA, and it also contains other tools such as an origin of replication and a selective marker.

The origin of replication is a sequence of DNA that the host cell recognizes that allows it to make more copies of the clone DNA sequence. This origin sequence is where the cell begins copying the vector and the attached clone DNA. The selective marker is a specific DNA sequence that is used by biologists to tell if the clone has entered the cell, and they are usually genes that confer antibiotic resistance to the cell.

The most common media used for this process is actually very similar to chicken soup, but the carbohydrate agarose is added to convert the media into a semi-solid substance, since bacterial colonies are much easier to detect on a semi-solid surface. Agarose is much like gelatin, but it comes from seaweed and unlike gelatin, most bacteria cannot digest it. Antibiotics are often added to the media, to kill any cells that do not possess the antibiotic resistant selective marker gene, which is in the clone DNA. This way, biologists can ensure that all the remaining cells have in fact taken up the clone DNA and its vector.

These cells are called transfected cells. Antibiotics are not the only way to identify transfected cells. Biologists sometimes use selective markers that turn cells a different colour or even to make them glow. Common proteins that do this include luciferase, which makes fireflies glow or green fluorescent protein, which comes from certain species of jellyfish. Green fluorescent protein also comes in other colours!

Sunday, September 23, 2007

Bacterial artificial chromosome




A bacterial artificial chromosome (BAC) is a DNA construct, based on a fertility plasmid (or F-plasmid), used for transforming and cloning in bacteria, usually E. coli. F-plasmids play a crucial role because they contain partition genes that promote the even distribution of plasmids after bacterial cell division. The bacterial artificial chromosome's usual insert size is 150 kbp, with a range from 100 to 300 kbp. A similar cloning vector, called a PAC has also been produced from the bacterial P1-plasmid.


BACs are often used to sequence the genetic code of organisms in genome projects, for example the Human Genome Project. A short piece of the organism's DNA is amplified as an insert in BACs, and then sequenced. Finally, the sequenced parts are rearranged in silico, resulting in the genomic sequence of the organism.

Wednesday, September 19, 2007

Bats and Humans share a common Gene for communication


Finding: The FOXP2 gene found in humans and bats allows human language evolution and bat echo location.

Discoveries that mutations in the FoxP2 gene lead to speech defects and that the gene underwent changes around the time language evolved both implicate FOXP2 in the evolution of human language.

No genetic variation exists among many vertebrates
Recently, patterns of gene expression in birds, humans and rodents have suggested a wider role in the production of vocalisations. But many reports have established that FOXP2 shows very little genetic variation across even distantly related vertebrates - from reptiles to mammals -- providing few extra clues as to the gene's role.

Genetic Variation does exist with echo locating bats
A new study, undertaken by a joint of team of British and Chinese scientists, has found that this gene shows unparalleled variation in echolocating bats. The results, appearing in a study report that FOXP2 sequence differences among bat lineages correspond well to contrasting forms of echolocation.

Bats like people need coordination of mouth and face for communication
Like speech, bat echolocation involves producing complex vocal signals via sophisticated coordination of the mouth and face. The involvement of FOXP2 in the evolution of echolocation adds weighty support to the theory that FOXP2 functions in the sensory-motor coordination of vocalisations.

Tuesday, September 18, 2007

New Method Can Reveal Ancestry Of All Genes Across Many Different Genomes

Finding: A new method has been developed that can reveal the ancestry of all genes across many different genomes. First applied to 17 species of fungi, the approach has unearthed some surprising clues about why new genes pop up in the first place and the biological nips and tucks that bolster their survival.

The problem
The wheels of evolution turn on genetic innovation as new genes with new functions appear, allowing organisms to grow and adapt in new ways. But deciphering the history of how and when various genes appeared, for any organism, has been a difficult and largely intractable task. Having the ability to trace the history of genes on a genomic scale opens the doors to a vast array of interesting and largely unexplored scientific questions. Although the principles laid out in the study pertain to fungi, they could have relevance to a variety of other species as well.

What we know
It has been recognized for decades that new genes first arise as carbon copies of existing genes. It is thought that this replication allows one of the gene copies to persist normally, while giving the other the freedom to acquire novel biological functions. Though the importance of this so-called gene duplication process is well appreciated it is the grist for the mill of evolutionary change the actual mechanics have remained murky, in part because scientists have lacked the tools to study it systematically.

Genes from 17 different species
Driven by the recent explosion of whole genome sequence data, the authors of the new study were able to assemble a natural history of more than 100,000 genes belonging to a group of fungi known as the Ascomycota. From this, the researchers gained a detailed view of gene duplication across the genomes of 17 different species of fungi, including the laboratory model Saccharomyces cerevisiae, commonly known as baker's yeast.

Methodology - Synergy
The basis for the work comes from a new method termed "SYNERGY", which first author Ilan Wapinski and his coworkers developed to help them reconstruct the ancestry of each fungal gene.

By tracing a gene's lineage through various species, the method helps determine in which species the gene first arose, and if -- and in what species -- it became duplicated or even lost altogether. SYNERGY draws its strength from the use of multiple types of data, including the evolutionary or "phylogenetic" tree that depicts how species are related to each other, and the DNA sequences and relative positions of genes along the genome.

Perhaps most importantly, the method does not tackle the problem of gene origins in one fell swoop, as has typically been done, but rather breaks it into discrete, more manageable bits. Instead of treating all species at once, SYNERGY first focuses on a pair of the most recently evolved species -- those at the outer branches of the tree -- and works, two-by-two, toward the more ancestral species that comprise the roots.

From this analysis scientists were able to identify a set of core principles that govern gene duplication in fungi. The findings begin to paint a picture of how new genes are groomed over hundreds of millions of years of evolution.

Monday, September 17, 2007

History of Bacteria evolution

Finding: The evolutionary history of bacteria has been worked out.

Unlike other organisms that tend to pass their genes on to the next generation of their own species, bacteria often exchange genetic material with totally unrelated species a process called lateral gene transfer.

Researchers did not believe that they could work out the evolutionary history of bacteria. But now, thanks to the availability of sequenced genomes for groups of related bacteria, and a new analytical approach, researchers at have been able to demonstrate that constructing a bacterial family tree is indeed possible.

Here is how they do it
Scientists propose an approach that begins by scouring genomes for a set of genes that serve as reliable indicators of bacterial evolution. This method has important implications for biologists studying the evolutionary history of organisms by establishing a foundation for charting the evolutionary events, such as lateral gene transfer, that shape the structure and substance of genomes.

In this study, the researchers chose the ancient bacterial group called gamma Proteobacteria, an ecologically diverse group (including Escherichia coli and Salmonella species) with the most documented cases of lateral gene transfer and the highest number of species with sequenced genomes.

Why use bacteria at all?
Bacteria promise to reveal a wealth of information about genomic evolution, because so many clusters of related bacterial genomes have been sequenced--allowing for broad comparative analysis among species--and because their genomes are small and compact.

The results support the ability of their method to reconstruct the important evolutionary events affecting genomes. Their approach promises to elucidate not only the evolution of bacterial genomes but also the diversification of bacterial species events that have occurred over the course of about a billion years of evolution.

Sunday, September 16, 2007

Human - Chimpanzee split occured 5-7 million years ago

Finding- New research indicates that the split between chimpanzees and humans occured 5 to 7 million years ago. This improves the time differential which previously had a 10 million year range 3-13 million years. Now the differential is 2 million years.

How it was done:
Scientists analyzed the largest data set yet of genes that code for proteins and also used an improved computational approach that they developed, which takes into account more of the variability -- or statistical error--in the data than any other previous study. Gene studies are needed to address this problem because the interpretation of the earliest fossils of humans at the ape/human boundary are controversial and because almost no fossils of chimpanzees have been discovered.

The science team examined 167 different gene sequence sets from humans, chimpanzees, macaques, and mice.

No previous study has taken into account all of the error involved in estimating time with the molecular-clock method. The new statistical technique is a multifactor bootstrap-resampling approach.

Nucleotide arrangement
The scientists estimated the time of divergence between species by studying the sequential arrangement of nucleotides that make up the chain-like DNA molecules of each species. The number of mutations in the DNA sequence of a species, compared with other species, is a gauge of its rate of evolutionary change.

Calibration - rate of one species with that of another species
The minimum time of divergence
By calibrating this rate with the known time of divergence of a species on another branch of the tree-like diagram that shows relationships among species, scientists can estimate the time when the species they are studying evolved. In this case, the calibration time the scientists used was the split of Old World monkeys -- including baboons, macaques, and others -- from the branch of the phylogenetic tree that led to humans and apes, which fossil studies have shown is at least 24 million years ago. Using this calibration time, the team estimated that the human-chimp divergence occurred at least 5 million years ago, proportionally about one-fifth of the calibration time.

Other supporting evidence
The maximum time of divergence
This time is consistent with the findings of several research groups that have used the molecular-clock method to estimate the split of humans and chimpanzees since the first attempt in 1967. But this is only a minimum estimate, because it was based on a minimum calibration time. To obtain a maximum limit on the human-chimp divergence, the team used as a calibration point the maximum estimate, based on fossil studies, of the divergence of Old-World monkeys and the branch leading to humans -- 35 million years ago. Calculations using this date yielded a time for the human-chimp split of approximately 7 million years ago, which again was proportionally about one-fifth of the calibration time.

What else can be gathered from knowing the origin of the divergence?
Besides knowing when we divereged, a fact worth knowing, this divergence time also has considerable importance because it is used to establish how fast genes mutate in humans and to date the historical spread of our species around the globe.

Knowing the timescale of human evolution, and how we changed through time in relation to our environment, could provide valuable clues for understanding the evolution of intelligent life.

This research does not pinpoint the precise time of the split, it tells us that proportional differences on branches in family trees should be considered when proposing new times. For example, we now know that a 10-to-12-million-year human-chimp split would infer a divergence of Old World monkeys from our lineage that is too old (50-to-60-million years ago) to reconcile with the current fossil record of primates.

What then is the next step?
Although some additional improvement is possible by including more genes and more species, the greatest opportunity now for further narrowing this estimate of 5-to-7-million years will be the discovery of new fossils and the improvement in geologic dating of existing fossils.