Looking Back in Evolutionary Time

Researchers in Japan are studying how one monkey manages to see in the dark. Their work might change our entire perception of how the primate family tree evolved.

Most primate researchers agree that the common ancestor of today’s monkeys was nocturnal. Strangely, most modern primates are active during the day and have poor night vision. Azara’s owl monkey, which is active at night and has good night vision, is the notable exception.

This exception has led to a controversy  in the field. Primate researchers cannot seem to agree if all monkeys lost the ability to see at night before the ancestors of the owl monkey gained it back. On the other hand, it is possible that the ancestors of the owl monkey were just lucky; maybe they never lost the ability to see at night.

Enter Akihiko Koga and his team at Kyoto University in Japan. The Koga group are a team of geneticists who think that the owl monkey is in the process of gaining back the night vision that was lost by the early ancestors of today’s monkeys.

But peering back into time is hard to do. So Koga and his team needed a way to study if the owl monkeys are evolving better night vision or if they are already optimized for night sight.

They relied on a finding from another group that showed that the low light-sensing cells (rods) of the eyes of nocturnal animals package their DNA in a different way than most cells. Most cells keep gene-rich DNA in the center of the nucleus. This sets up a little hotspot for cellular machinery to get in and turn on genes. But the rods of nocturnal animals pack their gene-rich DNA around the edges of the nucleus. It seems like this DNA packaging pattern makes it easier for light to pass through the rod cell in nocturnal animals.

When Koga and his group looked at the rods of the owl monkey eye, they found that the DNA is packed in a manner that is halfway between that of nocturnal and daytime animals. This suggests that owl monkeys have decent night vision, but they probably are not the best at seeing in the dark.

The team still wanted to know if the owl monkey was on the evolutionary path to regaining night vision or if it had been stuck that way since the early nocturnal ancestor of primates. They found that a piece of repetitive DNA that is usually packed into the center of rod nuclei in nocturnal animals has been expanding like a virus into new locations throughout the owl monkey genome. This could explain how the DNA packing has been changing over evolutionary time in the owl monkey nuclei. Most importantly, it could mean that the monkeys are regaining the night vision that was lost by their ancestors. Maybe they are still evolving toward better night vision.

We are all just monkeys with bad night vision. It is interesting to think that we have some DNA lurking in us–because we share most of our DNA with our primate cousins–that has been co-opted in the owl monkey to bring back night vision. More importantly, this is a new way that biologists have been able to go against the grain of time and peer backward in evolution.

Image: Aotus Azarae by Rich Hoyer, Flickr


Resistant Germs Beat Antibiotics to the Punch

Microbiologists reported that bacterial resistance to the antibiotic methicillin predates the use of the antibiotic in patients. This finding could underscore the need for entirely new types of antibiotics.

Methicillin, and its parent antibiotic penicillin, belong to a group of antibiotics that share a characteristic molecular structure. It was once widely used to treat Staphylococcus aureus infections, but resistance has since arisen and led to methicillin-resistant S. aureus (MRSA).

Researchers at the University of St. Andrew’s School of Medicine in the United Kingdom wanted to figure out when S. aureus became MRSA. They sequenced the genomes of 209 frozen samples of MRSA collected from patients going back to the 1960s. This allowed them to analyze minor genetic differences between the strains that researchers in the 1960s could only dream of.

The researchers found that strains from the 1960s contained genes that made them resistant to methicillin. Since methicillin was first used in patients in 1959, resistance would have had to develop soon afterward.

However, the team thought that methicillin resistance might have arisen earlier. When they compared the earlier MRSA strains, they found genetic differences between them that would have taken evolutionary time to develop. The simplest way to explain these differences was to conclude that the early MRSA strains shared a resistant ancestor that lived years before the introduction of methicillin.

Their estimates put the emergence of MRSA not long after the first use of penicillin, which debuted in the 1940s. The team concluded that widespread penicillin use led to the emergence of penicillin and methicillin resistance at around the time due to their similar molecular structures.

Today, tinkering with the same general structure to develop new antibiotics is a common practice. Studies like this highlight how we need to get more creative to maintain our upper hand on germs.

Image: S. aureus, from the CDC Public Health Image Library

Single Cells Seize the Means of (Food) Production

(Dinophysis acuminata. Photo credit: fjouenne, http://planktonnet.awi.de/)

Scientists reportedly discovered that an organism eats chloroplasts for breakfast…but doesn’t digest them until it has had the chance to use them first. The phenomenon might explain how the ancestor of all plant cells came to be over a billion years ago.

The study, published in the journal PLoS One, looked at single-celled organisms of the genus Dinophysis. Dinophysis species eat other single-celled organisms that contain chloroplasts, which are the green-pigment containing organelles that enable plants to make their own food from sunlight.

Usually, the story would end here; most organisms just digest up what they eat and that is the end of it. But researchers in the past noticed that sometimes Dinophysis does not immediately digest chloroplasts. Instead, it keeps them around.

This was an interesting observation because of an idea that is popular in biology that I will call the “endosymbiotic hypothesis.” The hypothesis is a proposed explanation for why complex cells–like human and plant cells–contain chloroplasts and/or mitochondria, while bacterial cells do not contain them.

The endosymbiotic hypothesis starts from the premise that chloroplasts and mitochondria–which make energy and food, respectively–look very much like bacteria. If you compare them under a microscope, they are around the same size. They grow and divide within our cells independently. They even contain their own DNA genomes that look like bacterial genomes.

The proposed conclusion is that simple cells that lacked chloroplasts and mitochondria could have eaten the bacterial ancestors of chloroplasts and mitochondria. Instead of digesting them, though, they could have just saved them to use as little energy factories.

This is a nice hypothesis, but to really buy it we would want to see single-celled organisms that can eat other organisms and then save their chloroplasts. So back to Dinophysis. Other scientists had already shown that Dinophysis can steal chloroplasts from their prey, but the team from Denmark wanted to track the chloroplasts in the Dinophysis cells over a long period of time. So they designed an experiment that involved growing Dinophysis in sea water and feeding them chloroplast-containing microorganisms and plenty of light. At a certain point, the researchers stopped the flow of microorganism prey but left the lights on. They did this for many days, and tracked the growth of the Dinophysis. Each time the number of Dinophysis in the sea water doubled, the scientists took a sample of them and looked at them under a microscope, taking note of the number and size of stolen chloroplasts remaining.

Keep in mind, the Dinophysis ate these chloroplasts. Based on that information, you would expect them to be digested before long. But we already knew that Dinophysis can keep the chloroplasts around for a while. The researchers were surprised, though: the chloroplasts were not just sticking around, but they were even growing and dividing in the Dinophysis! It was as though the chloroplasts had taken up residence in the Dinophysis and made it look like a plant cell.

Importantly, this is what we would expect to have happened at some point early in the evolution of complex cells. Remember, the endosymbiotic hypothesis predicts that there was once a single-celled organism that gobbled up the ancestors of chloroplasts and kept them around for a while to exploit them for a food source. The fact that we can see something like this happening in Dinophysis–and that the chloroplasts are even able to take up residence in their new host and start growing and dividing–adds evidence in favor of the endosymbiotic hypothesis.

This study does leave some unanswered questions, though. While the researchers showed some convincing evidence that Dinophysis is not digesting the chloroplasts and that the chloroplasts are in fact dividing over the course of weeks within their new hosts, they fail to show how this might be benefiting the Dinophysis. The endosymbiotic hypothesis predicts that the Dinophysis would gain a source of food from the new chloroplasts, but evidence of this in Dinophysis will have to come later.

In the meantime, this study could offer an interesting look back in time. We will never be able to turn the clock back on evolution, but experiments like this allow us to see how it may have proceeded all of those eons ago.

How the Eel Crossed the Atlantic

Anyone who has let their cat roam free over the neighborhood knows how keen a cat’s sense of direction can be. After all, they always find their way home–or so they saying goes. The European eel can do something quite similar, but its neighborhood stretches across the entire North Atlantic Ocean. Scientists are finally beginning to understand how these eels coordinate such a fantastic feat of navigation.

Scientists have long marveled over the young eels’ ability to find their way from their birthplace in the Sargasso Sea–off the coast of the southern United States–to the regions on the other side of the Atlantic where they spend most of their adult lives. They began to suspect that the eels, like plenty of other animals, use the earth’s magnetic field to navigate.

The earth’s churning hot metal core creates a magnetic field that surrounds our planet, providing practical benefits like protecting us from solar wind and allowing us to use a compass to figure out which direction is north. Some animals can even tap into this magnetic field to navigate, since each spot on earth’s surface is exposed to a different magnetic field due to its unique location between the earth’s poles.

Which leads us back to the eels. According to results published in the journal Current Biology, a group of researchers wanted to know how the eels would swim when exposed to different magnetic fields. To test this, they grew the eels in large tanks and exposed them to magnetic fields that would be characteristic of different geographic locations that the eels could occupy during their life cycle. Based on the results that they recorded in the tank, scientists could figure out how the eels would swim in the open ocean with a corresponding natural magnetic field.

Interestingly, this experiment showed that the magnetic fields would sweep the eels right into the Gulf Stream. We have all heard about the Gulf Stream, especially on the Weather Channel. In this context, though, the Gulf Stream is an ocean current that circulates around the North Atlantic in a clockwise direction.

The researchers concluded that the magnetic field would lead the young eels to the Gulf Stream, then the clockwise gyre of the Gulf Stream would lead them from the Sargasso Sea to Europe to spend their adult lives. Later in their lives, when it is time to return to the Sargasso to spawn, the eels would use the magnetic map to find the Gulf Stream again to return to where they started.

The researchers noted that this is the first evidence that eels use a magnetic map to coordinate their massive migrations, adding another species to the list of animals finding their way around earth with their very own internal compass.

Is the CRISPR Craze a Rerun?

Some years ago there was a basic science discovery that took the biomedical field by storm. Scientists working in a model organism had found a way to selectively target nucleic acids in the cell, shutting down gene expression. There was a ton of hype over the next several years, with everyone imagining the therapies that would start to help patients in no time.

You might think that I am talking about CRISPR; everyone else is, after all. But I am talking about RNAi, which was once touted as the discovery that would revolutionize medicine forever. I was talking to a colleague who is a bigwig in the CRISPR field who was speculating about the future of his field when he said something that shocked me at first. He suggested that CRISPR will not be the revolutionary clinical discovery that some people think it will turn out to be. When I pressed him, he compared it the hype behind RNAi a decade ago. Given this perspective, a couple of questions started to float around in my head. How similar was the hype behind RNAi to that of CRISPR/Cas9 today? Could CRISPR lead to the same letdown?

I did not know much about the RNAi craze–I know RNAi as a handy lab technique, but I never thought of it as a viable clinical treatment–so I went back and did some Googles. RNAi, which stands for “RNA interference,” is a set of cellular systems that cut up RNA and use the pieces to target and attack matching RNA transcripts in the cell. This turns down the expression of certain genes, which can be an effective way of doing genetic experiments in the lab.

It did not take much imagination to dream of how RNAi could be useful in treating human disease. Since plenty of diseases are due to the expression of disease-causing genes, doctors could treat the disease by giving the patient a drug to mobilize the RNAi system against the disease gene.

But in practice, RNAi ended up being difficult to use in patients. Hopes for RNAi therapy peaked during the mid 2000s, and started to ebb during the next few years after human trials showed no real benefit to patients or led to unintended immune responses.

Some people were afraid that RNAi would never live up to its promise. Biotechnology companies shuttered their RNAi research divisions. Human trials slowed down. Luckily, things did bounce back. There are still companies today working on RNAi therapies. It would seem that RNAi was over-hyped, it nearly crashed, then it became what it was always going to be: a therapy with some promise, but no miracle.

Today, CRISPR is just as hyped as RNAi was back then…if not more. CRISPR genome editing is popular science. In many ways, the lay public believes that this will be the century of biology: we will crack the mysteries of aging, we will edit human embryos to eliminate genetic disorders, we will cure all of the diseases. CRISPR genome editing is at the center of these hopes. But there are lessons to learn from the original breakthrough to end all breakthroughs. RNAi was not a complete failure, but we were certainly naive about its potential.

Part of what we got wrong was how unrealistic we were about the limitations of RNAi technology. Living cells have strong negative reactions to double-stranded RNA, which is a necessary step in the RNAi pathway. Delivery systems would be hard to engineer, just like the problems that still plague gene therapy. Finally, there is something that RNAi and CRISPR have in common: off-target effects.

Both RNAi and CRISPR depend on nucleic acids lining up and binding to each other in a pairwise manner before they can have their effect. Since the RNA sequences that bind to targets in RNAi and CRISPR are short and there is quite a bit of nucleic acid sequence in the cell, there is a possibility that you will get your molecule pairing up with an unintended target. It is like taking a short sentence fragment at random from a book and then searching the book for that fragment. You can find the target that you are looking for, but you might also find other perfect or near perfect matches elsewhere in the book, especially when you are searching through a large, complex book.

When these off-target effects happen with RNAi, you could shut down the expression of another gene. If that other gene is important, you might risk harming the cell. The same thing can happen with CRISPR. In fact, CRISPR has the potential to have more dire off-target effects:  CRISPR involves changing the DNA archive, rather than the RNA copy, which can lead to irrevocable changes to the cell.

Luckily, it does seem like CRISPR researchers have taken this to heart. Research into CRISPR’s off-target potential is an active field. I even blogged about a system that might be able to fine-tune the activity of CRISPR/Cas9 with the goal of reducing off-target effects in CRISPR therapy.

To be fair, CRISPR is at least a decade away from the clinic. But there are reasons to be concerned. Scientists have edited human embryos, and ethicists are scrambling to come up with rules to inform how we use this technology. If we learned anything from the RNAi experience, we should carry it over to CRISPR/Cas9. These systems seem to break out onto the scene with a ton of potential and bold claims. Eventually, we might be disappointed. There might be CRISPR trials somewhere down the road that will have to stop, with patients who thought they might be helped instead left wondering what all the hype was about. But if we have learned anything, it is that these systems will change our world. We will end up better off because of CRISPR. We just have to be willing to take the time to figure it out first.

Did Viruses Teach Us Sex?

Sex is a weird thing. At its core, the process involves a cell from one organism meeting up with another cell from another organism. These two cells have to become one when they collide, and they do this by fusing. New evidence suggests that the molecules responsible for this fusion might have come from viruses.

A new paper in the journal Current Biology noticed that proteins called fusogens in a single-celled organism are remarkably similar to another group of proteins produced by several types of viruses. Both types of fusogens are responsible for cell fusion in their respective organism/virus. In the single-celled organism, called Tetrahymena, fusogens dot the outside of the cell and allow the cells to undergo fusion and a primitive version of sex. Viruses, on the other hand, use fusogens to invade their cellular hosts.

Sex in Tetrahymena — Image: Jmf368w (CC BY-SA 4.0 via Wikimedia Commons)

The researchers involved in the present study were surprised when they saw just how similar Tetrahymena and viral fusogens look. Since proteins are just long strings of amino acids that are folded into complex shapes, we can represent a protein as a string of letters similar to what is done with DNA sequences. Then we can use a computer program to align the protein strings together by similarity. If two protein sequences align with a great deal of similarity–such that there is relatively little difference between the amino acid sequences of the two–it is often inferred that the proteins share a recent ancestor in evolutionary time. This is because evolutionary change is due to change at the DNA level, and the DNA change ultimately determines the protein change. When the viral and Tetrahymena fusogens were aligned, they appeared to be closely related based on similarity.

Not only did viral and Tetrahymena fusogens look strikingly similar, but the researchers were also able to show that they behaved quite similarly in a test tube. Specifically, they were similar in how they interacted with the chemicals that make up the exteriors of cells. The researchers concluded from this that both the structure and function of fusogens are conserved between viruses and Tetrahymena. When we see conservation of structure and function in biology, it is usually suggests that structures share an evolutionary origin.

So could viruses have passed sex on to us by leaving behind fusogens in our ancestor’s cells? Maybe, but the team that wrote this paper is not sure, and even admit that it might have happened the other way around. The bottom line is that we will need more evidence to know for sure, but this is certainly good circumstantial evidence that sexually reproducing organisms might owe a debt of gratitude to our infectious viral frenemies.

Sequencing the World

It looks like the beginnings of a consortium are taking shape, with the goal of sequencing all life on earth. As something of a genomicist, I am psyched by the goal, unattainable as it may be. I also want to say why lofty goals are helpful, and this one will be too.

The Human Genome Project took years to finish, and ended up costing about a dollar per base-pair, which are the chemical “letters” that make up the genetic code. Since then, sequencing has become orders of magnitude cheaper. The current genome sequencing leader, Illumina, famously announced that sequencing a genome could be done for a thousand dollars. If we compare that to the investment required for the human sequence, we certainly have made strides. This is due  to the technology we use to sequence genomes. The most popular way to do it today is to take a sample of DNA from an organism, which is typically present in long stretches of DNA called chromosomes, and break it into short fragments. Since we have a lot of DNA in the sample, we end up having more than one copy of each letter of the genome. Using the powerful genome sequences that we have developed , we can sequence a little bit of each of these fragments before using a computer program to take the short reads and assemble them into a contiguous sequence. If you can imagine taking a few hundred copies of “Moby Dick” and randomly cutting out stretches of letters before trying to reassemble the book from the fragments by looking for overlap between random fragments, then you understand the basic strategy that genome sequencing uses today.

In spite of the cutting edge technology, it still takes a ton of work to go from a draft genome assembly–which is what you could immediately get after putting a thousand dollars into an Illumina machine and plugging the resulting reads into the computer to assemble–to the kind of gold-standard genome assemblies that we have in well-studied organisms like mice and humans. Typically, more work has to be put in to fill in gaps in the assembly that result from highly repetitive DNA, which confounds assemblers. Scientists sometimes have to do follow-up experiments to prove that their genome assembly is real and is not just a computer error. Finally, the genome sequence is useless until you start to figure out where the genes and other features lie. This means more follow-up experiments and comparing the genome to those of other related organisms.

All of this take a significant investment of time and treasure, and there is no way that we could do that for all life on earth. You would never be able to have a gold-standard genome assembly for every organism on earth. Much like the oft-told anecdote about restaurants in New York City–where it is said that you could never eat at every restaurant in the city because new ones are opening for business and going out of business faster than you could visit them all–new organisms are evolving and going extinct all of the time. The idea of putting in enough work to get something as polished as the fruit fly genome, let alone the mouse or human genome, is laughable if you start to think about it. But it would allow researchers to gain an appreciation for the diversity of life that exists on earth, specifically at the DNA level. Just having fractions of the genomes of most of the species on earth would allow us to better understand the evolutionary relationships between all life on earth.

As for this goal being a little too big to handle, big goals are important to push us to new heights. Getting to the moon seemed ridiculous at the time, and sequencing the human genome was impossible when we first started to plan how to do it. These goals ended up being attainable, but just imagine if they had not been. Even if we had never made it to the moon, we would have still developed the kind of technology that allowed us to put satellites into orbit that now power our ubiquitous mobile devices. Even if the human genome proved intractable, we would have still ended up with improved sequencing technology. This is because setting these lofty goals has the effect of pushing us to achieve things that we would have never thought to accomplish without a lofty goal. If we set out sequence all life on earth, just imagine what we might find we can do along the way.

*I found a post by professor/blogger Jeff Ollerton who also had his own take on the proposal. While he and I do not agree, he has an interesting take that I enjoyed reading. It should also be said that he has more expertise than me in this area.