Tuesday, 26 July 2011

Induced pluripotent stem cells

It was one of the top science stories of 2007: number 2 on Science's list – reprogramming ordinary adult body cells (of mice and humans) to act like embryonic stem cells.
The riddle of Dolly the Sheep has puzzled biologists for more than a decade: What is it about the oocyte that rejuvenates the nucleus of a differentiated cell, prompting the genome to return to the embryonic state and form a new individual? This year, scientists came closer to solving that riddle. In a series of papers, researchers showed that by adding just a handful of genes to skin cells, they could reprogram those cells to look and act like embryonic stem (ES) cells.
The story really began in October 2006, when a team at Kyoto University in Japan, led by Shinya Yamanaka, announced that they had reprogrammed mouse skin cells into cells that closely resembled embryonic stem cells, based on certain characteristic genes that were expressed. The reprogramming was done by introducing genes for four important stem cell transcription factors (Oct4 (or sometimes the similar Oct3), Sox2, c-Myc, and Klf4) into the skin cells with the help of a genetically engineered retrovirus.

But the team could not at that time demonstrate that these reprogrammed cells would differentiate into a variety of adult cells after having been introduced into a mouse embryo which then developed into an adult mouse. Being able to do this would verify the pluripotency of the reprogrammed cells. (Pluripotency is the ability of a cell to develop into any type of fetal or adult cell. It is characteristic of embryonic stem cells.) The reprogrammed cells are called induced pluripotent stem (iPS) cells.
However, in June 2007 Yamanaka's team, along with two others, reported that they had been able to provide the missing demonstration of pluripotency. The second team that joined in reporting this accomplishment was led by Rudolf Jaenisch at MIT's Whitehead Institute for Biomedical Research. The third team was led jointly by Konrad Hochedlinger of the Harvard Stem Cell Institute and Kathrin Plath of the UCLA Institute for Stem Cell Biology and Medicine.Recently (mid-February), Kathrin Plath's team at UCLA has also announced success in reprogramming human skin cells, using the same techniques as previously reported. They have also verified that the induced pluripotent cells are very similar to embryonic stem cells:
Human Skin Cells Reprogrammed Into Embryonic Stem Cells
                                                       The reprogrammed cells were not just functionally identical to embryonic stem cells. They also had identical biological structure, expressed the same genes and could be coaxed into giving rise to the same cell types as human embryonic stem cells.
As we've noted, there have been some potential problems with the work already mentioned. First, any process that activates c-Myc (directly or indirectly) runs risks of promoting cancerous tumors. Second, the processes have used retroviruses to introduce the necessary genetic material into cells to be reprogrammed. This also runs the risk of inducing cancer.
So Konrad Hochedlinger's team has come along with work in mice to reduce or remove these cancer-causing risks:
Discovery Could Help Reprogram Adult Cells To Embryonic Stem Cell-like State
                                                 Harvard Stem Cell Institute (HSCI) and Massachusetts General Hospital (MGH) researchers have taken a major step toward eventually being able to reprogram adult cells to an embryonic stem cell-like state without the use of viruses or cancer-causing genes.
In a paper released online today by the journal Cell Stem Cell, Konrad Hochedlinger and colleagues report that they have discovered how long adult cells need to be exposed to reprogramming factors before they convert to an embryonic-like state, and have “defined the sequence of events that occur during reprogramming.”
This work on adult mouse skin cells should help researchers narrow the field of candidate chemicals and proteins that might be used to safely turn these processes on and off. This is particularly important because at this stage in the study of these induced pluripotent (iPS) cells, researchers are using cancer-causing genes to initiate the process, and are using retroviruses, which can activate cancer genes, to insert the genes into the target cells. As long as the work involves the use of either oncogenes or retroviruses, it would not be possible to use these converted cells in patients.
And hard on their heels, other teams are announcing similar findings:
Stem cell breakthrough may reduce cancer risk
                       The main obstacle to using "reprogrammed" human stem cells – the danger that they might turn cancerous – has been solved, claims a US company.
PrimeGen, based in Irvine, California, says that its scientists have converted specialised adult human cells back to a seemingly embryonic state – using methods that are much less likely to trigger cancer than those deployed previously.
The company also claims to be able to produce reprogrammed cells faster and much more efficiently than other scientists.

Sunday, 24 July 2011

Alternative energy sources

The outlook on energy alternatives to fossil fuels is looking a little bleak.
There have been several recent studies or reports casting significant doubt on the economic and/or environmental viability, at least for the near and intermediate future, of some of the leading contenders to supplant fossil fuels.
First up: nuclear power. Of course, environmentalists and others have had grave doubts about nuclear for decades, because of problems with safe disposal of spent nuclear fuel and the dangers of diversion of enriched uranium to manufacture of weapons. On top of that, there is the argument that replacing generation of power from burning fossil fuels with generation from nuclear sources may well contribute more to release of CO2 into the atmosphere than continuing to use fossil fuels. This comes about because so much power (generated from burning of fossil fuels) will need to be expended simply to build from scratch many new nuclear power plants and sharply increase the mining and purification of uranium:
Nuclear Power Not Efficient Enough To Replace Fossil Fuels, Study Finds
                                               Nuclear energy production must increase by more than 10 percent each year from 2010 to 2050 to meet all future energy demands and replace fossil fuels, but this is an unsustainable prospect. According to a report published in Inderscience's International Journal of Nuclear Governance, Economy and Ecology such a large growth rate will require a major improvement in nuclear power efficiency otherwise each new power plant will simply cannibalize the energy produced by earlier nuclear power plants.
Here's another way to look at this. If you consider just the marginal costs of producing a kW of energy from nuclear fuel vs. fossil fuel – counting (if you can) both direct economic costs and costs due to release of CO2 into the atmosphere – nuclear energy might be superior. However, if you also consider the capital expense (both direct and indirect) required to build enough new nuclear facilities to replace existing conventional facilities and also meet increased demand, then (according to the study) nuclear loses.
So what about using other energy sources as alternatives to fossil fuels, in order to significantly reduce dependency on fossil fuels and release of CO2? Like hydrogen, for example. Of course, this depends on further developing a lot of technology that's either not cost-competitive yet (fuel cells) or not even available yet (practical and safe means of storing and transporting hydrogen). To say nothing of the capital costs (as above) needed to build hydrogen infrastructure if and when the technology is available.
Even if technology can solve the difficult problems of storing and transporting hydrogen, there's another fundamental problem. Hydrogen itself is more of a form of energy suitable for transport and storage than it is a readily available source of energy (like sunlight or fossil fuels) that can be acquired or extracted (relatively) cheaply. There's no hydrogen just sitting around (like natural gas) waiting to be mined and distributed. Energy has to be consumed in order to separate hydrogen from oxygen, which together make up H2O. This energy has to come from some other source, as input to the electrical/chemical process that separates out hydrogen (or recombines it to make another fuel such as methane). This energy is regained later – but always with some percentage loss – when hydrogen is chemically recombined with oxygen (as in a fuel cell).
There really isn't any energy advantage to hydrogen at all, except for the (presumed) advantage over batteries in storage and transport. Of course, energy in a storable form is required for use in vehicles like cars and airplanes, in spite of the unavoidable losses along the way. The following essay goes into all of this in more detail.
The Hydrogen Economy
                                          Skeptics scoff at perpetual motion, free energy, and cold fusion, but what about energy from hydrogen? Before we invest trillions of dollars in a hydrogen economy, we should examine the science and pseudoscience behind the hydrogen hype.
There are some problems with the essay. First, one does not "make" hydrogen. It is extracted from chemical compounds like water, hydrocarbons (fossil fuels except coal), or biomass (carbohydrates, cellulose, etc.). Energy has to be input to the process in order to break the chemical bonds between hydrogen and other elements (carbon or oxygen). You get the energy back out when hydrogen recombines with oxygen or carbon (in a fuel cell, combustion chamber, etc.) – but always at some loss.
Second, the essay mostly assumes hydrogen will be stored and transported in liquid form, which is difficult and expensive, since liquid hydrogen boils at an ultracold -253°C. There is some hope that technology can be developed to store gaseous hydrogen in exotic solid materials at reasonable temperatures and pressures. However, at this point that's still conjectural. The larger point is that a practical "hydrogen economy" is still, at best, not in the near future.
Saturday, March 15, 2008
Alternative energy sources
The outlook on energy alternatives to fossil fuels is looking a little bleak.
There have been several recent studies or reports casting significant doubt on the economic and/or environmental viability, at least for the near and intermediate future, of some of the leading contenders to supplant fossil fuels.
First up: nuclear power. Of course, environmentalists and others have had grave doubts about nuclear for decades, because of problems with safe disposal of spent nuclear fuel and the dangers of diversion of enriched uranium to manufacture of weapons. On top of that, there is the argument that replacing generation of power from burning fossil fuels with generation from nuclear sources may well contribute more to release of CO2 into the atmosphere than continuing to use fossil fuels. This comes about because so much power (generated from burning of fossil fuels) will need to be expended simply to build from scratch many new nuclear power plants and sharply increase the mining and purification of uranium:
Nuclear Power Not Efficient Enough To Replace Fossil Fuels, Study Finds
Nuclear energy production must increase by more than 10 percent each year from 2010 to 2050 to meet all future energy demands and replace fossil fuels, but this is an unsustainable prospect. According to a report published in Inderscience's International Journal of Nuclear Governance, Economy and Ecology such a large growth rate will require a major improvement in nuclear power efficiency otherwise each new power plant will simply cannibalize the energy produced by earlier nuclear power plants.
Here's another way to look at this. If you consider just the marginal costs of producing a kW of energy from nuclear fuel vs. fossil fuel – counting (if you can) both direct economic costs and costs due to release of CO2 into the atmosphere – nuclear energy might be superior. However, if you also consider the capital expense (both direct and indirect) required to build enough new nuclear facilities to replace existing conventional facilities and also meet increased demand, then (according to the study) nuclear loses.
So what about using other energy sources as alternatives to fossil fuels, in order to significantly reduce dependency on fossil fuels and release of CO2? Like hydrogen, for example. Of course, this depends on further developing a lot of technology that's either not cost-competitive yet (fuel cells) or not even available yet (practical and safe means of storing and transporting hydrogen). To say nothing of the capital costs (as above) needed to build hydrogen infrastructure if and when the technology is available.
Even if technology can solve the difficult problems of storing and transporting hydrogen, there's another fundamental problem. Hydrogen itself is more of a form of energy suitable for transport and storage than it is a readily available source of energy (like sunlight or fossil fuels) that can be acquired or extracted (relatively) cheaply. There's no hydrogen just sitting around (like natural gas) waiting to be mined and distributed. Energy has to be consumed in order to separate hydrogen from oxygen, which together make up H2O. This energy has to come from some other source, as input to the electrical/chemical process that separates out hydrogen (or recombines it to make another fuel such as methane). This energy is regained later – but always with some percentage loss – when hydrogen is chemically recombined with oxygen (as in a fuel cell).
There really isn't any energy advantage to hydrogen at all, except for the (presumed) advantage over batteries in storage and transport. Of course, energy in a storable form is required for use in vehicles like cars and airplanes, in spite of the unavoidable losses along the way. The following essay goes into all of this in more detail.
The Hydrogen Economy
Skeptics scoff at perpetual motion, free energy, and cold fusion, but what about energy from hydrogen? Before we invest trillions of dollars in a hydrogen economy, we should examine the science and pseudoscience behind the hydrogen hype.
There are some problems with the essay. First, one does not "make" hydrogen. It is extracted from chemical compounds like water, hydrocarbons (fossil fuels except coal), or biomass (carbohydrates, cellulose, etc.). Energy has to be input to the process in order to break the chemical bonds between hydrogen and other elements (carbon or oxygen). You get the energy back out when hydrogen recombines with oxygen or carbon (in a fuel cell, combustion chamber, etc.) – but always at some loss.
Second, the essay mostly assumes hydrogen will be stored and transported in liquid form, which is difficult and expensive, since liquid hydrogen boils at an ultracold -253°C. There is some hope that technology can be developed to store gaseous hydrogen in exotic solid materials at reasonable temperatures and pressures. (Recent examples: here, here.) However, at this point that's still conjectural. The larger point is that a practical "hydrogen economy" is still, at best, not in the near future.
So hydrogen is not an energy source, and it is even very problematical as a way to store energy in a portable form for use in cars and airplanes. Fortunately, there are other ways to make energy portable, such as batteries. A Toyota Prius uses nickel metal hydride batteries to store energy from the regenerative braking system, and it seems to be an economically successful product. Lithium ion batteries, such as are used in laptop computers, have a higher energy density than the nickel metal hydride type. They have problems of their own, but significant improvements are being made. (See here, here, here.)
That still leaves the problem of developing additional actual sources of energy, that are alternatives to fossil fuels. Ethanol (grain alcohol) is getting a lot of publicity these days. It's politically popular with the agricultural industry, for obvious reasons. Ethanol partially solves one problem with fossil hydrocarbon fuels – by removing some dependence on politically unstable areas as a fuel source. But ethanol does nothing for the problem of CO2 emissions.
And it creates serious problems of its own, such as driving up the cost of agricultural products needed to feed people. Further, as with hydrogen, it takes a lot of energy to extract ethanol (or other energy carriers such as other biofuels or methane) from agricultural crops or biomass. Critiques of ethanol and other biofuels are not new, though they don't seem to get the attention they deserve.
               Other alternatives? There's always solar (photovoltaic) energy. Of all new but currently available alternative energy sources to fossil fuels (whether oil, natural gas, or coal), solar seems to be the most economical, especially taking reduced CO2 emissions into account.
But of course, solar also has its problems too. These include capital costs for building infrastructure to capture solar energy and to store it (for peak or nighttime use) or transmit it from the sunniest areas with low land prices. It's these capital costs (initial construction and eventual replacement) that hurt, since the marginal cost of each kWh is almost nil.
However, making detailed economic comparisons with traditional energy sources is rather difficult, as this study argues: Cloudy Outlook For Solar Panels: Costs Substantially Eclipse Benefits.
It would seem that the real difficulty of economic analysis lies in predicting the future costs of conventional energy sources – fossil fuels, especially oil. Some of the problems:

•How to estimate costs associated with CO2 emissions, given that the idea of global warming itself is so controversial (especially in the minds of economists and political officials, if not atmospheric scientists). To say nothing of estimating social costs of conjectural side effects, such as sea level rise, serious water shortages, detrimental impact on human and animal health, impact on agricultural production, etc.
•How to estimate the foreseeable rise in price of fossil fuels (especially oil) due to political instability, rising extraction costs (deep ocean sources), depletion of supplies, and rapid increase in demand from developing parts of the world. (There are large uncertainties in all of these factors, and some cost has to be allocated to this uncertainty itself.)
•How to handle the issue of proper pricing for energy at times of peak demand, as opposed to off-hours. (The report just mentioned discusses this.)

At present, the cost of solar energy, taking into account such things as installation costs, depreciation, etc., might well be two to four times the cost of energy from fossil fuels. But at least the cost of solar is pretty certain to decline, while the cost of energy from fossil fuels can only increase – and at a worrisomely unpredictable rate, in view of the uncertainties just listed.


 

Embryonic stem cells and Klf4

There's now some additional information on one of the transcription factors written about here, which are able to reprogram adult skin cells into embryonic stem cells. To review, one of the teams responsible for this research used Oct3/4, Sox2, c-Myc, and Klf4 for the reprogramming, while another team used Oct3/4, Sox2, Nanog and Lin28.
Of the transcription factors in the first list, all but Klf4 have been well-studied. So it is of some interest to know more about Klf4, and why it seems to be somewhat less essential than the others.
Some of the interesting details are reported on here: Molecular Alliance That Sustains Embryonic Stem Cell State Identified.
Klf4 is normally active in real embryonic stem cells. To investigate the role Klf4 might be playing in the reprogramming of skin cells, the researchers investigated embryonic stem cells that had been artificially depleted of Klf4. To their surprise, the team found that the cells maintained their pluripotency.
The question then was how to explain this. What was found is that two closely-related transcription factors – Klf2 and Klf5 – took over the role of Klf4:
"Most important, the data showed that the other Klfs were bound to the target sites when one of them was depleted." said Dr. Ng. "These Krüppel-like factors form a very powerful alliance that work together on regulating common targets. The impact of losing one of them is masked by the other two sibling molecules."
This family of transcription factors, called Kruppel-like factors, gets its name from a homology to the Drosophila Krüppel protein. Members of this family have been studied for their roles in cell proliferation, differentiation and survival, especially in the context of cancer.
Interestingly enough, according to the research press release,
Klfs were found to regulate the Nanog gene and other key genes that must be active for ES cells to be pluripotent, or capable of differentiating into virtually any type of cells. Nanog gene is one of the key pluripotency genes in ES cells.
"We suggest that Nanog and other genes are key effectors for the biological functions of the Klfs in ES cells," Dr. Ng said.
"Together, our study provides new insight into how the core Klf circuitry integrates into the Nanog transcriptional network to specify gene expression unique to ES cells.

Nanog, of course, is one of the transcription factors in the set of transcription factors which was found to be an alternative, for reprogramming adult cells, to the set that contained Klf4.
The Nanog protein, too, is known to be critically important in pluripotent stem cells. It is a homeobox transcription factor that appears to play an essential role in self-renewal of undifferentiated embryonic stem cells. It also appears to be connected with cancer, because (according to Wikipedia) "It has been shown that the tumour suppressor p53 binds to the promoter of NANOG and suppresses its expression after DNA damage in mouse embryonic stem cells. p53 can thus induce differentiation of embronic stem cells into other cell types which undergo efficient p53-dependent cell-cycle arrest and apoptosis."
The connection of Klf proteins with cancer is not only through Nanog. According to Wikipedia, "Klf4 also interacts with the p300/CBP transcription co-activators." The closely-related p300 and CBP "interact with numerous transcription factors and act to increase the expression of their target genes." And they too are involved with cancer:
Mutations in the p300 gene have been identified in several other types of cancer. These mutations are somatic, which means they are acquired during a person's lifetime and are present only in certain cells. Somatic mutations in the p300 gene have been found in a small number of solid tumors, including cancers of the colon and rectum, stomach, breast and pancreas. Studies suggest that p300 mutations may also play a role in the development of some prostate cancers, and could help predict whether these tumors will increase in size or spread to other parts of the body. In cancer cells, p300 mutations prevent the gene from producing any functional protein. Without p300, cells cannot effectively restrain growth and division, which can allow cancerous tumors to form.
What IQ-1 does, Kahn explains, is to block one arm of a cell-signaling pathway called the Wnt pathway, while enhancing the signal coming from the other arm of the Wnt pathway. The Wnt pathway is known to have dichotomous effects on stem cells i.e. both proliferative and differentiative. More specifically, IQ-1 blocks the coactivator p300 from interacting with the protein ß-catenin; this prevents the stem cells from being 'told' to differentiate into a more specific cell type.

Cheating

It's certainly appropriate – as well as hilarious – to draw the analogy between humans and slime molds. Kurt Vonnegut, Samuel Clemens, and H. L. Mencken would approve. But there's serious truth in it:
Some cheaters can keep it in their genes
                               A new study examining social behaviour suggests certain individuals are genetically programmed to cheat and often will do- providing they can get away with it.
The researchers looked at slime moulds - microscopic single-cell organisms or amoebae that are forced to cooperate with one another when food is in short supply. Studying slime moulds at the cellular level provides the scientists with a unique insight into the genes that may also influence human behaviour.
The international team, including biologists from The University of Manchester, found that some amoebae have the ability to use cheating tactics to give them a better chance of survival. The research - published in the journal Nature - not only demonstrates that cheating is a natural phenomenon governed by our genes but that it may be widespread among social creatures.

This is familiar territory. I wrote about it here, where the subject (among other things) was the evolutionary origins of altruism and cooperation. One needs to read that (or be familiar with the viewpoint of evolutionary psychology on the origins of morality and ethics) in order to see how the following speculations fit in.
Apparently, in many social species, there is a tendency for populations to evolve with an equilibrium mixture of cheaters and non-cheaters ("altruists"). Although cooperation increases the probability of group survival, some individuals in any group can gain an advantage by cheating, so they will tend to persist in groups as time goes on. But they can't become too numerous without harming the group's survival. So eventually some equilibrium is reached.
In the simulation of intergroup warfare I discussed in my earlier post, it was the warfare which worked against survival, so that under such conditions, there were pressures against a large equilibrium fraction of cheaters. These pressures were manifested in such things as religion and moral/ethical codes of behavior, together with formalized punishment of cheaters.
But warfare isn't the only factor that can put pressure on group survival. Simply living in a hostile or marginal environment can do it. This seems to be what happens with slime molds. Individuals can be, well, individualists until there is an existential threat.
One wonders whether this isn't what happened to the Neanderthals. Their environment was harsh. They must have migrated to that environment during favorable conditions (otherwise, why stay?), but eventually conditions got worse. If they were not able to evolve (biologically and/or socially) fast enough to reduce the percentage of cheaters, it's reasonable to suppose all would die. Modern humans living around the same time in similar environments – and who survived – perhaps were able to evolve faster. Or else they had already better capabilities for intragroup cooperation to deter cheating. Things like abilities in their brains for cheater detection, a "theory of mind", and ethical reasoning.
Other considerations suggest that worsening environmental conditions leads to more intergroup warfare (if population density is high enough, so that there is competition for resources, not merely strugle to survive, as on an island without competing groups). Such warfare would also promote cooperation and intragroup altruism over cheating.
What kind of cooperation is helpful in the non-warfare scenario? Sharing of resources (food, shelter, tools, clothing, etc.) Also communal support for raising orphaned children. Groups that had such customs and low proportions of cheaters would be more likely to survive at all.
Incidentally, one of the principal investigators (Chris Thompson) in the slime mold study, seems to know his subject pretty well. Here's another item about his slime mold research.

A couple of things about memory

Children's Memory May Be More Reliable Than Adults' In Court Cases
                           Researchers Valerie Reyna, human development professor, and Chuck Brainerd, human development and law school professor--both from Cornell University--argue that like the two-headed Roman god Janus, memory is of two minds--that is, memories are captured and recorded separately and differently in two distinct parts of the mind.
They say children depend more heavily on a part of the mind that records, "what actually happened," while adults depend more on another part of the mind that records, "the meaning of what happened." As a result, they say, adults are more susceptible to false memories, which can be extremely problematic in court cases.
The implications of these results for legal testimony is not what I find especially interesting here. In fact, there are reasons why the testimony of children has sometimes been found to be less reliable than that of adults. Namely, in some cases, the techniques used to interview the children (before trial) have been improperly coercive or suggestive of particular interpretations.
What does seem interesting is the hypothesis that in adults memories of the same event tend to be stored in two distinct forms: literal details of "what happened", and interpretive judgments about the "meaning" of an event. But that in children it is primarily the actual details that are stored.
Reyna and Brainerd's Fuzzy Trace Theory hypothesizes that people store two types of experience records or memories: verbatim traces and gist traces.
Verbatim traces are memories of what actually happened. Gist traces are based on a person's understanding of what happened, or what the event meant to him or her. Gist traces stimulate false memories because they store impressions of what an event meant, which can be inconsistent with what actually happened.
The researchers have experimental evidence to support their conclusions. Some of this is noted in earlier accounts, such as this:
Children Less Prone To False Memories, Implications For Eyewitness Testimony, Study Shows
                                           In a study published in the May issue of Psychological Science, Brainerd and Reyna presented a list of words for groups of first, fifth and ninth graders. Many of the words from this "study list" were related to each other (by belonging to certain categories such as animals, furniture, men's names) while others were unrelated "filler" words.
After a short break, the students were presented with a new "test list" composed of study list words, new words belonging to the aforementioned categories (animals, furniture, etc.), and distracter words that were new and entirely unrelated to the categories or the study list. Their task was to identify whether they had previously heard a word or not.
As predicted, if the test list provided a new word with a closely related meaning (a "semantic relation") to a word from the study list, older children were more likely to assert that they had heard it before. Simply put, the older children had more false memories in this case than younger children.
One can speculate about what's going on here. As people mature through childhood, they are constantly learning about the interrelationship of isolated details and events. (For instance, "Dad acts more scary after he's been drinking beer.") In addition, the accumulation of details makes more literal forms of memory cumbersome (and liable to confusion), so people learn to make abstractions and interpretations that summarize details and make storage easier by associating similar details in more general categories. However, this kind of fuzzy storage (or "fuzzy traces" as Brainerd and Reyna call it) can misrepresent the facts. (For instance, "Dad was drunk when he hit me" – which might not actually be true.)
Second, and not directly related to this, there are two quite indepentdent studies that show something about the relationship between memory and experience of stress.
The first item concerns observation of squirrels:
Correct Levels Of Stress Hormones Boost Learning, Squirrel Study Suggests
                                            Tests on the influence that a stress-related hormone has on learning in ground squirrels could have an impact on understanding how it influences human learning, according to a University of Chicago researcher.
Jill Mateo, Assistant Professor in Comparative Human Development, has found that when they perform normal survival tasks, ground squirrels learn more quickly if they have a modest amount of cortisol, a hormone produced in response to stress, than those with either high or low levels of cortisol.
In humans, cortisol production is also related to stress and is known to have an impact on learning, but that impact is not well understood, Mateo said.
This should sound familiar to anyone who's been through even a few moderately difficult college courses. Namely, if the work in a particular course isn't difficult enough to cause at least a little stress, retention of the details may not be very complete. Without some stress, the material just doesn't seem "important" enough, even if it's new to the student, to compel the student's attention to the details and the complexity. But of course, if the material is difficult enough to cause excessive stress, anxiety can get in the way of successfully organizing the material in the student's mind.
The second study looked at the actual neurobiology of learning under conditions of acute stress:
Short-term Stress Can Affect Learning And Memory
                            Short-term stress lasting as little as a few hours can impair brain-cell communication in areas associated with learning and memory, University of California, Irvine researchers have found.
It has been known that severe stress lasting weeks or months can impair cell communication in the brain's learning and memory region, but this study provides the first evidence that short-term stress has the same effect.
As it turns out, another stress-related hormone besides cortisol is involved, corticotropin releasing hormone (CRH), and the latter is more significant under conditions of acute stress:
In their study, Baram and her UC Irvine colleagues identified a novel process by which stress caused these effects. They found that rather than involving the widely known stress hormone cortisol, which circulates throughout the body, acute stress activated selective molecules called corticotropin releasing hormones, which disrupted the process by which the brain collects and stores memories.
Learning and memory take place at synapses, which are junctions through which brain cells communicate. These synapses reside on specialized branchlike protrusions on neurons called dendritic spines.
In rat and mouse studies, Baram's group saw that the release of CRH in the hippocampus, the brain's primary learning and memory center, led to the rapid disintegration of these dendritic spines, which in turn limited the ability of synapses to collect and store memories.
The researchers discovered that blocking the CRH molecules' interaction with their receptor molecules eliminated stress damage to dendritic spines in the hippocampal cells involved with learning and memory.
The role of cortisol, in learning under conditions of moderate stress, remains somewhat less clear. In addition to the squirrel study, anecdotal experience with so-called "flashbulb memories" supports the idea that some degree of stress can assist the formation of memories. The Wikipedia article states, without references, "Some biologists believe that the hormone cortisol, which is released in response to stressful incidents, cooperate with epinephrine (adrenaline) to cause the formation of flashbulb memories by the brain, functioning to help remembering things to avoid in the future." The squirrel study suggests cortisol actually has some role in memory formation, rather than being just a coincidental byproduct of stress. (See also the article on Emotion and memory.



Memory and BDNF

Following up on part of this note, where I discussed relationships between memory and stress, it turns out that there are some interesting things known, related to this, which involve a "neurotrophic factor" called BDNF.
In fact, there's quite a lot to say. Let's begin with an explanation of some terms, a little about BDNF, and a look at some research from the past several years on the relationship between BDNF and memory. Later we'll take up more on how stress and depression enter the picture.
A neurotrophin is a type of protein that promotes the survival of neurons – which is in general a pretty good thing. (We'll get to examples in a moment.) One type of neurotrophin, known as a "neutotropic factor", is a growth factor that affects neurons in particular.
More generally, a growth factor is a proteins that signals certain types of cells to survive, differentiate, or grow. A growth factor that helps a cell survive does so by inhibiting programmed cell death. Other growth factors promote cell division, which results in growth of the tissue that contains the affected cells. Yet other growth factors may induce cells to differentiate into cells of a more specialized type.
An important example of a general growth factor is IGF-1, also known as "insulin-like growth factor 1", which we'll be looking at more extensively in upcoming posts.
In this post we're going to consider the specific neurotrophic factor known as BDNF, the brain-derived neurotrophic factor.
Research has shown that BDNF plays a role in memory formation and in the connection between stress and depression. For example, in rats the stress hormone corticosterone seems to decrease the expression of BDNF, and if stress is persistent, this eventually leads to the atrophy of the hippocampus. Since the hippocampus plays an important role in long term memory, this is one way in which stress can negatively impact memory.
Atrophy of the hippocampus has also been found in humans suffering from chronic depression. There is evidence that suggests a deficiency of BDNF may be at least in part implicated in such depression. For example, various factors (such as the neurotransmitter glutamate, exercise, calorie restriction, and antidepressant drugs) are known to stimulate expression of BDNF – and often ameliorate depression as well.
There's a lot of science behind all this. Let's just look at a few research announcements from the past several years to get a feel for the interactions of BDNF and memory.
Key Pathway In Synaptic Plasticity Discovered
                                                The researchers studied a major developmental event in newborn rodents. A rapid increases in synapse strength and visual circuit refinement occurs quickly after the animal's eyes first open. It was already known that the PSD-95 protein rushes to visual system synapses soon after eye opening. PSD-95 is a scaffold protein that anchors several types of receptors. Some of these receptors are for the neurotransmitter glutamate, and there is also the TrkB receptor for BDNF (and other neurotrophins).
A positive feedback loop is initiated, in which the NMDA glutamate receptor activates BDNF. BDNF then triggers a signaling pathway involving the kinases PI3 and Akt. This pathway leads to more PSD-95 production, completing the loop. The net result is to make synapses more responsive to BDNF, followed by production of additional PSD-95. Once this loop is started at just a few of a neuron's synapses, the rush of PSD-95 to other excitory synapses of the neuron is on. In this way a few very active synapses can prime larger regions of a neuron for long-term synaptic strengthening in response to subsequent stimulation in the newborn animal.
Proteins Necessary For Brain Development Found To Be Critical For Long-term Memory
                                                  This research indicates that BDNF, which is crucial for the growth of brain cells during development, is also equally important for the formation of long-term memories. The study was performed on the common marine snail Aplysia. When the snails are electrically shocked, the neurotransmitter serotonin is released and promotes the formation of long-term memories associated with the shocks. But when the researchers blocked interaction between BDNF and its TrkB receptor, long-term memories did not form, even though serotonin was still released at synapses. This indicates that serotonin alone was not sufficient for long-term memory formation. Short-term memory formation was not affected. Further investigation showed that interfering with the BDNF receptors blocked long-term enhancement of the connections between the brain cells in the reflex circuit normally induced by the shock treatment.
Drug Triggers Body's Mechanism To Reverse Aging Effect On Memory Process
                                                       A class of drugs known as "ampakines" (so-called because they target AMPA receptors) has been under study and development since the early 1990s to deal with neurological conditions, such as schizophrenia, problems of attention span and alertness, and memory impairment associated with dementia and Alzheimer's disease. The research reported here was conducted by a team that included Gary Lynch, who has long been associated with investigation of the biological bases of learning and memory. (See here for more about Lynch and long-term memory.)
In this study, rats were treated for four days with an ampakine drug. Of particular interest was the effect of the drug on the hippocampus of the brain, because of its known importance in the formation of long-term memories. In the hippocampus areas of rats treated with the drug, it was found that (compared to controls) there was a significant increase both of levels of BDNF and of long-term potentiation (LTP) of synapses (an indicator of memory formation). Further, even though the drug had a known half-life of only 15 minutes, elevated levels of BDNF and LTP were observed as long as 18 hours after drug administration was stopped.
Tiny RNA Molecules Fine-tune The Brain's Synapses
                               Synapses between two neurons are formed between locations at the tip of an axon of one neuron (the "presynaptic" neuron) and a dendrite on the body of another neuron (the "postsynaptic" neuron). In order to form a complete synapse, it is necessary for there to be protrusions called "dendritic spines" on dendrites of the postsynaptic neuron. In the process of synaptic signaling, it is these spines that absorb neurotransmitter molecules released by the axon of the presynaptic neuron. Consequently, any mechanism that affects the density of spines on dendrites will affect the total number of synapses that can form between neurons.
It had previously been established that BDNF activates a protein kinase called Limk1, which in turn promotes the growth of dendritic spines and hence the ability of synapses to form. This research on rats studied the effect of the microRNA miR-134 on growth of dendritic spines of hippocampal neurons. It was found that when neurons were exposed to miR-134, spine volume significantly decreased, and synapses weakened. Conversely, when miR-134 was inhibited, spines increased in size, strengthening synapses. However, increased levels of BDNF negated the effects of miR-134, indicating that miR-134 achieved its effect by suppressing Limk1.

More about alternative energy

About a month ago, I wrote about the shortcomings of various alternative energy sources. That was mainly about a variety of problems with nuclear energy, solar energy (photovoltaics), and hydrogen.
I didn't even get into the subject of biofuels, but I should have, because the problems in that area are becoming painfully obvious.
Ordinarily I would not expect to find much significant reporting on a scientific/technical subject in Time magazine, especially something that challenges "conventional wisdom". But via DarkSyde at Kos I see there's an interesting article on the problems of "biofuel": The Clean Energy Scam
Several new studies show the biofuel boom is doing exactly the opposite of what its proponents intended: it's dramatically accelerating global warming, imperiling the planet in the name of saving it. Corn ethanol, always environmentally suspect, turns out to be environmentally disastrous. Even cellulosic ethanol made from switchgrass, which has been promoted by eco-activists and eco-investors as well as by President Bush as the fuel of the future, looks less green than oil-derived gasoline.
Meanwhile, by diverting grain and oilseed crops from dinner plates to fuel tanks, biofuels are jacking up world food prices and endangering the hungry.
The Time article focuses on the loss of rainforest, and consequently the loss of its ability to soak up and sequester CO2. When the forest is gone, CO2 will still be incorporated in biomass (crops of some sort). But then that is converted to biofuel, and released back into the atmosphere when it's burned. (To say nothing of the energy that's just wasted along with release of CO2 when the forest biomass is burned to clear it away.) Given all the energy that has to be expended to grow and harvest biofuel crops, with resulting additional release of CO2, we are worse off in terms of greenhouse gas emissions than if we just burned oil (or even coal).
But that's not the only serious problem. Crops that are grown to make fuel (from sugar cane, corn, switchgrass, or whatever) use land where food crops (for people and animals) could be grown instead. Driving up the cost of food for everyone on the planet. (Have you checked the price of bread or eggs at the market recently?)
Economists have spoken out about this problem for several years, when the hype for biofuels and ethanol was just beginning to build. For instance, we have from Howard Simons in early 2006: Making Our Food Fuel Isn't the Answer
If high prices strengthen energy's claim on food supplies, governments everywhere will intervene on behalf of their hungry citizens. If low prices torpedo biofuels' economics, governments everywhere will respond with subsidies for these industries. Only an elimination of current mandates and subsidies today will avoid these problems tomorrow, but the likelihood of this happening is near zero. Somehow I believe we will rue the day when we decided to make food and fuel substitutes at the margin.
In early 2007 Paul Krugman picked up the story: The Sum of All Ears
There is a place for ethanol in the world’s energy future — but that place is in the tropics. Brazil has managed to replace a lot of its gasoline consumption with ethanol. But Brazil’s ethanol comes from sugar cane.
In the United States, ethanol comes overwhelmingly from corn, a much less suitable raw material. In fact, corn is such a poor source of ethanol that researchers at the University of Minnesota estimate that converting the entire U.S. corn crop — the sum of all our ears — into ethanol would replace only 12 percent of our gasoline consumption.
So ethanol doesn't even help the U. S. all that much in terms of dependence on foreign oil. And this February Krugman returned to the subject here, linking to this: Ethanol Demand in U.S. Adds to Food, Fertilizer Costs
About 33 percent of U.S. corn will be used for fuel during the next decade, up from 11 percent in 2002, the Agriculture Department estimates. Corn rose 20 percent to a record on the Chicago Board of Trade since Dec. 19, the day President George W. Bush signed a law requiring a fivefold jump in renewable fuels by 2022.
Increased demand for the grain helped boost food prices by 4.9 percent last year, the most since 1990, and will reduce global inventories of corn to the lowest in 24 years, government data show. While advocates say ethanol is cleaner than gasoline, a Princeton University study this month said it causes more environmental harm than fossil fuels.
And then last week Krugman had even more: Grains Gone Wild
The subsidized conversion of crops into fuel was supposed to promote energy independence and help limit global warming. But this promise was, as Time magazine bluntly put it, a “scam.”
This is especially true of corn ethanol: even on optimistic estimates, producing a gallon of ethanol from corn uses most of the energy the gallon contains. But it turns out that even seemingly “good” biofuel policies, like Brazil’s use of ethanol from sugar cane, accelerate the pace of climate change by promoting deforestation.
And meanwhile, land used to grow biofuel feedstock is land not available to grow food, so subsidies to biofuels are a major factor in the food crisis. You might put it this way: people are starving in Africa so that American politicians can court votes in farm states.
Here's a report of a scientific study on the issue: Some Biofuels Risk Biodiversity And Could End Up Harming Environment
Corn-based ethanol is currently the most widely used biofuel in the United States, but it is also the most environmentally damaging among crop-based energy sources.
Finally, to bring this back to a solid scientific foundation, Sean at Cosmic Variance reminds us that Energy Doesn’t Grow on Trees
In particular, biofuels (such as ethanol) and hydrogen are not actually sources of energy — given the vagaries of thermodynamics, it costs more energy to create them than we can get by actually using them, as there will inevitably be some waste heat and entropy produced
.
Although all this bad news about just about every prospective near-term form of alternative energy is discouraging, there are a few other options that may become available in the slightly more distant future. There's the old perennial, controlled nuclear fusion. Even though work on that is even more active than ever, it's still at least several decades away.
But there's another significant option that's often overlooked: solar power satellites. This technology uses very large arrays of photovoltaic panels high in orbit around the earth. The energy is beamed back to the ground in the form of microwaves. (So this should not be confused with simply using mirrors to redirect additional sunlight, which presents serious problems of its own.)
Solar power satellites also have many uncertainties and potential problems, but the largest is simply boosting enough of them into orbit, and maintaining them. A possible approach to those problems involves space elevators. But those, again, present a whole additional set of challenges.
For now, here are a couple of articles from last fall with more details:

Consciousness, free will, etc.

For quite a long time, the subject of "consciousness" was not considered especially suitable for scientific investigation. A tremendous amount has been written about it, of course. But most of that has been written by philosophers rather than working scientists. Quite a few, if not a substantial majority, of philosophers seem to think that consciousness is a "mystery" that isn't likely to be understood scientifically anytime soon, if ever.
Well, on one hand, when a subject is not very amenable to scientific investigation, then it usually is philosophers, rather than scientists, who write and speak about it. Only when you see that situation change, and you have actual experimental scientists working on the problem and having the technology to actually conduct meaningful experiments, is it time to regard the subject as fair game for science.
At one time even physics was a subject discussed more by philosophers than by scientists. But that changed when people like Galileo and Newton came along, followed by many others. In regard to consciousness, perhaps the technology known as "functional MRI" may be the factor that tips the scales towards real science.
On the other hand, my general opinion of philosophers is well expressed by a quote attributed to one of the all time greatest scientists, C. F. Gauss: "When a philosopher says something that is true, then it is trivial. When he says something that is not trivial, then it is false."
However that may be, we are now in fact seeing real scientific work being done on the subject of consciousness. It's a vast field, I can only scratch the surface here, and I won't go into much detail about any of this. But perhaps a good example of relevant recent research on consciousness is this:
Brain scanner predicts your future moves
                          Long before you decided to read this story, your brain may have already said "click that link".
By scanning the brains of test subjects as they pressed one button or another – though not a computer mouse – researchers pinpointed a signal that divulged the decision about seven seconds before people ever realised their choice. The discovery has implications for mind-reading, and the nature of free will.

Seven seconds. Think about that for a bit. Where, exactly, is "free will", if the brain has already made, or almost made, a decision seven seconds before one is even conscious of a decision having been made?
Sure, one can argue that "free will" comes into play near the final stage of the process, when a course of action is consciously considered before commitment is made to it. Perhaps. But consider something like drug addiction. Just how much "free will" does an addict actually have to resist his/her cravings? Rather little, no? There's quite a bit of research into addiction now, and it is supporting the idea that addictive behavior is largely driven by brain chemistry. I could cite dozens of recent reports in this area, but I need to move on to other stuff right now. Maybe more on addiction later.
Oh yes, and there are a number of other brain states that appear to significantly affect behavior much more strongly than does conscious deliberation. States such as pain, extreme hunger or thirst, the experience of being tortured by agents of the U. S. government, etc.
There's another area of research that comes to mind in connection with significant unconscious influences on people's thoughts and behavior. Psychologists have a term for it: "priming". What it means is that one can manipulate, at least on a statistical basis, the behavior of another person, such as an experimental subject, a voter, or a "consumer", by subtly exposing the subject to various kinds of cues before eliciting the behavior to be manipulated. The subject may be consciously aware of the cues, but not of how they affect subsequent behavior.
Again, I could probably come up with many examples of studies on this if I had more time. Maybe later. Just two examples now, this story that appeared last year: Who’s Minding the Mind?. I wrote about that here, along with another, somewhat older story (A New Study Suggests A Relationship Between Fear Of Death And Political Preferences). Something to think about in this election year.
Moving right along, this all sets the stage for the following very recent report:
Blind to Change, Even as It Stares Us in the Face
                                             The phenomenon that Dr. Wolfe’s Pop Art quiz exemplified is known as change blindness: the frequent inability of our visual system to detect alterations to something staring us straight in the face. The changes needn’t be as modest as a switching of paint chips. At the same meeting, held at the Italian Academy for Advanced Studies in America at Columbia University, the audience failed to notice entire stories disappearing from buildings, or the fact that one poor chicken in a field of dancing cartoon hens had suddenly exploded. In an interview, Dr. Wolfe also recalled a series of experiments in which pedestrians giving directions to a Cornell researcher posing as a lost tourist didn’t notice when, midway through the exchange, the sham tourist was replaced by another person altogether.
The article is a report by Natalie Angier on research into consciousness and attention by neuroscientist Jeremy Wolfe. I won't attempt to say much about it, except that it concerns the interplay of consciousness and attention. And how in some sense we are not really conscious of as much as we think we are. The sense data are streaming into our brains, and they do register at some level that appears to be conscious. But in fact, only a part of the information stream that is the focus of "attention" actually seems to matter." Attention" is the brain's mechanism for limiting actual processing to just a part of the data stream that, somehow, seems most important.
I won't attempt to summarize more than that. Angier is a pretty good writer. Read the article for yourself. Actually, speaking of Angier, I should remark that I have expressed reservations before about her writing style (here). There I described her style as "too flowery and gaudy for my taste". The article mentioned above is fairly tame in that regard. But sometimes what she writes seems, to me anyhow, to be best described as somewhat twee.
Now, "twee" is a word you may not be familiar with if you have been exposed mainly to U. S. English (and you don't do crossword puzzles). It's a British word. A rough American equivalent might be "affected". However, I used "twee" to make a point. Namely, that if you didn't know the meaning, but you looked it up, I'm pretty sure you won't have much trouble remembering it. Why? Because I brought it to your attention.
So you see, although you were probably conscious of the word when you read it, having had your attention focused on it will tend to be the catalyst that gets it added to your memory.
The brain has other mechanisms for directing consciousness in certain ways and for raising the probability that certain kinds of data get remembered. Emotion is one such psychological mechanism that has this effect. I've touched on this tangentially when I wrote about stress and memory here. Also think about "flashbulb memories".
The point is that sensory data that is accompanied by significant emotional valence tends to be preferentially stored in memory. Various hormones and growth factors (such as BDNF) seem to play a role in this process. Consciousness per se... not so much. You have been conscious of quite a lot that you have already forgotten the next day (such as, perhaps, what you had for breakfast a day or two ago).
Here's another report of recent research that tends in the same direction. It concerns the role of emotion in the sense of smell, and which olfactory experiences tend to be stored in memory:
One Bad Experience Linked To Sniffing Out The Danger
                                             Each human nose encounters hundreds of thousands of scents in its daily travels perched front and center on our face. Some of these smells are nearly identical, so how do we learn to tell the critical ones apart?
Something bad has to happen. Then the nose becomes a very quick learner.
New research from Northwestern University's Feinberg School of Medicine shows a single negative experience linked to an odor rapidly teaches us to identify that odor and discriminate it from similar ones.

I don't think I need to comment further on that, with respect to how the emotional circuitry of the brain influences the memory storage circuitry.
Time to wrap up now. I'm going to indulge in a bit of anecdotal reporting on a few of my own observations on consciousness. This is just speculation on my part, not scientific data at all, of course. Think about your own experiences and see whether they aren't consistent with the possibility that consciousness is just another brain mechanism, which serves various purposes, and interacts in different ways with other brain mechanisms. If this is so, eventually we should be able to understand scientifically what the biology of consciousness is.
First off, I'll note that I recently had a colonoscopy. (Nothing bad found. Thanks for asking.) The most interesting part of the experience was how readily consciousness can be turned on and off. Before the procedure began, I was injected with two drugs: Demerol and Versed. The former blocks pain. The latter is a strong sedative that basically turns off consciousness for a brief, fairly predictable period of time.
For a couple of minutes after receiving the Versed, I didn't notice anything unusual, no drowsiness, no disorientation, nada. I just looked around and noticed various features of the immediate environment, such as the staff and the monitoring instruments. I can still picture that stuff fairly clearly. Of the next half hour or so while the procedure was going on, I can recall nothing at all. And then I was awake again, and everything seemed pretty normal. I can recall that period clearly too.
What I conclude from this is that consciousness and memory formation are both processes and/or mechanisms that can be turned off pretty mechanically with a simple chemical. Probably the memory formation is turned off first, so that the intermediate state of drowsiness (if it even occurred) was not remembered. And then as soon as the chemical has been metabolized away, consciousness and memory formation resume with little aftereffect. (Physicians maintain that a patient's judgment can be compromised for several hours afterwards. Perhaps so, but I could detect little evidence of that.) I was also told that patients under the effects of the drugs are still consciousness enough to follow verbal instructions. But of course, I have no memory of whether this is true.
This experience was not like falling asleep normally, when there is usually a definite period of drowsiness that one can often remember the next day. Nevertheless, once asleep one is no longer conscious in the usual sense. Again this suggests that consciousness is just a mechanism that can be turned off (though not by means of volition, unfortunately) at appropriate times.
But we all realize that many mental processes do not cease when we are asleep. For instance, perceptual data is still coming in and processed by the brain to some extent. Noises, especially, that can and do wake us up. Then there is dreaming, which seems to engage large parts of the machinery of consciousness, including emotional subsubsystems (fear and pleasure), and perceptual processing systems of vision and hearing – only on internally generated rather than external sensory data. Sometimes external perceptual data becomes part of the dream consciousness, but usually not.
Another interesting aspect of dreaming is how it interacts with memory. We all know how quickly, after we wake up, we forget about what we may have been dreaming. This suggests that the short term memory system has been functioning, but loses its content more or less as usual. Intermediate and long term memory systems seem to be shut down. It's very unusual (in my experience) to remember any dreams several hours later, unless I happened to think of them immediately after waking up.
Only, this doesn't seem to be entirely true. I've not infrequently had the experience with dreams, when I can "remember" details of being in places and situations and in the company of people that I've encountered in other dreams, perhaps not at all recently. So it seems that there is some sort of long term memory mechanism and storage capability that is specifically dedicated for use while dreaming. Perhaps this is just an aspect of the déjà vu effect.
Another intesting question about the state of consciousness during sleep is whether, or to what extent, the brain continues to engage in creative problem solving. We all know about the advice, when one has a difficult problem, to "sleep on it". This does seem to help a little, but probably it's more a case of letting sleep restore the freshness and alertness of one's mind. Ceasing to think about problems for awhile (whether hours or days) often has similar benefits. I'd say that I experience creative insights into problems rather more often in some place like the shower or out on a walk than I do right after awakening. So it doesn't seem to me that a great deal of actual ratiocination is going on during sleep. At least in my experience.
One last sort of observation, concerning how attention facilitates memory. I think most people find it a little difficult to recall what they've had to eat for lunch even a day or two ago, certainly a month or even a week ago. Unless, that is, there was something unusual about the circumstances of the meal. Such as eating something one hasn't tried before, or at least not for a long time. Or having a meal in a restaurant or location one hasn't eaten in often (or ever). In such cases, one tends to recall many little details of the experience, not just the items in the meal itself. For instance, one tends to remember noticing specific ingredients in what's eaten, and what flavors seemed to be especially pleasant or unpleasant.
When I think about it, I can still recall where I was the first time I had a cola beverage, and how I enjoyed it. I was only six or so. Now, perhaps what I can remember was not actually the first time. And perhaps I'm only remembering past experiences of having that memory. But still... this is about an experience that was decades ago, and without particularly intense related emotions, just general pleasure. But the experience was marked by having my full attention at the time. Needless to say, I can't summon up the perceptual experience of (probably) any other consumption of a similar beverage... except the one I had today.
From that experience, I draw the conclusion that attention is a mechanism which is separate from consciousness, yet which regulates it in such a way that essentially permanent long term memories can be formed. I don't think it's too much of a stretch to imagine that specific biological features will eventually be identified that implement the mechanisms of attention and consciousness in general.

Induced Pluripotent Stem Cells II

In this article from the April 4 Science, which I mentioned here, several research reports dealing with induced pluripotent stem cells were discussed. One of these I covered in the post I just noted.
Another just as important report apparently has not yet been formally published, but is (at least temporarily) available online since February 14 at Science Express:
Generation of Pluripotent Stem Cells from Adult Mouse Liver and Stomach Cells
                                                         Induced pluripotent stem (iPS) cells have been generated from mouse and human fibroblasts by the retroviral transduction of four transcription factors. However, the cell origins and molecular mechanisms of iPS cell induction remain elusive. This report describes the generation of iPS cells from adult mouse hepatocytes and gastric epithelial cells. These iPS cell clones appear to be equivalent to ES cells in gene expression and are competent to generate germ-line chimeras.
It's not surprising that this is significant research, as it's from the same team of Shinya Yamanaka that was the first to report successful creation of induced pluripotent stem cells. (See here.)
So what is this research about? Well, the investigators used the same four transcription factors (Oct3/4, Sox2, Klf4, and c-Myc) as employed in the majority of previous iPS studies. However, instead of applying the transcription factors to fibroblast cells, they were applied to two types of epithelial cells instead.
Fibroblasts are part of a body's connective tissue. They are involved in structure and support for other tissues and contain large amounts of the protein collagen. They do not divide for the most part, and so it is especially significant that it was possible to reprogram them into a stemcell-like state at all.
Epithelial cells, on the other hand, line the inner and outer surfaces of various body structures, including skin and the gastrointestinal tract. Such cells divide more frequently. They have to, in order to replace other cells of the same kind that are exposed to hostile environments. Epithelial cells also tend to be more adherent to other cells, because they more highly express an adherence protein called E-cadherin.
In some sense, then, epithelial cells are a little more like stem cells to begin with, so one might expect better results when attempting to reprogram them.
This expectation seems to have been met. One of the key differences the researchers found is that reprogrammed epithelial cells had less tendency to form cancerous tumors in mice into which they were included. Certainly not an inconsiderable advantage. This characteristic may be related to the finding that c-Myc seems to play a less essential role in reprogramming epithelial cells.
Specifically, reprogramming of epithelial cells was almost as efficient when c-Myc was not used as when it was included with the other three transcription factors. Yet it was not possible to accomplish reprogramming if any of the other three factors was omitted. In contrast, the efficiency of reprogramming fibroblasts dropped by 90% when c-Myc was omitted.
Another intriguing difference was that reprogrammed epithelial cells contained higher levels of expression of β-catenin than reprogrammed fibroblasts did. (You may recall – see here – that β-catenin is an important part of the Wnt signaling pathway.) In this regard, the reprogrammed epithelial cells are more like true embryonic stem cells than reprogrammed fibroblasts are. It's probably not a coincidence that expression of Nanog is stimulated by β-catenin, (see here), since Nanog is considered important for maintaining stem cell pluripotency.
A further advantage of the use of epithelial cells is that many fewer retroviral "integration sites" were needed to include the transcription factor genes into the cell genome, in comparison with fibroblasts. This is another way the risk of cancer is reduced.

TOR signaling and cancer

Another recent development pertinent to the discussion of TOR signaling and cancer (see here), is the announcement of preclinical findings about a potential anti-cancer drug that may act against ovarian cancer. The drug works by inhibiting the mTOR signaling pathway. (mTOR is the mammalian form of TOR.)
This is not at all the first anti-cancer drug that's come along with a similar mechanism of action. But it's still interesting, because any drug that affects TOR signaling has the potential of also causing unwanted side effects, since TOR signaling is involved in so many cell processes. Presumably some effort has been made to find reasons why the effect of the drug should be limited to cancer cells.
The drug is called NV-128, and has been developed by an Australian biotech company called Novogen. Since the drug hasn't yet entered clinical trials in humans, it could take a decade or so (as usual) to perform enough testing to determine that NV-128 is actually effective, and relatively safe.
Anyhow, here's the news release:
Drug Compound Leads To Death Of Ovarian Cancer Cells Resistant To Chemotherapy
                                                                       In a discovery that may be useful for maintaining remission in chemo-resistant ovarian cancer, Yale scientists report that pre-clinical studies have shown the drug compound NV-128 can induce the death of ovarian cancer cells by halting the activation of a protein pathway called mTOR.
Many traditional cancer drugs work by triggering cell death via apoptosis. Unfortunately, apoptosis needs enzymes called caspases to work, as explained here. And cancer cells may develop a circumvention of this mechanism by turning down the production of caspases, which are needed to allow mitochondria to respond to apoptosis signals. NV-128, however, is able to overcome this problem by triggering caspase-independent cell death.
In cancer cells, mTOR signals enhance tumor growth and may be associated with resistance to conventional therapies. Inhibition of mTOR could shut down many of these survival pathways, including proteins that protect the mitochondria of cancer cells.
Here's the Novogen press release:
Novogen’s NV-128 shown to target the akt-mTOR receptor in chemoresistant cancer cells
                                                                            NV-128 is unique in that it does not induce caspase-mediated apoptosis which can be non-functional in chemoresistant cancer cells due to accumulated mutations in tumour suppressor/promoter genes and over-expression of anti-apoptotic proteins. Rather, NV-128 uncouples the akt-mTOR­P70S6K signal transduction cascade which has a key role in driving protein translation and uncontrolled cancer cell proliferation. Further, NV-128 induces mitochondrial depolarization via a novel pathway involving the autophagy protein Beclin-1 and Bcl-2, thereby resulting in endonuclease G translocation to the nucleus and cell death.
The same research group that presented the findings just mentioned has also done work on ovarian cancer itself, and been able to locate cancer stem cells for this type of cancer:
Ovarian Cancer Stem Cells Identified, Characterized
                                                    Researchers at Yale School of Medicine have identified, characterized and cloned ovarian cancer stem cells and have shown that these stem cells may be the source of ovarian cancer's recurrence and its resistance to chemotherapy.
As already mentioned, NV-128 is not the only drug under investigation for attacking cancer by targeting the TOR pathway. In fact, almost a year ago, the first anti-cancer mTOR-inhibitor received FDA approval. It's Toricel (temsirolimus), an intravenous drug from Wyeth Pharmaceuticals, for kidney cancer. Novartis has an oral drug (everolimus) for kidney cancer in Phase III trials. (It's already been approved by the FDA as an immunosuppressant to prevent rejection of organ transplants.) Interestingly, and unsurprisingly, everolimus is a derivative of Rapamycin (sirolimus) – an anti-fungal and immunosuppressive compound – which led to the original discovery of mTOR. Everolimus works similarly to Rapamycin as an mTOR inhibitor.
The American biotech company Ariad Pharmaceuticals has a small molecule anti-cancer mTOR inhibitor called deforolimus in intermediate clinical trials for a variety of solid cancers, such as sarcomas, endometrial, prostate, breast and non-small cell lung cancers. The company describes the drug as "a novel small-molecule inhibitor of the protein mTOR, a “master switch” in cancer cells. Blocking mTOR creates a starvation-like effect in cancer cells by interfering with cell growth, division, metabolism, and angiogenesis." Last summer Ariad entered into a major partnership with Merck to develop and test the drug, so this is an indication that the drug has definite promise.
Ariad has a nice video you can download, which explains a bit about how their drug works, and about TOR signaling in general. I highly recommend having a look at it, since it covers upstream signals that activate mTOR (growth factors, amino acids, oxygen, energy), downstream effects (synthesis of proteins for cell growth, cell division, metabolism, and angiogenesis). It notes that certain other signaling proteins (PTEN, Akt, PI3K) cause overactivation of mTOR, and it points out that mTOR stimulates the production of the cyclin D cell division protein.

 

Black holes exist

It may come as a surprise to lay readers to learn that even as recently as 10 years ago there were prominent physicists who still doubted the existence of black holes.
For example, there's Sir John Maddox, a trained chemist and physicist who was editor of Nature for 22 years. In a book published in 1998 (What Remains to be Discovered), he wrote, "The concept of black holes raises serious difficulties of a philosophical character." (p. 43) (Remember what C. F. Gauss said about philosophers.) And, "The habit of others in referring to black holes as "putative" seems to imply a collective uneasiness about the concept." (p. 112)
But recent skeptics of black holes are in good company. Einstein, for one, vigorously objected to the idea. Around 1935 Subrahmanyan Chandrasekhar, later a Nobel Prize winner and now celebrated, but only in his mid-20s at the time, had the audacity to argue that black holes might form from collapsed stars only 1.44 times as heavy as the Sun – and nearly had his career wrecked from Sir Arthur Eddington's sharp criticism.
Even very recently one still sees theoretical studies that offer alternatives to the standard relativisic model of black holes.
Nevertheless, to the consternation of skeptics, the evidence for the correctness of the standard model of black holes just continues to pile up:
Quasar tests general relativity to the limit
                                                        [T]eam leader Mauri Valtonen of Tuorla Observatory in Finland claims the work provides the first hard evidence for black holes, which are so massive that space–time is predicted to completely curve in on itself: "People refer to the concept of black holes all the time, but strictly speaking one first has to prove that general relativity holds in strong gravitational fields before we can be sure that black holes exist," he told physicsworld.com.
And actually, the more significant part of that research is its validation of general relativity in very strong gravitational fields:
Astronomers have obtained the most compelling evidence yet that massive objects dramatically warp space–time, as predicted by Einstein's general theory of relativity. Although the geometric nature of gravity was first demonstrated in 1919, when Arthur Eddington famously detected the subtle warping effect of the Sun on the light from distant stars, the new results provide the first test of Einstein's theory in much stronger gravitational fields.

The amino acid chirality mystery

If the analysis here is correct, it solves one of the more puzzling mysteries of life on Earth – namely, the fact that all 20 amino acids found in biological proteins are "left-handed".
Meteorites Delivered The 'Seeds' Of Earth's Left-hand Life, Experts Argue
                                                                   In a report at the 235th national meeting of the American Chemical Society, Ronald Breslow, Ph.D., University Professor, Columbia University, and former ACS President, described how our amino acid signature came from outer space.
Chains of amino acids make up the protein found in people, plants, and all other forms of life on Earth. There are two orientations of amino acids, left and right, which mirror each other in the same way your hands do. This is known as "chirality." In order for life to arise, proteins must contain only one chiral form of amino acids, left or right, Breslow noted.
"If you mix up chirality, a protein's properties change enormously. Life couldn't operate with just random mixtures of stuff," he said.
Recall that a carbon atom can form up to four bonds with other atoms. (Sometimes there are 2 or more bonds with the same atom, such as a double bond to another carbon atom.)
Breslow and Columbia chemistry grad student Mindy Levine found that these cosmic amino acids could directly transfer their chirality to simple amino acids found in living things. Thus far, Breslow's team is the first to demonstrate that this kind of handedness transfer is possible under these conditions.
On the prebiotic Earth, this transfer left a slight excess of left-handed amino acids, Breslow said. His next experiment replicated the chemistry that led to the amplification and eventual dominance of left-handed amino acids.
That's where things stand now. We have as yet no way of knowing whether this is the scenario that actually occurred. But it is the most credible scenario yet devised to explain the otherwise astonishing fact that essentially all life on Earth uses only left-handed amino acids.