What happened to the Neanderthals, that now-extinct group of stocky, big-boned ancient humans who lived in Europe and Asia until about 30,000 years ago? Were they wiped out by conflict with modern humans who arrived later in Europe or by some other unfavorable environmental condition? Did they interbreed with modern humans, ultimately "disappearing" as a distinct group because their genes were incorporated into a much larger gene pool of modern humans?
Several years ago I reported (this blog May 12, 2010) that Neanderthals shared a number of genetic variations with modern Europeans that they did not share with modern Africans. These findings at least suggested, but did not prove, that interbreeding occurred when Neanderthals met modern humans.
More recently, DNA analysis reveals that up to 3% of the DNA of modern Eurasians is similar to the DNA of the Neanderthals. And according to a paper published this week in Nature, DNA analysis of a bone of a modern human who lived in present-day Romania around 40,000 years ago revealed that 6-9% of his genes were from Neanderthals. In other words, he may have had a great, great-grandparent who was a Neanderthal.
Of course, this does not prove that interbreeding was the cause of the disappearance of the Neanderthals. Interbreeding can occur between groups in conflict, as the behavior of modern-day humans clearly shows. But at least it indicates that interbreeding did occur, at least occasionally.
Wednesday, June 24, 2015
Tuesday, June 16, 2015
Breast-Feeding Correlates with Reduced Risk of Leukemia
Yet another reason to consider breast-feeding your baby; a study just published in JAMA Pediatrics reports that children who are breast-fed for at least 6 months have a 19% lower risk of developing childhood leukemia compared to children who are not breast fed at all. Childhood leukemia, a type of blood cancer, is a leading cause of death among children and adolescents.
Scientists have long known that breast milk contains antibodies and immune-strengthening compounds that help support the infant's immature immune system. The American Academy of Pediatrics recommends that women breast-feed their infants because it lowers the risk of infections and allergies, among other known benefits. But this is the first time that a correlation between breast-feeding and reduced risk of leukemia has been documented.
Of course, the finding of a correlation between breast-feeding and reduced leukemia risk does not prove that it is actually breast-feeding (or breast mik) that lowers the risk of leukemia. Remember, a correlation does not prove cause and effect. It could be something else entirely. It does fit, though, with what we know about breast milk and the role of the immune system in recognizing and destroying abnormal (cancerous) cells.
Perhaps some day we'll know for sure if indeed breast milk lowers leukemia risk. But for now, the study's authors can only conclude that breast-feeding may lower the risk of childhood leukemia.
Scientists have long known that breast milk contains antibodies and immune-strengthening compounds that help support the infant's immature immune system. The American Academy of Pediatrics recommends that women breast-feed their infants because it lowers the risk of infections and allergies, among other known benefits. But this is the first time that a correlation between breast-feeding and reduced risk of leukemia has been documented.
Of course, the finding of a correlation between breast-feeding and reduced leukemia risk does not prove that it is actually breast-feeding (or breast mik) that lowers the risk of leukemia. Remember, a correlation does not prove cause and effect. It could be something else entirely. It does fit, though, with what we know about breast milk and the role of the immune system in recognizing and destroying abnormal (cancerous) cells.
Perhaps some day we'll know for sure if indeed breast milk lowers leukemia risk. But for now, the study's authors can only conclude that breast-feeding may lower the risk of childhood leukemia.
Topics:
cancer,
development and aging,
immune system
Saturday, June 13, 2015
Still No Known Benefits From Eating Placenta
Eight years ago I reported on what was then a relatively new phenomenon - eating one's own placenta (see this blog July 19, 2007.) I pointed out that the reported health benefits of this practice were based on anecdotal reports, meaning that they were scientifically untested. My hope was that some day someone would look into this phenomenon and be able to tell us whether it is just a fad or whether there really is something to it.
Flash forward to the present day. A recent search of the scientific literature on the subject found 49 papers, only 10 of which were deemed of sufficient quality to be included in the final analysis. None of the 10 papers offered conclusive evidence of a health benefit of eating the placenta. The bottom line is that eight years on, there is still no good evidence that eating the placenta after birth has any health benefits at all.
According to the study's authors, the health benefits and risks of eating the placenta "require further investigation." But at this point I'm wondering, why bother - why not just refrain from eating placenta in the first place? Problem solved.
Flash forward to the present day. A recent search of the scientific literature on the subject found 49 papers, only 10 of which were deemed of sufficient quality to be included in the final analysis. None of the 10 papers offered conclusive evidence of a health benefit of eating the placenta. The bottom line is that eight years on, there is still no good evidence that eating the placenta after birth has any health benefits at all.
According to the study's authors, the health benefits and risks of eating the placenta "require further investigation." But at this point I'm wondering, why bother - why not just refrain from eating placenta in the first place? Problem solved.
Wednesday, June 10, 2015
GMOs; the Fear is Not Backed by Good Science
It's trendy these days for food companies and even restaurants to label their foods as free of GMOs (genetically modified organisms) wherever possible. Some states are passing laws requiring that all foods containing GMOs be labeled as such. It's happening not because there is good science to show that GMOs are bad, but because there is a perception that they are bad. According to a recent article in The New York Times, the antipathy toward GMOs "reflects a poor public understanding of the science behind them, along with a rebellion against the dominance of food and agricultural conglomerates."
The article goes on to list some of the facts about genetic modification and genetic engineering, and then points out some of the successes of the technique, including a rice that produces the precursor to vitamin A which could significantly reduce blindness in some countries. But then the article suggests something new; perhaps genetic engineering could be used to remove specific naturally-occurring genes from certain foods, such as peanuts or shellfish, that cause serious life-threatening allergies in some people. By the way, no GMO food has ever been proven to cause a life-threatening allergy.
It's unfortunate that the first two major GMOs were corn and soybeans that were genetically engineered to be resistant to insect pests or to a commonly-used herbicide - both of which primarily benefit large agricultural seed companies and big farmers. Would there still be objections to GMOs if the very first ones had prevented certain nutritional deficiency diseases or severe food allergies? I wonder.
I'm not necessarily in favor of some mega-farm farming practices or of the unnamed seed company that has a near lock on the GMO seed market, either. But these are a whole different set of issues from the safety of GMO foods.
The article goes on to list some of the facts about genetic modification and genetic engineering, and then points out some of the successes of the technique, including a rice that produces the precursor to vitamin A which could significantly reduce blindness in some countries. But then the article suggests something new; perhaps genetic engineering could be used to remove specific naturally-occurring genes from certain foods, such as peanuts or shellfish, that cause serious life-threatening allergies in some people. By the way, no GMO food has ever been proven to cause a life-threatening allergy.
It's unfortunate that the first two major GMOs were corn and soybeans that were genetically engineered to be resistant to insect pests or to a commonly-used herbicide - both of which primarily benefit large agricultural seed companies and big farmers. Would there still be objections to GMOs if the very first ones had prevented certain nutritional deficiency diseases or severe food allergies? I wonder.
I'm not necessarily in favor of some mega-farm farming practices or of the unnamed seed company that has a near lock on the GMO seed market, either. But these are a whole different set of issues from the safety of GMO foods.
Monday, June 8, 2015
GINA Applies to DNA Tests for Identification Purposes
The police can use DNA tests to identify suspects, but apparently, private companies cannot, according to a recent court case. The company that wanted to identify the "devious defecator", an employee who was defecating regularly in its warehouse. To solve the mystery the company asked several employees for samples for DNA testing. The employees were cleared, but they later sued the company.
Last month a federal court in Atlanta ruled in favor of the employees, ruling that a DNA test of an employee by a private company violates the Genetic Information Nondiscrimination Act (GINA). As you may recall, GINA protects employees from employment discrimination by making it illegal "for an employer to request, require, or purchase genetic information with respect to an employee." In its decision, the court ruled that although in this case the employees' DNA samples were not used to test for specific genes, they could have been used for that purpose and hence the mere act of asking the employees for a DNA sample violates GINA.
It's a surprisingly narrow interpretation of GINA, especially since GINA does not protect you against discrimination in other areas, such as life, disability, and long-term insurance (see this blog Apr. 17, 2014.) It will be interesting to see if this interpretation holds up in future cases.
Last month a federal court in Atlanta ruled in favor of the employees, ruling that a DNA test of an employee by a private company violates the Genetic Information Nondiscrimination Act (GINA). As you may recall, GINA protects employees from employment discrimination by making it illegal "for an employer to request, require, or purchase genetic information with respect to an employee." In its decision, the court ruled that although in this case the employees' DNA samples were not used to test for specific genes, they could have been used for that purpose and hence the mere act of asking the employees for a DNA sample violates GINA.
It's a surprisingly narrow interpretation of GINA, especially since GINA does not protect you against discrimination in other areas, such as life, disability, and long-term insurance (see this blog Apr. 17, 2014.) It will be interesting to see if this interpretation holds up in future cases.
Tuesday, June 2, 2015
Beta-Amyloid Accumulation Precedes Alzheimer Disease
In patients with Alzheimer disease, the most common cause of dementia, deposits of an abnormal protein called beta-amyloid accumulate around neurons in the brain. It is thought that these beta-amyloid deposits choke off neurons, eventually killing them. When there are too few neurons left, cognitive impairment and eventually dementia set in. But when exactly do these beta-amyloid deposits begin to accumulate, and could their accumulation and the subsequent development of Alzheimer disease be prevented? Attempts to find drugs that would prevent the development of Alzheimer disease have been hampered by the fact that it is difficult to find people to test the drugs on; that is, people who are still seemingly normal but who are likely to develop Alzheimer disease later in life. By the time Alzheimer disease becomes evident, it's too late to prevent it; all one can do is treat the symptoms.
Now a new study offers hope. It's a meta-analysis (remember, a meta-analysis is a study of previous study; like a summary) of 55 previous studies, involving over 7,500 persons who were either normal or had subjective or mild cognitive impairment - none had reached the level of Alzheimer disease yet. All of the studies looked at the prevalence of beta-amyloid pathology, either by positron emission tomography (PET) scan or by analysis of cerebrospinal fluid.
The study found that the prevalence of beta-amyloid pathology increased with age and decreasing level of cognitive function, ranging from 10% in 50-yr-olds with normal cognition to over 70% in 90-yr-olds with mild cognitive impairment.
Most importantly, the study suggests that beta-amyloid pathology begins to develop as early as 20 to 30 years before the onset of dementia. This is good news for researchers seeking to find a drug to prevent Alzheimer disease. By testing a lot of seemingly normal individuals and then enrolling only persons with evidence of beta-amyloid pathology (persons at high risk for Alzheimer disease) in their research studies, the researchers are more likely to find a potential Alzheimer prevention drug that works.
For those of us unlikely to enroll in a research study, however, it's not a particularly helpful finding. Yes, we could be tested to see if we had signs of beta-amyloid accumulation yet, but what would be the point, if there is no cure?
Now a new study offers hope. It's a meta-analysis (remember, a meta-analysis is a study of previous study; like a summary) of 55 previous studies, involving over 7,500 persons who were either normal or had subjective or mild cognitive impairment - none had reached the level of Alzheimer disease yet. All of the studies looked at the prevalence of beta-amyloid pathology, either by positron emission tomography (PET) scan or by analysis of cerebrospinal fluid.
The study found that the prevalence of beta-amyloid pathology increased with age and decreasing level of cognitive function, ranging from 10% in 50-yr-olds with normal cognition to over 70% in 90-yr-olds with mild cognitive impairment.
Most importantly, the study suggests that beta-amyloid pathology begins to develop as early as 20 to 30 years before the onset of dementia. This is good news for researchers seeking to find a drug to prevent Alzheimer disease. By testing a lot of seemingly normal individuals and then enrolling only persons with evidence of beta-amyloid pathology (persons at high risk for Alzheimer disease) in their research studies, the researchers are more likely to find a potential Alzheimer prevention drug that works.
For those of us unlikely to enroll in a research study, however, it's not a particularly helpful finding. Yes, we could be tested to see if we had signs of beta-amyloid accumulation yet, but what would be the point, if there is no cure?
Subscribe to:
Posts (Atom)