2/22/2012

Step Forward in Effort to Regenerate Damaged Nerves


The carnage evident in disasters like car wrecks or wartime battles is oftentimes mirrored within the bodies of the people involved. A severe wound can leave blood vessels and nerves severed, bones broken, and cellular wreckage strewn throughout the body -- a debris field within the body itself. 

Thriving DRG cells [Credit: University of Rochester Medical Center]
It's scenes like this that neurosurgeon Jason Huang, M.D., confronts every day. Severe damage to nerves is one of the most challenging wounds to treat for Huang and colleagues. It's a type of wound suffered by people who are the victims of gunshots or stabbings, by those who have been involved in car accidents -- or by soldiers injured on the battlefield, like those whom Huang treated in Iraq. 

Now, back in his university laboratory, Huang and his team have taken a step forward toward the goal of repairing nerves in such patients more effectively. In a paper published in the journal PLoS ONE, Huang and colleagues at the University of Rochester Medical Center report that a surprising set of cells may hold potential for nerve transplants. 

In a study in rats, Huang's group found that dorsal root ganglion neurons, or DRG cells, help create thick, healthy nerves, without provoking unwanted attention from the immune system. 

The finding is one step toward better treatment for the more than 350,000 patients each year in the United States who have serious injuries to their peripheral nerves. Huang's laboratory is one of a handful developing new technologies to treat such wounds. 

"These are very serious injuries, and patients really suffer, many for a very long time," said Huang, associate professor of Neurosurgery and chief of Neurosurgery at Highland Hospital, an affiliate of the University of Rochester Medical Center. "There are a variety of options, but none of them is ideal. 

"Our long-term goal is to grow living nerves in the laboratory, then transplant them into patients and cut down the amount of time it takes for those nerves to work," added Huang, whose project was funded by the National Institute of Neurological Disorders and Stroke and by the University of Rochester Medical Center. 

For a damaged nerve to repair itself, the two disconnected but healthy portions of the nerve must somehow find each other through a maze of tissue and connect together. This happens naturally for a very small wound -- much like our skin grows back over a small cut -- but for some nerve injuries, the gap is simply too large, and the nerve won't grow back without intervention. 

For surgeons like Huang, the preferred option is to transplant nerve tissue from elsewhere in the patient's own body -- for instance, a section of a nerve in the leg -- into the wounded area. The transplanted nerve serves as scaffolding, a guide of sorts for a new nerve to grow and bridge the gap. Since the tissue comes from the patient, the body accepts the new nerve and doesn't attack it. 

But for many patients, this treatment isn't an option. They might have severe wounds to other parts of the body, so that extra nerve tissue isn't available. Alternatives can include a nerve transplant from a cadaver or an animal, but those bring other challenges, such as the lifelong need for powerful immunosuppressant drugs, and are rarely used. 

One technology used by Huang and other neurosurgeons is the NeuraGen Nerve Guide, a hollow, absorbable collagen tube through which nerve fibers can grow and find each other. The technology is often used to repair nerve damage over short distances less than half an inch long. 

In the PLoS One study, Huang's team compared several methods to try to bridge a nerve gap of about half an inch in rats. The team transplanted nerve cells from a different type of rat into the wound site and compared results when the NeuraGen technology was was used alone or when it was paired with DRG cells or with other cells known as Schwann cells. 

After four months, the team found that the tubes equipped with either DRG or Schwann cells helped bring about healthier nerves. In addition, the DRG cells provoked less unwanted attention from the immune system than the Schwann cells, which attracted twice as many macrophages and more of the immune compound interferon gamma. 

While both Schwann and DRG cells are known players in nerve regeneration, Schwann cells have been considered more often as potential partners in the nerve transplantation process, even though they pose considerable challenges because of the immune system's response to them. 

"The conventional wisdom has been that Schwann cells play a critical role in the regenerative process," said Huang, who is a scientist in the Center for Neural Development and Disease. "While we know this is true, we have shown that DRG cells can play an important role also. We think DRG cells could be a rich resource for nerve regeneration." 

In a related line of research, Huang along with colleagues in the laboratory of Douglas H. Smith, M.D. , at the University of Pennsylvania are creating DRG cells in the laboratory by stretching them, which coaxes them to grow about one inch every three weeks. The idea is to grow nerves several inches long in the laboratory, then transplant them into the patient, instead of waiting months after surgery for the nerve endings to travel that distance within the patient to ultimately hook up. 

The first author of the PLoS One paper is research associate Weimin Liu, Ph.D. Other authors, in addition to Huang and Smith, are graduate students Yi Ren and Xiaowei Wang; post-doctoral associate Samantha Dayawansa, Ph.D.; undergraduate Adam Bossert; neurologist Handy Gelbard, M.D., Ph.D.; Jing Tong, M.D., formerly of the Huang laboratory and now a neurosurgeon at Hebei Medical University in China, and Xiaoshen He, M.D., a neurosurgeon at Fourth Military Medical University in China. 

Source: University of Rochester Medical Center [February 21, 2012]

2/21/2012

Cocaine and the Teen Brain: New Insights Into Addiction


When first exposed to cocaine, the adolescent brain launches a strong defensive reaction designed to minimize the drug's effects, Yale and other scientists have found. Now two new studies by a Yale team identify key genes that regulate this response and show that interfering with this reaction dramatically increases a mouse's sensitivity to cocaine. 


The findings may help explain why risk of drug abuse and addiction increase so dramatically when cocaine use begins during teenage years. 

The results were published in the Feb. 14 and Feb. 21 issues of the Journal of Neuroscience. 

Researchers including those at Yale have shown that vulnerability to cocaine is much higher in adolescence, when the brain is shifting from an explosive and plastic growth phase to more settled and refined neural connections characteristic of adults. Past studies at Yale have shown that the neurons and their synaptic connections in adolescence change shape when first exposed to cocaine through molecular pathway regulated by the gene integrin beta1, which is crucial to the development of the nervous system of vertebrates. 

"This suggests that these structural changes observed are probably protective of the neurocircuitry, an effort of the neuron to protect itself when first exposed to cocaine," said Anthony Koleske, professor of molecular biophysics and biochemistry and of neurobiology and senior author of both papers. 

In the latest study, Yale researchers report when they knocked out this pathway, mice needed approximately three times less cocaine to induce behavioral changes than mice with an intact pathway. 

The research suggests that the relative strength of the integrin beta1 pathway among individuals may explain why some cocaine users end up addicted to the drug while others escape its worst effects, Koleske theorized. 

"If you were to become totally desensitized to cocaine, there is no reason to seek the drug," he said. 

Koleske and Jane R. Taylor, professor of psychiatry and psychology and an author of the Feb. 14 paper, are teaming up with other Yale researchers to look for other genes that may play a role in protecting the brain from effects of cocaine and other drugs of abuse. 

Shannon Gourley, now of Emory University who worked with Koleske and Taylor, is lead author on the Feb. 14 paper detailing how the structural response to cocaine protects against cocaine sensitivity. Anastasia Oleveska and Michael S. Warren are other Yale authors on this paper. Warren and William D. Bradley of Yale are co-lead authors of the latest Neuroscience paper describing the role for integrin beta 1 in the control of adolescent synapse and dendrite refinement and stability. Yu-Chih Lin, Mark A. Simpson, Charles A. Greer are other Yale-affiliated authors. 

Author: Bill Hathaway | Source: Yale University [February 21, 2012]

2/20/2012

Honeycomb structure responsible for bacteria's extraordinary sense


Cornell researchers have peered into the complex molecular network of receptors that give one-celled organisms like bacteria the ability to sense their environment and respond to chemical changes as small as 1 part in 1,000. 

Honeycomb-like hexagonal lattice formed by the network of receptor molecules (pink), associated enzymes (blue) and coupling proteins (green) in motile bacteria. Such a cooperative lattice arranged across their cell membranes helps bacteria sense their environment with extreme sensitivity. Inset shows the structure of a unit cell of the lattice in detail (top left). [Credit: Crane Lab]
Just as humans use five senses to navigate through surroundings, bacteria employ an intricate structure of thousands of receptor molecules, associated enzymes and linking proteins straddling their cell membranes that trigger responses to external chemical changes. 

Researchers in the lab of Brian Crane, professor of chemistry and chemical biology, with collaborators in the lab of Grant Jensen at the California Institute of Technology, have mapped out the honeycomb-like hexagonal arrangement of these receptor complexes in unprecedented detail. 

They report their discovery in the Feb. 20 issue of the Proceedings of the National Academy of Sciences. 

"The highlight of the paper is that by using a combination of [techniques], we've been able to image these complex arrays in live cells and determine how they are structured -- they are a very unique biological assembly that is conserved across nearly all classes of motile bacteria," said Crane. 

The findings might have potential applications in engineering biologically inspired synthetic molecular devices to detect specific chemicals with high sensitivity over a wide dynamic range; they also could shed light on the pathogenesis of various bacterial diseases like syphilis, cholera and Lyme disease. They may also give some insight into the functioning of the human immune system, where special cells may be employing similar cooperative networks of receptors to recognize and fight against foreign antigens, Crane said. 

Scientists are trying to determine how one receptor -- on detecting nutrients, oxygen or acidity, for example -- triggers communication through multiple enzymes in the complex, setting off a chain reaction. This cooperation leads to a gain or amplification in the signal that is finally communicated to the tails (flagella) of the bacterium, affecting the way they spin. Such a response allows the bacterium to move toward food sources or away from toxic environments.  

The latest work takes a big step toward understanding this mechanism by demonstrating for the first time how the individual pieces of receptors, enzymes and coupling proteins fit together to generate an extended network. It builds on previous work in the Crane lab done in collaboration with Jack Freed, Cornell professor of chemistry and chemical biology. 

"We were able to see how the enzymes and proteins are linked together in the complex array and also their interactions with the receptors," said Xiaoxiao Li, graduate student and joint lead author on the study. 

High-resolution X-ray images of the bacterial membrane receptor complex were obtained after extraction, purification and crystallization at the Cornell High Energy Synchrotron Source (CHESS). The scientists reconstructed the complicated molecular structure of the complex with an accuracy that is within a nanometer (a billionth of a meter). Pictures of the structure were also captured inside living cells, by first "flash-freezing" the cells and then scattering electrons off them. 

The researchers deduced that trimers (groups of three) of receptor molecules are arranged at the vertices of hexagons surrounding rings of enzymes and coupling proteins. These rings linked by proteins form the backbone of the extended honeycomb-like network. 

Author: Vivek Venkataraman | Source: Cornell University [February 20, 2012]

2/16/2012

Nanoparticles in Food, Vitamins Could Harm Human Health, Researchers Warn


Billions of engineered nanoparticles in foods and pharmaceuticals are ingested by humans daily, and new Cornell research warns they may be more harmful to health than previously thought. 

An intestinal cell monolayer after exposure to nanoparticles, shown in green [Credit: Cornell University]
A research collaboration led by Michael Shuler, the Samuel B. Eckert Professor of Chemical Engineering and the James and Marsha McCormick Chair of Biomedical Engineering, studied how large doses of polystyrene nanoparticles -- a common, FDA-approved material found in substances from food additives to vitamins -- affected how well chickens absorbed iron, an essential nutrient, into their cells. 

The results were reported online Feb. 12 in the journal Nature Nanotechnology. 

According to the study, high-intensity, short-term exposure to the particles initially blocked iron absorption, whereas longer-term exposure caused intestinal cell structures to change, allowing for a compensating uptick in iron absorption. 

The researchers tested both acute and chronic nanoparticle exposure using human gut cells in petri dishes as well as live chickens and reported matching results. They chose chickens because these animals absorb iron into their bodies similarly to humans, and they are also similarly sensitive to micronutrient deficiencies, explained Gretchen Mahler, Ph.D. '08, the paper's first author and former Cornell graduate student and postdoctoral associate. 

The researchers used commercially available, 50-nanometer polystyrene carboxylated particles that are generally considered safe for human consumption. They found that following acute exposure, a few minutes to a few hours after consumption, both the absorption of iron in the in vitro cells and the chickens decreased. 

But following exposure of 2 milligrams per kilogram for two weeks -- a slower, more chronic intake -- the structure of the intestinal villi began to change and increase in surface area. This was an effective physiological remodeling that led to increased iron absorption. 

"This was a physiological response that was unexpected," Mahler said. 

Shuler noted that in some sense this intestinal villi remodeling was positive because it shows the body adapts to challenges. But it serves to underscore how such particles, which have been widely studied and considered safe, cause barely detectable changes that could lead to, for example, over-absorption of other, harmful compounds. 

Human exposure to nanoparticles is only increasing, Shuler continued. 

"Nanoparticles are entering our environment in many different ways," Shuler said. "We have some assurance that at a gross level they are not harmful, but there may be more subtle effects that we need to worry about." 

The paper included Cornell co-authors Mandy Esch, a research associate in biomedical engineering; Elad Tako, a research associate at the Robert W. Holley Center for Agriculture and Health; Teresa Southard, assistant professor of biomedical sciences; Shivaun Archer, senior lecturer in biomedical engineering; and Raymond Glahn, senior scientist with the USDA Agricultural Research Service and courtesy associate professor in the Department of Food Science. The work was supported by the National Science Foundation; New York State Office of Science, Technology and Academic Research; Army Corp of Engineers; and U.S. Department of Agriculture. 

Author: Anne Ju | Source: Cornell University [February 16, 2012]

2/15/2012

Smoking Zaps Healthy Bacteria In the Mouth


According to a new study, smoking causes the body to turn against its own helpful bacteria, leaving smokers more vulnerable to disease. 


Despite the daily disturbance of brushing and flossing, the mouth of a healthy person contains a stable ecosystem of healthy bacteria. New research shows that the mouth of a smoker is a much more chaotic, diverse ecosystem -- and is much more susceptible to invasion by harmful bacteria. 

As a group, smokers suffer from higher rates of oral diseases -- especially gum disease -- than do nonsmokers, which is a challenge for dentists, according to Purnima Kumar, assistant professor of periodontology at Ohio State University. She and her colleagues are involved in a multi-study investigation of the role the body's microbial communities play in preventing oral disease. 

"The smoker's mouth kicks out the good bacteria, and the pathogens are called in," said Kumar. "So they're allowed to proliferate much more quickly than they would in a non-smoking environment." 

The results suggest that dentists may have to offer more aggressive treatment for smokers and would have good reason to suggest quitting smoking, Kumar said. 

"A few hours after you're born, bacteria start forming communities called biofilms in your mouth," said Kumar. "Your body learns to live with them, because for most people, healthy biofilms keep the bad bacteria away." 

She likens a healthy biofilm to a lush, green lawn of grass. "When you change the dynamics of what goes into the lawn, like too much water or too little fertilizer," she said, "you get some of the grass dying, and weeds moving in." For smokers, the "weeds" are problem bacteria known to cause disease. 

In a new study, Kumar's team looked at how these bacterial ecosystems regrow after being wiped away. For 15 healthy nonsmokers and 15 healthy smokers, the researchers took samples of oral biofilms one, two, four and seven days after professional cleaning. 

The researchers were looking for two things when they swabbed subjects' gums. First, they wanted to see which bacteria were present by analyzing DNA signatures found in dental plaque. They also monitored whether the subjects' bodies were treating the bacteria as a threat. If so, the swab would show higher levels of cytokines, compounds the body produces to fight infection. 

The results of the study were published in the journal Infection and Immunity. 

"When you compare a smoker and nonsmoker, there's a distinct difference," said Kumar. "The first thing you notice is that the basic 'lawn,' which would normally contain thriving populations made of a just few types of helpful bacteria, is absent in smokers." 

The team found that for nonsmokers, bacterial communities regain a similar balance of species to the communities that were scraped away during cleaning. Disease-associated bacteria are largely absent, and low levels of cytokines show that the body is not treating the helpful biofilms as a threat. 

"By contrast," said Kumar, "smokers start getting colonized by pathogens -- bacteria that we know are harmful -- within 24 hours. It takes longer for smokers to form a stable microbial community, and when they do, it's a pathogen-rich community." 

Smokers also have higher levels of cytokines, indicating that the body is mounting defenses against infection. Clinically, this immune response takes the form of red, swollen gums -- called gingivitis -- that can lead to the irreversible bone loss of periodontitis. 

In smokers, however, the body is not just trying to fight off harmful bacteria. The types of cytokines in smokers' gum swabs showed the researchers that smokers' bodies were treating even healthy bacteria as threatening. 

Although they do not yet understand the mechanisms behind these results, Kumar and her team suspect that smoking is confusing the normal communication that goes on between healthy bacterial communities and their human hosts. 

Practically speaking, these findings have clear implications for patient care, according to Kumar. 

"It has to drive how we treat the smoking population," she said. "They need a more aggressive form of treatment, because even after a professional cleaning, they're still at a very high risk for getting these pathogens back in their mouths right away. 

"Dentists don't often talk to their patients about smoking cessation," she continued. "These results show that dentists should take a really active role in helping patients to get the support they need to quit." 

For Kumar, who is a practicing periodontist as well as a teaching professor, doing research has changed how she treats her patients. "I tell them about our studies, about the bacteria and the host response, and I say, 'Hey -- I'm really scared for you.' Patients have been more willing to listen, and two actually quit." 

Kumar's collaborators include Chad Matthews and Vinayak Joshi of Ohio State's College of Dentistry as well as Marko de Jager and Marcelo Aspiras of Philips Oral Healthcare. The research was sponsored by a grant from Philips Oral Healthcare. 

Author: Maureen Langlois | Source: Ohio State University [February 15, 2012]

2/14/2012

Lust makes you smarter and seven deadly sins are good for you


Good news for lovers - the seven deadly sins, including Lust, are good for you. University of Melbourne social psychologist Dr Simon Laham uses modern research to make a compelling case for the virtues of living a sinful life in his latest book The Joy of Sin: The Psychology of the Seven Deadlies (And Why They Are So Good For You). 


Dr Laham argues that human behavior is more complex than simple “good” or “evil” and shows us that Pride, Lust, Gluttony, Greed, Envy, Sloth and Anger are not soul-condemning offenses, but ever-present and, if indulged wisely, are largely functional human tendencies. 

In particular, for lovers intent on indulging in a bit of lust this Valentine’s Day, Dr Laham reveals: 

Lust can make you smarter. Research shows that people with sex on the brain are better at solving ‘analytic thinking’ problems. Lust triggers us to become focused on the present and the details of satisfying a rather pressing current goal, namely sex. 

Lust makes you helpful. Lust is so well designed to fulfill its function of getting people into bed, that it leads us to behave in ways that potential partners will find more attractive. 

Lust builds love. Research shows that lustful participants are more likely to display a range of loving, relationship maintenance strategies – like adopting constructive conflict resolution strategies – to increase the chances of sex in the future. 

Gluttony - People who have eaten a piece of cake are more likely to donate to charity. 

Greed – Money can buy you happiness as long as you spend it the right way. Studies show that people are happier when they spend their money on experiences rather than material possessions. 

Sloth - The ultimate slothful state, sleep and even napping, improves your memory and makes you more insightful.  Research has also shown that slowing down makes you more helpful.  Studies in cities in which people walk more slowly, such as Bakersfield in California, pedestrians are more likely to stop and offer help. 

Anger - Anger triggers an oppositional mindset which makes people more willing to entertain beliefs contrary to their own. In addition, angry negotiators, tend to be more likely to get what they want in a negotiation. 

Envy - Comparing yourself to those better off than you can lead to boosts in mood, self-image and creativity. 

School students who compared themselves to superior students got better grades. 

Pride – Proud people persist longer at difficult tasks and adopt leadership roles. Studies show that the proud are more liked. 

Dr Laham said that when you take a look at the evidence, the seven deadly sins can really serve us quite well despite being told for centuries they are bad for us. 

“This is great news for Australians as a recent BBC poll deemed Australia the most sinful country on earth,” he said.
 

So research now shows that it’s ok to indulge in a bit of Lust this Valentine’s Day and you’ll be better off for it. In fact, indulge in all seven deadly sins and you might just be a little smarter, happier and more successful. 

Source: University of Melbourne [February 13, 2012]

2/13/2012

Computer programs that think like humans


Intelligence – what does it really mean? In the 1800s, it meant that you were good at memorising things, and today intelligence is measured through IQ tests where the average score for humans is 100. Researchers at the Department of Philosophy, Linguistics and Theory of Science at the University of Gothenburg, Sweden, have created a computer program that can score 150. 


IQ tests are based on two types of problems: progressive matrices, which test the ability to see patterns in pictures, and number sequences, which test the ability to see patterns in numbers. The most common math computer programs score below 100 on IQ tests with number sequences. For Claes StrannegÃ¥rd, researcher at the Department of Philosophy, Linguistics and Theory of Science, this was a reason to try to design 'smarter' computer programs. 

"We're trying to make programs that can discover the same types of patterns that humans can see," he says. 

The research group, which consists of Claes StrannegÃ¥rd, Fredrik Engström, Rahim Nizamani and three students working on their degree projects, believes that number sequence problems are only partly a matter of mathematics – psychology is important too. StrannegÃ¥rd demonstrates this point: 

"1, 2, …, what comes next? Most people would say 3, but it could also be a repeating sequence like 1, 2, 1 or a doubling sequence like 1, 2, 4. Neither of these alternatives is more mathematically correct than the others. What it comes down to is that most people have learned the 1-2-3 pattern." 

The group is therefore using a psychological model of human patterns in their computer programs. They have integrated a mathematical model that models human-like problem solving. The program that solves progressive matrices scores IQ 100 and has the unique ability of being able to solve the problems without having access to any response alternatives. The group has improved the program that specialises in number sequences to the point where it is now able to ace the tests, implying an IQ of at least 150. 

"Our programs are beating the conventional math programs because we are combining mathematics and psychology. Our method can potentially be used to identify patterns in any data with a psychological component, such as financial data. But it is not as good at finding patterns in more science-type data, such as weather data, since then the human psyche is not involved," says StrannegÃ¥rd. 

The research group has recently started collaborating with the Department of Psychology at Stockholm University, with a goal to develop new IQ tests with different levels of difficulty. 

"We have developed a pretty good understanding of how the tests work. Now we want to divide them into different levels of difficulty and design new types of tests, which we can then use to design computer programs for people who want to practice their problem solving ability," says StrannegÃ¥rd. 

Source: University of Gothenburg [February 13, 2012]

2/10/2012

What does love look like?


What does love look like? A dozen roses delivered on an ordinary weekday? Breakfast in bed? Or just a knowing glance between lovers? 


While outward displays of love are fairly easy to discern, a researcher in the College of Behavioral and Social Sciences is taking a decidedly "inward" approach to documenting this most complex of human emotions. 

Sandra Langeslag, an expert in biological psychology, is using brain-imaging tools such as electroencephalography (EEG) and functional magnetic resonance imaging (fMRI) to identify and assess the neurocognition of romantic love. 

"I want to understand how the brain works when humans are attracted to one another," says Langeslag, a postdoctoral research fellow in the Laboratory of Cognition and Emotion. 

By associating specific brain impulses with psychological data related to romantic love, Langeslag is at the forefront of research bridging the gap between emotion -- a state of mind derived from one's circumstances -- and cognition, which is acquiring knowledge through thought and experience. 

"The traditional [scientific] view is that cognition and emotion are two different things," says Luiz Pessoa, a professor of psychology and director of the lab. "But we believe they interact to a great extent -- that there aren't two completely separate sets of brain areas related to each." 

Langeslag's research began a decade ago while still an undergraduate in her native Netherlands. She already had a strong interest in biological psychology, and decided to explore the biological aspect of romantic infatuation and attachment when she herself fell in love. 

"That got me to thinking, 'Wow, what's happening to me? Why do I feel the way I feel?'" she recalls. 

Langeslag wrote her master's thesis on the physiological traits of romantic love, conducting research on fellow students in the Netherlands who had identified themselves as being in love. The testing involved a battery of questionnaires combined with brain imaging data supplied by EEG, which measures electrical impulses, and fMRI, which measures precise areas of blood flow. 

By showing the test participants a series of images, whether of their beloved, a friend or a stranger, Langeslag was able to detect a specific brainwave pattern called P3 that represents "motivated attention." The P3 brainwave was very pronounced when subjects where shown pictures of their own partner, she says, leading her to believe that people pay close attention to their sweethearts, even in the face of distractions like friends dropping in or an attractive stranger entering their field of view. 

Langeslag did similar research while pursuing her doctorate in emotional memory, also in the Netherlands. Now at Maryland, she continues to analyze her data collected from overseas while she and Pessoa examine another aspect of cognition and emotion: how a person's reasoning ability is affected by negative stimuli, such as the threat of receiving an electrical shock. 

Ultimately, Langeslag believes her neurocognitive research on romantic love will help other scientists studying the biological aspects of romantic behavior. 

"Love is not a psychiatric disorder, but people that are in love are kind of crazy," she says. "Although we're not looking to cure anyone from being in love, we should continue to explore every aspect of this that we can." 

Author: Tom Ventsias | Source: University of Maryland [February 10, 2012]

2/08/2012

Right hand or left? How the brain solves a perceptual puzzle


When you see a picture of a hand, how do you know whether it's a right or left hand? This "hand laterality" problem may seem obscure, but it reveals a lot about how the brain sorts out confusing perceptions. Now, a study which will be published in a forthcoming issue of Psychological Science, a journal published by the Association for Psychological Science, challenges the long-held consensus about how we solve this problem. 


"For decades, the theory was that you use your motor imagination," says Shivakumar Viswanathan, who conducted the study with University of California Santa Barbara colleagues Courtney Fritz and Scott T. Grafton. Judging from response times, psychologists thought we imagine flipping a mental image of each of our own hands to find the one matching the picture. These imagined movements were thought to recruit the same brain processes used to command muscles to move—a high-level cognitive feat. 

The study, however, finds that the brain is adept at decoding a left or right hand without these mental gymnastics. Judging laterality is "a low-level sensory problem that uses processes that bring different senses into register"— a process called binding, says Viswanathan. Seeing a hand of unknown laterality leads the brain to bind the seen hand to the correct felt hand. If they are still out of register because of their conflicting positions, an illusory movement arises from the brain's attempt to bring the seen and felt hand into the same position. But "this feeling of moving only comes after you already know which hand it is." 

In the study, participants couldn't see their own hands, which were held palm down. They saw hand shapes tilted at different angles, with a colored dot on them indicating a palm-up or down posture. One group of participants saw the shape first and then the dot; and the other, the dot first. Participants in both groups put the shape and dot together mentally and indicated which hand it was by pushing a button with that hand. 

However, when the shape and dot were shown simultaneously, participants in the first group felt movements of their right hands when seeing a left hand and vice versa; the other group always felt a movement of the correct hand. This behavioral difference (which experimenters gleaned from response time) was due to differences in participants' perception of the seen hand—establishing that an earlier sensory process made the decision. 

In a second experiment, participants were told which hand it was and had to judge whether its palm was down or up, indicating their answer with one hand only. This time, the illusory hand-movement occurred only when the seen hand-shape matched that of the participant's own palm-down responding hand, but not otherwise. Even though no right/left judgments were required, the response was dominated by an automatic binding of the seen and felt hands, and the illusory movement followed, says Viswanathan. 

The study helps us understand the experience of amputees, who sometimes sense an uncontrollable itch or clenching in the "phantom" body part. Showing the opposite hand or leg in a mirror allows the patient to "feel" the absent limb and mentally relieve the discomfort—a "binding" of vision and feeling. 

Source: Association for Psychological Science [February 08, 2012]

2/04/2012

Stressed kids more likely to become obese


The more ongoing stress children are exposed to, the greater the odds they will become obese by adolescence, reports Cornell environmental psychologist Gary Evans in the journal Pediatrics (129:1). 


Nine-year-old children who were chronically exposed to such stressors as poverty, crowded housing and family turmoil gain more weight and were significantly heavier by age 13 than they would have been otherwise, the study found. The reason, Evans and his co-authors suggest, is that ongoing stress makes it tougher for children to control their behavior and emotions -- or self-regulate. That, in turn, can lead to obesity by their teen years. 

"These children are heavier, and they gain weight faster as they grow up. A very good predictor of adults' ability to follow healthy habits is their ability to self-regulate. It seems reasonable that the origins of that are probably in childhood. This [research] is starting to lay that out," said Evans, the Elizabeth Lee Vincent Professor of Human Ecology in the Departments of Design and Environmental Analysis and of Human Development in Cornell's College of Human Ecology. 

Evans conducted the study with former students Thomas Fuller-Rowell, Ph.D. '10, now a Robert Wood Johnson postdoctoral fellow at the University of Wisconsin-Madison, and Stacey Doan, Ph.D. '10, an assistant professor of psychology at Boston University. 

The researchers measured the height and weight of 244 9-year-olds in rural New York state and calculated their various physical and psycho-social stressors -- for example, exposure to violence, living in a substandard house or having no access to such resources as books. They also measured the children's ability to delay gratification by offering them a choice between waiting for a large plate of candy versus having a medium plate immediately. The researchers measured the children's height and weight again four years later. 

While the study doesn't prove that a child's inability to delay gratification causes her to gain weight, there's strong evidence to suggest that it does, Evans said. First, previous studies have shown that chronic stress is linked to weight gain in children and teenagers, and that children eat more sugary, fatty foods when stressed. 

Second, there's a plausible neurocognitive mechanism that may help better understand this behavior, Evans said. "There's some evidence that parts of the brain that are vulnerable and sensitive to stress, particularly early in life, are some of the same parts involved in this self-regulatory behavior." 

The study has implications for education policies such as No Child Left Behind that emphasize testing cognitive abilities but ignore children's ability to control their behavior and emotions, Evans said. 

"A child's ability to self-regulate is not just predictive of things like whether you're going to have trouble with weight -- it predicts grades, graduating from high school. A 4-year-old's ability to self-regulate even predicts SAT scores. This is a very powerful phenomenon," he said. 

The findings also have implications for interventions and policies aimed at reducing individual stressors. "If it's the cumulative impact of stress on these families that is important, that means an intervention that only looks at one stressor -- say, just drug abuse, which is how most interventions are designed -- is doomed to fail," Evans concluded. 

Author: Susan Kelley | Source: Cornell University [January 21, 2012]

2/02/2012

Could brain size determine whether you are good at maintaining friendships?


Researchers are suggesting that there is a link between the number of friends you have and the size of the region of the brain – known as the orbital prefrontal cortex – that is found just above the eyes. A new study, published today in the journal Proceedings of the Royal Society B, shows that this brain region is bigger in people who have a larger number of friendships. 


The research was carried out as part of the British Academy Centenary ‘Lucy to Language’ project, led by Professor Robin Dunbar of the University of Oxford in a collaboration with Dr Penny Lewis at The University of Manchester, Dr Joanne Powell and Dr Marta Garcia-Finana at Liverpool University, and Professor Neil Roberts at Edinburgh University. 

The study suggests that we need to employ a set of cognitive skills to maintain a number of friends (and the keyword is ‘friends’ as opposed to just the total number of people we know). These skills are described by social scientists as ‘mentalising’ or ‘mind-reading’– a capacity to understand what another person is thinking, which is crucial to our ability to handle our complex social world, including the ability to hold conversations with one another. This study, for the first time, suggests that our competency in these skills is determined by the size of key regions of our brains (in particular, the frontal lobe). 

Professor Dunbar, from the Institute of Cognitive and Evolutionary Anthropology, explained: “’Mentalising’ is where one individual is able to follow a natural hierarchy involving other individuals’ mind states. For example, in the play ‘Othello’, Shakespeare manages to keep track of five separate mental states: he intended that his audience believes that Iago wants Othello to suppose that Desdemona loves Cassio [the italics signify the different mind states]. Being able to maintain five separate individuals’ mental states is the natural upper limit for most adults.” 

The researchers took anatomical MR images of the brains of 40 volunteers at the Magnetic Resonance and Image Analysis Research Centre at the University of Liverpool to measure the size of the prefrontal cortex, the part of the brain used in high-level thinking. Participants were asked to make a list of everyone they had had social, as opposed to professional, contact with over the previous seven days. They also took a test to determine their competency in mentalising.  

Professor Dunbar said: “We found that individuals who had more friends did better on mentalising tasks and had more neural volume in the orbital frontal cortex, the part of the forebrain immediately above the eyes. Understanding this link between an individual’s brain size and the number of friends they have helps us understand the mechanisms that have led to humans developing bigger brains than other primate species. The frontal lobes of the brain, in particular, have enlarged dramatically in humans over the last half million years.” 

Dr Penny Lewis, from the School of Psychological Sciences at The University of Manchester, said: “Both the number of friends people had and their ability to think about other people’s feelings predicted the size of this same small brain area. This not only suggests that we’ve found a region which is critical for sociality, it also shows that the link between brain anatomy and social success is much more direct than previously believed.” 

Dr Joanne Powell, from the Department of Psychology, University of Liverpool, said: “Perhaps the most important finding of our study is that we have been able to show that the relationship between brain size and social network size is mediated by mentalising skills. What this tells us is that the size of your brain determines your social skills, and it is these that allow you to have many friends.” 

Dr Lewis added: “This research is particularly important because it provides the strongest support to date for the social brain hypothesis – that is, the idea that human brains evolved to accommodate the social demands of living in a big group. Cross-species comparisons between various monkey brains have already supported this, but our work is some of the first to show that people with larger social groups actually have more neural matter in this particular bit of cortex. It looks as though size really does matter when it comes to social success.” 

Source: University of Manchester [February 02, 2012]

Men Behaving Nicely: Selfless Acts by Men Increase When Attractive Women Are Nearby


Men put on their best behaviour when attractive ladies are close by. When the scenario is reversed however, the behaviour of women remains the same. These findings were published February 2, 2012, in the British Psychological Society's British Journal of Psychology via the Wiley Online Library. 


The research, which also found that the number of kind and selfless acts by men corresponded to the attractiveness of ladies, was undertaken by Dr Wendy Iredale of Sheffield Hallam University and Mark Van Vugt of the VU University in Amsterdam and the University of Oxford. 

Two experiments were undertaken. For the first, 65 men and 65 women, all of an average age of 21, anonymously played a cooperation game where they could donate money via a computer program to a group fund. Donations were selfless acts, as all other players would benefit from the fund, whilst the donor wouldn't necessarily receive anything in return. 

Players did not know who they were playing with. They were observed by either someone of the same sex or opposite sex -- two physically attractive volunteers, one man and one woman. Men were found to do significantly more good deeds when observed by the opposite sex. Whilst the number of good deeds made by women didn't change, regardless of who observed. 

For the second experiment, groups of males were formed. Males were asked to make a number of public donations. These increased when observed by an attractive female, where they were found to actively compete with one another. When observed by another male, however, donations didn't increase. 

Dr Iredale said: "The research shows that good deeds among men increase when presented with an opportunity to copulate. Theoretically, this suggests that a good deed is the human equivalent of the peacock's tail. Practically, this research shows how societies can encourage selfless acts." 

Source: British Psychological Society (BPS) [February 02, 2012]

Understanding how bacteria come back from the dead


Salmonella remains a serious cause of food poisoning in the UK and throughout the EU, in part due to its ability to thrive and quickly adapt to the different environments in which it can grow. New research involving a team of IFR scientists, funded by BBSRC, has taken the first detailed look at what Salmonella does when it enters a new environment, which could provide clues to finding new ways of reducing transmission through the food chain and preventing human illness. 


Bacteria can multiply rapidly, potentially doubling every 20 minutes in ideal conditions. However, this exponential growth phase is preceded by a period known as lag phase, where no increase in cell number is seen. Lag phase was first described in the 19th Century, and was assumed to be needed by bacteria to prepare to exploit new environmental conditions. Beyond this, surprisingly little was known about lag phase, other than bacteria are metabolically active in this period. But exactly what are bacteria doing physiologically during this period? 

To fill in this knowledge gap researchers at IFR, along with colleagues at Campden BRI, a membership-based organisation carrying out research and development for the food and drinks industry, have developed a simple and robust system for studying the biology of Salmonella during lag phase. In this system, lag phase lasts about two hours, but the cells sense their new environment remarkably quickly, and within four minutes switch on a specific set of genes, including some that control the uptake of specific nutrients. 

For example, one nutrient accumulated is phosphate which is needed for many cellular processes, and a gene encoding a phosphate transporter was the most upregulated gene during the first four minutes of lag phase. The cellular uptake mechanisms for iron were also activated during lag phase, and are needed for key aspects of bacterial metabolism. This increase in iron leads to a short term sensitivity to oxidative damage. Manganese and calcium are also accumulated in lag phase, but are lost from the cell during exponential growth. 

This new understanding of Salmonella metabolism during lag phase show how rapidly Salmonella senses favourable conditions and builds up the materials needed for growth. This study was carried out by two BBSRC-CASE studentships, which were partially funded by Campden BRI. 

Future research to work out the regulatory mechanisms behind these processes and the switch from lag phase to exponential growth will tell us more about how Salmonella can flourish in different environments, and could point to new ways of controlling its transmission in the food chain.  

Source: Norwich BioScience Institutes [February 02, 2012]

2/01/2012

Why the brain is more reluctant to function as we age


New findings, led by neuroscientists at the University of Bristol and published this week in the journal Neurobiology of Aging, reveal a novel mechanism through which the brain may become more reluctant to function as we grow older. 


It is not fully understood why the brain's cognitive functions such as memory and speech decline as we age. Although work published this year suggests cognitive decline can be detectable before 50 years of age. The research, led by Professor Andy Randall and Dr Jon Brown from the University's School of Physiology and Pharmacology, identified a novel cellular mechanism underpinning changes to the activity of neurones which may underlie cognitive decline during normal healthy aging. 

The brain largely uses electrical signals to encode and convey information. Modifications to this electrical activity are likely to underpin age-dependent changes to cognitive abilities. 

The researchers examined the brain's electrical activity by making recordings of electrical signals in single cells of the hippocampus, a structure with a crucial role in cognitive function. In this way they characterised what is known as "neuronal excitability" — this is a descriptor of how easy it is to produce brief, but very large, electrical signals called action potentials; these occur in practically all nerve cells and are absolutely essential for communication within all the circuits of the nervous system. 

Action potentials are triggered near the neurone's cell body and once produced travel rapidly through the massively branching structure of the nerve cell, along the way activating the synapses the nerve cell makes with the numerous other nerve cells to which it is connected. 

The Bristol group identified that in the aged brain it is more difficult to make hippocampal neurones generate action potentials. Furthermore they demonstrated that this relative reluctance to produce action potential arises from changes to the activation properties of membrane proteins called sodium channels, which mediate the rapid upstroke of the action potential by allowing a flow of sodium ions into neurones. 

Professor Randall, Professor in Applied Neurophysiology said: "Much of our work is about understanding dysfunctional electrical signalling in the diseased brain, in particular Alzheimer's disease. We began to question, however, why even the healthy brain can slow down once you reach my age. Previous investigations elsewhere have described age-related changes in processes that are triggered by action potentials, but our findings are significant because they show that generating the action potential in the first place is harder work in aged brain cells. 

"Also by identifying sodium channels as the likely culprit for this reluctance to produce action potentials, our work even points to ways in which we might be able modify age-related changes to neuronal excitability, and by inference cognitive ability." 

Source: University of Bristol [February 01, 2012]

Scientists decode how the brain hears words


US scientists said Wednesday they have found a way to decode how the brain hears words, in what researchers described as a major step toward one day helping people communicate after paralysis or stroke. 

These images courtesy of the Center for Vital Longevity, The University of Texas at Dallas, show positron emission tomography scans of the brains of healthy adults showing low (L) and high (R) levels of beta-amyloid protein. US scientists said Wednesday they have found a way to decode how the brain hears words [Credit: AFP]
By placing electrodes on the brains of research subjects and then having them listen to conversations, scientists were able to analyze the sound frequencies registered and figure out which words they were hearing. 

"We were focused on how the brain processes the sounds of speech," researcher Brian Pasley of the Helen Wills Neuroscience Institute at the University of California Berkeley told AFP. 

"Most of the information in speech is between one to 8,000 hertz. Essentially the brain analyzes those different sound frequencies in somewhat separate locations." 

By tracking how and where the brain registered sounds in the temporal lobe -- the center of the auditory system -- scientists were able to map out the words and then recreate them as heard by the brain. 

"When a particular brain site is being activated, we know that roughly corresponds to some sound frequency that the patient is actually listening to," Pasley said. 

"So we could map that out to an extent that would allow us to use that brain activity to resynthesize the sound from the frequencies we were guessing." 

One word the researchers mapped was "structure." The high-frequency "s" sound showed up as a certain pattern in the brain, while the lower harmonics of the "u" sound appeared as a different pattern. 

"There is to some extent a correspondence between these features of sound and the brain activity that they cause," and putting together the physical registry in the brain helped rebuild the words, Pasley explained. 

The work builds on previous research in ferrets, in which scientists read to the animals and recorded their brain activity. 

They were able to decode which words the creatures heard even though the ferrets themselves didn't understand the words. 

The next step for researchers is to figure out just how similar the process of hearing sounds may be to the process of imagining words and sounds. 

That information could one day help scientists determine what people want to say when they cannot physically speak. 

Some previous research has suggested there may be similarities, but much more work needs to be done, Pasley said. 

"This is huge for patients who have damage to their speech mechanisms because of a stroke or Lou Gehrig's disease and can't speak," co-author Robert Knight, a UC Berkeley professor of psychology and neuroscience, said in a statement. 

"If you could eventually reconstruct imagined conversations from brain activity, thousands of people could benefit." 

Participating researchers came from the University of Maryland, UC Berkeley and Johns Hopkins University in Baltimore, Maryland. 

The study appears in the January 31 edition of the open access journal PLoS Biology. 

Author: Kerry Sheridan | Source: AFP [February 01, 2012]

Twitter Delicious Facebook Digg Stumbleupon Favorites More

 
Design by Free WordPress Themes | Bloggerized by Lasantha - Premium Blogger Themes | Facebook Themes