Sunday, January 23, 2011

Long-Distance Migration May Help Reduce Infectious Disease Risks for Many Animal Species

It's a common assumption that animal migration, like human travel across the globe, can transport pathogens long distances, in some cases increasing disease risks to humans. West Nile Virus, for example, spread rapidly along the East coast of the U.S., most likely due to the movements of migratory birds.
But in a paper just published in the journal Science, researchers in the University of Georgia Odum School of Ecology report that in some cases, animal migrations could actually help reduce the spread and prevalence of disease and may even promote the evolution of less-virulent disease strains.
Every year, billions of animals migrate, some taking months to travel thousands of miles across the globe. Along the way, they can encounter a broad range of pathogens while using different habitats and resources. Stopover points, where animals rest and refuel, are often shared by multiple species in large aggregations, allowing diseases to spread among them.
But, according to Odum School associate professor Sonia Altizer and her co-authors, Odum School postdoctoral associates Rebecca Bartel and Barbara Han, migration can also help limit the spread of some pathogens.
Some kinds of parasites have transmission stages that can build up in the environment where host animals live, and migration allows the hosts to periodically escape these parasite-laden habitats. While hosts are gone, parasite numbers become greatly reduced so that the migrating animals find a largely disease-free habitat when they return. Long migratory journeys can also weed infected animals from the population: imagine running a marathon with the flu. This not only prevents those individuals from spreading disease to others, it also helps to eliminate some of the most virulent strains of pathogens.
"By placing disease in an ecological context," said Odum School dean John Gittleman, "you not only see counterintuitive patterns but also understand advantages to disease transmission. This is a classic example of disease ecology at its best."
Altizer's long-term research on monarch butterflies and a protozoan parasite that infects them provides an excellent demonstration of migration's effects on the spread of infectious disease. Monarchs in eastern North America migrate long distances, from as far north as Canada, to central Mexico, where they spend the winter. Monarchs in other parts of the world migrate shorter distances. In locations with mild year-round climates, such as southern Florida and Hawaii, monarchs do not migrate at all. Work by Altizer and others in her lab showed that parasite prevalence is lowest in the eastern North American population, which migrates the farthest distance, and highest in non-migratory populations. This could be because infected monarchs do not migrate successfully, as suggested by tethered-flight experiments with captive butterflies, or because parasites build up in habitats where monarchs breed year-round. Other work showed that parasites isolated from monarchs that flew the longest were less virulent than those found in monarchs that flew shorter distances or didn't migrate at all, suggesting that monarchs with highly virulent parasites didn't survive the longest migrations.
"Taken together, these findings tell us that migration is important for keeping monarch populations healthy -- a result that could apply to many other migratory animal species," said Altizer.
But for monarchs, and many other species, migration is now considered an endangered phenomenon. Deforestation, urbanization and the spread of agriculture have eliminated many stopover sites, and artificial barriers such as dams and fences have blocked migration routes for other species. These changes can artificially elevate animal densities and facilitate contact between wildlife, livestock and humans, increasing the risk that pathogens will spread across species. As co-author Han noted, "A lot of migratory species are unfairly blamed for spreading infections to humans, but there are just as many examples suggesting the opposite -- that humans are responsible for creating conditions that increase disease in migratory species."
And as the climate warms, species like the monarch may no longer need to undertake the arduous migratory journey to their wintering grounds. With food resources available year-round, some species may shorten or give up their migrations altogether -- prolonging their exposure to parasites in the environment, raising the rates of infection and favoring the evolution of more virulent disease strains. "Migration is a strategy that has evolved over millions of years in response to selection pressures driven by resources, predators and lethal parasitic infections -- any changes to this strategy could translate to changes in disease dynamics," said Han.
"There is an urgent need for more study of pathogen dynamics in migratory species and how human activities affect those dynamics," Altizer said. The paper concludes with an outline of challenges and questions for future research. "We need to learn more in order to make decisions about the conservation and management of wildlife and to predict and mitigate the effects of future outbreaks of infectious diseases."

Friday, January 14, 2011

Painkillers 'cause kidney damage'

Be it a body pain, a headache or the pain of a wound, all we do is to pop in a Painkiller. Soon the pain subsides and you sign off for a peaceful sleep, unaware of the fact that the painkiller is playing tricks on your body organs.
Experts say that occasional intake of Painkillers does not cause harm but regular practice may lead to serious health conditions. Surveys have proved the fast growing practice of Painkiller addiction. Most of the addicts were not even aware that they were addicted to the painkiller.
Painkillers do not work on any specific body part. All they do is to reduce the pain messages sent to brain and relax the reaction. Painkillers do the job of suppressing the pain but they never cure it. If the pain is an outcome of a constant health condition, it will return after a gap of few hours.
The two major body organs damaged by painkillers are Kidney and Heart.
Heart – According to a recent survey, excess consumption of Painkillers can lead to Cardiac Arrest. Cardiac Arrest is the condition when heart stops circulating blood to the body. Excess consumption of painkillers hampers the normal breathing process. This in turn leads to drop in Oxygen supply.
This low level of Oxygen disturbs the heart rhythm called ventricular fibrillation. In this condition though the heart continues to work, it does not supply enough blood to the body.
Kidney - The medicine which reduces pain is called Analgesic. Some Analgesic which does not need prescription like Aspirin, Ibuprofen etc. Over usage of theses drugs can lead to Kidney damage. These painkillers are not broken by the Liver or by the digestive system. These are excreted through the kidney, thus causing damage to it.
The Analgesic can cause two types of Kidney damage – Acute Renal Failure and Chronic Kidney Disease called analgesic nephropathy.

Other Side Effects Of Painkillers
1.Constipation – Painkillers have the ability to disturb your bowel system. The constipation if not diagnosed it time can be very painful and lead to other major diseases.
2.Dizziness – Painkillers relax your brain and generally makes you feel sleepy. Constant usage of painkillers, can make it a permanent trait. Constant heavy usage can lead to dull brain and depression.
3.Nausea – Some painkillers contain a dose of morphine, which is not tolerated by some body types. This may cause nausea which eventually retreats. Continuous usage of these painkillers may cause serious problems.
In few situations, like after an operation or in times of unbearable pain, painkillers can be used but if the pain persists, it's better to take medical advice. Avoiding these problem with painkillers is sure to lead to serious health conditions.

Wednesday, January 12, 2011

Water on Moon Originated from Comets

Researchers at the University of Tennessee, Knoxville, continue to chip away at the mysterious existence of water on the moon -- this time by discovering the origin of lunar water.
 Larry Taylor, a distinguished professor in the Department of Earth and Planetary Sciences, was the one last year to discover trace amounts of water on the moon. This discovery debunked beliefs held since the return of the first Apollo rocks that the moon was bone-dry.
Then, he discovered water was actually pretty abundant and ubiquitous -- enough so a human settlement on the moon is not unquestionable.
Now, Taylor and a team of researchers have determined the lunar water may have originated from comets smashing into the moon soon after it formed.
His findings will be posted online, in the article "Extraterrestrial Hydrogen Isotope Composition of Water in Lunar Rocks" on the website of the scientific journal, Nature Geoscience.
Taylor and his fellow researchers conducted their study by analyzing rocks brought back from the Apollo mission. Using secondary ion mass spectrometry, they measured the samples' "water signatures," which tell the possible origin of the water -- and made the surprising discovery that the water on the Earth and moon are different.
"This discovery forces us to go back to square one on the whole formation of the Earth and moon," said Taylor. "Before our research, we thought the Earth and moon had the same volatiles after the Giant Impact, just at greatly different quantities. Our work brings to light another component in the formation that we had not anticipated -- comets."
Scientists believe the moon formed by a giant impact of the nascent Earth with a Mars-sized object called Theia, which caused a great explosion throwing materials outward to aggregate and create the moon. Taylor's article theorizes that at this time, there was a great flux of comets, or "dirty icebergs," hitting both the Earth and moon systems. The Earth already having lots of water and other volatiles did not change much. However, the moon, being bone-dry, acquired much of its water supply from these comets.
Taylor's research shows that water has been present throughout all of the moon's history -- some water being supplied externally by solar winds and post-formation comets and the other internally during the moon's original formation.
"The water we are looking at is internal," said Taylor. "It was put into the moon during its initial formation, where it existed like a melting pot in space, where cometary materials were added in at small yet significant amounts."
To be precise, the lunar water he has found does not consist of "water" -- the molecule H2O -- as we know it on Earth. Rather, it contains the ingredients for water -- hydrogen and oxygen -- that when the rocks are heated up, will be liberated to create water. The existence of hydrogen and oxygen -- water -- on the moon can literally serve as a launch pad for further space exploration.
"This water could allow the moon to be a gas station in the sky," said Taylor. "Spaceships use up to 85 percent of their fuel getting away from Earth's gravity. This means the moon can act as a stepping stone to other planets. Missions can fuel up at the moon, with liquid hydrogen and liquid oxygen from the water, as they head into deeper space, to other places such as Mars."
Taylor collaborated with James P. Greenwood at Wesleyan University in Middletown, Conn.; Shoichi Itoh, Naoya Sakamoto and Hisayoshi Yurimoto at Hokkaido University in Japan; and Paul Warren at the University of California in Los Angeles.

Couch Potatoes Beware: Too Much Time Spent Watching TV Is Harmful to Heart Health

Spending too much leisure time in front of a TV or computer screen appears to dramatically increase the risk for heart disease and premature death from any cause, perhaps regardless of how much exercise one gets, according to a new study published in the January 18, 2011, issue of the Journal of the American College of Cardiology.
Data show that compared to people who spend less than two hours each day on screen-based entertainment like watching TV, using the computer or playing video games, those who devote more than four hours to these activities are more than twice as likely to have a major cardiac event that involves hospitalization, death or both.
The study -- the first to examine the association between screen time and non-fatal as well as fatal cardiovascular events -- also suggests metabolic factors and inflammation may partly explain the link between prolonged sitting and the risks to heart health.
"People who spend excessive amounts of time in front of a screen -- primarily watching TV -- are more likely to die of any cause and suffer heart-related problems," said Emmanuel Stamatakis, PhD, MSc, Department of Epidemiology and Public Health, University College London, United Kingdom. "Our analysis suggests that two or more hours of screen time each day may place someone at greater risk for a cardiac event."
In fact, compared with those spending less than two hours a day on screen-based entertainment, there was a 48% increased risk of all-cause mortality in those spending four or more hours a day and an approximately 125% increase in risk of cardiovascular events in those spending two or more hours a day. These associations were independent of traditional risk factors such as smoking, hypertension, BMI, social class, as well as exercise.
The findings have prompted authors to advocate for public health guidelines that expressly address recreational sitting (defined as during non-work hours), especially as a majority of working age adults spend long periods being inactive while commuting or being slouched over a desk or computer.
"It is all a matter of habit. Many of us have learned to go back home, turn the TV set on and sit down for several hours -- it's convenient and easy to do. But doing so is bad for the heart and our health in general," said Dr. Stamatakis. "And according to what we know so far, these health risks may not be mitigated by exercise, a finding that underscores the urgent need for public health recommendations to include guidelines for limiting recreational sitting and other sedentary behaviors, in addition to improving physical activity."
Biological mediators also appear to play a role. Data indicate that one fourth of the association between screen time and cardiovascular events was explained collectively by C-reactive protein (CRP), body mass index, and high-density lipoprotein cholesterol suggesting that inflammation and deregulation of lipids may be one pathway through which prolonged sitting increases the risk for cardiovascular events. CRP, a well-established marker of low-grade inflammation, was approximately two times higher in people spending more than four hours of screen time per day compared to those spending less than two hours a day.
Dr. Stamatakis says the next step will be to try to uncover what prolonged sitting does to the human body in the short- and long-term, whether and how exercise can mitigate these consequences, and how to alter lifestyles to reduce sitting and increase movement and exercise.
The present study included 4,512 adults who were respondents of the 2003 Scottish Health Survey, a representative, household-based survey. A total of 325 all-cause deaths and 215 cardiac events occurred during an average of 4.3 years of follow up.
Measurement of "screen time" included self-reported TV/DVD watching, video gaming, as well as leisure-time computer use. Authors also included multiple measures to rule out the possibility that ill people spend more time in front of the screen as opposed to other way around. Authors excluded those who reported a previous cardiovascular event (before baseline) and those who died during the first two years of follow up just in case their underlying disease might have forced them to stay indoors and watch TV more often. Dr. Stamatakis and his team also adjusted analyses for indicators of poor health (e.g., diabetes, hypertension).

Sunday, January 9, 2011

Emotional Signals Are Chemically Encoded in Tears, Researchers Find

Emotional crying is a universal, uniquely human behavior. When we cry, we clearly send all sorts of emotional signals. In a paper published online January 6 in Science Express, scientists at the Weizmann Institute have demonstrated that some of these signals are chemically encoded in the tears themselves. Specifically, they found that merely sniffing a woman's tears -- even when the crying woman is not present -- reduces sexual arousal in men.
Humans, like most animals, expel various compounds in body fluids that give off subtle messages to other members of the species. A number of studies in recent years, for instance, have found that substances in human sweat can carry a surprising range of emotional and other signals to those who smell them.
But tears are odorless. In fact, in a first experiment led by Shani Gelstein, Yaara Yeshurun and their colleagues in the lab of Prof. Noam Sobel in the Weizmann Institute's Neurobiology Department, the researchers first obtained emotional tears from female volunteers watching sad movies in a secluded room and then tested whether men could discriminate the smell of these tears from that of saline. The men could not.
In a second experiment, male volunteers sniffed either tears or a control saline solution, and then had these applied under their nostrils on a pad while they made various judgments regarding images of women's faces on a computer screen. The next day, the test was repeated -- the men who were previously exposed to tears getting saline and vice versa. The tests were double blinded, meaning neither the men nor the researchers performing the trials knew what was on the pads. The researchers found that sniffing tears did not influence the men's estimates of sadness or empathy expressed in the faces. To their surprise, however, sniffing tears negatively affected the sex appeal attributed to the faces.
To further explore the finding, male volunteers watched emotional movies after similarly sniffing tears or saline. Throughout the movies, participants were asked to provide self-ratings of mood as they were being monitored for such physiological measures of arousal as skin temperature, heart rate, etc. Self-ratings showed that the subjects' emotional responses to sad movies were no more negative when exposed to women's tears, and the men "smelling" tears showed no more empathy. They did, however, rate their sexual arousal a bit lower. The physiological measures, however, told a clearer story. These revealed a pronounced tear-induced drop in physiological measures of arousal, including a significant dip in testosterone -- a hormone related to sexual arousal.
Finally, in a fourth trial, Sobel and his team repeated the previous experiment within an fMRI machine that allowed them to measure brain activity. The scans revealed a significant reduction in activity levels in brain areas associated with sexual arousal after the subjects had sniffed tears.
Sobel said, "This study raises many interesting questions. What is the chemical involved? Do different kinds of emotional situations send different tear-encoded signals? Are women's tears different from, say, men's tears? Children's tears? This study reinforces the idea that human chemical signals -- even ones we're not conscious of -- affect the behavior of others."
Human emotional crying was especially puzzling to Charles Darwin, who identified functional antecedents to most emotional displays -- for example, the tightening of the mouth in disgust, which he thought originated as a response to tasting spoiled food. But the original purpose of emotional tears eluded him. The current study has offered an answer to this riddle: Tears may serve as a chemosignal. Sobel points out that some rodent tears are known to contain such chemical signals. "The uniquely human behavior of emotional tearing may not be so uniquely human after all," he says.

Major Advance in MRI Allows Much Faster Brain Scans

An international team of physicists and neuroscientists has reported a breakthrough in magnetic resonance imaging that allows brain scans more than seven times faster than currently possible.
 In a paper that appeared Dec. 20 in the journal PLoS ONE, a University of California, Berkeley, physicist and colleagues from the University of Minnesota and Oxford University in the United Kingdom describe two improvements that allow full three-dimensional brain scans in less than half a second, instead of the typical 2 to 3 seconds.
"When we made the first images, it was unbelievable how fast we were going," said first author David Feinberg, a physicist and adjunct professor in UC Berkeley's Helen Wills Neuroscience Institute and president of the company Advanced MRI Technologies in Sebastopol, Calif. "It was like stepping out of a prop plane into a jet plane. It was that magnitude of difference."
For neuroscience, in particular, fast scans are critical for capturing the dynamic activity in the brain.
"When a functional MRI study of the brain is performed, about 30 to 60 images covering the entire 3-D brain are repeated hundreds of times like the frames of a movie but, with fMRI, a 3-D movie," Feinberg said. "By multiplexing the image acquisition for higher speed, a higher frame rate is achieved for more information in a shorter period of time."
"The brain is a moving target, so the more refined you can sample this activity, the better understanding we will have of the real dynamics of what's going on here," added Dr. Marc Raichle, a professor of radiology, neurology, neurobiology, biomedical engineering and psychology at Washington University in St. Louis who has followed Feinberg's work.
Because the technique works on all modern MRI scanners, the impact of the ultrafast imaging technique will be immediate and widespread at research institutions worldwide, Feinberg said. In addition to broadly advancing the field of neural-imaging, the discovery will have an immediate impact on the Human Connectome Project, funded last year by the National Institutes of Health (NIH) to map the connections of the human brain through functional MRI (fMRI) and structural MRI scans of 1,200 healthy adults.
"At the time we submitted our grant proposal for the Human Connectome Project, we had aspirations of acquiring better quality data from our study participants, so this discovery is a tremendous step in helping us accomplish the goals of the project," said Dr. David Van Essen, a neurobiologist at Washington University and co-leader of the project. "It's vital that we get the highest quality imaging data possible, so we can infer accurately the brain's circuitry -- how connections are established, and how they perform."
The faster scans are made possible by combining two technical improvements invented in the past decade that separately boosted scanning speeds two to four times over what was already the fastest MRI technique, echo planar imaging (EPI). Physical limitations of each method prevented further speed improvements, "but together their image accelerations are multiplied," Feinberg said. The team can now obtain brain scans substantially faster than the time reductions reported in their paper and many times faster than the capabilities of today's machines.
Magnetic resonance imaging works by using a magnetic field and radio waves to probe the environment of hydrogen atoms in water molecules in the body. Because hydrogen atoms in blood, for example, respond differently than atoms in bone or tissue, computers can reconstruct the body's interior landscape without the use of penetrating X-rays.
Nearly 20 years ago, however, a new type of MRI called functional MRI (fMRI) was developed to highlight areas of the brain using oxygen, and thus presumably engaged in neuronal activity, such as thinking .Using echo planar imaging (EPI), fMRI vividly distinguishes oxygenated blood funneling into working areas of the brain from deoxygenated blood in less active areas.
As with standard MRI, fMRI machines create magnetic fields that vary slightly throughout the brain, providing a different magnetic environment for hydrogen atoms in different areas. The differing magnetic field strengths make the spin of each hydrogen atom precess at different rates, so that when a pulse of radio waves is focused on the head, the atoms respond differently depending on location and on their particular environment. Those that absorb radio energy and then release the energy are detected by magnetic coils surrounding the head, and these signals, or "echoes," are used to produce an image of the brain.
With EPI, a single pulse of radio waves is used to excite the hydrogen atoms, but the magnetic fields are rapidly reversed several times to elicit about 50 to 100 echoes before the atoms settle down. The multiple echoes provide a high-resolution picture of the brain.
In 2002, Feinberg proposed using a sequence of two radio pulses to obtain twice the number of images in the same amount of time. Dubbed simultaneous image refocusing (SIR) EPI, it has proved useful in fMRI and for 3-D imaging of neuronal axonal fiber tracks, though the improvement in scanning speed is limited because with a train of more than four times as many echoes, the signal decays and the image resolution drops.
Another acceleration improvement, multiband excitation of several slices using multiple coil detection, was proposed in the U.K. at about the same time by David Larkmann for spinal imaging. The technique was recently used for fMRI by Steen Moeller and colleagues at the University of Minnesota. This technique, too, had limitations, primarily because the multiple coils are relatively widely spaced and cannot differentiate very closely spaced images.
In collaboration with Essa Yacoub, senior author on the paper, and Kamil Ugurbil, director of the University of Minnesota's Center for Magnetic Resonance Research and co-leader of the Human Connectome Project, Feinberg combined these techniques to get significantly greater acceleration than either technique alone while maintaining the same image resolution.
"With the two methods multiplexed, 10, 12 or 16 images the product of their two acceleration factors were read out in one echo train instead of one image," Feinberg said. "The new method is in the optimization phase and is now substantially faster than the scan times reported in this paper."
The ability to scan the brain in under 400 milliseconds moves fMRI closer to electroencephalography (EEG) for capturing very rapid sequences of events in the brain.
"Other techniques which capture signals derived from neuronal activity, EEG or MEG, have much higher temporal resolution; hundred microsecond neuronal changes. But MRI has always been very slow, with 2 second temporal resolution," Feinberg said. "Now MRI is getting down to a few hundred milliseconds to scan the entire brain, and we are beginning to see neuronal network dynamics with the high spatial resolution of MRI."
The development will impact general fMRI as well as diffusion imaging of axonal fibers in the brain, both of which are needed to achieve the main goal of the Human Connectome Project. Diffusion imaging reveals the axonal fiber networks that are the main nerve connections between areas of the brain, while fMRI shows which areas of the brain are functionally connected, that is, which areas are active together or sequentially during various activities.
"While it simply is not possible to show the billions of synaptic connections in the live human brain, the hope is that understanding patterns of how the normal brain is functionally interacting and structurally connected will lead to insights about diseases that involve miswiring in the brain," Feinberg said.
"We suspect several neurologic and psychiatric disorders, such as autism and schizophrenia, may be brain connectivity disorders, but we don't know what normal connectivity is," Feinberg added. "Although the fMRI and neuronal fiber images do not have the resolution of an electron microscope, the MRI derived Connectome reveals the live human brain and can be combined with genetic and environmental information to identify individual differences in brain circuitry."
Raichle, a collaborator in the NIH Human Connectome project, is one of the pioneers of "resting state" MRI, in which brain scans are taken of patients not involved in any specific task. He believes that the ongoing spontaneous activity discovered during such scans will tell us about how the brain remains flexible and maintains a degree of homeostatis so that "you know who you are."
"Being able to sample this ongoing activity at increasing temporal fidelity and precision becomes really important for understanding how the brain is doing this," Raichle said. "David is superclever at this kind of technical stuff, and I have been cheering him along, saying that the faster we can go, the better we can understand the brain's spontaneous activity."
The other authors of the PLoS ONE paper are Steen Moeller and Edward Auerbach of the Center for Magnetic Resonance Research at the University of Minnesota Medical School; Sudhir Ramanna of Advanced MRI Technologies; Matt F. Glasser of Washington University; and Karla L. Miller and Stephen M. Smith of the Oxford Centre for Functional MRI of the Brain at the University of Oxford. Feinberg is also affiliated with the UC San Francisco Department of Radiology.
The work was supported by the NIH's Human Connectome Project and by other grants from the NIH and from Advanced MRI Technologies.

Friday, January 7, 2011

Lice DNA Study Shows Humans First Wore Clothes 170,000 Years Ago

A new University of Florida study following the evolution of lice shows modern humans started wearing clothes about 170,000 years ago, a technology which enabled them to successfully migrate out of Africa.

Principal investigator David Reed, associate curator of mammals at the Florida Museum of Natural History on the UF campus, studies lice in modern humans to better understand human evolution and migration patterns. His latest five-year study used DNA sequencing to calculate when clothing lice first began to diverge genetically from human head lice.
Funded by the National Science Foundation, the study is available online and appears in this month's print edition of Molecular Biology and Evolution.
"We wanted to find another method for pinpointing when humans might have first started wearing clothing," Reed said. "Because they are so well adapted to clothing, we know that body lice or clothing lice almost certainly didn't exist until clothing came about in humans."
The data shows modern humans started wearing clothes about 70,000 years before migrating into colder climates and higher latitudes, which began about 100,000 years ago. This date would be virtually impossible to determine using archaeological data because early clothing would not survive in archaeological sites.
The study also shows humans started wearing clothes well after they lost body hair, which genetic skin-coloration research pinpoints at about 1 million years ago, meaning humans spent a considerable amount of time without body hair and without clothing, Reed said.
"It's interesting to think humans were able to survive in Africa for hundreds of thousands of years without clothing and without body hair, and that it wasn't until they had clothing that modern humans were then moving out of Africa into other parts of the world," Reed said.
Lice are studied because unlike most other parasites, they are stranded on lineages of hosts over long periods of evolutionary time. The relationship allows scientists to learn about evolutionary changes in the host based on changes in the parasite.
Applying unique data sets from lice to human evolution has only developed within the last 20 years, and provides information that could be used in medicine, evolutionary biology, ecology or any number of fields, Reed said.
"It gives the opportunity to study host-switching and invading new hosts -- behaviors seen in emerging infectious diseases that affect humans," Reed said.
A study of clothing lice in 2003 led by Mark Stoneking, a geneticist at the Max Planck Institute in Leipzig, Germany, estimated humans first began wearing clothes about 107,000 years ago. But the UF research includes new data and calculation methods better suited for the question.
"The new result from this lice study is an unexpectedly early date for clothing, much older than the earliest solid archaeological evidence, but it makes sense," said Ian Gilligan, lecturer in the School of Archaeology and Anthropology at The Australian National University. "It means modern humans probably started wearing clothes on a regular basis to keep warm when they were first exposed to Ice Age conditions."
The last Ice Age occurred about 120,000 years ago, but the study's date suggests humans started wearing clothes in the preceding Ice Age 180,000 years ago, according to temperature estimates from ice core studies, Gilligan said. Modern humans first appeared about 200,000 years ago.
Because archaic hominins did not leave descendants of clothing lice for sampling, the study does not explore the possibility archaic hominins outside of Africa were clothed in some fashion 800,000 years ago. But while archaic humans were able to survive for many generations outside Africa, only modern humans persisted there until the present.
"The things that may have made us much more successful in that endeavor hundreds of thousands of years later were technologies like the controlled use of fire, the ability to use clothing, new hunting strategies and new stone tools," Reed said.
Study co-authors were Melissa Toups of Indiana University and Andrew Kitchen of The Pennsylvania State University, both previously with UF. Co-author Jessica Light of Texas A&M University was formerly a post-doctoral fellow at the Florida Museum. The researchers completed the project with the help of Reed's NSF Faculty Early Career Development Award, which is granted to researchers who exemplify the teacher-researcher role.