Women have a new weapon against postpartum depression, but it’s costly

Approval of the first and only treatment in the United States specifically targeting postpartum depression offers hope for millions of women each year who suffer from the debilitating mental health disorder after giving birth.

The new drug brexanolone — marketed under the name Zulresso and approved on March 19 by the U.S. Food and Drug Administration — is expected to become available to the public in late June. Developed by Sage Therapeutics, based in Cambridge, Mass., the drug is costly and treatment is intensive: It’s administered in the hospital as a 60-hour intravenous drip, and a treatment runs between $20,000 and $35,000. But researchers say that it could help many of the estimated 11.5 percent of U.S. new moms each year who experience postpartum depression, which can interfere with normal bonding between mothers and infants and lead to feeling hopeless, sad or overly anxious.
Here’s a closer look at the drug, its benefits and some potential drawbacks.

How does the new drug work?
How exactly brexanolone works is not known. But because the drug’s chemical structure is nearly identical to the natural hormone allopregnanolone, it’s thought that brexanolone operates in a similar way.

Allopregnanolone enhances the effects of a neurochemical called gamma-aminobutyric acid, or GABA, which stops nerve cells in the brain from firing. Ultimately this action helps quell a person’s anxiety or stress.
During pregnancy, the concentration of allopregnanolone in a woman’s brain spikes. This leads some neurons to compensate by temporarily tuning out GABA so that the nerve cells don’t become too inactive. Levels of the steroid typically return to normal quickly after a baby is born, and the neurons once again responding to GABA shortly thereafter. But for some women, this process can take longer, possibly resulting in postpartum depression.

Brexanolone temporarily elevates the brain’s allopregnanolone levels again, which results in a patient’s mood improving. But it’s still not clear exactly why the drug has this effect, says Samantha Meltzer-Brody, a reproductive psychiatrist at the University of North Carolina School of Medicine in Chapel Hill and the lead scientist of the drug’s clinical trials. Nor is it clear whether allopregnanolone’s, and thus possible brexanolone’s, influence on GABA is affecting only postpartum depression. But the drug clearly “has this incredibly robust response,” she says, “unlike anything currently available.”

How effective was the drug in clinical trials?
Brexanolone went through three separate clinical trials in which patients were randomly given either the drug or a placebo: one Phase II trial, which tests the drug’s effectiveness and proper dosage, and two Phase III trials, which tested the drug’s effects on moderate or severe postpartum depression and were necessary to gain approval for the drug’s commercial use in people.

Of 234 people who completed the trials, 94 received the suggested dosage of brexanolone. About 70 of those patients, or 75 percent, had what Meltzer-Brody described as a “robust response” to just one course of treatment. And of those patients with positive responses, 94 percent continued to feel well 30 days after the treatment. The results suggest that the drug may be most effective for those with severe postpartum depression; among those with moderate symptoms, the drug and the placebo had a fairly similar impact.

Can people take the drug again?
“There’s nothing prohibiting” a second course of brexanolone, but the effects of a repeat course have not been studied, Meltzer-Brody says. The drug was designed to be taken in tandem with the start of antidepressants, which take effect after about two to four weeks. So by the time the brexanolone wears off, the antidepressants would have kicked in.

It’s not clear yet if some patients could need a second dose. The clinical trials compared a group of women taking both antidepressants and brexanolone with another group taking only brexanolone and found no difference in the two group’s response 30 days after tests ended, Meltzer-Brody says. Because the study ended at 30 days, it’s unclear if the effects of brexanolone on its own last longer.

Can women breastfeed while taking brexanolone?
As a precaution, treated women did not breastfeed until six days after taking the drug. But in tests of breastmilk from 12 treated, lactating women, concentrations of brexanolone in breastmilk were negligible — less than 10 nanograms per millileter — in most of the women 36 hours after they received the infusion, according to Sage’s briefing document for the FDA. The FDA has yet to issue guidance on breast feeding.

Are there side effects?
About a third of the trial patients experienced sleepiness, sedation or headaches. The possibility of drowsiness led to the FDA’s requirement that the drug be administered by IV drip in a supervised setting. “If someone isn’t supervised, then there would be the risk that someone could get sleepy and pass out,” Meltzer-Brody says.

Are there plans for different versions of the drug?
Sage Therapeutics is developing a pill version of a drug called SAGE-217. It’s chemically similar to brexanolone and has similar antidepressant effects. Early results from a Phase III trial reported by the company in January show that, of 78 women treated with the pill, 72 percent responded favorably within two weeks, and 53 percent had not experienced a recurrence of symptoms four weeks later.

Is it worth the price and time?
Setting aside 60 hours to be hospitalized for an expensive drug could be discouraging for some. “It’s going to be very important for insurance to cover it in order for it be accessible,” Meltzer-Brody says. “I’m hoping that will be the case.” But based on the reaction of women with severe postpartum depression who participated in the trials, “two-and-a-half days seems like nothing if your debilitating, depressive symptoms will be gone.”

The delight of discovering an asteroid that spits

These are wondrous times for space exploration. Just when you think exploring the cosmos couldn’t possibly get more fun, another discovery delivers a new “oh wow” moment.

Consider the asteroid Bennu. It’s an unprepossessing space rock that drew scientists’ curiosity because it is among the most pristine objects in our solar system, and it might provide clues to the origins of life. But checking out Bennu is no trip to Paris; it’s about 130 million kilometers from Earth. NASA launched its OSIRIS-REx probe to Bennu in 2016, and it didn’t arrive until last December. The spacecraft is currently orbiting its quarry in preparation for an attempt at gathering samples from the asteroid’s surface in 2020 and then toting them back to Earth. Estimated delivery date: September 24, 2023. Clearly, asteroid science is not a discipline for those with short attention spans.
So imagine scientists’ delight when OSIRIS-REx already had news to share: Bennu is squirting jets of dust into space. It’s an asteroid behavior no one had ever seen before. Astronomy writer Lisa Grossman learned all about Bennu’s surprise jets while attending the Lunar and Planetary Science Conference in March. She reports that the dusty fountains may be the work of volatile gases beneath Bennu’s surface. The presence of volatiles would suggest that the rock wandered into the inner solar system relatively recently. But astronomers still have a lot to figure out about Bennu’s history, and they couldn’t be happier.

In other surprising space rock news from the conference, astronomers analyzing the much-more-distant object dubbed Ultima Thule now think it’s an agglomeration of mini-worlds that stuck together in the early days of the solar system — as Grossman terms it, a “Frankenworld.” That’s just the latest unexpected news from this Kuiper Belt denizen. If you’re as space rock obsessed as we are, you may recall that the first fuzzy images from NASA’s New Horizons spacecraft, which flew by Ultima Thule on January 1, suggested that the rock looked like a bowling pin or a snowman spinning in space. More recent images reveal not a snowman, but instead two pancakes or hamburger patties glued end to end (SN: 3/16/19, p. 15). That has scientists scrambling to figure out what forces could create such an oddly shaped object.

We’ll be hearing more about Bennu, Ultima Thule and other residents of our solar system in the months to come. I’m particularly looking forward to news from the Parker Solar Probe, which is tightening its orbit around the sun. I’m the one who is going to have to be patient in this case, though that’s not an attribute typically associated with journalists. The spacecraft won’t make its closest encounter with the sun until 2024, before ending its mission the following year. But the probe will be reporting in, and we’ll be reporting, too, as it makes this historic journey (SN: 1/19/19, p. 7).

Open your Web browser or your trusty print magazine and join us for the adventure. We hope you’ll enjoy the journey as much as we do.

Testing mosquito pee could help track the spread of diseases

There are no teensy cups. But a urine test for wild mosquitoes has for the first time proved it can give an early warning that local pests are spreading diseases.

Mosquito traps remodeled with a pee-collecting card picked up telltale genetic traces of West Nile and two other worrisome viruses circulating in the wild, researchers in Australia report April 4 in the Journal of Medical Entomology.

The tests were based on an innovative saliva monitoring system unveiled in 2010: traps that lure mosquitoes into tasting honey-coated cards. Among its advantages, this card-based medical testing doesn’t need the constant refrigeration that checking whole mosquitoes does. And it’s not as labor intensive as monitoring sentinel chickens or pigs for signs of infection.
But testing traces of mosquito saliva left on these cards comes close to the limits of current molecular methods for detecting viruses. In part, it’s an issue of volume. A mosquito drools fewer than five nanoliters of saliva when it tastes a card. In comparison, mosquitoes excrete about 1.5 microliters of liquid per pee, offering a veritable flood of material. So Dagmar Meyer of James Cook University in Cairns, Australia and her colleagues created urine collectors using standard overnight light traps and longer-standing traps that exhale delicious carbon dioxide, a mosquito come-hither.

The team set out 29 urine traps in two insect-rich spots in Queensland along with traps equipped to catch mosquito saliva. When mosquitoes fell for the trick and entered a urine trap, their excretions dripped through a mesh floor onto a collecting card. Adding a moist wick of water kept trapped mosquitoes alive and peeing longer, thus improving the sample. Pee traps picked up three viruses — West Nile, Ross River and Murray Valley encephalitis — while the saliva ones detected two, the researchers report.

Hayabusa2 has blasted the surface of asteroid Ryugu to make a crater

Hayabusa2 has blasted the asteroid Ryugu with a projectile, probably adding a crater to the small world’s surface and stirring up dust that scientists hope to snag.

The projectile, a two-kilogram copper cylinder, separated from the Hayabusa2 spacecraft at 9:56 p.m. EDT on April 4, JAXA, Japan’s space agency, reports.

Hayabusa2 flew to the other side of the asteroid to hide from debris that would have been ejected when the projectile hit (SN: 1/19/19, p. 20). Scientists won’t know for sure whether the object successfully made a crater, and, if so, how big it is, until the craft circles back. But by 10:36 p.m. EDT, Hayabusa2’s cameras had captured a blurry shot of a dust plume spurting up from Ryugu, so the team thinks the attempt worked.
“This is the world’s first collision experiment with an asteroid!” JAXA tweeted.

Hayabusa2 plans to briefly touch down inside the crater to pick up a pinch of asteroid dust. The spacecraft has already grabbed one sample of Ryugu’s surface (SN Online: 2/22/19). But dust exposed by the impact will give researchers a look at the asteroid’s subsurface, which has not been exposed to sunlight or other types of space radiation for up to billions of years.

If all goes as planned, Hayabusa2 will return to Earth with both samples in late 2020. A third planned sample pickup has been scrapped because Ryugu’s boulder-strewn surface is so hazardous for the spacecraft.
Comparing the two samples will reveal details of how being exposed to space changes the appearance and composition of rocky asteroids, and will help scientists figure out how Ryugu formed (SN Online: 3/20/19). Scientists hope that the asteroid contains water and organic material that might help explain how life got started in the solar system.

A Greek skull may belong to the oldest human found outside of Africa

A skull found in a cliffside cave on Greece’s southern coast in 1978 represents the oldest Homo sapiens fossil outside Africa, scientists say.

That skull, from an individual who lived at least 210,000 years ago, was encased in rock that also held a Neandertal skull dating to at least 170,000 years ago, contends a team led by paleoanthropologist Katerina Harvati of the University of Tübingen in Germany.

If these findings, reported online July 10 in Nature, hold up, the ancient Greek H. sapiens skull is more than 160,000 years older than the next oldest European H. sapiens fossils (SN Online: 11/2/11). It’s also older than a proposed H. sapiens jaw found at Israel’s Misliya Cave that dates to between around 177,000 and 194,000 years ago (SN: 2/17/18, p. 6).

“Multiple Homo sapiens populations dispersed out of Africa starting much earlier, and reaching much farther into Europe, than previously thought,” Harvati said at a July 8 news conference. African H. sapiens originated roughly 300,000 years ago (SN: 7/8/17, p. 6).
A small group of humans may have reached what’s now Greece more than 200,000 years ago, she suggested. Neandertals who settled in southeastern Europe not long after that may have replaced those first H. sapiens. Then humans arriving in Mediterranean Europe tens of thousands of years later would eventually have replaced resident Neandertals, who died out around 40,000 years ago (SN Online: 6/26/19).

But Harvati’s group can’t exclude the possibility that H. sapiens and Neandertals simultaneously inhabited southeastern Europe more than 200,000 years ago and sometimes interbred. A 2017 analysis of ancient and modern DNA concluded that humans likely mated with European Neandertals at that time.

The two skulls were held in a small section of wall that had washed into Greece’s Apidima Cave from higher cliff sediment and then solidified roughly 150,000 years ago. Since one skull is older than the other, each must originally have been deposited in different sediment layers before ending up about 30 centimeters apart on the cave wall, the researchers say.
Earlier studies indicated that one Apidima skull, which retains the face and much of the braincase, was a Neandertal that lived at least 160,000 years ago. But fossilization and sediment pressures had distorted the skull’s shape. Based on four 3-D digital reconstructions of the specimen, Harvati’s team concluded that its heavy brow ridges, sloping face and other features resembled Neandertal skulls more than ancient and modern human skulls. An analysis of the decay rate of radioactive forms of uranium in skull bone fragments produced an age estimate of at least 170,000 years.

A second Apidima fossil, also dated using uranium analyses, consists of the back of a slightly distorted braincase. Its rounded shape in a digital reconstruction characterizes H. sapiens, not Neandertals, the researchers say. A bunlike bulge often protrudes from the back of Neandertals’ skulls.
But without any facial remains to confirm the species identity of the partial braincase, “it is still possible that both Apidima skulls are Neandertals,” says paleoanthropologist Israel Hershkovitz of Tel Aviv University. Hershkovitz led the team that discovered the Misliya jaw and assigned it to H. sapiens.

Harvati and her colleagues will try to extract DNA and species-distinguishing proteins (SN: 6/8/19, p. 6) from the Greek skulls to determine their evolutionary identities and to look for signs of interbreeding between humans and Neandertals.

The find does little to resolve competing explanations of how ancient humans made their way out of Africa. Harvati’s suggestion that humans trekked from Africa to Eurasia several times starting more than 200,000 years ago is plausible, says paleoanthropologist Eric Delson of City University of New York’s Lehman College in an accompanying commentary. And the idea that some H. sapiens newcomers gave way to Neandertals probably also applied to humans who reached Misliya Cave and nearby Middle Eastern sites as late as around 90,000 years ago, before Neandertals occupied the area by 60,000 years ago, Delson says.

Hershkovitz disagrees. Ancient humans and Neandertals lived side-by-side in the Middle East for 100,000 years or more and occasionally interbred, he contends. Misliya Cave sediment bearing stone tools dates to as early as 274,000 years ago, Hershkovitz says. Since only H. sapiens remains have been found in the Israeli cave, ancient humans probably made those stone artifacts and could have been forerunners of Greek H. sapiens.

Both fish and humans have REM-like sleep

No one should have to sleep with the fishes, but new research on zebrafish suggests that we sleep like them.

Sleeping zebrafish have brain activity similar to both deep slow-wave sleep and rapid eye movement, or REM, sleep that’s found in mammals, researchers report July 10 in Nature. And the team may have tracked down the cells that kick off REM sleep.

The findings suggest that the basics of sleep evolved at least 450 million years ago in zebrafish ancestors, before the evolution of animals that give birth to live young instead of laying eggs. That’s 150 million years earlier than scientists thought when they discovered that lizards sleep like mammals and birds (SN: 5/28/16, p. 9).

What’s more, sleep may have evolved underwater, says Louis C. Leung, a neuroscientist at Stanford University School of Medicine. “These signatures [of sleep] really have important functions — even though we may not know what they are — that have survived hundreds of millions of years of evolution.”
In mammals, birds and lizards, sleep has several stages characterized by specific electrical signals. During slow-wave sleep, the brain is mostly quiet except for synchronized waves of electrical activity. The heart rate decreases and muscles relax. During REM or paradoxical sleep, the brain lights up with activity almost like it’s awake. But the muscles are paralyzed (except for rapid twitching of the eyes) and the heart beats erratically.

For many years, scientists have known that fruit flies, nematodes, fish, octopuses and other creatures have rest periods reminiscent of sleep. But until now, no one could measure the electrical activity of those animals’ brains to see if that rest is the same as mammals’ snoozing.

Leung and colleagues developed a system to do just that in zebrafish by genetically engineering them to make a fluorescent molecule that lights up when it encounters calcium, which is released when nerve cells and muscles are active. By following the flashes of light using a light sheet microscope, the researchers tracked brain and muscle activity in the naturally transparent fish larvae.

The next task was to lull fish asleep under the microscope. In some experiments, the team added drugs that trigger either slow-wave or REM sleep in mammals to the fish’s water. In others, researchers deprived fish of sleep for a night or tuckered the fish out with lots of activity during the day. Results from all the snooze-inducing methods were the same.

Sleeping fish have two distinct types of brain activity while sleeping, the team found. One, similar to slow-wave sleep, was characterized by short bursts of activity in some nerve cells in the brain. The researchers call that state slow-bursting sleep. REM-like sleep, which the researchers dubbed “propagating-wave sleep,” was characterized by frenzied brain activity that spreads like a wave through the brain. The researchers aren’t calling the sleep phases REM or slow-wave sleep because there are some minor differences between the way fish and mammals sleep.
A group of cells that line hollow spaces called ventricles deep in the brain seems to trigger that wave of REM-like brain activity. These ependymal cells dip fingerlike cilia into the cerebral spinal fluid that bathes the ventricles and the central nervous system. The cells appear to beat their cilia faster as amounts of a well-known, sleep-promoting hormone called melanin-concentrating hormone in the fluid increases, the researchers discovered.
It’s unclear how the ependymal cells communicate with the rest of the brain to set off REM-like activity. Such cells are also present in mammals, but no one has yet been able to see that deeply into the brains of sleeping mammals to determine whether the cells play a role in sleep. But knowing about these cells may help researchers develop better sleep aids, Leung says.

Just as in mammals, zebrafish’s whole bodies are affected during sleep. Their muscles relax during sleep and their hearts slow from about 200 beats per minute when awake to about 110 to 120 beats per minute while asleep during the slow-wave–like sleep. During the REM-like sleep, the heart slows even more to about 90 beats per minute and loses its regular rhythm. And the fish’s muscles also go completely slack. The one characteristic that the fish lack is rapid eye movement. Instead, the eyes roll back into their sockets, says study coauthor Philippe Mourrain, a biologist at Stanford University School of Medicine.

Lack of eye movement could indicate that emotion-processing parts of the brain, such as the amygdala, aren’t as active in zebrafish as they are in mammals, says sleep researcher Allan Pack of the University of Pennsylvania Perelman School of Medicine. With their brain-activity monitoring, the researchers have taken sleep research “to the next level,” says Pack, and “they present pretty compelling evidence” of slow-wave and REM-like sleep in the fish.

The whole-body involvement that the researchers documented solidifies the argument that fish sleep is similar to mammals, says neuroscientist Paul Shaw of Washington University School of Medicine in St. Louis. In all organisms known to snooze, “sleep is manifest everywhere” in the body, he says.

Future experiments may show why poor sleep or a lack of Zs contributes to health problems in people, such as obesity, heart disease and diabetes.

Ancient DNA unveils disparate fates of Ice Age hunter-gatherers in Europe

Ice sheets expanded across much of northern Europe from around 25,000 to 19,000 years ago, making a huge expanse of land unlivable. That harsh event set in motion a previously unrecognized tale of two human populations that played out at opposite ends of the continent.

Western European hunter-gatherers outlasted the icy blast in the past. Easterners got replaced by migrations of newcomers.

That’s the implication of the largest study to date of ancient Europeans’ DNA, covering a period before, during and after what’s known as the Last Glacial Maximum, paleogeneticist Cosimo Posth and colleagues report March 1 in Nature.
As researchers have long thought, southwestern Europe provided refuge from the last Ice Age’s big chill for hunter-gatherers based in and near that region, the scientists say. But it turns out that southeastern Europe, where Italy is now located, did not offer lasting respite from the cold for nearby groups, as previously assumed.

Instead, those people were replaced by genetically distinct hunter-gatherers who presumably had lived just to the east along the Balkan Peninsula. Those people, who carried ancestry from parts of southwestern Asia, began trekking into what’s now northern Italy by about 17,000 years ago, as the Ice Age began to wane.

“If local [Ice Age] populations in Italy did not survive and were replaced by groups from the Balkans, this completely changes our interpretation of the archaeological record,” says Posth, of the University of Tübingen in Germany.

Posth and colleagues’ conclusions rest on analyses of DNA from 356 ancient hunter-gatherers, including new molecular evidence for 116 individuals from 14 countries in Europe and Asia. Excavated human remains that yielded DNA dated to between about 45,000 and 5,000 years ago (SN: 4/7/21).

Comparisons of sets of gene variants inherited by these hunter-gatherers from common ancestors enabled the researchers to reconstruct population movements and replacements that shaped ancient Europeans’ genetic makeup. For the first time, ancient DNA evidence included individuals from what’s known as the Gravettian culture, which dates from about 33,000 to 26,000 years ago in central and southern Europe, and from southwestern Europe’s Solutrean culture, which dates to between about 24,000 and 19,000 years ago.
Contrary to expectations, makers of Gravettian tools came from two genetically distinct groups that populated western and eastern Europe for roughly 10,000 years before the Ice Age reached its peak, Posth says. Researchers have traditionally regarded Gravettian implements as products of a biologically uniform population that occupied much of Europe.

“What we previously thought was one genetic ancestry in Europe turned out to be two,” says paleogeneticist Mateja Hajdinjak of the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany, who did not participate in the new study. And “it seems that western and southwestern Europe served as a [refuge from glaciation] more than southeastern Europe and Italy.”

Descendants of the western Gravettian population, who are associated with Solutrean artifacts and remnants of another ancient culture in western Europe that ran from about 19,000 to 14,000 years ago, outlasted the Ice Age before spreading northeastward across Europe, the researchers say.

Further support for southwestern Europe as an Ice Age refuge comes from DNA extracted from a pair of fossil teeth that belonged to an individual linked to the Solutrean culture in southern Spain. That roughly 23,000-year-old adult was genetically similar to western European hunter-gatherers who lived before and after the Last Glacial Maximum, Max Planck paleogeneticist Vanessa Villalba-Mouco and colleagues, including Posth, report March 1 in Nature Ecology & Evolution.

Meanwhile, the genetic evidence suggests that hunter-gatherers in what’s now Italy were replaced by people from farther east, probably based in the Balkan region. Those newcomers must have brought with them a distinctive brand of stone artifacts, previously excavated at Italian sites and elsewhere in eastern Europe, known as Epigravettian tools, Posth says. Many archaeologists have suspected that Epigravettian items were products of hunter-gatherers who clustered in Italy during the Ice Age’s peak freeze.

But, Hajdinjak says, analyses of DNA from fossils of Ice Age Balkan people are needed to clarify what groups moved through Italy, and when those migrations occurred.

Ultimately, descendants of Ice Age migrants into Italy reached southern Italy and then western Europe by around 14,000 years ago, Posth and colleagues say. Ancient DNA evidence indicates that, during those travels, they left a major genetic mark on hunter-gatherers across Europe.

How meningitis-causing bacteria invade the brain

Bacteria can slip into the brain by commandeering cells in the brain’s protective layers, a new study finds. The results hint at how a deadly infection called bacterial meningitis takes hold.

In mice infected with meningitis-causing bacteria, the microbes exploit previously unknown communication between pain-sensing nerve cells and immune cells to slip by the brain’s defenses, researchers report March 1 in Nature. The results also hint at a new way to possibly delay the invasion — using migraine medicines to interrupt those cell-to-cell conversations.
Bacterial meningitis is an infection of the protective layers, or meninges, of the brain that affects 2.5 million people globally per year. It can cause severe headaches and sometimes lasting neurological injury or death.

“Unexpectedly, pain fibers are actually hijacked by the bacteria as they’re trying to invade the brain,” says Isaac Chiu, an immunologist at Harvard Medical School in Boston. Normally, one might expect pain to be a warning system for us to shut down the bacteria in some way, he says. “We found the opposite…. This [pain] signal is being used by the bacteria for an advantage.”

It’s known that pain-sensing neurons and immune cells coexist in the meninges, particularly in the outermost layer called the dura mater (SN: 11/11/20). So to see what role the pain and immune cells play in bacterial meningitis, Chiu’s team infected mice with two of the bacteria known to cause the infection in humans: Streptococcus pneumoniae and S. agalactiae. The researchers then observed where that bacteria ended up in mice genetically tweaked to lack pain-sensing nerve cells and compared those resting spots to those in mice with the nerve cells intact.

Mice without pain-sensing neurons had fewer bacteria in the meninges and brain than those with the nerve cells, the team found. This contradicts the idea that pain in meningitis serves as a warning signal to the body’s immune system, mobilizing it for action.

Further tests showed that the bacteria triggered a chain of immune-suppressing events, starting with the microbes secreting toxins in the dura mater.

The toxins hitched onto the pain neurons, which in turn released a molecule called CGRP. This molecule is already known to bind to a receptor on immune cells, where it helps control the dura mater’s immune responses. Injecting infected mice with more CGRP lowered the number of dural immune cells and helped the infection along, the researchers found.

The team also looked more closely at the receptor that CGRP binds to. In infected mice bred without the receptor, fewer bacteria made it into the brain. But in ones with the receptor, immune cells that would otherwise engulf bacteria and recruit reinforcements were disabled.
The findings suggest that either preventing the release of CGRP or preventing it from binding to immune cells might help delay infection.

In humans, neuroscientists know that CGRP is a driver of headaches — it’s already a target of migraine medications (SN: 6/5/18). So the researchers gave five mice the migraine medication olcegepant, which blocks CGRP’s effects, and infected them with S. pneumoniae. After infection, the medicated mice had less bacteria in the meninges and brain, took longer to show symptoms, didn’t lose as much weight and survived longer than mice that were not given the medication.

The finding suggests olcegepant slowed the infection. Even though it only bought mice a few extra hours, that’s crucial in meningitis, which can develop just as quickly. Were olcegepant to work the same way in humans, it might give doctors more time to treat meningitis. But the effect is probably not as dramatic in people, cautions Michael Wilson, a neurologist at the University of California, San Francisco who wasn’t involved with the work.

Scientists still need to determine whether pain-sensing nerve cells and immune cells have the same rapport in human dura mater, and whether migraine drugs could help treat bacterial meningitis in people.

Neurologist Avindra Nath has doubts. Clinicians think the immune response and inflammation damage the brain during meningitis, says Nath, who heads the team investigating nervous system infections at the National Institute of Neurological Disorders and Stroke in Bethesda, Md. So treatment involves drugs that suppress the immune response, rather than enhance it as migraine medications might.

Chiu acknowledges this but notes there might be room for both approaches. If dural mater immune cells could head the infection off at the pass, it may keep some bacteria from penetrating the defenses, minimizing brain inflammation.

This study might not ultimately change how clinicians treat patients, Wilson says. But it still reveals something new about one of the first lines of defense for the brain.

Fungi don’t turn humans into zombies. But The Last of Us gets some science right

Like so many others, I’ve been watching the HBO series The Last of Us. It’s a classic zombie apocalypse drama following Joel (played by Pedro Pascal) and Ellie (Bella Ramsey) as they make their way across the former United States (now run by a fascist government called Fedra).

I’m a big fan of zombie and other post-apocalyptic fiction. And my husband had told me how good the storyline is in the video game that inspired the series, so I was prepared for interesting storytelling. What I didn’t expect was to be so intrigued by the science behind the sci-fi.
In the opening minutes of the series, two scientists on a fictional 1968 talk show discuss the microbes that give them pandemic nightmares. One says it’s fungi — not viruses or bacteria — that keep him awake. Especially worrisome, he says, are the fungi that control rather than kill their hosts. He gives the example of fungi that turns ants into living zombies, puppeteering the insects by flooding their brains with hallucinogens.

He goes on to warn that even though human body temperature keeps us fungus-free, that might not be true if the world got a little bit warmer. He predicts that as the thermostat climbs, a fungus that hijacks insects could mutate a gene allowing it to burrow into human brains and take control of our minds. Such a fungus could induce its human puppets to spread the fungus “by any means necessary,” he says. What’s worse, there are no preventatives, treatments or cures, nor any way to make them.

It’s a brief segment, but it had me hooked. It all sounded so chilling and … plausible. After all, fungi like ones that cause nail infections, yeast infections and ringworm already infect people.

So I consulted some experts on fungal infections to find out whether this could actually happen.

I’ve got good news and bad news.

First, the bad news.

Bad news: Climate change has already helped one fungus mutate to infect humans
I wanted to know if warming has spurred any fungi to mutate and become infectious. So I called Arturo Casadevall. He has been thinking about fungi and heat for a long time. He’s proposed that the need to avoid fungal infections may have provided the evolutionary pressure that drove mammals and birds to evolve warm-bloodedness (SN: 12/3/10).

Most fungal species simply can’t reproduce at human body temperature (37° Celsius, or 98.6° Fahrenheit). But as the world warms, “these strains either have to die or adapt,” says Casadevall, a microbiologist who specializes in fungal infections at Johns Hopkins Bloomberg School of Public Health. That raises the possibility that fungi that now infect insects or reptiles could evolve to grow at temperatures closer to human body temperature.

At the same time, humans’ average body temperature has been falling since the 19th century, at least in high-income countries, researchers reported in eLife in 2020. One study from the United Kingdom pegs average body temperature at 36.6° C (97.9° F). And some of us are even cooler.

Fungi’s possible adaptation to higher heat and humans’ cooling body temperature are on a collision course, Casadevall says.
He and colleagues presented evidence of one such crash. Climate change may have allowed a deadly fungus called Candida auris to acclimate to human body temperatures (SN: 7/26/19). A version of the fungus that could infect humans independently emerged on three continents from 2012 to 2015. “It’s not like someone took a plane a spread it. These things came out of nowhere simultaneously,” Casadevall says.

Some people argue that the planet hasn’t warmed enough to make fungi a problem, he says. “But you have to think about all the really hot days [that come with climate change]. Every really hot day is a selection event,” in which many fungi will die. But some of those fungi will have mutations that help them handle the heat. Those will survive. Their offspring may be able to survive future even hotter heat waves until human body temperature is no challenge.

Fungi that infect people are usually not picky about their hosts, Casadevall says. They will grow in soil or — if given an opportunity — in people, pets or in other animals. The reason fungi don’t infect people more often is that “the world is much colder than we are, and they have no need of us,” he says.

When people do get infected, the immune system usually keeps the fungi in check. But fungal infections can cause serious illness or be deadly, particularly to people with weakened immune systems (SN: 11/29/21; SN: 1/10/23).

The second episode of The Last of Us reveals that the zombie-creating fungi initially spread through people eating contaminated flour. Then, the infected people attack and bite others, spreading the fungus.

In real life, most human infections arise from breathing in spores. But Casadevall says it’s “not implausible” that people could get infected by eating spores or by being bitten.

Also bad: Fungal genes can adapt to higher heat
I also wondered exactly how a fungus could evolve in response to heat. Asiya Gusa, a fungal researcher at Duke University School of Medicine, has published one possibility.

In 2020, she and colleagues reported in the Proceedings of the National Academy of Sciences on how one fungus mutated at elevated temperature to become harder to fight.

Cryptococcus deneoformans, which already infects humans (though it’s no zombie-maker), became resistant to some antifungal drugs when grown at human body temperature. The resistance was born when mobile bits of DNA called transposons (often called jumping genes) hopped into a few genes needed for the antifungals to work.

In a follow-up study, Gusa and colleagues grew C. deneoformans at either 30° C or 37° C for 800 generations, long enough to detect multiple changes in their DNA. Fungi had no problem growing at the balmy 30° C (86° F), the temperature at which researchers typically grow fungi in the lab. But their growth slowed at the higher temperature, a sign that the fungi were under stress from the heat.

In C. deneoformans, that heat stress really got things jumping. One type of transposon accumulated a median of 12 extra copies of itself in fungi grown at body temperature. By contrast, fungi grown at 30° C tended to pick up a median of only one extra copy of the transposon. The team reported those results January 20 in PNAS. The researchers don’t yet know the effect the transposon hops might have on the fungi’s ability to infect people, cause disease or resist fungus-fighting drugs.

So yeah, the bad news is not great. Fungi are mutating in the heat and at least one species has gained the ability to infect people thanks to climate change. Other fungi that infect people are more widespread than they were in the 1950s and 1960s, also thanks to a warming world (SN: 1/4/23).

But I promised good news. And here it is.

Good news: Human brains may resist zombification
It may not be our body temperature, but our brain chemistry, that protects us from being hijacked by zombifying fungi.

I consulted Charissa de Bekker and Jui-Yu Chou, two researchers who study the Ophiocordyceps fungi that are the model for the TV show’s fungal menace. These fungi infect ants, flooding the insects with a cocktail of chemicals that steer the ants to climb plants. Once in position, the ants chomp down and the chemicals keep the jaw muscles locked in place (SN: 7/17/19).

Unlike most fictional zombies, the ants are alive during this process. “A lot of people get the misconception that we work on undead ants,” says de Bekker, a microbiologist at Utrecht University in the Netherlands. She’s glad to see the show “stick to the story of the host being very much alive while its behaviors change.” The fungi even help preserve the ant, keeping it alive even while feeding on it. But eventually the ant dies. Then a mushroom rises from the corpse, showering spores onto the ground where other ants may become infected.

Related species of Ophiocordyceps infect various species of ants and other insects. But each fungal species is very specific to the host it infects. That’s because the fungi had to individualize the chemicals they use to control the particular species they infect. The ability to manipulate behavior comes at the cost of not being able to infect multiple species.
A fungus that specializes in infecting ants probably can’t get past humans’ immune systems, says Chou, a fungal researcher at the National Changhua University of Education in Taiwan. “Think of a key that fits into a specific lock. It is only this unique combination that will trigger the lock to open,” he says.

Even if the fungi evolved to withstand human body temperature and immune system attacks, they probably couldn’t take control of our minds, de Bekker says. “Manipulation is like a whole different ballgame. You need a ton of additional tools to get there.” It took millions of years of coevolution for the fungi to master piloting ants, after all.

While fungi do make mind-altering chemicals that can affect human behavior (LSD and psilocybin, for instance), Casadevall agrees that fungi that mind control insects probably won’t turn humans into zombies. “It’s not one of my worries,” he says.

Infected ants don’t turn into vicious, biting zombies either, de Bekker says. “If anything, we actually see the healthy ants being aggressive toward infected individuals, once they figure out that they’re infected, to basically get rid of them.” That “social immunity” helps protect the rest of the nest from infection.

Also good: Humans are innovative enough to develop treatments
The fictional scientist’s assertion that we couldn’t prevent, treat or cure these fungal infections is also a stretch.

Antifungal drugs exist and they cure many fungal infections, though some infections may persist. Some that spread to the brain may be particularly difficult to clear.Some fungi are also evolving resistance to the drugs. And a few fungal vaccines are in the works, although they may not be ready for years.

The experts I talked to say they hope the show will bring attention to real fungal diseases.

Gusa was especially glad to see fungi in the limelight. And she shares my fondness for that retro series opening in which the scientist predicts climate change could spawn mind-controlling fungi bent on infecting every person on the planet.

“I was pretty much yelling at the TV when I watched the [show’s] intro,” in an excited kind of way, she says. “This is the foundation of a lot of my grant funding … the threat of thermal adaptation of fungi.… To see it played out on the screen was something kind of fun.”

The Milky Way may be spawning many more stars than astronomers had thought

The Milky Way is churning out far more stars than previously thought, according to a new estimate of its star formation rate.

Gamma rays from aluminum-26, a radioactive isotope that arises primarily from massive stars, reveal that the Milky Way converts four to eight solar masses of interstellar gas and dust into new stars each year, researchers report in work submitted to arXiv.org on January 24. That range is two to four times the conventional estimate and corresponds to an annual birthrate in our galaxy of about 10 to 20 stars, because most stars are less massive than the sun.
At this rate, every million years — a blink of the eye in astronomical terms — our galaxy spawns 10 million to 20 million new stars. That’s enough to fill roughly 10,000 star clusters like the beautiful Pleiades cluster in the constellation Taurus. In contrast, many galaxies, including most of the ones that orbit the Milky Way, make no new stars at all.

“The star formation rate is very important to understand for galaxy evolution,” says Thomas Siegert, an astrophysicist at the University of Würzburg in Germany. The more stars a galaxy makes, the faster it enriches itself with oxygen, iron and the other elements that stars create. Those elements then alter star-making gas clouds and can change the relative number of large and small stars that the gas clouds form.

Siegert and his colleagues studied the observed intensity and spatial distribution of emission from aluminum-26 in our galaxy. A massive star creates this isotope during both life and death. During its life, the star blows the aluminum into space via a strong wind. If the star explodes when it dies, the resulting supernova forges more. The isotope, with a half-life of 700,000 years, decays and gives off gamma rays.

Like X-rays, and unlike visible light, gamma rays penetrate the dust that cloaks the youngest stars. “We’re looking through the entire galaxy,” Siegert says. “We’re not X-raying it; here we’re gamma-raying it.”

The more stars our galaxy spawns, the more gamma rays emerge. The best match with the observations, the researchers find, is a star formation rate of four to eight solar masses a year. That is much higher than the standard estimate for the Milky Way of about two solar masses a year.

The revised rate is very realistic, says Pavel Kroupa, an astronomer at the University of Bonn in Germany who was not involved in the work. “I’m very impressed by the detailed modeling of how they account for the star formation process,” he says. “It’s a very beautiful work. I can see some ways of improving it, but this is really a major step in the absolutely correct direction.”

Siegert cautions that it is difficult to tell how far the gamma rays have traveled before reaching us. In particular, if some of the observed emission arises nearby — within just a few hundred light-years of us — then the galaxy has less aluminum-26 than the researchers have calculated, which means the star formation rate is on the lower side of the new estimate. Still, he says it’s unlikely to be as low as the standard two solar masses per year.
In any event, the Milky Way is the most vigorous star creator in a collection of more than 100 nearby galaxies called the Local Group. The largest Local Group galaxy, Andromeda, converts only a fraction of a solar mass of gas and dust into new stars a year. Among Local Group galaxies, the Milky Way ranks second in size, but its high star formation rate means that we definitely try a lot harder.