November 2025

IT’LL NEVER FLY, ORVILLE:

As the 2025 Atlantic hurricane season ends, the future of forecasting is AI (Greg Allen, 11/29/25, NPR: Weekend Edition)

A week before the hurricane made landfall, however, forecast models disagreed on where it would go. One model that got it right — accurately predicting Melissa’s path and its category 5 intensity — was a new one: Google’s DeepMind AI-based hurricane model.

James Franklin, a former branch chief at the National Hurricane Center, analyzed how the forecast models performed this year, and says Google’s DeepMind outshone them all. “The model performed very, very well, which was very impressive,” he says. “It was the best guidance we saw this year.”

THE FUTURE ALWAYS HAPPENS FASTER THAN YOU EXPECT:

How to Print a Human: We desperately need new organs, and we’re running out of ways to get them (Mary Roach November 28, 2025, Nautilus)

I ask Feinberg when he thinks medical science will arrive at the point of implanting entire functional bioprinted organs in patients. If we use the analogy of airplane flight, he puts things somewhere around the Wright brothers stage. “Of course, we don’t want a plane that goes 30 feet down the field. We want a plane that can fly around all day.”

And how far off is that? A decade plus, Feinberg says.

For medical science, that’s actually a brisk turnaround. (In an earlier phone conversation, Feinberg equated “a decade or two” with “pretty quickly.”) He adds that he thought it could easily happen far sooner. “Because we keep coming up with new things.” Just 20 years ago, he points out, there was no gene editing, no CRISPR. “Plus AI is going to accelerate, and that’s going to change what’s possible.”

I pose the same question now to Jaci Bliley, a senior post-doctoral fellow in the lab. Bliley has just joined us in the microscope room. Two to three decades is her estimate. Like Feinberg, she says she’s surprised at how fast things are moving. She offers the example of some stand-alone beating heart ventricles, little tubular constructs that she printed as part of her Ph.D. research. “That was 2019,” she says. “Now we’re putting them into mice and they’re surviving. After six months they’re still alive and beating.”

THANKS, VLAD!:

Why Russia has come to the table (Peter Caddick-Adams, 11/28/25, Englesberg Ideas)

Russia’s economy is imploding. Largely due to sanctions caused by the Ukraine War, this year the Economics Ministry posted a record mid-year budget deficit of 3.7 trillion roubles ($45.8 billion) and the Central Bank expects the full-year deficit to reach $55 billion, or 2 per cent of GDP. This is almost certainly the reason peace proposals with Ukraine have surfaced again. […]

Traditionally, the Kremlin has leant heavily on oil and gas exports to generate cash; in 2024, earnings from these exports contributed around 30 per cent of total federal budget revenue. However, from an average price listing of $71.10 per barrel of Urals crude in November 2022, due to sanctions on Rosneft and Lukoil, reliance on its aging and inefficient ‘shadow tanker’ shipping fleet, and a G7-imposed price cap, after three years, traders report the price of Russian oil has slid to $36.61 per barrel, with other OPEC producers replacing the Urals output. As key export buyers, notably China and India, were threatening to search elsewhere for suppliers, by November 2025 Russian sellers had been obliged to discount their black stuff to an average of $23.52 a barrel.

Thus, the Kremlin has turned to selling assets it cannot replace.

Ukraine needs to increase its demands, starting with regime change.

…AND CHEAPER…:

Your Fridge Is Bigger and Cheaper Today, Thanks to Global Trade and Innovation (Jeremy Horpedahl, 11/26/25, Cato)

In 1984, the average hourly earnings for production and nonsupervisory workers (representing about 80 percent of the private workforce) stood at roughly $8.32. Acquiring the Kenmore would thus require approximately 163 hours of labor, equivalent to more than three full workweeks.

By contrast, a comparable 2024 model from a major retailer like Home Depot—matching size and features—retailed for $998 in nominal terms in 2024, when I last checked it, a direct reduction without inflation adjustment. With average hourly earnings in 2024 at about $29.85 for the same worker category, the labor investment drops to around 33 hours. In relative terms, refrigerators have become nearly five times more affordable, reflecting efficiencies from global supply chains, automation, and competition.

Interestingly, that fridge has increased in price sharply since 2024, almost certainly in part because of trade policy, and is currently listed at $1,658—even so, it is still much cheaper when measured in time prices, requiring just 53 hours of labor, compared with 163 hours in 1984 (and you can probably find a Black Friday deal on it too).

DEFLATION WORKS:

How Milei made austerity popular (Julieta Casas, 11/20/25, Englesberg Ideas)

How has Milei managed to maintain popular support? A comparison with the country’s past austerity administrations suggests two possible reasons. First, while the president’s ‘chainsaw’ economic policies have cut into the real incomes of broad segments of the population and led to a stagnation of economic growth, they did so early on in his administration. After a harsh initial shock, the mid-term elections coincided with a rebound of the economy, potentially aiding the president’s electoral support. What’s more, a drastic reduction of inflation has worked to his advantage.

THERE’LL BE TIME ENOUGH FOR COUNTING:

What’s next for AlphaFold: A conversation with a Google DeepMind Nobel laureate: “I’ll be shocked if we don’t see more and more LLM impact on science,” says John Jumper (Will Douglas Heaven, November 24, 2025, MIT Technology Review)

Proteins are made from strings of amino acids that chemical forces twist up into complex knots. An untwisted string gives few clues about the structure it will form. In theory, most proteins could take on an astronomical number of possible shapes. The task is to predict the correct one.

Jumper and his team built AlphaFold 2 using a type of neural network called a transformer, the same technology that underpins large language models. Transformers are very good at paying attention to specific parts of a larger puzzle.

But Jumper puts a lot of the success down to making a prototype model that they could test quickly. “We got a system that would give wrong answers at incredible speed,” he says. “That made it easy to start becoming very adventurous with the ideas you try.”


They stuffed the neural network with as much information about protein structures as they could, such as how proteins across certain species have evolved similar shapes. And it worked even better than they expected. “We were sure we had made a breakthrough,” says Jumper. “We were sure that this was an incredible advance in ideas.”

What he hadn’t foreseen was that researchers would download his software and start using it straight away for so many different things. Normally, it’s the thing a few iterations down the line that has the real impact, once the kinks have been ironed out, he says: “I’ve been shocked at how responsibly scientists have used it, in terms of interpreting it, and using it in practice about as much as it should be trusted in my view, neither too much nor too little.” […]

AlphaFold was designed to be used for a range of purposes. Now multiple startups and university labs are building on its success to develop a new wave of tools more tailored to drug discovery. This year, a collaboration between MIT researchers and the AI drug company Recursion produced a model called Boltz-2, which predicts not only the structure of proteins but also how well potential drug molecules will bind to their target.

Last month, the startup Genesis Molecular AI released another structure prediction model called Pearl, which the firm claims is more accurate than AlphaFold 3 for certain queries that are important for drug development. Pearl is interactive, so that drug developers can feed any additional data they may have to the model to guide its predictions.

AlphaFold was a major leap, but there’s more to do, says Evan Feinberg, Genesis Molecular AI’s CEO: “We’re still fundamentally innovating, just with a better starting point than before.”