The Download: monkey names, and smart masks for health monitoring

This is today’s edition of The Download, our weekday newsletter that provides a daily dose of what’s going on in the world of technology.

How machine learning is helping us probe the secret names of animals

The news: Do animals have names? It seems so, after new research appears to have discovered that small monkeys called marmosets “vocally label” their monkey friends with specific sounds.

How they did it: The team used audio recorders and pattern-recognition software to analyze the animals’ high-pitched chirps and twitters. To prove they’d cracked the monkey code—and learned the secret names—the team played recordings at the marmosets through a speaker and found they responded more often when their label, or name, was in the recording.

Why it matters: Until now, only humans, dolphins, elephants, and probably parrots had been known to use specific sounds to call out to other individuals. This sort of research could provide clues to the origins of human language, arguably the most powerful innovation in our species’ evolution. Read the full story.

—Antonio Regalado

A new smart mask analyzes your breath to monitor your health

Your breath can give away a lot about you. Each exhalation contains all sorts of compounds, including possible biomarkers for disease or lung conditions, that could give doctors a valuable insight into your health.

Now a new smart mask could help doctors check your breath for these signals continuously and in a noninvasive way. A patient could wear the mask at home, measure their own levels, and then go to the doctor if a flare-up is likely. Read the full story.

—Scott J Mulligan

A new way to build neural networks could make AI more understandable

A tweak to the way artificial neurons work in neural networks could make AIs easier to decipher.

Artificial neurons—the fundamental building blocks of deep neural networks—have survived almost unchanged for decades. While these networks give modern artificial intelligence its power, they are also inscrutable. 

Existing artificial neurons, used in large language models like GPT4, work by taking in a large number of inputs, adding them together, and converting the sum into an output using another mathematical operation inside the neuron. Combinations of such neurons make up neural networks, and their combined workings can be difficult to decode. 

But the new way to combine neurons works a little differently—and should be easier to make sense of. Read the full story.

—Anil Ananthaswamy

The must-reads

I’ve combed the internet to find you today’s most fun/important/scary/fascinating stories about technology.

1 The arrest of Telegram’s founder is unsettling Silicon Valley
It’s opened encryption up to new levels of scrutiny. (NYT $)+ Defenders of encryption fear the case will embolden authorities to attack it. (WP $)

2 Publishers are opting out of Apple’s AI scraping
Apple gave major sites the choice to hand over their data, so they’ve said no. (Wired $)
+ It is also reported to be investing in AI giant OpenAI. (WSJ $)
+ Apple is poised to release new AI features in its next iOS update. (MIT Technology Review)

3 Brazil is going after Elon Musk
A judge has vowed to shut down X and has blocked Starlink’s bank accounts. (Bloomberg $)+ Musk is waging an ongoing battle with Brazil’s Supreme Court justice. (FT $)

4 Schools are still grappling with AI
Teachers are split over whether using the tools constitutes cheating or not. (New Yorker $)
+ ChatGPT is going to change education, not destroy it. (MIT Technology Review)

5 US regulators are rethinking cancer drug dosing in clinical trials
The FDA wants drugmakers to reexamine their dosing. Startups are worried. (WSJ $)
+ Cancer vaccines are having a renaissance. (MIT Technology Review)

6 We’re learning more about the proteins that regulate our genes
It looks as though they’ve been secretly managing our cells, too. (Knowable Magazine)

7 This company teaches gas-fueled car owners how to convert them into EVs
But retrofitting vehicles comes with some pretty major risks. (Rest of World)
+ Why EV charging needs more than Tesla. (MIT Technology Review)

8 Meta’s AI assistant is steadily growing more popular
It’s got around 400 million monthly users. (The Information $)
+ But arch rival OpenAI has around 200 million weeklyusers. (Axios)

9 LA’s new arena is fully digitized  
Facial recognition cameras are everywhere, and good luck buying anything without its official app. (The Atlantic $)

10 Algorithm-driven music recommendations are hit and miss
Here’s some different ways to find new tunes. (WP $)
+ How to break free of Spotify’s algorithm. (MIT Technology Review)

Quote of the day

“You’re going to be left with crypto scams and rapid weight-loss adverts.”

—An insider tells the Financial Times how Telegram’s recent legal troubles are likely to deter advertisers from wanting to work with the platform.

The big story

What’s next for the world’s fastest supercomputers

September 2023

When the Frontier supercomputer came online last year, it marked the dawn of so-called exascale computing, with machines that can execute an exaflop—or a quintillion (1018) floating point operations a second.

Since then, scientists have geared up to make more of these blazingly fast computers: several exascale machines are due to come online in the US and Europe in 2024.

But speed itself isn’t the endgame. Researchers hope to pursue previously unanswerable questions about nature—and to design new technologies in areas from transportation to medicine. Read the full story.

—Sophia Chen

We can still have nice things
A place for comfort, fun and distraction to brighten up your day. (Got any ideas? Drop me a line or tweet ’em at me.)

+ If you’ve ever struggled to erect a tent, just know the world record for putting one up is one minute and seven seconds.
+ Are we all beer girls now?
+ Traveling art exhibitions are big business these days.
+ Brace yourself: fall is coming.

Main Menu