This is today’s edition of The Download, our weekday newsletter that provides a daily dose of what’s going on in the world of technology.
The ad reads like an offer of salvation: Cancer kills many people. But there is hope in Apatone, a proprietary vitamin C–based mixture, that is “KILLING cancer.” The substance, an unproven treatment that is not approved by the FDA, is not available in the United States. If you want Apatone, the ad suggests, you need to travel to a clinic in Mexico.
If you’re on Facebook or Instagram and Meta has determined you may be interested in cancer treatments, it’s possible you’ve seen this ad. It is part of a pattern on Facebook of ads that make misleading or false health claims, targeted at cancer patients.
Evidence from Facebook and Instagram users, medical researchers, and its own Ad Library suggests that Meta is rife with ads containing sensational health claims, which the company directly profits from, with some misleading ads remaining unchallenged for months and even years. Read the full story.
The hacking industry faces the end of an era
The news: NSO Group, the world’s most notorious hacking company, could soon cease to exist. The Israeli firm, still reeling from US sanctions, has been in talks about a possible acquisition by the American military contractor L3 Harris. The deal is far from certain, but if it goes through, it’s likely to involve the dismantling of NSO Group and the end of an era.
Industry-wide turbulence: No matter what happens to NSO, the changes afoot in the global hacking industry are far bigger than any single company. That’s mostly down to two major changes: the US sanctioned NSO in late 2021, and days later the Israeli government severely restricted its hacking industry, cutting the number of countries firms can sell to from over 100 to just 37.
But… The industry is adjusting rather than disappearing. One thing we’re learning is that a vacuum can’t last long in a market where demand is so high. Read the full story.
—Patrick Howell O’Neill
We need smarter cities, not “smart cities”
The term “smart cities” originated as a marketing strategy for large IT vendors. It has now become synonymous with urban uses of technology, particularly advanced and emerging technologies. But cities are more than 5G, big data, driverless vehicles, and AI, and a focus on building “smart cities” risks turning cities into technology projects.
Truly smart cities recognize the ambiguity of lives and livelihoods, and they are driven by outcomes far beyond the implementation of “solutions.” They are defined by their residents’ talents, relationships, and sense of ownership—and not by the technology deployed there. Read the full story.
—Riad Meddeb and Calum Handforth
Coming soon: The TR35 list of innovators for 2022
On Wednesday, we’re announcing this year’s list of 35 Innovators Under 35: a chance to take a look at not just where technology is now, but where it’s going and the brilliant young minds that are making it happen.
The full list is in the latest issue of our print magazine and online from 29 June. You can subscribe here.
I’ve combed the internet to find you today’s most fun/important/scary/fascinating stories about technology.
1 Period tracking apps are rushing to anonymize their user data
Following the Supreme Court’s decision to strike down Roe v Wade, experts are concerned menstrual data could be exploited to incriminate people seeking abortions. (WSJ $)
+ How people seeking abortions can avoid leaving a digital trail. (WP $)
+ Roe discussions among Big Tech workers quickly soured last week. (Bloomberg $)
+ It’s mostly safe to store abortion pills for later use. (New York Mag)
+ High quality sex education is also under threat. (Vox)
+ Where to get abortion pills and how to use them. (MIT Technology Review)
2 China’s surveillance network is predicting crime and dissent before it happens
And surveilling vulnerable people, including those experiencing mental illnesses. (NYT $)
3 Inflation isn’t going away any time soon
But falling prices could provide a welcome respite. (Economist $)
4 Crypto’s elites don’t care about you
They also don’t care if you lose your life savings investing in their dodgy wares. (The Atlantic $)
+ Singapore has had enough of crypto cowboys. (The Register)
+ Crypto is weathering a bitter storm. Some still hold on for dear life. (MIT Technology Review)
5 The US is bungling its big semiconductor opportunity
And the chance to create thousands of jobs in the process. (WP $)
+ Taiwan, the world’s biggest chip producer, is facing a spike in power costs. (Bloomberg $)
+ Meanwhile, Japan is worrying about a shortage of specialist chip engineers. (FT $)
+ The chip boom could be coming to an end. (The Register)
7 Social media is a minefield for therapists
Professionals are conflicted about whether jumping on trends compromises their expertise. (Slate $)
8 Amazon’s product results are all over the place
Basically, it’s down to the company prioritizing US companies over Chinese firms. (WSJ $)
+ China’s selling community isn’t best pleased about it. (SCMP)
9 AI learns in virtual worlds
But when it comes to robots, a physical environment is still essential. (Quanta)
+ Sophisticated AI DALL-E has started generating fake human faces. (Motherboard)
+ Revealing how AI programs answer questions like humans destroys their mystique. (Wired $)
+ A quick guide to the most important AI law you’ve never heard of. (MIT Technology Review)
10 You smell like your friends
Whether you realize it or not. (Economist $)
Quote of the day
“I went from no one knowing who I was to all of the worst people on the internet knowing who I am.”
—Clara Sorrenti, a trans Twitch star known as Keffals, tells the Washington Post how she became targeted by anti-trans activists and other trolls after starting to discuss politics on the streaming platform.
The big story
AI has the potential to help us deal with vast societal challenges, like health inequalities, racial biases, and political polarization. However, its risks have become increasingly apparent, including opacity and lack of explainability, and design choices that result in bias. Whether AI is developed and used in good or harmful ways will depend in large part on the legal frameworks governing and regulating it.
There should be a new guiding tenet to AI regulation, a principle of legal neutrality asserting that the law should tend not to discriminate between AI and human behavior. Currently, the legal system is not neutral—for example, an AI that is significantly safer than a person may be the best choice for driving a vehicle, but existing laws may prohibit driverless vehicles. Neutral legal treatment would ultimately benefit human wellbeing by helping the law better achieve its underlying policy goals. Read the full story.
We can still have nice things
+ Better Call Saul’s creators discuss some of the most crucial moments from the much-loved series, which comes to an end next month.
+ This 104-year old man has turned to YouTube in search of “fellowship” .
+ I too would like to know the answer to this.
+ Are we witnessing the end of music genres? Answers on a postcard, please.
+ These 35 summer reads are bound to leave you captivated.