The biggest technology failures of 2019 - MIT Technology Review
Technology Review be without our annual list of the year’s sorriest tech fails?
This year’s list includes the deadly, the dishonest, and the simply daft. Read
Boeing’s out-of-control autopilot
First one brand-new 737 Max Boeing plane, Lion Air Flight 510, crashed shortly after takeoff. Then another did the same. Everyone aboard died. In each case, pilots had struggled against an autopilot system that took over and plunged the planes to their doom.
Fake food computer
The MIT Media Lab has been called the “future factory”—but its “food computer” likely won’t be part of it.
In a 2015 TED Talk that gathered 1.8 million views, architect Caleb Harper introduced hydroponic boxes stuffed with electronics and AI, which he said would measure millions of combinations of light, temperature, and humidity. His Open Agriculture project, he said, was pioneering “cyber agriculture.”
Really? The food computer, it turns out, was nothing more than a glorified grow box that didn’t work very well. But by fertilizing the project with buzzwords—“climate hacking,” “open source,” “microbiome”—the Media Lab kept winning attention and funding for it. Claims for the contraption reached an absurd apex in April, when Harper said “machine learning” had been employed to grow basil that an MIT news release called “likely more delicious” than any ever tasted.
In September, workers stepped forward to blow the whistle, telling the media about fake photo shoots (the plants were purchased), smoke-and-mirror tactics, and environmental violations. By October, MIT officials had “halted most of the work” by the OpenAg group, according to the Boston Globe.
Within weeks of a major study identifying genes associated with homosexual behavior, a programmer had launched an app called “How Gay Are You?”
For $5.50, the app purported to use those research findings to calculate the gayness level of anyone, using results from a DNA test like those sold by 23andMe.
Controversy ensued. Was the app a “dangerous mischaracterization” of science or did it accurately underscore the main point, which is that there’s no one gene for being gay? Alternatively, did it show that the original research project to try to explain homosexual behavior was ill conceived?
The gaydar app is now gone (it didn’t survive the controversy), but the promise—or the problem—of genetic predictions isn’t going away. Gene scientists have new ways to link small genetic differences not only to a person’s risk of disease, but to traits like height, intelligence, or earning potential.
This year, an Israeli company launched that country’s first lunar lander, which unfortunately crash-landed on the moon in April. Luckily, no one was onboard. Unfortunately, something was.
It turned out that a US nonprofit called Arch Mission Foundation had secretly added to the mission payload a capsule full of tardigrades, or water bears. The microscopic, eight-legged creatures can survive in a dormant state through harsh conditions, and maybe even on the moon.
The concept of planetary protection is the idea that we shouldn’t pollute other worlds with earthly life. There’s the worry over contamination, and what’s more, if you do discover life outside of orbit, you’d like to be sure you didn’t put it there.
Without some water, the tardigrades aren’t likely to revive and spread. Still, the episode shows that today’s honor system might not be enough to ensure planetary protection.
Why did Arch do it? The foundation’s mission is to create a backup of planet Earth, and so it tests out technologies for long-lasting archives, like securing information in DNA strands or encapsulating insects in artificial amber. Its payload on the Israeli mission included nickel sheets nanopatterned with 60,000 pages of Wikipedia and other texts.
In a last-minute switch-up, Arch and its cofounder Nova Spivack decided to add some human hair, blood cells, and thousands of tardigrades. “We didn’t tell them we were putting life in this thing,” Spivack said. “We just decided to take the risk.”
Apple’s biased credit card
Why would a wealthy tech entrepreneur get a credit limit 10 times as high his wife’s on the new Apple Card, even though their assets are held in common? When one complained, a rep told him, “It’s just the algorithm.” A sexist algorithm! Steve Wozniak, Apple’s cofounder, said it happened to his wife, too. But what’s the program, and what does it do? Apple and Goldman Sachs, the bank backing the card, didn’t say. And that’s the problem. Computerized bias exists, but it’s hard to hold anyone, or anything, accountable. Facebook this year reached a settlement to stop letting advertisers intentionally discriminate in housing and job ads, yet research shows that unseen algorithms are still skewing results. Ads for taxi drivers on Facebook were automatically shown more often to minorities, and supermarket jobs to women.