The best way to avoid killer robots and other dystopian uses for AI is to focus on all the ...

  • 时间: 2018-09-24 05:48:57

When it comes to how artificial-intelligence technology might affect society, there are a host of things to worry about, including the massive loss of jobs and killer robots.

But the best way to avoid such negative outcomes may be to ignore them, more or less.

That's the advice of Phil Libin, CEO of All Turtles, a startup that focuses on turning AI-related ideas into commercial products and companies. In a recent conversation with Business Insider, Libin likened the situation to some advice he received when he was learning to ride a motorcycle.

His instructor taught him that if an accident happened in front of him while he was riding on the highway, such as a semi truck flipping over, the worst thing to do would be to stare at the truck. Instead, his instructor said, he should focus on the point he needed to get to to avoid colliding with the truck.

"If you look at what you're trying to avoid, then you're going to run into it," said Libin, who previously founded Evernote. "You've got to look at where you want to be."

The tech industry would do well to follow that admonition when it comes to developing artificial intelligence, he said.

Years in the making, AI is starting to progress rapidly. It's being used by consumers in the form of intelligent assistants such as Amazon's Alexa to answer trivia questions and make purchases. And it's being used by corporations to help them make business decisions.

AI has the potential for good — and evil

Many observers think the technology could transform society in profound ways, and not necessarily for the better.

Indeed, there are some potentially dangerous and dystopian outcomes and uses of AI. It's already starting to be used in China as part of a mass surveillance scheme. It could be used to track people basically from their birth on, collecting intimate insights into their every thought and desire. It could be used to perpetuate or worsen discrimination against particular people or groups. And it could be used to power terrifying new weapons.

Technologists and policy makers ought not ignore such potential uses of the technology, Libin said. They should be aware of them. But the best way to avoid them would be to concentrate on developing ways to use AI in socially beneficial ways, he said.

"There really is a flipped-over truck, and there's all sorts of bad things that can happen. And we definitely need to work towards not hitting it," Libin said. "But the best way to do that is to [say] … this is where we want to go. Here's a vision of certain products that are like obviously good, and virtuous, and the world needs them, and they solve real problems, and let's make those products."

Indeed, that's what he sees as a big part of All Turtles' mission. One of the first projects the company helped incubate is a chatbot called Spotthat is designed to make it easier for employees to document and report incidents of sexual harassment and discrimination. Another is Disco, a plug-in for collaboration software Slack that helps employers give timely positive feedback to workers.

The projects All Turtles works on "is all stuff that we should be able to, right from beginning, right by design, feel good about," he said.