Mr. Musk recently warned of A.I.’s dangers on Bill Maher’s show and in a sit-down with Senator Chuck Schumer, Democrat of New York. Mr. Hoffman has explained the technology’s potential to Vice President Kamala Harris, Commerce Secretary Gina Raimondo and Transportation Secretary Pete Buttigieg. Last week, Mr. Altman said in a congressional hearing that “the benefits of the tools we have deployed so far vastly outweigh the risks.”
In Mr. Hoffman’s view, warnings about A.I.’s existential risk to humanity overstate what the technology can do. And he believes that other potential issues caused by A.I. — job loss, destruction of democracy, disruption of the economy — have an obvious fix: more technology.
“The solutions live in the future, not by enshrining the past,” he said.
That’s a tough pitch to a public that has seen tech’s harmful effects over the last decade, including social media misinformation and autonomous vehicle crashes. And this time, the risks are even larger, said Oded Netzer, a professor at Columbia Business School.
“It’s not just the risks, it’s how fast they are moving,” Mr. Netzer said of tech companies’ handling of A.I. “I don’t think we can hope or trust that the industry will regulate itself.”
Mr. Hoffman’s pro-A.I. campaign, he said, is meant to foster trust where it’s broken. “It’s not to say that there won’t be some harms in some areas,” he said. “The question is could we learn and iterate to a much better state?”
Mr. Hoffman has been thinking about that question since he studied symbolic systems at Stanford University in the late 1980s. There, he imagined how A.I. would facilitate “our Promethean moment,” he said in a YouTube video from March. “We can make these new things and we can journey with them.”
After working at PayPal and co-founding LinkedIn, the professional social network, in 2002, Mr. Hoffman began investing in start-ups including Nauto, Nuro and Aurora Innovation, all focused on applying A.I. tech to transportation. He also joined an A.I. ethics committee at DeepMind.
Source: | This article originally belongs to Nytimes.com