Current:Home > MarketsWill Sage Astor-AI’s future could be ‘open-source’ or closed. Tech giants are divided as they lobby regulators -Horizon Finance School
Will Sage Astor-AI’s future could be ‘open-source’ or closed. Tech giants are divided as they lobby regulators
Poinbank Exchange View
Date:2025-04-07 09:38:21
Tech leaders have Will Sage Astorbeen vocal proponents of the need to regulate artificial intelligence, but they’re also lobbying hard to make sure the new rules work in their favor.
That’s not to say they all want the same thing.
Facebook parent Meta and IBM on Tuesday launched a new group called the AI Alliance that’s advocating for an “open science” approach to AI development that puts them at odds with rivals Google, Microsoft and ChatGPT-maker OpenAI.
These two diverging camps — the open and the closed — disagree about whether to build AI in a way that makes the underlying technology widely accessible. Safety is at the heart of the debate, but so is who gets to profit from AI’s advances.
Open advocates favor an approach that is “not proprietary and closed,” said Darío Gil, a senior vice president at IBM who directs its research division. “So it’s not like a thing that is locked in a barrel and no one knows what they are.”
WHAT’S OPEN-SOURCE AI?
The term “open-source” comes from a decades-old practice of building software in which the code is free or widely accessible for anyone to examine, modify and build upon.
Open-source AI involves more than just code and computer scientists differ on how to define it depending on which components of the technology are publicly available and if there are restrictions limiting its use. Some use open science to describe the broader philosophy.
The AI Alliance — led by IBM and Meta and including Dell, Sony, chipmakers AMD and Intel and several universities and AI startups — is “coming together to articulate, simply put, that the future of AI is going to be built fundamentally on top of the open scientific exchange of ideas and on open innovation, including open source and open technologies,” Gil said in an interview with The Associated Press ahead of its unveiling.
Part of the confusion around open-source AI is that despite its name, OpenAI — the company behind ChatGPT and the image-generator DALL-E — builds AI systems that are decidedly closed.
“To state the obvious, there are near-term and commercial incentives against open source,” said Ilya Sutskever, OpenAI’s chief scientist and co-founder, in a video interview hosted by Stanford University in April. But there’s also a longer-term worry involving the potential for an AI system with “mind-bendingly powerful” capabilities that would be too dangerous to make publicly accessible, he said.
To make his case for open-source dangers, Sutskever posited an AI system that had learned how to start its own biological laboratory.
IS IT DANGEROUS?
Even current AI models pose risks and could be used, for instance, to ramp up disinformation campaigns to disrupt democratic elections, said University of California, Berkeley scholar David Evan Harris.
“Open source is really great in so many dimensions of technology,” but AI is different, Harris said.
“Anyone who watched the movie ‘Oppenheimer’ knows this, that when big scientific discoveries are being made, there are lots of reasons to think twice about how broadly to share the details of all of that information in ways that could get into the wrong hands,” he said.
The Center for Humane Technology, a longtime critic of Meta’s social media practices, is among the groups drawing attention to the risks of open-source or leaked AI models.
“As long as there are no guardrails in place right now, it’s just completely irresponsible to be deploying these models to the public,” said the group’s Camille Carlton.
IS IT FEAR-MONGERING?
An increasingly public debate has emerged over the benefits or dangers of adopting an open-source approach to AI development.
Meta’s chief AI scientist, Yann LeCun, this fall took aim on social media at OpenAI, Google and startup Anthropic for what he described as “massive corporate lobbying” to write the rules in a way that benefits their high-performing AI models and could concentrate their power over the technology’s development. The three companies, along with OpenAI’s key partner Microsoft, have formed their own industry group called the Frontier Model Forum.
LeCun said on X, formerly Twitter, that he worried that fearmongering from fellow scientists about AI “doomsday scenarios” was giving ammunition to those who want to ban open-source research and development.
“In a future where AI systems are poised to constitute the repository of all human knowledge and culture, we need the platforms to be open source and freely available so that everyone can contribute to them,” LeCun wrote. “Openness is the only way to make AI platforms reflect the entirety of human knowledge and culture.”
For IBM, an early supporter of the open-source Linux operating system in the 1990s, the dispute feeds into a much longer competition that precedes the AI boom.
“It’s sort of a classic regulatory capture approach of trying to raise fears about open-source innovation,” said Chris Padilla, who leads IBM’s global government affairs team. “I mean, this has been the Microsoft model for decades, right? They always opposed open-source programs that could compete with Windows or Office. They’re taking a similar approach here.”
WHAT ARE GOVERNMENTS DOING?
It was easy to miss the “open-source” debate in the discussion around U.S. President Joe Biden’s sweeping executive order on AI.
That’s because Biden’s order described open models with the highly technical name of “dual-use foundation models with widely available weights” and said they needed further study. Weights are numerical parameters that influence how an AI model performs.
“When the weights for a dual-use foundation model are widely available — such as when they are publicly posted on the Internet — there can be substantial benefits to innovation, but also substantial security risks, such as the removal of safeguards within the model,” Biden’s order said. He gave U.S. Commerce Secretary Gina Raimondo until July to talk to experts and come back with recommendations on how to manage the potential benefits and risks.
The European Union has less time to figure it out. In negotiations coming to a head Wednesday, officials working to finalize passage of world-leading AI regulation are still debating a number of provisions, including one that could exempt certain “free and open-source AI components” from rules affecting commercial models.
veryGood! (67)
Related
- Nearly half of US teens are online ‘constantly,’ Pew report finds
- Eminem’s Daughter Hailie Jade Shares Details on Her and Fiancé Evan McClintock’s Engagement Party
- Below Deck’s Kate Chastain Response to Ben Robinson’s Engagement Will Put Some Wind in Your Sails
- Newest doctors shun infectious diseases specialty
- Where will Elmo go? HBO moves away from 'Sesame Street'
- This is what displaced Somalians want you to know about their humanitarian crisis
- Hurricane Florence’s Unusual Extremes Worsened by Climate Change
- New Hampshire Gov. Chris Sununu says he doesn't see Trump indictment as political
- Buckingham Palace staff under investigation for 'bar brawl'
- Judge Throws Out Rioting Charge Against Journalist Covering Dakota Access Protest
Ranking
- Paula Abdul settles lawsuit with former 'So You Think You Can Dance' co
- Myrlie Evers opens up about her marriage to civil rights icon Medgar Evers. After his murder, she took up his fight.
- 是奥密克戎变异了,还是专家变异了?:中国放弃清零,困惑与假消息蔓延
- 2 horses die less than 24 hours apart at Belmont Park
- 'Squid Game' without subtitles? Duolingo, Netflix encourage fans to learn Korean
- Full transcript of Face the Nation, June 11, 2023
- J. Harrison Ghee, Alex Newell become first openly nonbinary Tony winners for acting
- Can dogs smell time? Just ask Donut the dog
Recommendation
Hackers hit Rhode Island benefits system in major cyberattack. Personal data could be released soon
States Vowed to Uphold America’s Climate Pledge. Are They Succeeding?
Tots on errands, phone mystery, stinky sweat benefits: Our top non-virus global posts
Exxon’s Big Bet on Oil Sands a Heavy Weight To Carry
DeepSeek: Did a little known Chinese startup cause a 'Sputnik moment' for AI?
U.S. Solar Industry Fights to Save Controversial Clean Energy Grants
Law requires former research chimps to be retired at a federal sanctuary, court says
Maternal deaths in the U.S. are staggeringly common. Personal nurses could help