Artists Urge Action on AI, but Congress Is Slow to Respond
22.05.2024 - 18:37
/ variety.com
Gene Maddaus Senior Media Writer Scarlett Johansson called out OpenAI this week for mimicking her voice for a new chatbot, underscoring the urgency that many artists feel about regulating artificial intelligence. But in Congress, taking on AI is starting to resemble wrestling an octopus. It’s so sprawling, it’s hard to know how to begin.
The Senate AI Working Group issued a “roadmap” last week. But that left it vague where Congress is going or when it will get there. Some have warned against stifling innovation.
And while some regulation is likely to happen eventually, it appears it will be done piecemeal — one tentacle at a time. “From the House perspective, part of the challenge is that we’re under Republican control,” Democratic Rep. Ted Lieu of California, co-chair of the House task force on AI, tells Variety.
“It’s been very chaotic. We’re just trying to stop stupid stuff.” Hollywood unions and artists organizations argue that AI trains on artists’ work, and — if left unchecked — will churn out cheap imitations that will steal their jobs. They are lobbying for proposals to address copyright concerns and to outlaw nonconsensual AI “deepfakes,” which threaten performers.
Some advocates envision a licensing system like ASCAP or BMI, in which AI companies would pay royalties to train on copyrighted work, with artists having the ability to opt out. “Our interest is in making sure individual creators are being paid for the ingestion of their content into AI platforms, for which they are not currently remunerated despite the fact that their souls are being stolen,” says James Silverberg, CEO of the American Society for Collective Rights Licensing, which distributes royalties to illustrators and photographers. Some artists
.
The website popstar.one is an aggregator of news from open sources. The source is indicated at the beginning and at the end of the announcement. You can
send a complaint on the news if you find it unreliable.