FACT FOCUS: Zoom says it isn’t training AI on calls without consent. But other data is fair game
An update to Zoom’s terms of service is worrying some online that the company now has permission to use their videos and chat logs for artificial intelligence training with no ability to opt out
Your support helps us to tell the story
From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.
At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.
The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.
Your support makes all the difference.An update to Zoom’s terms of service is raising alarm bells on social media, with users claiming it reveals the videoconferencing company is now tapping their online doctor visits and virtual happy hours to train artificial intelligence models.
“Zoom terms of service now require you to allow AI to train on ALL your data — audio, facial recognition, private conversations — unconditionally and irrevocably, with no opt out,” read one widely-shared tweet this week that has since been deleted. “Don’t try to negotiate with our new overlords.”
The company quickly responded with a blog post on Monday stressing that it “will not use audio, video, or chat customer content to train our artificial intelligence models without your consent,” and adding a line to the terms to make this clearer.
Online privacy experts say that this policy is now accurately reflected in the document. However, the terms do still allow Zoom to train AI on other data, such as how customers behave — and they question how much choice some meeting participants will have to opt out if, say, their boss decides otherwise.
Here’s a closer look at the facts.
CLAIM: Zoom’s terms of service give the company permission to use all customer data, including private conversations, for training artificial intelligence, with no ability to opt out.
THE FACTS: That is not accurate, at least now that Zoom has added language to the terms to make its policy clearer, experts say.
The current terms would not allow the company to tap user-generated content like video and chat for AI training without a customer opting in. However, once a meeting host agrees, other participants would have to leave if they don’t want to consent. The terms also allow Zoom to use other data, including information about user behavior, without additional permission.
“The face of these terms of service does now assure the user that Zoom is not going to use their customer content for the purpose of training artificial intelligence models without their consent,” John Davisson, director of litigation and senior counsel at the Electronic Privacy Information Center, told The Associated Press.
At issue is language Zoom added to its terms in March. The document differentiates between two types of data: “service generated data,” such as what features customers use and what part of the world they are in, and “customer content,” which is the data created by users themselves, such as audio or chat transcripts.
The terms state that service-generated data can be used for “machine learning or artificial intelligence (including the purposes of training and tuning algorithms and models.” Zoom's blog post says the company considers such data “to be our data,” and experts confirm this language would allow the company to use this data for AI training without obtaining additional consent.
Separately, the terms say that customer content may also be used “for the purpose” of machine learning or AI.
After this was highlighted on social media this week, the company clarified in its post that this refers to new generative AI features that users must agree to, which create things like automated meeting summaries for customers. Zoom said in a statement to the AP that in addition to enabling the features for themselves, users must separately consent to sharing this data with the company.
Experts said that the language in the March update was wide-reaching and could have opened the door for the company to use that data without additional permission if it wanted to.
But Zoom added a more explicit caveat to the terms on Monday, saying: “Notwithstanding the above, Zoom will not use audio, video or chat Customer Content to train our artificial intelligence models without your consent.”
With this language, Davisson said that using such data to train AI without a user consenting would now constitute a violation of the terms on Zoom’s part, opening the company up to litigation.
However, experts said the way this function works could still pose problems for some participants in Zoom calls if the host opts into the generative AI features.
Zoom says that if a meeting organizer decides to use the meeting summary feature, participants are sent a notification alerting them that an AI feature has been enabled and that their data may be shared for AI training. They are prompted to either proceed with the meeting or to leave.
Although this in theory offers all participants the ability to control how their data is used, it may not be possible for someone to opt out of a meeting or forgo Zoom altogether if they disagree, said Katharine Trendacosta, director of policy and advocacy at the Electronic Frontier Foundation.
“If the administrator consents and it’s your boss at your work who requires you to use Zoom, how is that really consent?” she asked.
Davisson has similar concerns.
“That sort of just-in-time acquisition of ‘consent,’ is not real consent,” he said. “And so that’s a pretty misleading caveat from them to introduce. It claims to rest on consent, but what it’s resting on really is just sort of what the meeting organizer or system administrator has decided to do.”
___
This is part of AP’s effort to address widely shared misinformation, including work with outside companies and organizations to add factual context to misleading content that is circulating online. Learn more about fact-checking at AP.