TechCrunch Minute: Slack may be training its AI off of your messages — and opting out is harder than you’d think

Written by
Amanda Silberling
Published on
May 20, 2024, 4:04 p.m.

Slack is under fire for its shady policies around using customer data to train its AI.

According to Slack’s privacy principles , customer data like messages and files can be used to train Slack’s AI and machine learning models. That sounds pretty alarming, but it gets worse. If you don’t want your data to become part of a training set, you have to email Slack to opt out.

On Threads, one machine learning engineer at Slack, Aaron Maurer, said , “We don’t train or fine tune LLMs on your messages, if that’s what you’re worried about.” That is, in fact, exactly what people are worried about. But even if Maurer is right, and Slack is not currently training its AI on our data, customers are worried that as written, the privacy principles give Slack the license to do this.

Slack told TechCrunch that it does not “build these models in such a way that they could learn, memorize, or be able to reproduce some part of customer data.”

Slack’s comments have only sparked more confusion, because they’re incongruous with what their policy actually says.

The Salesforce-owned company’s fumble reiterates how uncomfortable some consumers are with AI. Whether we like it or not, we’ve all probably had our data used to train AI already, because some of the largest LLMs are trained on datasets that basically just scrape the whole internet.

The Slack saga shows how hard it is for the average consumer to know what data any company is collecting from them at any given moment. That knowledge doesn’t need to be so opaque.

Weekly newsletter
No spam. Just the latest releases and tips, interesting articles, and exclusive interviews in your inbox every week.
Read about our privacy policy .
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.