Connect with us

Hi, what are you looking for?

SecurityWeekSecurityWeek

Artificial Intelligence

User Outcry as Slack Scrapes Customer Data for AI Model Training

Slack reveals it has been training AI/ML models on customer data, including messages, files and usage information. It’s opt-in by default.

Slack data for AI

Enterprise workplace collaboration platform Slack has sparked a privacy backlash with the revelation that it has been scraping customer data, including messages and files, to develop new AI and ML models.

By default, and without requiring users to opt-in, Slack said its systems have been analyzing customer data and usage information (including messages, content and files) to build AI/ML models to improve the software.

The company insists it has technical controls in place to block Slack from accessing the underlying content and promises that data will not lead across workplaces but, despite these assurances, corporate Slack admins are scrambling to opt-out of the data scraping.

This line in Slack’s communication sparked a social media controversy with the realization that content in direct messages and other sensitive content posted to Slack was being used to develop AI/ML models and that opting out world require sending e-mail requests:

“If you want to exclude your Customer Data from Slack global models, you can opt out. To opt out, please have your org, workspace owners or primary owner contact our Customer Experience team at [email protected] with your workspace/org URL and the subject line ‘Slack global model opt-out request’. We will process your request and respond once the opt-out has been completed.”

Multiple CISOs polled by SecurityWeek say they’re not surprised to hear that Slack — like many big-tech vendors — is developing AI/ML models on data flowing through its platform but grumbled that customers should not bear the burden of opting out of this data scraping.

In a social media post in response to critics, Slack said it has platform-level machine-learning models for things like channel and emoji recommendations and search results and insists that customers can exclude their data from helping train those (non-generative) ML models. 

The company said Slack AI – which is a gen-AI experience natively built in Slack – is a separately purchased add-on that uses Large Language Models (LLMs) but does not train those LLMs on customer data. “Because Slack AI hosts the models on its own infrastructure, your data remains in your control and exclusively for your organization’s use. It never leaves Slack’s trust boundary and no third parties, including the model vendor, will have access to it,” the company said.

Advertisement. Scroll to continue reading.

In its documentation, Slack said data will not leak across workspaces. “For any model that will be used broadly across all of our customers, we do not build or train these models in such a way that they could learn, memorize, or be able to reproduce some part of customer data.”

Related: Slack Says Hackers Stole Private Source Code Repositories

Related: Slack Forces Password Resets After Discovering Software Flaw

Related: New Slack Connect DM Feature Raises Security Concerns

Related:  Slack Pays Bounty for Critical Vulnerability in Desktop App

Written By

Ryan Naraine is Editor-at-Large at SecurityWeek and host of the popular Security Conversations podcast series. He is a security community engagement expert who has built programs at major global brands, including Intel Corp., Bishop Fox and GReAT. Ryan is a founding-director of the Security Tinkerers non-profit, an advisor to early-stage entrepreneurs, and a regular speaker at security conferences around the world.

Click to comment

Trending

Daily Briefing Newsletter

Subscribe to the SecurityWeek Email Briefing to stay informed on the latest threats, trends, and technology, along with insightful columns from industry experts.

Join the session as we discuss the challenges and best practices for cybersecurity leaders managing cloud identities.

Register

The AI Risk Summit brings together security and risk management executives, AI researchers, policy makers, software developers and influential business and government stakeholders.

Register

People on the Move

Retired U.S. Army General and former NSA Director Paul M. Nakasone has joined the Board of Directors at OpenAI.

Jill Passalacqua has been appointed Chief Legal Officer at autonomous security solutions provider Horizon3.ai.

Cisco has appointed Sean Duca as CISO and Practice Leader for the APJC region.

More People On The Move

Expert Insights