Enterprise workplace collaboration platform Slack has sparked a privacy backlash with the revelation that it has been scraping customer data, including messages and files, to develop new AI and ML models.
By default, and without requiring users to opt-in, Slack said its systems have been analyzing customer data and usage information to build AI/ML models to improve the software.
The company insists it has technical controls in place to block Slack from accessing the underlying content and promises that data will not lead across workplaces but, despite these assurances, corporate Slack admins are scrambling to opt-out of the data scraping.
Multiple CISOs polled by SecurityWeek say they're not surprised to hear that Slack - like many big-tech vendors - is developing AI/ML models on data flowing through its platform but grumbled that customers should not bear the burden of opting out of this data scraping.
In a social media post in response to critics, Slack said it has platform-level machine-learning models for things like channel and emoji recommendations and search results and insists that customers can exclude their data from helping train those ML models.
The company said Slack AI - which is a gen-AI experience natively built in Slack - is a separately purchased add-on that uses Large Language Models but does not train those LLMs on customer data.
In its documentation, Slack said data will not leak across workspaces.
This Cyber News was published on www.securityweek.com. Publication date: Sat, 18 May 2024 08:43:05 +0000