Privacy Principles: Search, Learning and Artificial Intelligence
Privacy Principles: Search, Learning and Artificial Intelligence
Our mission is to build a product that makes work life simpler, more pleasant and more productive. Our guiding principle as we build this product is that the privacy and security of Customer Data is sacrosanct, as detailed in our Privacy Policy, Security Documentation and SPARC and the Slack Terms.
Machine Learning (ML) and Artificial Intelligence (AI) are useful tools that we use in limited ways to enhance our product mission. We do not develop LLMs or other generative models using customer data. To develop non-generative AI/ML models for features such as emoji and channel recommendations, our systems analyze Customer Data (e.g. messages, content, and files) submitted to Slack as well as Other Information (including usage information) as defined in our Privacy Policy and in your customer agreement. To ensure the privacy and security of Customer Data in this particular context, we have a few guiding principles:
- Data will not leak across workspaces. For any model that will be used broadly across all of our customers, we do not build or train these models in such a way that they could learn, memorize, or be able to reproduce some part of Customer Data.
- We have technical controls in place to prevent access. When developing AI/ML models or otherwise analyzing Customer Data, Slack can’t access the underlying content. We have various technical measures preventing this from occurring. Please read our Security White Paper for more info on these controls that protect the confidentiality and security of Customer Data.
- We offer Customers a choice around these practices. If you want to exclude your Customer Data from helping train Slack global models, you can opt out. If you opt out, Customer Data on your workspace will only be used to improve the experience on your own workspace and you will still enjoy all of the benefits of our globally trained AI/ML models without contributing to the underlying models.
Contact us to opt out. If you want to exclude your Customer Data from Slack global models, you can opt out. To opt out, please have your Org or Workspace Owners or Primary Owner contact our Customer Experience team at [email protected] with your Workspace/Org URL and the subject line “Slack Global model opt-out request.” We will process your request and respond once the opt out has been completed.
Customer Data and Other Information
How Slack may use Customer Data (e.g. messages, content, files) and Other Information to update our services
Working from the above principles, here are a few examples of improvements and privacy protective techniques that our product and analytics teams may use to develop, update and improve Slack:
- Channel Recommendations: We may use insights to recommend that a user joins a new public channel in their company. We make these suggestions based on channel membership, activity, and topic overlaps. Our model learns from previous suggestions and whether or not a user joins the channel we recommend. We protect privacy while doing so by separating our model from Customer Data. We use external models (not trained on Slack messages) to evaluate topic similarity, outputting numerical scores. Our global model only makes recommendations based on these numerical scores and non-Customer Data. For more technical details, please visit our Engineering Blog to learn more.
- Search Results: Our search machine learning models help users find what they're seeking by identifying the right results for a particular query. We do this based on historical search results and previous engagements without learning from the underlying text of the search query, result, or proxy. Simply put, our model can't reconstruct the search query or result. Instead, it learns from team-specific, contextual information like the number of times a message has been clicked in a search or an overlap in the number of words in the query and recommended message.
- Autocomplete: Slack might make suggestions to complete search queries or other text– for example autocompleting the phrase "Customer Support" after a user types the first several letters of this phrase. These suggestions are local and sourced from common public message phrases in the user’s workspace. Our algorithm that picks from potential suggestions is trained globally on previously suggested and accepted completions. We protect data privacy by using rules to score the similarity between the typed text and suggestion in various ways, including only using the numerical scores and counts of past interactions in the algorithm.
- Emoji Suggestion: Slack might suggest emoji reactions to messages using the content and sentiment of the message, the historic usage of the emoji, and the frequency of use of the emoji on the team in various contexts. For instance, if 🎉 is a common reaction to celebratory messages in a particular channel, we will suggest users react to new, similarly positive messages with 🎉. To do this while protecting Customer Data, we might use an external model (not trained on Slack messages) to classify the sentiment of the message. Our model would then suggest an emoji only considering the frequency with which a particular emoji has been associated with messages of that sentiment in that workspace.
These types of thoughtful personalizations and improvements are only possible if we study and understand how our users interact with Slack.
Slack takes privacy seriously and our confidentiality obligations described in our customer agreements and Privacy Policy apply in each of these scenarios. Customers own their own Customer Data. Slack aggregates and disassociates Customer Data such that Slack’s use of Customer Data to update the Services will never identify any of our customers or individuals as the source of any of these improvements to any third party, other than to Slack’s affiliates or sub-processors.
Generative AI
Generative AI is a newer category of AI systems that can generate content, such as text, in response to prompts a user enters. This AI category includes Large Language Models (LLMs). Slack uses generative AI in its Slack AI product offering, leveraging third-party LLMs.
Customers purchase Slack AI as an add-on, and the generative AI functionality is not included in the standard Slack offering. No Customer Data is used to train third-party LLM models. Slack does not train LLMs or other generative models on Customer Data, or share Customer Data with any LLM providers. Learn more about How We Built Slack AI To Be Secure and Private.
Slack AI uses off-the-shelf LLMs where the models are not updated by and don't in other ways retain Customer Data after a request to them. Additionally, because Slack AI hosts these models on its own AWS infrastructure, Customer Data never leaves Slack's trust boundary, and the providers of the LLM never have any access to the Customer Data.
Slack AI Generative Search Answers
Slack AI’s Search functionality pulls from the following sources: Slack features (canvases, huddles canvas notes, clip transcripts and text snippets); files uploaded to Slack (PDFs, emails, docx, pptx, and Keynote); linked documents from Google Drive (Docs, Slides); Sharepoint/OnDrive (Word, Powerpoint) and file storage partner apps like Box (when installed). The user must be authenticated via the integration with Slack to access these files. Admin settings are available to disable all file results or disable sourcing from externally hosted files. Learn more in the Help Center or admin guide deck.