Data retention policy
We retain user data, including search queries and interaction logs, for as long as necessary to deliver our AI-powered search functionality and optimize the user experience. Data is stored for a maximum of 30 days, unless otherwise required by law or agreed upon with the user. Regular reviews are conducted to delete data no longer needed for active services.
Data archiving and removal policy
User data from connected systems like Google Drive is archived when it is no longer actively required but may be needed for legal, compliance, or auditing purposes. Archived data is securely stored and access is restricted to authorized personnel. Upon user request, or at the end of a data retention period, data is permanently deleted from our systems within 30 days, unless legal obligations require otherwise.
Data storage policy
All data is securely stored in cloud provider’s data centers (e.g., AWS) with encryption applied both at rest and in transit. Data from connected services (e.g., Google Drive, SharePoint) is accessed securely using OAuth or other approved authorization mechanisms, and is never stored beyond the period necessary for processing user queries or till the time the user has as account with us. We implement robust access controls to ensure that only authorized personnel can access data, and we regularly audit our infrastructure to ensure security and compliance with industry standards.
Data center location(s)
United States
Data hosting details
Cloud-hosted
App/service has sub-processors
yes
App/service uses large language models (LLM)
yes
LLM model(s) used
text-embedding-3-small, gpt-4-turbo, rerank-english-v3.0
LLM retention settings
Data sent to OpenAI and Cohere is not stored after processing. Both providers do not retain data for training or long-term storage. Queries are temporarily cached for processing and deleted after use, adhering to their retention policies.
LLM data tenancy policy
Our application ensures data isolation between users. Data sent to OpenAI and Cohere models is processed independently and not shared between users. No user data is retained by the LLMs beyond the query's scope, ensuring strict tenant separation.
LLM data residency policy
Data processed by OpenAI and Cohere models follows their data residency policies. While we do not control data residency during LLM processing, both providers comply with global privacy standards, including GDPR, ensuring data is handled securely.