Does the Pro edition's local LLM feature share any data with OpenAI or any third party?

This page addresses a frequently asked question.

When you use the local LLM feature with the pre-built model type options from the Hugging Face Transformers library, it downloads the pre-trained model and associated files from the Hugging Face Model Hub. However, this process does not send any of your data to Hugging Face or any other external service. The model generation itself is performed locally on your machine, so your text inputs and generated text stay within your system.

To summarise, using the local LLM feature will only result in downloading - and then caching - the model files from the Hugging Face Model Hub, and no data will be sent to external services during the actual text generation process.

Last updated