BurpGPT
Aegis CyberBurpGPTUpgrade to Pro edition
  • Welcome
  • Getting Started
    • Installation
  • How To
    • Use the OpenAI's API feature
    • Use the Azure OpenAI Service's API feature
    • Use the Local LLM feature
    • Use the Prompt Library feature
    • Use a custom-trained model
  • Help & FAQ
    • FAQ
      • Can BurpGPT editions be used as standalone applications operating independently?
      • Does BurpGPT Pro cover Azure OpenAI Service/OpenAI usage, or do I need an extra subscription?
      • Does the Pro edition's local LLM feature share any data with OpenAI or any third party?
      • How can I cancel my subscription?
      • How to resolve the 'Runtime Error: Bad magic number in .pyc file' error?
      • How to resolve the 'Cannot use chat template functions...' error?
      • If the server displays an error indicating that port 3000 is in use, how can I resolve this issue?
      • Is it possible for BurpGPT editions to produce false positive results?
      • What are the possible causes for an unsuccessful activation of my BurpGPT Pro?
      • What is the limit for device activations with a single BurpGPT Pro license?
    • Changelog
    • Roadmap
Powered by GitBook
On this page

Was this helpful?

  1. Help & FAQ
  2. FAQ

Does the Pro edition's local LLM feature share any data with OpenAI or any third party?

This page addresses a frequently asked question.

PreviousDoes BurpGPT Pro cover Azure OpenAI Service/OpenAI usage, or do I need an extra subscription?NextHow can I cancel my subscription?

Last updated 1 year ago

Was this helpful?

When you use the local LLM feature with the pre-built model type options from the Hugging Face Transformers library, it downloads the pre-trained model and associated files from the . However, this process does not send any of your data to Hugging Face or any other external service. The model generation itself is performed locally on your machine, so your text inputs and generated text stay within your system.

To summarise, using the local LLM feature will only result in downloading - and then caching - the model files from the Hugging Face Model Hub, and no data will be sent to external services during the actual text generation process.

Hugging Face Model Hub