BurpGPT
CtrlK
Aegis CyberBurpGPTUpgrade to Pro edition
  • Welcome
  • Getting Started
    • Installation
    • Supported model providers
  • How To
    • Use supported cloud-based model providers
    • Use supported local model providers
      • With Hugging Face
      • With the Ollama provider
    • Use the prompt library
    • Analyse HTTP traffic
    • View GPT-generated results
    • Test and validate model provider settings
  • Help & FAQ
    • FAQ
      • Can BurpGPT editions be used as standalone applications operating independently?
      • Does BurpGPT Pro cover usage of all supported cloud-based model providers like OpenAI?
      • Does the Pro edition's local LLM feature share any data with OpenAI or any third party?
      • How can I cancel my subscription?
      • How to resolve the 'Runtime Error: Bad magic number in .pyc file' error?
      • How to resolve the 'Cannot use chat template functions...' error?
      • How to purchase and manage licenses as a reseller?
      • If the server displays an error indicating that port 3000 is in use, how can I resolve this issue?
      • Is it possible for BurpGPT editions to produce false positive results?
      • What are the possible causes for an unsuccessful activation of my BurpGPT Pro?
      • What is the limit for device activations with a single BurpGPT Pro license?
    • Changelog
    • Roadmap
Powered by GitBook
On this page

Was this helpful?

  1. How To

Use supported local model providers

With Hugging FaceWith the Ollama provider
PreviousUse supported cloud-based model providersNextWith Hugging Face

Was this helpful?