BurpGPT
CtrlK
Aegis CyberBurpGPTUpgrade to Pro edition
  • Welcome
  • Getting Started
    • Installation
    • Supported Model Providers
  • How To
    • Use the OpenAI's API feature
    • Use the Azure OpenAI Service's API feature
    • Use the Local LLM feature
    • Use the Prompt Library feature
    • Use a custom-trained model
  • Help & FAQ
    • FAQ
      • Can BurpGPT editions be used as standalone applications operating independently?
      • Does BurpGPT Pro cover Azure OpenAI Service/OpenAI usage, or do I need an extra subscription?
      • Does the Pro edition's local LLM feature share any data with OpenAI or any third party?
      • How can I cancel my subscription?
      • How to resolve the 'Runtime Error: Bad magic number in .pyc file' error?
      • How to resolve the 'Cannot use chat template functions...' error?
      • How to purchase and manage licenses as a reseller?
      • If the server displays an error indicating that port 3000 is in use, how can I resolve this issue?
      • Is it possible for BurpGPT editions to produce false positive results?
      • What are the possible causes for an unsuccessful activation of my BurpGPT Pro?
      • What is the limit for device activations with a single BurpGPT Pro license?
    • Changelog
    • Roadmap
Powered by GitBook
On this page

Was this helpful?

  1. Getting Started

Supported Model Providers

This page lists all compatible LLM providers.

The following is a list of currently supported LLM providers:

For Local or Custom models, communication uses the standard OpenAI API schema.

  • Anthropic

  • Google AI Gemini

  • Mistral AI

  • Ollama

  • OpenAI

  • Local / Custom

PreviousInstallationNextUse the OpenAI's API feature

Last updated 4 days ago

Was this helpful?