BurpGPT
Aegis CyberBurpGPTUpgrade to Pro edition
  • Welcome
  • Getting Started
    • Installation
  • How To
    • Use the OpenAI's API feature
    • Use the Azure OpenAI Service's API feature
    • Use the Local LLM feature
    • Use the Prompt Library feature
    • Use a custom-trained model
  • Help & FAQ
    • FAQ
      • Can BurpGPT editions be used as standalone applications operating independently?
      • Does BurpGPT Pro cover Azure OpenAI Service/OpenAI usage, or do I need an extra subscription?
      • Does the Pro edition's local LLM feature share any data with OpenAI or any third party?
      • How can I cancel my subscription?
      • How to resolve the 'Runtime Error: Bad magic number in .pyc file' error?
      • How to resolve the 'Cannot use chat template functions...' error?
      • If the server displays an error indicating that port 3000 is in use, how can I resolve this issue?
      • Is it possible for BurpGPT editions to produce false positive results?
      • What are the possible causes for an unsuccessful activation of my BurpGPT Pro?
      • What is the limit for device activations with a single BurpGPT Pro license?
    • Changelog
    • Roadmap
Powered by GitBook
On this page
  • Configuration
  • Analyse HTTP traffic
  • View GPT-generated insights

Was this helpful?

  1. How To

Use the OpenAI's API feature

This page outlines the steps involved in using the OpenAI's integration.

PreviousInstallationNextUse the Azure OpenAI Service's API feature

Last updated 9 months ago

Was this helpful?

To use this feature, an active OpenAI account with adequate credits for API consumption is required. Please be aware that the creation of an OpenAI account and the credit assignment procedure are beyond the scope of this guide.

For more information on pricing and how to acquire credits, please refer to .

It is important to note that using this feature involves sharing data with OpenAI. Therefore, we highly recommend that you read and ensure that you are comfortable with it. If you are working with clients, sensitive projects, or are concerned about data privacy, we recommend upgrading to BurpGPT Pro and using its Local LLM feature instead to keep your data private and secure.

Configuration

  1. Go to the Azure/OpenAI API tab.

  2. Enter your OpenAI API key into the API key field. You can obtain a key .

Keep your API key confidential and do not share it with anyone. The API key is linked to your account and can affect your billing. This key is necessary for communicating with the OpenAI API.

If you want to avoid subscription fees for the OpenAI API, consider using the Local LLM feature. With this feature, you can issue unlimited queries without relying on the OpenAI API.

  1. Select one of the pre-built models from the Model dropdown field. The associated number of datapoints used to train the model is displayed under the Model size field.

  2. To optimise the performance and usage of OpenAI's models, set a Max prompt length. By adjusting this parameter, you can optimise the amount of information you can provide to the model.

    • Max prompt length: determines the maximum size of your prompt once the placeholders have been replaced.

  3. Choose a Role from the following options:

    • Assistant: This role is used to represent the model’s responses based on the user messages. The assistant role is responsible for generating the actual response to the user’s query.

    • System: This role is used to specify the way the model answers questions. For example, if the model is designed to be a helpful assistant, the system role would be "You are a helpful assistant".

    • User: This role is equivalent to the queries made by the user. It is used to provide the model with the necessary context to generate a response.

Test model

After , you can test the selected model by clicking on the Test button. This will send a test query to the OpenAI API.

The results of your query will be displayed in a dialog box, as follows:

Analyse HTTP traffic

Finally, to scan your HTTP traffic against your model and prompt, you can either:

  • Instruct BurpGPT to use the selected OpenAI's model when performing passive scans with Burp Suite by clicking on the Passive Scan: OpenAI's API button.

  • Use the custom context menu actions to send relevant requests for analysis, by simply right-clicking in request/response and selecting Extensions -> BurpGPT Pro -> Send to OpenAI's API.

View GPT-generated insights

A new Information-level severity issue, named GPT-generated insights, will appear under Target -> Site map and Dashboard -> Issue Activity.

This issue will provide detailed information about the query made to the selected model and will also include the model response as illustrated in the following screenshot:

OpenAI Pricing
OpenAI's privacy policy
there
configuring the model settings
OpenAI API: GPT-generated insights.