Summarize

POST

Given a text, the model will return a summary.

Optionally include instructions to influence the way the summary is generated.

If use_context is set to true, the model will also use the content coming from the ingested documents in the summary. The documents being used can be filtered by their metadata using the context_filter. Ingested documents metadata can be found using /ingest/list endpoint. If you want all ingested documents to be used, remove context_filter altogether.

If prompt is set, it will be used as the prompt for the summarization, otherwise the default prompt will be used.

When using 'stream': true, the API will return data chunks following OpenAI’s streaming model:

{"id":"12345","object":"completion.chunk","created":1694268190,
"model":"private-gpt","choices":[{"index":0,"delta":{"content":"Hello"},
"finish_reason":null}]}

Request

This endpoint expects an object.
textstringOptional
use_contextbooleanOptional
context_filterobjectOptional
promptstringOptional
instructionsstringOptional
streambooleanOptional

Response

This endpoint returns an object.
summarystring