Integrate with open source LLMs using Ollama. With an instance of Ollama running locally you can use this integration to have a conversation in an Incident, download models, and create new models.
Ollama
- Details
- Content
- Dependencies
- Version History
Get up and running with large language models locally.
Integrations
Name | Description |
---|---|
Ollama (Community Contribution) |
Integrations
Name | Description |
---|---|
Ollama (Community Contribution) | Integrate with open source LLMs using Ollama. With an instance of Ollama running locally you can use this integration to have a conversation in an Incident, download models, and create new models. |
Required Content Packs (1)
Pack Name | Pack By |
---|---|
Base | By: Cortex XSOAR |
Optional Content Packs (0)
Pack Name | Pack By |
---|
All level dependencies (1)
Pack Name | Pack By |
---|---|
Base | By: Cortex XSOAR |
1.0.0 - 1741355 (May 8, 2024) Download
Get up and running with large language models locally.
PLATFORMS
Cortex XSOARCortex XSIAM
INFO
Supported By | Community | |
Created | May 8, 2024 | |
Last Release | December 4, 2024 |
WORKS WITH THE FOLLOWING INTEGRATIONS:
