Integrate with open source LLMs using Ollama. With an instance of Ollama running locally you can use this integration to have a conversation in an Incident, download models, and create new models.
Ollama
- Details
- Content
- Dependencies
- Version History
Get up and running with large language models locally.
Integrations
| Name | Description |
|---|---|
| Ollama (Community Contribution) |
Integrations
| Name | Description |
|---|---|
| Ollama (Community Contribution) | Integrate with open source LLMs using Ollama. With an instance of Ollama running locally you can use this integration to have a conversation in an Incident, download models, and create new models. |
Required Content Packs (1)
| Pack Name | Pack By |
|---|---|
| Base | By: Cortex XSOAR |
Optional Content Packs (0)
| Pack Name | Pack By |
|---|
All level dependencies (1)
| Pack Name | Pack By |
|---|---|
| Base | By: Cortex XSOAR |
PLATFORMS
Cortex XSOARCortex XSIAM
INFO
| Supported By | Community | |
| Created | May 8, 2024 | |
| Last Release | October 29, 2025 |
WORKS WITH THE FOLLOWING INTEGRATIONS:
