Integrate with open source LLMs using Ollama. With an instance of Ollama running locally you can use this integration to have a conversation in an Incident, download models, and create new models.
Ollama
- Details
- Content
- Dependencies
- Version History
Get up and running with large language models locally.
Integrations
Name | Description |
---|---|
Ollama (Community Contribution) |
Integrations
Name | Description |
---|---|
Ollama (Community Contribution) | Integrate with open source LLMs using Ollama. With an instance of Ollama running locally you can use this integration to have a conversation in an Incident, download models, and create new models. |
PLATFORMS
Cortex XSOARCortex XSIAM
INFO
Supported By | Community | |
Created | May 8, 2024 | |
Last Release | December 4, 2024 |