Skip to content

[3.9] Optional Ollama integration #9

@mikejmorgan-ai

Description

@mikejmorgan-ai

Implement optional Ollama runtime backend alongside llama.cpp, model management via Ollama (pull/store/list), local REST API (OpenAI-compatible), backend selection/fallback, Debian packaging, and security controls (localhost binding, sandboxing).

Scope

This epic covers 12 decisions and 11 tasks from the Cortex Linux planning system.

Source

  • Planning Tool: Skilliks
  • Module: See internal planning documentation

Tasks

Tasks will be added as sub-issues or checklist items as specification is refined.


Epic generated from Cortex Linux strategic planning

Metadata

Metadata

Assignees

No one assigned

    Labels

    P2-mediumv1.0 features - medium priorityepicEpic: major feature area with subtasks

    Type

    No type

    Projects

    No projects

    Milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions