Chat with your Open WebUI LLM from neovim
Note
Not really proud of the quality, but it somewhat works.
Using lazy.nvim
{
"Speiser/openwebui.nvim",
dependencies = {
"nvim-lua/plenary.nvim",
},
config = function()
local openwebui = require("openwebui")
openwebui.setup({
api_endpoint = "https://localhost:8000/api/chat/completions",
model = "your-model",
api_key = "your-api-key",
})
vim.keymap.set("n", "<leader>ai", openwebui.start_chat, { noremap = true, silent = true })
end,
}```
