feat(coding-agent): add standalone streaming codegen CLI and docs #2
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Title
feat(coding-agent): add standalone streaming codegen CLI and docs
Summary
Introduces a self-contained coding agent example that streams code generation from local Ollama models via LangChain. This lives outside the RAG pipeline and is runnable with a single command.
Motivation
What’s included
coding_agent_example/run.py: simple streaming code generator usingChatOllamaand a minimal prompt.coding_agent_example/start.sh: one-liner wrapper with sensible defaults.coding_agent_example/tools.pyandcoding_agent_example/agent.py: scaffolding for future tool/agent work (not used by the simple CLI).coding_agent_example/requirements.txt: minimal deps for the example.coding_agent_example/README.md: quickstart, advanced usage, and next-steps plan.How to run
Notes
--modelisn’t provided.Next steps (planned follow-up PRs)