A simple LangGraph project demonstrating basic workflow creation and state management with a chatbot example using LMStudio local LLM. Package management uses UV, the fastest package manager for Python.
- Simple Chatbot Workflow: Interactive conversation with state management
- Demo Mode: Predefined conversation flow for testing
- State Management: Maintains conversation history using LangGraph's state system
- Error Handling: Graceful handling of API errors and missing configurations
- Python 3.12 or higher
- UV installed and running
- LMStudio installed and running
- qwen3-4b-2507 model loaded in LMStudio, if your computer has enough resources, you can also try gpt-oos-20b
-
Clone the repository (if not already done):
git clone <your-repo-url> cd hello-langgraph
-
Install dependencies:
- LangGraph CLI is a tool for building and running LangGraph API server locally.
- The official documentation installs
langgraph-cliusingpipor Homebrew, we use UV to install it here. - A UV tool is a self-contained executable that can be installed and run from the command line. We install
langgraph-clitool using UV, then update the shell to make it available in the PATH.
uv tool install langgraph-cli uv tool update-shell
-
Set up LMStudio:
- Download and install LMStudio
- Load the qwen3-4b-2507 model in LMStudio
- Start the local server (default port: 1234)
-
[Optional] Register for a free LangSmith account to get an API key:
- This step is optional, but highly recommended to learn more about LangGraph and LangSmith. LangSmith is a platform for tracking and debugging LLM workflows
- Follow this documentation to create an account and get an API key
-
Set up environment variables:
- Create a
.envfile in the root directory of the project (make sure you add it to.gitignore) - Add the following environment variables:
# LMStudio Configuration LMSTUDIO_BASE_URL=http://localhost:1234/v1 LMSTUDIO_MODEL=qwen3-4b-2507 LMSTUDIO_API_KEY=lm-studio # LangSmith Configuration (optional - for tracing and monitoring) LANGCHAIN_TRACING_V2=true LANGCHAIN_API_KEY=<your-api-key> LANGCHAIN_PROJECT=hello-langgraph - Create a
- Run the LangGraph app:
uv run main.pyThis is a standalone command line application. You'll be presented with three options:
-
Interactive Conversation: Chat with the bot in real-time, type
quit,exit,bye, orq(case insensitive) to exit -
Demo Workflow: Run a predefined conversation sequence
-
Exit: Quit the application
-
Run LangGraph API server and LangGraph Studio:
If you are using Chrome browser, run:
$ langgraph dev
Welcome to
╦ ┌─┐┌┐┌┌─┐╔═╗┬─┐┌─┐┌─┐┬ ┬
║ ├─┤││││ ┬║ ╦├┬┘├─┤├─┘├─┤
╩═╝┴ ┴┘└┘└─┘╚═╝┴└─┴ ┴┴ ┴ ┴
- 🚀 API: http://127.0.0.1:2024
- 🎨 Studio UI: https://smith.langchain.com/studio/?baseUrl=http://127.0.0.1:2024
- 📚 API Docs: http://127.0.0.1:2024/docs
This in-memory server is designed for development and testing.
For production use, please use LangGraph Platform.If you are using Safari or Brave browser, they block plain HTTP traffic on localhost. You need to pass in a special --tunnel flag to use Cloudflare Tunnel to expose your local server.
$ langgraph dev --tunnel
Welcome to
╦ ┌─┐┌┐┌┌─┐╔═╗┬─┐┌─┐┌─┐┬ ┬
║ ├─┤││││ ┬║ ╦├┬┘├─┤├─┘├─┤
╩═╝┴ ┴┘└┘└─┘╚═╝┴└─┴ ┴┴ ┴ ┴
- 🚀 API: https://accurately-hydrogen-adware-batman.trycloudflare.com
- 🎨 Studio UI: https://smith.langchain.com/studio/?baseUrl=https://accurately-hydrogen-adware-batman.trycloudflare.com
- 📚 API Docs: https://accurately-hydrogen-adware-batman.trycloudflare.com/docs
- API is where you can make API calls to the LangGraph application, e.g. we have included a
client.pythat shows how to interact through the API. - LangGraph Studio is the UI for interacting with the LangGraph application. For more details on how to use LangGraph Studio, see LangGraph Studio Quickstart and LangGraph Studio Documentation.
- API docs lists all the API endpoints and their parameters. You can directly test it out through the API docs UI.For more details on how to use the API, see LangGraph API Documentation.
After you are done, you can stop the LangGraph API server by pressing Ctrl+C in the terminal.
- Start the LangGraph API server, then run the client that sends requests through the API
You can directly use LangGraph API to build your application we include a client.py that shows how to interact through the API.
First, start the LangGraph API server:
$ langgraph devTake a note of the API URL, it will be something like http://localhost:2024. You need to pass it to the client.
Then, in another terminal, start the client:
$ uv run client.pySomehow you cannot connect to LangGraph API server if you run in tunnel mode (--tunnel), the tunnel server name cannot be resolved you will get an error: Failed to connect to LangGraph API server: [Errno 8] nodename nor servname provided, or not known. So you need to start the LangGraph API server without tunnel mode.
For more details on LangGraph API, see LangGraph Python SDK Documentation.
hello-langgraph/
├── main.py # Main application with LangGraph workflow
├── client.py # Client for interacting with LangGraph API server through API/SDK
├── pyproject.toml # Project dependencies and configuration
├── .env # Your environment variables (create this)
├── .gitignore # Git ignore file
├── .python-version # Python version file
├── langgraph.json # LangGraph configuration
├── uv.lock # UV lock file
├── README.md # This file
└── LICENSE # License file
The project demonstrates a simple LangGraph workflow with:
- State Definition: Uses
TypedDictto define conversation state with message history - Workflow Graph: Creates a simple linear flow: START → chatbot → END
- State Management: Automatically handles message accumulation using
add_messages
- State Class: Defines the structure of data flowing through the workflow
- Chatbot Node: Processes messages and generates AI responses
- Workflow Creation: Builds and compiles the LangGraph workflow
- Interactive Interface: Provides user-friendly interaction modes
To extend the workflow, you can add new nodes:
def new_node(state: State):
# Your node logic here
return {"messages": [AIMessage(content="Response from new node")]}
# Add to workflow
workflow.add_node("new_node", new_node)
workflow.add_edge("chatbot", "new_node")
workflow.add_edge("new_node", END)Extend the state structure:
class State(TypedDict):
messages: Annotated[list[BaseMessage], add_messages]
user_info: dict # Add custom fields
context: str-
LMStudio Connection Issues:
- Ensure LMStudio is running and the server is started
- Check that the correct model (qwen3-4b-2507) is loaded
- Verify the server URL in
.envmatches LMStudio's server address
-
Import Errors:
- Make sure all dependencies are installed:
pip install -e . - Check Python version compatibility (3.12+)
- Make sure all dependencies are installed:
-
Model Loading Issues:
- Ensure qwen3-4b-2507 is properly downloaded in LMStudio
- Check LMStudio logs for any model loading errors
- Verify sufficient system resources (RAM/VRAM) for the model
-
Local LangGraph API Server Issues:
- Safari and Brave browsers block plain HTTP traffic on localhost. When you run the LangGraph API server, instead of
langgraph dev, uselanggraph dev --tunnel. For more details, see this troubleshooting guide and langchain-cli manual.
- Safari and Brave browsers block plain HTTP traffic on localhost. When you run the LangGraph API server, instead of
langgraph>=0.2.0: Core workflow frameworklangchain>=0.3.0: LangChain integrationlangchain-openai>=0.2.0: OpenAI-compatible API integration (used for LMStudio)python-dotenv>=1.0.0: Environment variable managementrequests>=2.31.0: HTTP client for API calls
- Fork the repository
- Create a feature branch
- Make your changes
- Add tests if applicable
- Submit a pull request
This project is licensed under the terms specified in the LICENSE file.
- LangChain Documentation
- LangGraph Documentation
- Quick start guide: run a LangGraph application locally
- LangGraph CLI reference
- LangGraph Server API Documentation
- LangGraph Python SDK Documentation
- Other LangGraph Platform Documentation
- new-langgraph-project-python template github project, to use it, just run the command:
langgraph new <path_to_local_project> --template new-langgraph-project-python
- LMStudio Documentation
- Qwen Model Information
- UV package manager
- UV tools