Commit 622a5c9
committed
docs: Completely restructure README for clarity and usability
Rewrote README from scratch to make installation and setup crystal clear
for users. The old README was confusing about which features to enable and
what was required for each provider type.
## Major Improvements
### 1. "Choose Your Setup" Decision Tree
Added a clear decision section at the top with 4 options:
- **Option 1: Local Setup** (ONNX + Ollama) - Free, private
- **Option 2: LM Studio** - Best Mac performance
- **Option 3: Cloud Providers** - Best quality
- **Option 4: Hybrid** - Mix and match
Each option clearly states:
- What it's best for
- Pros and cons
- Direct link to setup instructions
### 2. Step-by-Step Provider-Specific Instructions
Complete setup guides for each provider type:
**Local Setup (ONNX + Ollama):**
- Install Ollama
- Pull models
- Build with correct features
- Configure
- Run
**LM Studio Setup:**
- Download LM Studio
- Download models in app
- Start server
- Build with correct features
- Configure
- Run
**Cloud Setup (Anthropic & OpenAI):**
- Get API keys
- Build with cloud features
- Use wizard or manual config
- Run
- Includes reasoning model configuration
**Hybrid Setup:**
- Mix local and cloud
- Example configs
- Build command
### 3. Clear Feature Flags Table
Added comprehensive table showing:
- Feature flag name
- What it enables
- When to use it
Plus common build command examples:
```bash
# Local only
cargo build --release --features "onnx,ollama,faiss"
# LM Studio
cargo build --release --features "openai-compatible,faiss"
# Cloud
cargo build --release --features "anthropic,openai-llm,openai,faiss"
```
### 4. Improved Structure
**Before:** Confusing mix of information, hard to find what you need
**After:** Clear sections with table of contents
- Choose Your Setup (decision tree)
- Installation (step-by-step by provider)
- Configuration (with examples)
- Usage (basic commands)
- Feature Flags Reference (table)
- Performance (metrics)
- Troubleshooting (common issues)
- Advanced Features (moved to end)
### 5. Better Configuration Examples
- Removed confusing `.env` approach
- Use `~/.codegraph/config.toml` consistently
- Full working examples for each provider
- Reasoning model configuration included
### 6. Troubleshooting Section
Added common issues and solutions:
- Build issues (missing FAISS, feature flags)
- Runtime issues (API keys, models, connections)
- Where to get help
### 7. Improved Readability
- Used emojis for visual scanning
- Clear headers with consistent formatting
- Code blocks with proper syntax highlighting
- Checkmarks (✅) to mark completion
- Pros/cons for each option
- "Best for" callouts
## What Was Removed
- Confusing `.env` variable approach
- Scattered information
- Unclear prerequisites
- Mixed local/cloud instructions
- Hard-to-find feature flag info
## What Was Added
- Decision tree at the top
- Complete step-by-step guides
- Feature flags table
- Troubleshooting section
- Clear pros/cons for each option
- Common build commands
- Reasoning model examples
## User Experience Improvements
**Before:**
- User reads entire README, still confused
- Unclear which features to enable
- Don't know if they need cloud or local
- Can't find complete setup for their use case
**After:**
- User picks their setup in 30 seconds
- Jumps to relevant section
- Follows step-by-step instructions
- Has working system
## File Structure
```
README.md (completely rewritten)
├── Overview
├── Table of Contents
├── Choose Your Setup (NEW)
│ ├── Option 1: Local
│ ├── Option 2: LM Studio
│ ├── Option 3: Cloud
│ └── Option 4: Hybrid
├── Installation (REORGANIZED)
│ ├── Prerequisites
│ ├── Local Setup
│ ├── LM Studio Setup
│ ├── Cloud Setup
│ └── Hybrid Setup
├── Configuration
├── Usage
├── Feature Flags Reference (NEW TABLE)
├── Performance
├── Troubleshooting (NEW)
├── Advanced Features
└── Learn More
```
This restructuring makes CodeGraph accessible to users of all levels while
maintaining depth for advanced users.1 parent 623838b commit 622a5c9
1 file changed
+482
-96
lines changed
0 commit comments