A sophisticated monitoring system for tracking Christmas tree water levels with unnecessary precision.
This project monitors and visualizes the water level of a Christmas tree through three main components:
- Data Sleigh - Unified MQTT logger, YoLink sensor integration, and S3 uploader with season-aware behavior
- Infrastructure - AWS CDK scripts that provision S3 bucket and IAM credentials
- Static Site - Vite-powered visualization dashboard served via GitHub Pages
IoT Sensors (ESP8266, YoLink, etc.)
β (MQTT messages)
βββββββββββββββββββββββββββββββββββββββ
β Data Sleigh β
β βββββββββββββββ ββββββββββββββββ β
β β Local MQTT β β YoLink Cloud β β
β ββββββββ¬βββββββ ββββββββ¬ββββββββ β
β ββββββββββ¬ββββββββ β
β β β
β DuckDB (normalized) β
β β β
β [Season Check] β
β β β β
β IN-SEASON OFF-SEASON β
β β β β
β Aggregate Monthly β
β & Upload Backup β
ββββββββββββ¬βββββββββββββ¬ββββββββββββββ
β β
S3 (JSON) S3 (backups/)
β
GitHub Pages (static site)
β
User's Browser (Chart.js)
- Python 3.11-3.13 with uv installed
- Node.js 18+ with npm (for CDK CLI via
npxand Vite) - Docker
- AWS CLI configured with appropriate permissions
- MQTT broker (e.g., Mosquitto) for IoT sensor data
Data Sleigh is the unified data collection and upload service. See data_sleigh/README.md for complete documentation.
Quick Start with Docker:
cd data_sleigh
# Copy and edit the example configuration
cp run-docker-data-sleigh.example.sh run-docker-data-sleigh.sh
# Edit run-docker-data-sleigh.sh with your settings
# Build and run
docker build -t data-sleigh .
./run-docker-data-sleigh.shKey Features:
- π‘ Dual MQTT support (local broker + YoLink cloud)
- π Season-aware behavior (automatic in-season/off-season modes)
- πΎ Efficient DuckDB storage with normalized YoLink schema
- βοΈ Gzip-compressed JSON uploads to S3
- π¦ Automatic monthly backups during off-season
- π§ Email alerts for disk space monitoring
Deploy the CDK stack to create S3 bucket and IAM credentials:
cd infrastructure
uv sync
npx aws-cdk bootstrap # First time only
npx aws-cdk deployAfter deployment, the stack outputs will include the IAM credentials needed for Data Sleigh.
cd site
npm install
npm run dev # Development server
npm run build # Production build to ../docs/The static site is built to the docs/ directory and served via GitHub Pages:
- Push changes to GitHub
- Go to repository Settings β Pages
- Set source to "Deploy from a branch"
- Select
mainbranch and/docsfolder - Site will be available at: https://treelemetry.tomlee.space
The JSON file uploaded to S3 includes:
- Season info (start date, end date, active status)
- Raw measurements (last 10 minutes)
- Aggregated data (1m, 5m, 1h intervals)
- Consumption analysis (detected segments, slopes, predictions)
- Statistics (min, max, avg, stddev)
All data is gzip-compressed for efficient transfer.
Sample structure:
{
"generated_at": "2025-12-05T12:00:00Z",
"season": {
"start": "2025-12-01",
"end": "2026-01-15",
"is_active": true
},
"measurements": [...],
"agg_1m": { "data": [...] },
"agg_5m": { "data": [...] },
"agg_1h": { "data": [...] },
"analysis": {
"segments": [...],
"current_prediction": {
"slope_mm_per_hr": 2.5,
"time_to_50mm_hours": 8.5
}
}
}treelemetry/
βββ data_sleigh/ # Unified MQTT logger + S3 uploader
β βββ Dockerfile
β βββ main.py
β βββ config/ # Configuration templates
β βββ src/ # Implementation
β βββ tests/ # Test suite
β βββ tools/ # CLI utilities
βββ infrastructure/ # AWS CDK for S3 & IAM
β βββ app.py
β βββ cdk.json
β βββ infrastructure/
βββ site/ # Vite static site
β βββ index.html
β βββ package.json
β βββ src/
βββ docs/ # Built site (GitHub Pages)
βββ mqtt_logger/ # Legacy (replaced by data_sleigh)
βββ uploader/ # Legacy (replaced by data_sleigh)
See PROJECT_STRUCTURE.md for complete details.
# Data Sleigh tests
cd data_sleigh
uv sync --all-extras
uv run pytest
# With coverage
uv run pytest --cov=src --cov-report=htmlmake help # Show available targets
make install # Install all dependencies
make build-docker # Build Docker images
make test # Run tests
make dev-site # Start site dev server
make status # Check component statusData Sleigh consolidates the previous mqtt_logger and uploader components:
Benefits:
- β Eliminates DuckDB locking issues (single process)
- β Simplified deployment (one container vs two)
- β Season-aware behavior built-in
- β Improved YoLink schema (normalized columns)
- β Monthly backups during off-season
The legacy mqtt_logger/ and uploader/ directories are preserved for reference but are no longer actively maintained.
MIT
