Code for creation, deployment of a model trained on Titanic data.
.
├── .gitignore
├── create_model.ipynb # Jupyter notebook for training the LightGBM model
├── docker-compose.yml # Docker Compose configuration for deploying the API and worker
├── Dockerfile # Dockerfile for building the API container
├── main.py # Entry point for the FastAPI application
├── model_singleton.py # Singleton class for loading and using the trained model
├── poetry.lock
├── pyproject.toml # Poetry configuration file for project dependencies
├── README.md
├── schema.py # Defines the data schema for API requests
├── tasks.py # Celery tasks for background processing
├── data/ # Folder containing Titanic dataset files
│ ├── gender_submission.csv
│ ├── test.csv
│ ├── train.csv
│ └── train_processed.csv
├── models/
│ ├── category_mapping.pkl # Pickle file for categorical mappings
│ └── model_v0.txt # Trained LightGBM model file
https://www.kaggle.com/competitions/titanic/overview
pip install poetry
poetry installrun the create_model.ipynb notebook to create a lightgbm model.
install docker and docker-compose
docker-compose up -d --buildopen the browser and go to http://localhost:8000/docs, test the api by swagger UI.
docker-compose down