You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
A Redis module for serving tensors and executing deep learning models.
9
+
# RedisAI
10
+
RedisAI is a Redis module for executing Deep Learning/Machine Learning models and managing their data. Its purpose is being a "workhorse" for model serving, by providing out-of-the-box support for popular DL/ML frameworks and unparalleled performance. **RedisAI both maximizes computation throughput and reduces latency by adhering to the principle of data locality**, as well as simplifies the deployment and serving of graphs by leveraging on Redis' production-proven infrastructure.
12
11
13
-
## Cloning
14
-
If you want to run examples, make sure you have [git-lfs](https://git-lfs.github.com) installed when you clone.
12
+
To read RedisAI docs, visit [redisai.io](https://oss.redis.com/redisai/). To see RedisAI in action, visit the [demos page](https://oss.redis.com/redisai/examples/).
15
13
16
-
## Quickstart
14
+
# Quickstart
15
+
RedisAI is a Redis module. To run it you'll need a Redis server (v6.0.0 or greater), the module's shared library, and its dependencies.
17
16
18
-
1.[Docker](#docker)
19
-
2.[Build](#building)
17
+
The following sections describe how to get started with RedisAI.
20
18
21
19
## Docker
22
-
23
-
To quickly tryout RedisAI, launch an instance using docker:
24
-
25
-
```sh
26
-
docker run -p 6379:6379 -it --rm redislabs/redisai:edge-cpu-xenial
20
+
The quickest way to try RedisAI is by launching its official Docker container images.
21
+
### On a CPU only machine
27
22
```
28
-
29
-
For docker instance with GPU support, you can launch it from `tensorwerk/redisai-gpu`
30
-
31
-
```sh
32
-
docker run -p 6379:6379 --gpus all -it --rm redislabs/redisai:edge-gpu-xenial
23
+
docker run -p 6379:6379 redislabs/redisai:latest-cpu-x64-bionic
33
24
```
34
25
35
-
But if you'd like to build the docker image, you need a machine that has Nvidia driver (CUDA 10.0), nvidia-container-toolkit and Docker 19.03+ installed. For detailed information, checkout [nvidia-docker documentation](https://github.com/NVIDIA/nvidia-docker)
26
+
### On a GPU machine
27
+
For GPU support you will need a machine you'll need a machine that has Nvidia driver (CUDA 11.2 and cuDNN 8.1), nvidia-container-toolkit and Docker 19.03+ installed. For detailed information, checkout [nvidia-docker documentation](https://github.com/NVIDIA/nvidia-docker)
36
28
37
-
```sh
38
-
docker build -f Dockerfile-gpu -t redisai-gpu .
39
-
docker run -p 6379:6379 --gpus all -it --rm redisai-gpu
29
+
```
30
+
docker run -p 6379:6379 --gpus all -it --rm redislabs/redisai:latest-gpu-x64-bionic
40
31
```
41
32
42
-
Note that Redis config is located at `/usr/local/etc/redis/redis.conf` which can be overridden with a volume mount
43
33
34
+
## Building
35
+
You can compile and build the module from its source code. The [Developer](https://oss.redis.com/redisai/developer/) page has more information about the design and implementation of the RedisAI module and how to contribute.
This will checkout and build and download the libraries for the backends (TensorFlow, PyTorch, ONNXRuntime) for your platform. Note that this requires CUDA 10.0 to be installed.
63
+
Alternatively, you can run the following to fetch the backends with GPU support.
79
64
80
65
```sh
81
-
bash get_deps.sh
66
+
bash get_deps.sh gpu
82
67
```
83
68
84
-
Alternatively, run the following to only fetch the CPU-only backends even on GPU machines.
69
+
### Building the Module
70
+
Once the dependencies have been built, you can build the RedisAI module with:
85
71
86
72
```sh
87
-
bash get_deps.sh cpu
73
+
make -C opt clean ALL=1
74
+
make -C opt
88
75
```
89
76
90
-
Once the dependencies are downloaded, build the module itself. Note that
91
-
CMake 3.0 or higher is required.
77
+
Alternatively, run the following to build RedisAI with GPU support:
92
78
93
79
```sh
94
-
ALL=1 make -C opt clean build
80
+
make -C opt clean ALL=1
81
+
make -C opt GPU=1
95
82
```
96
83
97
-
Note: in order to use the PyTorch backend on Linux, at least `gcc 4.9.2` is required.
84
+
### Backend Dependancy
98
85
99
-
### Running the server
86
+
RedisAI currently supports PyTorch (libtorch), Tensorflow (libtensorflow), TensorFlow Lite, and ONNXRuntime as backends. This section shows the version map between RedisAI and supported backends. This extremely important since the serialization mechanism of one version might not match with another. For making sure your model will work with a given RedisAI version, check with the backend documentation about incompatible features between the version of your backend and the version RedisAI is built with.
100
87
101
-
You will need a redis-server version 6.0 or greater. This should be
102
-
available in most recent distributions:
103
88
104
-
```sh
105
-
redis-server --version
106
-
Redis server v=6.2.5 sha=00000000:0 malloc=jemalloc-5.2.1 bits=64 build=c3504d808f2b2793
Note: Keras and TensorFlow 2.x are supported through graph freezing. See [this script](https://github.com/RedisAI/RedisAI/blob/master/tests/flow/test_data/tf2-minimal.py) to see how to export a frozen graph from Keras and TensorFlow 2.x.
108
96
109
-
To start Redis with the RedisAI module loaded:
97
+
## Loading the Module
98
+
To load the module upon starting the Redis server, simply use the `--loadmodule` command line switch, the `loadmodule` configuration directive or the [Redis `MODULE LOAD` command](https://redis.io/commands/module-load) with the path to module's library.
99
+
100
+
For example, to load the module from the project's path with a server command line switch use the following:
Once loaded, you can interact with RedisAI using redis-cli. Basic information and examples for using the module is described [here](https://oss.redis.com/redisai/intro/#getting-started).
116
109
117
-
Some languages have client libraries that provide support for RedisAI's commands:
110
+
### Client libraries
111
+
Some languages already have client libraries that provide support for RedisAI's commands. The following table lists the known ones:
118
112
119
113
| Project | Language | License | Author | URL |
120
114
| ------- | -------- | ------- | ------ | --- |
@@ -131,35 +125,16 @@ Some languages have client libraries that provide support for RedisAI's commands
131
125
132
126
133
127
134
-
135
-
136
-
## Backend Dependancy
137
-
138
-
RedisAI currently supports PyTorch (libtorch), Tensorflow (libtensorflow), TensorFlow Lite, and ONNXRuntime as backends. This section shows the version map between RedisAI and supported backends. This extremely important since the serialization mechanism of one version might not match with another. For making sure your model will work with a given RedisAI version, check with the backend documentation about incompatible features between the version of your backend and the version RedisAI is built with.
Note: Keras and TensorFlow 2.x are supported through graph freezing. See [this script](https://github.com/RedisAI/RedisAI/blob/master/tests/test_data/tf2-minimal.py) to see how to export a frozen graph from Keras and TensorFlow 2.x. Note that a frozen graph will be executed using the TensorFlow 1.15 backend. Should any 2.0 ops be not supported on the 1.15 after freezing, please open an Issue.
128
+
The full documentation for RedisAI's API can be found at the [Commands page](commands.md).
152
129
153
130
## Documentation
131
+
Read the docs at [redisai.io](https://oss.redis.com/redisai/).
154
132
155
-
Read the docs at [redisai.io](http://redisai.io). Checkout our [showcase repo](https://github.com/RedisAI/redisai-examples) for a lot of examples written using different client libraries.
156
-
157
-
## Mailing List / Forum
133
+
## Contact Us
134
+
If you have questions, want to provide feedback or perhaps report an issue or [contribute some code](contrib.md), here's where we're listening to you:
158
135
159
-
Got questions? Feel free to ask at the [RedisAI Forum](https://forum.redislabs.com/c/modules/redisai)
0 commit comments