|
7 | 7 | # RedisAI |
8 | 8 | [](https://forum.redislabs.com/c/modules/redisai) |
9 | 9 | [](https://discord.gg/rTQm7UZ) |
| 10 | +RedisAI is a Redis module for executing Deep Learning/Machine Learning models and managing their data. Its purpose is being a "workhorse" for model serving, by providing out-of-the-box support for popular DL/ML frameworks and unparalleled performance. **RedisAI both maximizes computation throughput and reduces latency by adhering to the principle of data locality**, as well as simplifies the deployment and serving of graphs by leveraging on Redis' production-proven infrastructure. |
10 | 11 |
|
11 | | -A Redis module for serving tensors and executing deep learning models. |
| 12 | +RedisAI is a joint effort between [Redis](https://www.redis.com) and [Tensorwerk](https://tensorwerk.com). |
12 | 13 |
|
13 | | -## Cloning |
14 | | -If you want to run examples, make sure you have [git-lfs](https://git-lfs.github.com) installed when you clone. |
| 14 | +## Where Next? |
| 15 | +* The [Introduction](https://oss.redis.com/redisai/intro/) is the recommended starting point |
| 16 | +* The [Quickstart](https://oss.redis.com/redisai/quickstart/) page provides information about building, installing and running RedisAI |
| 17 | +* The [Commands](https://oss.redis.com/redisai/commands/) page is a reference of the RedisAI API |
| 18 | +* The [Clients](https://oss.redis.com/redisai/clients/) page lists RedisAI clients by programming language |
| 19 | +* The [Configuration](https://oss.redis.com/redisai/configuration/) page explains how to configure RedisAI |
| 20 | +* The [Performance](https://oss.redis.com/redisai/performance/) page provides instructions for running benchmarks with RedisAI |
| 21 | +* The [Developer](https://oss.redis.com/redisai/developer/) page has more information about the design and implementation of the RedisAI module |
15 | 22 |
|
16 | | -## Quickstart |
| 23 | +## Quick Links |
| 24 | +* [Source code repository](https://github.com/RedisAI/RedisAI) |
| 25 | +* [Releases](https://github.com/RedisAI/RedisAI/releases) |
| 26 | +* [Docker image](https://hub.docker.com/r/redislabs/redisai/) |
17 | 27 |
|
18 | | -1. [Docker](#docker) |
19 | | -2. [Build](#building) |
| 28 | +## Contact Us |
| 29 | +If you have questions, want to provide feedback or perhaps report an issue or [contribute some code](contrib.md), here's where we're listening to you: |
20 | 30 |
|
21 | | -## Docker |
22 | | - |
23 | | -To quickly tryout RedisAI, launch an instance using docker: |
24 | | - |
25 | | -```sh |
26 | | -docker run -p 6379:6379 -it --rm redislabs/redisai:edge-cpu-xenial |
27 | | -``` |
28 | | - |
29 | | -For docker instance with GPU support, you can launch it from `tensorwerk/redisai-gpu` |
30 | | - |
31 | | -```sh |
32 | | -docker run -p 6379:6379 --gpus all -it --rm redislabs/redisai:edge-gpu-xenial |
33 | | -``` |
34 | | - |
35 | | -But if you'd like to build the docker image, you need a machine that has Nvidia driver (CUDA 10.0), nvidia-container-toolkit and Docker 19.03+ installed. For detailed information, checkout [nvidia-docker documentation](https://github.com/NVIDIA/nvidia-docker) |
36 | | - |
37 | | -```sh |
38 | | -docker build -f Dockerfile-gpu -t redisai-gpu . |
39 | | -docker run -p 6379:6379 --gpus all -it --rm redisai-gpu |
40 | | -``` |
41 | | - |
42 | | -Note that Redis config is located at `/usr/local/etc/redis/redis.conf` which can be overridden with a volume mount |
43 | | - |
44 | | - |
45 | | -### Give it a try |
46 | | - |
47 | | -On the client, set the model |
48 | | -```sh |
49 | | -redis-cli -x AI.MODELSTORE foo TF CPU INPUTS 2 a b OUTPUTS 1 c BLOB < tests/test_data/graph.pb |
50 | | -``` |
51 | | - |
52 | | -Then create the input tensors, run the computation graph and get the output tensor (see `load_model.sh`). Note the signatures: |
53 | | -* `AI.TENSORSET tensor_key data_type dim1..dimN [BLOB data | VALUES val1..valN]` |
54 | | -* `AI.MODELRUN graph_key INPUTS input_key1 ... OUTPUTS output_key1 ...` |
55 | | -```sh |
56 | | -redis-cli |
57 | | -> AI.TENSORSET bar FLOAT 2 VALUES 2 3 |
58 | | -> AI.TENSORSET baz FLOAT 2 VALUES 2 3 |
59 | | -> AI.MODELRUN foo INPUTS bar baz OUTPUTS jez |
60 | | -> AI.TENSORGET jez META VALUES |
61 | | -1) dtype |
62 | | -2) FLOAT |
63 | | -3) shape |
64 | | -4) 1) (integer) 2 |
65 | | -5) values |
66 | | -6) 1) "4" |
67 | | - 2) "9" |
68 | | -``` |
69 | | - |
70 | | -## Building |
71 | | - |
72 | | -You should obtain the module's source code and submodule using git like so: |
73 | | - |
74 | | -```sh |
75 | | -git clone --recursive https://github.com/RedisAI/RedisAI |
76 | | -``` |
77 | | - |
78 | | -This will checkout and build and download the libraries for the backends (TensorFlow, PyTorch, ONNXRuntime) for your platform. Note that this requires CUDA 10.0 to be installed. |
79 | | - |
80 | | -```sh |
81 | | -bash get_deps.sh |
82 | | -``` |
83 | | - |
84 | | -Alternatively, run the following to only fetch the CPU-only backends even on GPU machines. |
85 | | - |
86 | | -```sh |
87 | | -bash get_deps.sh cpu |
88 | | -``` |
89 | | - |
90 | | -Once the dependencies are downloaded, build the module itself. Note that |
91 | | -CMake 3.0 or higher is required. |
92 | | - |
93 | | -```sh |
94 | | -ALL=1 make -C opt clean build |
95 | | -``` |
96 | | - |
97 | | -Note: in order to use the PyTorch backend on Linux, at least `gcc 4.9.2` is required. |
98 | | - |
99 | | -### Running the server |
100 | | - |
101 | | -You will need a redis-server version 6.0 or greater. This should be |
102 | | -available in most recent distributions: |
103 | | - |
104 | | -```sh |
105 | | -redis-server --version |
106 | | -Redis server v=6.2.5 sha=00000000:0 malloc=jemalloc-5.2.1 bits=64 build=c3504d808f2b2793 |
107 | | -``` |
108 | | - |
109 | | -To start Redis with the RedisAI module loaded: |
110 | | - |
111 | | -```sh |
112 | | -redis-server --loadmodule install-cpu/redisai.so |
113 | | -``` |
114 | | - |
115 | | -## Client libraries |
116 | | - |
117 | | -Some languages have client libraries that provide support for RedisAI's commands: |
118 | | - |
119 | | -| Project | Language | License | Author | URL | |
120 | | -| ------- | -------- | ------- | ------ | --- | |
121 | | -| JRedisAI | Java | BSD-3 | [RedisLabs](https://redislabs.com/) | [Github](https://github.com/RedisAI/JRedisAI) | |
122 | | -| redisai-py | Python | BSD-3 | [RedisLabs](https://redislabs.com/) | [Github](https://github.com/RedisAI/redisai-py) | |
123 | | -| redisai-go | Go | BSD-3 | [RedisLabs](https://redislabs.com/) | [Github](https://github.com/RedisAI/redisai-go) | |
124 | | -| redisai-js | Typescript/Javascript | BSD-3 | [RedisLabs](https://redislabs.com/) | [Github](https://github.com/RedisAI/redisai-js) | |
125 | | -| redis-modules-sdk | TypeScript | BSD-3-Clause | [Dani Tseitlin](https://github.com/danitseitlin) | [Github](https://github.com/danitseitlin/redis-modules-sdk) | |
126 | | -| redis-modules-java | Java | Apache-2.0 | [dengliming](https://github.com/dengliming) | [Github](https://github.com/dengliming/redis-modules-java) | |
127 | | -| smartredis | C++ | BSD-2-Clause | [Cray Labs](https://github.com/CrayLabs) | [Github](https://github.com/CrayLabs/SmartRedis) | |
128 | | -| smartredis | C | BSD-2-Clause | [Cray Labs](https://github.com/CrayLabs) | [Github](https://github.com/CrayLabs/SmartRedis) | |
129 | | -| smartredis | Fortran | BSD-2-Clause | [Cray Labs](https://github.com/CrayLabs) | [Github](https://github.com/CrayLabs/SmartRedis) | |
130 | | -| smartredis | Python | BSD-2-Clause | [Cray Labs](https://github.com/CrayLabs) | [Github](https://github.com/CrayLabs/SmartRedis) | |
131 | | - |
132 | | - |
133 | | - |
134 | | - |
135 | | - |
136 | | -## Backend Dependancy |
137 | | - |
138 | | -RedisAI currently supports PyTorch (libtorch), Tensorflow (libtensorflow), TensorFlow Lite, and ONNXRuntime as backends. This section shows the version map between RedisAI and supported backends. This extremely important since the serialization mechanism of one version might not match with another. For making sure your model will work with a given RedisAI version, check with the backend documentation about incompatible features between the version of your backend and the version RedisAI is built with. |
139 | | - |
140 | | - |
141 | | -| RedisAI | PyTorch | TensorFlow | TFLite | ONNXRuntime | |
142 | | -|:--------|:-------:|:----------:|:------:|:-------------:| |
143 | | -| 0.1.0 | 1.0.1 | 1.12.0 | None | None | |
144 | | -| 0.2.1 | 1.0.1 | 1.12.0 | None | None | |
145 | | -| 0.3.1 | 1.1.0 | 1.12.0 | None | 0.4.0 | |
146 | | -| 0.4.0 | 1.2.0 | 1.14.0 | None | 0.5.0 | |
147 | | -| 0.9.0 | 1.3.1 | 1.14.0 | 2.0.0 | 1.0.0 | |
148 | | -| 1.0.0 | 1.5.0 | 1.15.0 | 2.0.0 | 1.2.0 | |
149 | | -| master | 1.7.0 | 1.15.0 | 2.0.0 | 1.2.0 | |
150 | | - |
151 | | -Note: Keras and TensorFlow 2.x are supported through graph freezing. See [this script](https://github.com/RedisAI/RedisAI/blob/master/tests/test_data/tf2-minimal.py) to see how to export a frozen graph from Keras and TensorFlow 2.x. Note that a frozen graph will be executed using the TensorFlow 1.15 backend. Should any 2.0 ops be not supported on the 1.15 after freezing, please open an Issue. |
152 | | - |
153 | | -## Documentation |
154 | | - |
155 | | -Read the docs at [redisai.io](http://redisai.io). Checkout our [showcase repo](https://github.com/RedisAI/redisai-examples) for a lot of examples written using different client libraries. |
156 | | - |
157 | | -## Mailing List / Forum |
158 | | - |
159 | | -Got questions? Feel free to ask at the [RedisAI Forum](https://forum.redislabs.com/c/modules/redisai) |
| 31 | +* [Forum](https://forum.redis.com/c/modules/redisai) |
| 32 | +* [Repository](https://github.com/RedisAI/RedisAI/issues) |
160 | 33 |
|
161 | 34 | ## License |
162 | | - |
163 | | -Redis Source Available License Agreement - see [LICENSE](LICENSE) |
164 | | - |
165 | | -Copyright 2020, [Redis Labs, Inc](https://redislabs.com) |
| 35 | +RedisAI is licensed under the [Redis Source Available License Agreement](https://github.com/RedisAI/RedisAI/blob/master/LICENSE). |
0 commit comments