You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
> **Redis-inference-optimization is no longer actively maintained or supported.**
11
3
>
12
4
> We are grateful to the redis-inference-optimization community for their interest and support.
13
-
> Previously, redis-inference-optimization was named RedisAI, but was renamed in Jan 2025 to reduce confusion around Redis' other AI offerings.
5
+
> Previously, redis-inference-optimization was named RedisAI, but was renamed in Jan 2025 to reduce confusion around Redis' other AI offerings. To learn more about Redis' current AI offerings, visit [the Redis website](https://redis.io/redis-for-ai).
14
6
15
7
# Redis-inference-optimization
16
8
Redis-inference-optimization is a Redis module for executing Deep Learning/Machine Learning models and managing their data. Its purpose is being a "workhorse" for model serving, by providing out-of-the-box support for popular DL/ML frameworks and unparalleled performance. **Redis-inference-optimization both maximizes computation throughput and reduces latency by adhering to the principle of data locality**, as well as simplifies the deployment and serving of graphs by leveraging on Redis' production-proven infrastructure.
17
9
18
10
# Quickstart
19
-
redis-inference-optimization is a Redis module. To run it you'll need a Redis server (v6.0.0 or greater), the module's shared library, and its dependencies.
11
+
Redis-inference-optimization is a Redis module. To run it you'll need a Redis server (v6.0.0 or greater), the module's shared library, and its dependencies.
20
12
21
13
The following sections describe how to get started with redis-inference-optimization.
22
14
23
15
## Docker
24
16
The quickest way to try redis-inference-optimization is by launching its official Docker container images.
25
17
### On a CPU only machine
26
18
```
27
-
docker run -p 6379:6379 redislabs/redis-inference-optimization:1.2.7-cpu-bionic
19
+
docker run -p 6379:6379 redislabs/redisai:1.2.7-cpu-bionic
28
20
```
29
21
30
22
### On a GPU machine
31
23
For GPU support you will need a machine you'll need a machine that has Nvidia driver (CUDA 11.3 and cuDNN 8.1), nvidia-container-toolkit and Docker 19.03+ installed. For detailed information, checkout [nvidia-docker documentation](https://github.com/NVIDIA/nvidia-docker)
32
24
33
25
```
34
-
docker run -p 6379:6379 --gpus all -it --rm redislabs/redis-inference-optimization:1.2.7-gpu-bionic
26
+
docker run -p 6379:6379 --gpus all -it --rm redislabs/redisai:1.2.7-gpu-bionic
35
27
```
36
28
37
29
@@ -87,7 +79,7 @@ make -C opt GPU=1
87
79
88
80
### Backend Dependancy
89
81
90
-
redis-inference-optimization currently supports PyTorch (libtorch), Tensorflow (libtensorflow), TensorFlow Lite, and ONNXRuntime as backends. This section shows the version map between redis-inference-optimization and supported backends. This extremely important since the serialization mechanism of one version might not match with another. For making sure your model will work with a given redis-inference-optimization version, check with the backend documentation about incompatible features between the version of your backend and the version redis-inference-optimization is built with.
82
+
Redis-inference-optimization currently supports PyTorch (libtorch), Tensorflow (libtensorflow), TensorFlow Lite, and ONNXRuntime as backends. This section shows the version map between redis-inference-optimization and supported backends. This extremely important since the serialization mechanism of one version might not match with another. For making sure your model will work with a given redis-inference-optimization version, check with the backend documentation about incompatible features between the version of your backend and the version redis-inference-optimization is built with.
Once loaded, you can interact with redis-inference-optimization using redis-cli. Basic information and examples for using the module is described [here](https://oss.redis.com/redis-inference-optimization/intro/#getting-started).
104
+
Once loaded, you can interact with redis-inference-optimization using redis-cli.
113
105
114
106
### Client libraries
115
107
Some languages already have client libraries that provide support for redis-inference-optimization's commands. The following table lists the known ones:
redis-inference-optimization is licensed under your choice of the Redis Source Available License 2.0 (RSALv2) or the Server Side Public License v1 (SSPLv1).
123
+
Redis-inference-optimization is licensed under your choice of the Redis Source Available License 2.0 (RSALv2) or the Server Side Public License v1 (SSPLv1).
0 commit comments