You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
> **RedisAI is no longer actively maintained or supported.**
2
+
> **Redis-inference-optimization is no longer actively maintained or supported.**
11
3
>
12
-
> We are grateful to the RedisAI community for their interest and support.
13
-
14
-
# RedisAI
15
-
RedisAI is a Redis module for executing Deep Learning/Machine Learning models and managing their data. Its purpose is being a "workhorse" for model serving, by providing out-of-the-box support for popular DL/ML frameworks and unparalleled performance. **RedisAI both maximizes computation throughput and reduces latency by adhering to the principle of data locality**, as well as simplifies the deployment and serving of graphs by leveraging on Redis' production-proven infrastructure.
4
+
> We are grateful to the redis-inference-optimization community for their interest and support.
5
+
> Previously, redis-inference-optimization was named RedisAI, but was renamed in Jan 2025 to reduce confusion around Redis' other AI offerings. To learn more about Redis' current AI offerings, visit [the Redis website](https://redis.io/redis-for-ai).
16
6
17
-
To read RedisAI docs, visit [redisai.io](https://oss.redis.com/redisai/). To see RedisAI in action, visit the [demos page](https://oss.redis.com/redisai/examples/).
7
+
# Redis-inference-optimization
8
+
Redis-inference-optimization is a Redis module for executing Deep Learning/Machine Learning models and managing their data. Its purpose is being a "workhorse" for model serving, by providing out-of-the-box support for popular DL/ML frameworks and unparalleled performance. **Redis-inference-optimization both maximizes computation throughput and reduces latency by adhering to the principle of data locality**, as well as simplifies the deployment and serving of graphs by leveraging on Redis' production-proven infrastructure.
18
9
19
10
# Quickstart
20
-
RedisAI is a Redis module. To run it you'll need a Redis server (v6.0.0 or greater), the module's shared library, and its dependencies.
11
+
Redis-inference-optimization is a Redis module. To run it you'll need a Redis server (v6.0.0 or greater), the module's shared library, and its dependencies.
21
12
22
-
The following sections describe how to get started with RedisAI.
13
+
The following sections describe how to get started with redis-inference-optimization.
23
14
24
15
## Docker
25
-
The quickest way to try RedisAI is by launching its official Docker container images.
16
+
The quickest way to try redis-inference-optimization is by launching its official Docker container images.
26
17
### On a CPU only machine
27
18
```
28
19
docker run -p 6379:6379 redislabs/redisai:1.2.7-cpu-bionic
@@ -37,7 +28,7 @@ docker run -p 6379:6379 --gpus all -it --rm redislabs/redisai:1.2.7-gpu-bionic
37
28
38
29
39
30
## Building
40
-
You can compile and build the module from its source code. The [Developer](https://oss.redis.com/redisai/developer/) page has more information about the design and implementation of the RedisAI module and how to contribute.
31
+
You can compile and build the module from its source code.
Use the following script to download and build the libraries of the various RedisAI backends (TensorFlow, PyTorch, ONNXRuntime) for CPU only:
53
+
Use the following script to download and build the libraries of the various redis-inference-optimization backends (TensorFlow, PyTorch, ONNXRuntime) for CPU only:
63
54
64
55
```sh
65
56
bash get_deps.sh
@@ -72,14 +63,14 @@ bash get_deps.sh gpu
72
63
```
73
64
74
65
### Building the Module
75
-
Once the dependencies have been built, you can build the RedisAI module with:
66
+
Once the dependencies have been built, you can build the redis-inference-optimization module with:
76
67
77
68
```sh
78
69
make -C opt clean ALL=1
79
70
make -C opt
80
71
```
81
72
82
-
Alternatively, run the following to build RedisAI with GPU support:
73
+
Alternatively, run the following to build redis-inference-optimization with GPU support:
83
74
84
75
```sh
85
76
make -C opt clean ALL=1
@@ -88,59 +79,45 @@ make -C opt GPU=1
88
79
89
80
### Backend Dependancy
90
81
91
-
RedisAI currently supports PyTorch (libtorch), Tensorflow (libtensorflow), TensorFlow Lite, and ONNXRuntime as backends. This section shows the version map between RedisAI and supported backends. This extremely important since the serialization mechanism of one version might not match with another. For making sure your model will work with a given RedisAI version, check with the backend documentation about incompatible features between the version of your backend and the version RedisAI is built with.
82
+
Redis-inference-optimization currently supports PyTorch (libtorch), Tensorflow (libtensorflow), TensorFlow Lite, and ONNXRuntime as backends. This section shows the version map between redis-inference-optimization and supported backends. This extremely important since the serialization mechanism of one version might not match with another. For making sure your model will work with a given redis-inference-optimization version, check with the backend documentation about incompatible features between the version of your backend and the version redis-inference-optimization is built with.
Note: Keras and TensorFlow 2.x are supported through graph freezing. See [this script](http://dev.cto.redis.s3.amazonaws.com/RedisAI/test_data/tf2-minimal.py
101
-
) to see how to export a frozen graph from Keras and TensorFlow 2.x.
91
+
Note: Keras and TensorFlow 2.x are supported through graph freezing.
102
92
103
93
## Loading the Module
104
94
To load the module upon starting the Redis server, simply use the `--loadmodule` command line switch, the `loadmodule` configuration directive or the [Redis `MODULE LOAD` command](https://redis.io/commands/module-load) with the path to module's library.
105
95
106
96
For example, to load the module from the project's path with a server command line switch use the following:
Once loaded, you can interact with RedisAI using redis-cli. Basic information and examples for using the module is described [here](https://oss.redis.com/redisai/intro/#getting-started).
104
+
Once loaded, you can interact with redis-inference-optimization using redis-cli.
115
105
116
106
### Client libraries
117
-
Some languages already have client libraries that provide support for RedisAI's commands. The following table lists the known ones:
107
+
Some languages already have client libraries that provide support for redis-inference-optimization's commands. The following table lists the known ones:
RedisAI is licensed under your choice of the Redis Source Available License 2.0 (RSALv2) or the Server Side Public License v1 (SSPLv1).
123
+
Redis-inference-optimization is licensed under your choice of the Redis Source Available License 2.0 (RSALv2) or the Server Side Public License v1 (SSPLv1).
0 commit comments