Skip to content

Commit 75775e0

Browse files
committed
Reverted docker links to RedisAI. Removed external broken links.
1 parent 1e95348 commit 75775e0

File tree

1 file changed

+11
-29
lines changed

1 file changed

+11
-29
lines changed

README.md

+11-29
Original file line numberDiff line numberDiff line change
@@ -1,37 +1,29 @@
1-
[![GitHub issues](https://img.shields.io/github/release/redis-inference-optimization/redis-inference-optimization.svg?sort=semver)](https://github.com/redis-inference-optimization/redis-inference-optimization/releases/latest)
2-
[![CircleCI](https://circleci.com/gh/redis-inference-optimization/redis-inference-optimization/tree/master.svg?style=svg)](https://circleci.com/gh/redis-inference-optimization/redis-inference-optimization/tree/master)
3-
[![Dockerhub](https://img.shields.io/badge/dockerhub-redislabs%2Fredis-inference-optimization-blue)](https://hub.docker.com/r/redislabs/redis-inference-optimization/tags/)
4-
[![codecov](https://codecov.io/gh/redis-inference-optimization/redis-inference-optimization/branch/master/graph/badge.svg)](https://codecov.io/gh/redis-inference-optimization/redis-inference-optimization)
5-
[![Total alerts](https://img.shields.io/lgtm/alerts/g/redis-inference-optimization/redis-inference-optimization.svg?logo=lgtm&logoWidth=18)](https://lgtm.com/projects/g/redis-inference-optimization/redis-inference-optimization/alerts/)
6-
[![Forum](https://img.shields.io/badge/Forum-redis-inference-optimization-blue)](https://forum.redislabs.com/c/modules/redis-inference-optimization)
7-
[![Discord](https://img.shields.io/discord/697882427875393627?style=flat-square)](https://discord.gg/rTQm7UZ)
8-
91
> [!CAUTION]
102
> **Redis-inference-optimization is no longer actively maintained or supported.**
113
>
124
> We are grateful to the redis-inference-optimization community for their interest and support.
13-
> Previously, redis-inference-optimization was named RedisAI, but was renamed in Jan 2025 to reduce confusion around Redis' other AI offerings.
5+
> Previously, redis-inference-optimization was named RedisAI, but was renamed in Jan 2025 to reduce confusion around Redis' other AI offerings. To learn more about Redis' current AI offerings, visit [the Redis website](https://redis.io/redis-for-ai).
146
157
# Redis-inference-optimization
168
Redis-inference-optimization is a Redis module for executing Deep Learning/Machine Learning models and managing their data. Its purpose is being a "workhorse" for model serving, by providing out-of-the-box support for popular DL/ML frameworks and unparalleled performance. **Redis-inference-optimization both maximizes computation throughput and reduces latency by adhering to the principle of data locality**, as well as simplifies the deployment and serving of graphs by leveraging on Redis' production-proven infrastructure.
179

1810
# Quickstart
19-
redis-inference-optimization is a Redis module. To run it you'll need a Redis server (v6.0.0 or greater), the module's shared library, and its dependencies.
11+
Redis-inference-optimization is a Redis module. To run it you'll need a Redis server (v6.0.0 or greater), the module's shared library, and its dependencies.
2012

2113
The following sections describe how to get started with redis-inference-optimization.
2214

2315
## Docker
2416
The quickest way to try redis-inference-optimization is by launching its official Docker container images.
2517
### On a CPU only machine
2618
```
27-
docker run -p 6379:6379 redislabs/redis-inference-optimization:1.2.7-cpu-bionic
19+
docker run -p 6379:6379 redislabs/redisai:1.2.7-cpu-bionic
2820
```
2921

3022
### On a GPU machine
3123
For GPU support you will need a machine you'll need a machine that has Nvidia driver (CUDA 11.3 and cuDNN 8.1), nvidia-container-toolkit and Docker 19.03+ installed. For detailed information, checkout [nvidia-docker documentation](https://github.com/NVIDIA/nvidia-docker)
3224

3325
```
34-
docker run -p 6379:6379 --gpus all -it --rm redislabs/redis-inference-optimization:1.2.7-gpu-bionic
26+
docker run -p 6379:6379 --gpus all -it --rm redislabs/redisai:1.2.7-gpu-bionic
3527
```
3628

3729

@@ -87,7 +79,7 @@ make -C opt GPU=1
8779

8880
### Backend Dependancy
8981

90-
redis-inference-optimization currently supports PyTorch (libtorch), Tensorflow (libtensorflow), TensorFlow Lite, and ONNXRuntime as backends. This section shows the version map between redis-inference-optimization and supported backends. This extremely important since the serialization mechanism of one version might not match with another. For making sure your model will work with a given redis-inference-optimization version, check with the backend documentation about incompatible features between the version of your backend and the version redis-inference-optimization is built with.
82+
Redis-inference-optimization currently supports PyTorch (libtorch), Tensorflow (libtensorflow), TensorFlow Lite, and ONNXRuntime as backends. This section shows the version map between redis-inference-optimization and supported backends. This extremely important since the serialization mechanism of one version might not match with another. For making sure your model will work with a given redis-inference-optimization version, check with the backend documentation about incompatible features between the version of your backend and the version redis-inference-optimization is built with.
9183

9284

9385
| redis-inference-optimization | PyTorch | TensorFlow | TFLite | ONNXRuntime |
@@ -109,33 +101,23 @@ redis-server --loadmodule ./install-cpu/redis-inference-optimization.so
109101

110102
### Give it a try
111103

112-
Once loaded, you can interact with redis-inference-optimization using redis-cli. Basic information and examples for using the module is described [here](https://oss.redis.com/redis-inference-optimization/intro/#getting-started).
104+
Once loaded, you can interact with redis-inference-optimization using redis-cli.
113105

114106
### Client libraries
115107
Some languages already have client libraries that provide support for redis-inference-optimization's commands. The following table lists the known ones:
116108

117109
| Project | Language | License | Author | URL |
118110
| ------- | -------- | ------- | ------ | --- |
119-
| Jredis-inference-optimization | Java | BSD-3 | [RedisLabs](https://redislabs.com/) | [Github](https://github.com/redis-inference-optimization/Jredis-inference-optimization) |
120-
| redis-inference-optimization-py | Python | BSD-3 | [RedisLabs](https://redislabs.com/) | [Github](https://github.com/redis-inference-optimization/redis-inference-optimization-py) |
121-
| redis-inference-optimization-go | Go | BSD-3 | [RedisLabs](https://redislabs.com/) | [Github](https://github.com/redis-inference-optimization/redis-inference-optimization-go) |
122-
| redis-inference-optimization-js | Typescript/Javascript | BSD-3 | [RedisLabs](https://redislabs.com/) | [Github](https://github.com/redis-inference-optimization/redis-inference-optimization-js) |
111+
| JredisAI | Java | BSD-3 | [RedisLabs](https://redislabs.com/) | [Github](https://github.com/redis-inference-optimization/JRedisAI) |
112+
| redisAI-py | Python | BSD-3 | [RedisLabs](https://redislabs.com/) | [Github](https://github.com/redisAI/redisAI-py) |
113+
| redisAI-go | Go | BSD-3 | [RedisLabs](https://redislabs.com/) | [Github](https://github.com/RedisAI/redisAI-go) |
114+
| redisAI-js | Typescript/Javascript | BSD-3 | [RedisLabs](https://redislabs.com/) | [Github](https://github.com/redisAI/redisAI-js) |
123115
| redis-modules-sdk | TypeScript | BSD-3-Clause | [Dani Tseitlin](https://github.com/danitseitlin) | [Github](https://github.com/danitseitlin/redis-modules-sdk) |
124116
| redis-modules-java | Java | Apache-2.0 | [dengliming](https://github.com/dengliming) | [Github](https://github.com/dengliming/redis-modules-java) |
125117
| smartredis | C++ | BSD-2-Clause | [Cray Labs](https://github.com/CrayLabs) | [Github](https://github.com/CrayLabs/SmartRedis) |
126118
| smartredis | C | BSD-2-Clause | [Cray Labs](https://github.com/CrayLabs) | [Github](https://github.com/CrayLabs/SmartRedis) |
127119
| smartredis | Fortran | BSD-2-Clause | [Cray Labs](https://github.com/CrayLabs) | [Github](https://github.com/CrayLabs/SmartRedis) |
128120
| smartredis | Python | BSD-2-Clause | [Cray Labs](https://github.com/CrayLabs) | [Github](https://github.com/CrayLabs/SmartRedis) |
129121

130-
131-
132-
The full documentation for redis-inference-optimization's API can be found at the [Commands page](commands.md).
133-
134-
## Contact Us
135-
If you have questions, want to provide feedback or perhaps report an issue or [contribute some code](contrib.md), here's where we're listening to you:
136-
137-
* [Forum](https://forum.redis.com/c/modules/redis-inference-optimization)
138-
* [Repository](https://github.com/RedisAI/redis-inference-optimization/issues)
139-
140122
## License
141-
redis-inference-optimization is licensed under your choice of the Redis Source Available License 2.0 (RSALv2) or the Server Side Public License v1 (SSPLv1).
123+
Redis-inference-optimization is licensed under your choice of the Redis Source Available License 2.0 (RSALv2) or the Server Side Public License v1 (SSPLv1).

0 commit comments

Comments
 (0)