Open
Description
Hello,
Will it be possible to configure model data type (same a tensor dtype) ?
Let's say it do some quantisation on my model I would like to be able to use redis AI and do inference on tensor in float16, int8.
I think most backend implement this mechanism but I don't know if it's something that would interest redisAI ?