Introduction to RedisAI: Serving Tensorflow, PyTorch, and ONNX Models with Redis



Introduction to RedisAI: Serving Tensorflow, PyTorch, and ONNX Models with Redis

Introduction to RedisAI: Serving Tensorflow, PyTorch, and ONNX Models with Redis

In this first video in a three-part series, we’ll explore RedisAI, a model serving and inferencing engine for Redis. We’ll explain common patterns to host AI models, how you can do it with RedisAI, and how RedisAI can take advantage of data locality to make your inferencing as fast as fast can be.

Links:
Redis University → https://university.redislabs.com/
Redis Labs → https://www.redislabs.com/
RedisAI → https://redisai.io
Announcing RedisAI 1.0 → https://redislabs.com/blog/redisai-ai-serving-engine-for-real-time-applications/
Need a Redis cluster now? Sign up for a free Redis Cloud Essentials account → https://bit.ly/2wasiCa

Join our discord server → https://discord.gg/redis

Timeline:
00:00 Intro
00:43 Model serving patterns
00:52 1. Model in Application
1:29 2. Models in Containers
2:03 3. Models in a Model Server
2:20 Model serving with RedisAI
2:37 Tensor Definition
3:01 Model Definition
3:20 RedisAI Example
4:02 Data locality with RedisAI
5:10 Recap

Comments are closed.