With the growing business demand for real-time predictions, we are witnessing companies making investments in modernizing their data architectures to support online inference.ÊÊWhen companies need to deliver real-time ML applications to support large volumes of online traffic, Redis is most often selected as the foundation for the online feature store, because of its ability to deliver ultra-low latency with high throughput at scale.Ê2021 was a year of significant growth in customers building their online features stores with Redis. 2022 will see an increase in customers buying COTS feature store software supporting low-latency, high throughput online inference requirements.
In this talk, Redis will share key observations around customers, architectural patterns, use cases and industries adopting Redis as an online feature store. In the process, Redis will also highlight its integrations with key partners in the feature store & MLOps ecosystem including Feast, Microsoft Azure and Tecton.