ML Workflow Pipelines for IoT Applications on K8s at SmartDeployAI
SmartDeployAI builds data workflow pipelines for running large scale Industrial IoT applications. Their software platform is a shared multi-tenant Kubernetes cluster environment where multiple workflow pipelines can be bootstrapped and scheduled to run concurrently. Learn how IoT sensors and devices are provisioned on their platform. This process requires them to track markers in their metadata store or parameters to run various pipeline models. They need to persist this data and make it available throughout the entire data workflow pipeline life-cycle. Learn how their journey led to ScyllaDB, and how they minimized latencies, maintained data storage isolation for each workflow pipeline in a shared Kubernetes cluster, bootstrapped pipeline artifacts and resources on demand and reduced their resource consumption footprint.