While training machine learning (ML) models is the subject of countless MOOCs and web tutorials, productionizing and operating ML models is usually left to the big commercial players or expert users. By leveraging the Jupyter environment and ecosystem, we describe a method to democratize the productionization of ML models while making the process transparent for the casual user.
ML models have become an essential tool for organizations across almost all industries, providing valuable insights and predictions based on data. However, deploying and maintaining ML models can be challenging due to the complex and often dynamic nature of the compute environments they require.
This talk will discuss the benefits of using containerization to manage the compute environments of ML models in JupyterLab and MLFlow, and how it can help organizations make their ML operations more democratic, efficient, and reliable over the long term.
Containerization offers a powerful solution for managing the compute environments of ML models and, combined with tools such as Cookiecutters and MLFlow, making it easier for organizations to deploy and maintain their ML operations over time. By adopting containerization both during model development, training and deployment and integrating open-source tools and services, organizations can better manage and trust the ML models used in their business.