More

    Dockerizing Flask ML Applications

    Guide On Deploying ML Models With Flask and Containerizing Your Work

    Image from Unsplash by Jonas Smith

    Deploying ML models is an essential step in the ML Lifecycle that’s often overlooked by Data Scientists. Without model deployment/hosting, there is no usage of Machine Learning models in real-world applications. There’s a variety of ways to host your ML models and one of the simplest, yet most effective is Flask.

    Flask is a micro-web framework that’s implemented in Python. Using Flask, we’ll walk through an example of how you can locally serve your ML model for inference. If you’re completely new to Flask I would recommend taking a look at this guide, which will get you up to speed.

    Flask is only the first step of the overarching problem we’re going to solve here. Yes we can find a way to host our models using Flask, but how do we maintain that environment in a production setting? How do we track any changes that we might have to make to the software in a real-time application?

    With Docker containers we provide a light-weight wrapper of sorts that makes it extremely easy to reproduce your environment and dependencies. For example, if we have a software update we need to install in our application, Docker makes it simple to implement and track this update. With the utilization of services such as a Docker Registry, you can also track different image versions for deployment. The software lifecycle becomes a lot more simple and efficient with the use of Docker containers to persist our environment and any changes that we are making. For a full breakdown of Docker please take a look at the following resource.

    In this article we’ll take a look at walking through these two steps with a ML model that we have trained.

    NOTE: This article will assume basic knowledge of Python, Docker, Flask, and utilizing the CLI. We will also be using some basic HTML/CSS, but you can copy the template as is. This article will also use an ML Model for serving, but we will not cover any theory behind model building, for model specific information please reference this article.

    Dataset

    For our example we’ll be working with the Petrol Consumption regression dataset from Kaggle. The original data source is licensed here.

    Model Building

    For this article we’re not going to spend too much time on model building or selecting the most accurate model from a Data Science performance perspective. For our regression problem, I’m going to utilize a RandomForest Regression model via the Sklearn framework. Generally when I need quick access to working ML model examples, I try to maintain a model toolbox that will give me boilerplate code to speed up the process.

    Prepare the Data
    Train Random Forest Model

    I serialize the model using the Joblib module, and to verify that it works I’ll quickly predict on a sample data point.

    Sample Inference

    Make sure to keep the “model.pkl” file that is generated (can also create model.joblib). This is the model artifact that we will load up for inference on our Flask server.

    Hosting Models On Flask

    There’s going to be two parts to our Flask application. Firstly, we want to build a template for our front-end using some basic HTML/CSS. In this template all we’re doing is creating a simple form that will take in four inputs: Petrol Tax, Average Income, Paved Highways, and Population Driver License Percentage. These are the four parameters that our model is expecting on the back-end and we will capture them via this form.

    Form

    Secondly, we can configure our Flask application to work with these form inputs. Let’s take a look at the necessary imports and load up the model, this is where our model.pkl file comes into hand.

    Load model

    Let’s test out a GET request with our Flask web server before we configure handling the form. We render our index.html page that we’ve created (excuse me for my ugly HTML/CSS skills).

    Flask App Setup

    If you run the following command in the same directory as your Flask app file, you can start up the Flask server.

    Start Server
    Form Template (Screenshot by Author)

    Now we can process the form’s inputs and feed them to our model.

    Process Form Inputs

    On the HTML side, we want to reflect this output variable “res” that our model is returning, so we add that portion in our index.html code.

    Reflect Result

    If we fill out out form now with some dummy values, we should be able to see inference outputting on the page.

    Submit Form (Screenshot by Author)
    Inference Displayed (Screenshot by Author)

    Great! We have a working Flask application, now let’s see how we can properly containerize this.

    Dockerizing The Flask Application

    Before getting started, make sure to have Docker installed and up and running.

    There are two main Docker entities you need to know: Docker Image and Docker Container. A Docker Image is what contains our source code, dependencies, and environment. A Docker Container is an instance of the Docker Image. Using the Docker Image as a template we can run containers that will start up our application. To further understand the difference please reference this article.

    To build our Docker Image we need to provide a Dockerfile with instructions of setup essentially. With Dockerfiles we can use Base Images that have pre-installed dependencies and code. We’ll grab the Base Python Image to get started.

    Base Image

    Next we use the WORKDIR command to set the working directory within the container. It’s not necessarily best practice to utilize the root directory in the container as it can lead to similar filenames and smaller issues of that sort.

    Working Directory

    Now for setting up our environment we need to provide the dependencies we need to install, we can do this via a requirements.txt file.

    Requirements for Application

    We can then first copy over this requirements file, and then install it in our container environment.

    Install Requirements

    Next we can copy over our remaining files to our working directory.

    Copy Rest Of files To Working Directory

    Lastly, utilizing CMD we can provide a default command upon starting up our container.

    Start Flask Web Server In Container

    Now that we have our Dockerfile, we can build our Docker Image by running the following command in the same path as your Dockerfile.

    Build Docker Image

    This will create a Docker Image tagged as “ramveg/ml-app”, feel free to rename it as you wish. By default this also goes to the “latest” version, if you would like to change that you could do “ramveg/ml-app:dev” or something of that sort.

    Build Image

    We can see this image by running the following command.

    Dockage Image (Screenshot by Author)

    Optionally, you can also check the contents of your container with the following command.

    Check Docker containers

    This should start a shell within your container, where you can run normal linux commands to get an idea of your directory structure. Here we see all the files that we copied over.

    Container Structure (Screenshot by Author)

    Now we can start up the container, the key portion here is that we need to specify port 5000, as that is the default port that Flask runs on.

    Start Flask Server On Container

    If we go to “localhost:5000″, we should see the same Flask app that we had locally. Just to verify we can also perform inference again.

    Flask Server (Screenshot by Author)
    Inference (Screenshot by Author)

    Additional Resources & Conclusion

    You can access the entire code for the example in the link above. I hope that with this article you’ve gotten a deeper understanding of deploying models in a production type setting. For further content on understanding ML Model Hosting, check out my MLFlow and SageMaker series. If you’re interested in learning more about containers, there’s this great course on Udemy by Stephen Grider that will help you start from ground up.

    Recent Articles

    spot_img

    Related Stories

    Stay on op - Ge the daily news in your inbox