Working with MLFlow models¶
For this section, we will look at how to upload a model packaged using MLFlow.
The following steps describe how to upload a model packaged using MLFlow¶
You can follow the steps to install MLFlow from their
Train a model¶
Next step is to train the model in MLFlow and make sure a model artifact is saved to mlrun.
You save the model by calling
log_model at the end of your training code.
For example, if you're training a sklearn model,
The model will be saved to:
Build a docker image¶
We can now use the saved artifact to build a docker image.
mlflow models build-docker -m runs:/<run-id>/model
This will create docker image: 'mlflow-pyfunc-servable'.
Tag this image so that it can be pushed to a
Fiddler accessible docker registry,
docker tag mlflow-pyfunc-servable:latest manojcheenath/fiddler_examples docker push manojcheenath/fiddler_examples
Upload model to Fiddler¶
The model can now be uploaded to
Fiddler by calling
deployment_options = DeploymentOptions( deployment_type='predictor', image_uri='manojcheenath/fiddler_examples:latest', port=8080, )
result = fiddler_client.register_model( project_id, model_id, dataset_id, model_info, deployment_options, )
You can find the complete tutorials here