LogoLogo
👨‍💻 API Reference📣 Release Notes📺 Request a Demo
  • Introduction to Fiddler
    • Monitor, Analyze, and Protect your ML Models and Gen AI Applications
  • Fiddler Doc Chatbot
  • First Steps
    • Getting Started With Fiddler Guardrails
    • Getting Started with LLM Monitoring
    • Getting Started with ML Model Observability
  • Tutorials & Quick Starts
    • LLM and GenAI
      • LLM Evaluation - Compare Outputs
      • LLM Monitoring - Simple
    • Fiddler Free Guardrails
      • Guardrails - Quick Start Guide
      • Guardrails - Faithfulness
      • Guardrails - Safety
      • Guardrails FAQ
    • ML Observability
      • ML Monitoring - Simple
      • ML Monitoring - NLP Inputs
      • ML Monitoring - Class Imbalance
      • ML Monitoring - Model Versions
      • ML Monitoring - Ranking
      • ML Monitoring - Regression
      • ML Monitoring - Feature Impact
      • ML Monitoring - CV Inputs
  • Glossary
    • Product Concepts
      • Baseline
      • Custom Metric
      • Data Drift
      • Embedding Visualization
      • Fiddler Guardrails
      • Fiddler Trust Service
      • LLM and GenAI Observability
      • Metric
      • Model Drift
      • Model Performance
      • ML Observability
      • Trust Score
  • Product Guide
    • LLM Application Monitoring & Protection
      • LLM-Based Metrics
      • Embedding Visualizations for LLM Monitoring and Analysis
      • Selecting Enrichments
      • Enrichments (Private Preview)
      • Guardrails for Proactive Application Protection
    • Optimize Your ML Models and LLMs with Fiddler's Comprehensive Monitoring
      • Alerts
      • Package-Based Alerts (Private Preview)
      • Class Imbalanced Data
      • Enhance ML and LLM Insights with Custom Metrics
      • Data Drift: Monitor Model Performance Changes with Fiddler's Insights
      • Ensuring Data Integrity in ML Models And LLMs
      • Embedding Visualization With UMAP
      • Fiddler Query Language
      • Model Versions
      • How to Effectively Use the Monitoring Chart UI
      • Performance Tracking
      • Model Segments: Analyze Cohorts for Performance Insights and Bias Detection
      • Statistics
      • Monitoring ML Model and LLM Traffic
      • Vector Monitoring
    • Enhance Model Insights with Fiddler's Slice and Explain
      • Events Table in RCA
      • Feature Analytics Creation
      • Metric Card Creation
      • Performance Charts Creation
      • Performance Charts Visualization
    • Master AI Monitoring: Create, Customize, and Compare Dashboards
      • Creating Dashboards
      • Dashboard Interactions
      • Dashboard Utilities
    • Adding and Editing Models in the UI
      • Model Editor UI
      • Model Schema Editing Guide
    • Fairness
    • Explainability
      • Model: Artifacts, Package, Surrogate
      • Global Explainability: Visualize Feature Impact and Importance in Fiddler
      • Point Explainability
      • Flexible Model Deployment
        • On Prem Manual Flexible Model Deployment XAI
  • Technical Reference
    • Python Client API Reference
    • Python Client Guides
      • Installation and Setup
      • Model Onboarding
        • Create a Project and Onboard a Model for Observation
        • Model Task Types
        • Customizing your Model Schema
        • Specifying Custom Missing Value Representations
      • Publishing Inference Data
        • Creating a Baseline Dataset
        • Publishing Batches Of Events
        • Publishing Ranking Events
        • Streaming Live Events
        • Updating Already Published Events
        • Deleting Events From Fiddler
      • Creating and Managing Alerts
      • Explainability Examples
        • Adding a Surrogate Model
        • Uploading Model Artifacts
        • Updating Model Artifacts
        • ML Framework Examples
          • Scikit Learn
          • Tensorflow HDF5
          • Tensorflow Savedmodel
          • Xgboost
        • Model Task Examples
          • Binary Classification
          • Multiclass Classification
          • Regression
          • Uploading A Ranking Model Artifact
    • Integrations
      • Data Pipeline Integrations
        • Airflow Integration
        • BigQuery Integration
        • Integration With S3
        • Kafka Integration
        • Sagemaker Integration
        • Snowflake Integration
      • ML Platform Integrations
        • Integrate Fiddler with Databricks for Model Monitoring and Explainability
        • Datadog Integration
        • ML Flow Integration
      • Alerting Integrations
        • PagerDuty Integration
    • Comprehensive REST API Reference
      • Projects REST API Guide
      • Model REST API Guide
      • File Upload REST API Guide
      • Custom Metrics REST API Guide
      • Segments REST API Guide
      • Baselines REST API Guide
      • Jobs REST API Guide
      • Alert Rules REST API Guide
      • Environments REST API Guide
      • Explainability REST API Guide
      • Server Info REST API Guide
      • Events REST API Guide
      • Fiddler Trust Service REST API Guide
    • Fiddler Free Guardrails Documentation
  • Configuration Guide
    • Authentication & Authorization
      • Adding Users
      • Overview of Role-Based Access Control
      • Email Authentication
      • Okta Integration
      • SSO with Azure AD
      • Ping Identity SAML SSO Integration
      • Mapping LDAP Groups & Users to Fiddler Teams
    • Application Settings
    • Supported Browsers
  • History
    • Release Notes
    • Python Client History
    • Compatibility Matrix
    • Product Maturity Definitions
Powered by GitBook

© 2024 Fiddler Labs, Inc.

On this page
  • Steps for the walkthrough
  • Label Update
  • Papermill Operator
  • Airflow DAG

Was this helpful?

  1. Technical Reference
  2. Integrations
  3. Data Pipeline Integrations

Airflow Integration

PreviousData Pipeline IntegrationsNextBigQuery Integration

Last updated 1 month ago

Was this helpful?

Apache Airflow is an open source platform ETL platform to manage company’s complex workflows. Companies are increasingly integrating their ML models pipeline into Airflow DAGs to manage and monitor all the components of their ML model system.

By integrating Fiddler into an existing Airflow DAG, you will be able to train, manage, and onboard your models while actively monitoring performance, data quality, and troubleshooting degradations across your models.

Fiddler can be easily integrated into your existing airflow DAG for ML model pipeline. A notebook which is used for publishing events can be orchestrated to run as a part of your airflow DAG using a ‘Papermill Operator’.

Steps for the walkthrough

  1. Setup airflow on your local or docker, these steps can be followed.

  2. Add your jupyter notebook containing the code for publishing to your airflow home directory. In this example we will use the 2 different notebooks -

    a.

    b.

  3. Add an orchestration code to your airflow directory, airflow will pick up the orchestration code and construct a DAG as defined. The orchestration code contains the ‘papermill operator’ to orchestrate the jupyter notebooks which will be used to onboard models and publish events to Fiddler. Please refer to our .

  4. The run interval can be set up in orchestration code as ‘schedule_interval’ in the DAG class. This interval can be based on the frequency of training and inference of your ML model.

  5. Once the DAGs are set up it can be monitored on the UI. Below we can see dummy DAGs have been set up with placeholder nodes for ‘data preparation ETL’ and ‘model training/inference’. We have two DAGs -

    a. To set up Fiddler model registration after preparing baseline data (training pipeline)

    b. To publish events to Fiddler after data preparation and ML model inference (inference pipeline)

Label Update

An important business use case is integrating Fiddler’s ‘Label Update’ as a part of your ML workflow using Airflow. Label update can be used to update the ground truth feature in your data. This can be done using the ‘​​publish_event’ api, passing the event, event_id parameters, and making the update_event parameter as ‘True’. The code to update label can be found in the This notebook can be integrated to run as a part of your airflow DAG using the

Papermill Operator

operator_var = PapermillOperator(
        task_id="task_name",
        input_nb="input_jupyter_notebook",
        output_nb="output_jupyter_notebook",
        parameters={"variable_1": "{{ value }}"},
    )

Airflow DAG

Below is an example of Model Registration Airflow DAG run history

Model Registration Airflow DAG flow

Link
Notebook to onboard ML model to Fiddler platform
Notebook to push production events to Fiddler platform
orchestration code
notebook
sample code