Kafka Integration
Fiddler Kafka connector is a service that connects to a Kafka topic containing production events for a model, and publishes the events to Fiddler.
Pre-requisites
We assume that the user has an account with Fiddler, has already created a project, uploaded a dataset and onboarded a model. We will need the url_id, org_id, project_id and model_id to configure the Kafka connector.
Installation
The Kafka connector runs on Kubernetes within the customer’s environment. It is packaged as a Helm chart. To install:
This creates a deployment that reads events from the Kafka topic and publishes it to the configured model. The deployment can be scaled as needed. However, if the Kafka topic is not partitioned, scaling will not result in any gains.
Limitations
The connector assumes that there is a single dedicated topic containing production events for a given model. Multiple deployments can be created, one for each model, and scaled independently.
The connector assumes that events are published as JSON serialized dictionaries of key-value pairs. Support for other formats can be added on request. As an example, a Kafka message should look like the following:
Last updated