Kafka Integration
Fiddler Kafka connector is an optional Fiddler service that connects to a Kafka topic containing production events for a model, and publishes the events to Fiddler.
Kafka Integration Pre-requisites
We assume that the user has an account with Fiddler, has already created a project and onboarded a model. We will need your Fiddler url_id, project_id, and model_id to configure the Kafka connector.
Installation
For Fiddler on-premise installations, the Kafka connector runs on Kubernetes within your own environment. It is packaged as a Helm chart for quick installation:
This creates a deployment that reads event data from the Kafka topic and publishes it to the configured Fiddler model. The deployment can be scaled as needed; however, note that if the Kafka topic is not partitioned, scaling will not result in any gains.
Limitations
The connector assumes that there is a single dedicated topic containing production events for a given model. Multiple deployments can be created, one for each model, and scaled independently.
The connector assumes that events are published as JSON serialized dictionaries of key-value pairs. Support for other formats can be added on request. As an example, a Kafka message should look like the following:
Last updated
Was this helpful?