Naming conventions : In client 2.x API method parameters for unique identifiers used semantic names such as project_id='my_project' and model_id='my_model'. With client 3.x we will expose automatically generated unique identifiers for objects like projects and models. This unique identifier is what is to be used for any "id" or "_id" parameter. Your semantic names will be associated with an object's "name" parameter and will allow retrieval by your semantic name in addition to get by "id".
Flow Changes
Import
2.x : import fiddler as fdl
3.x : import fiddler as fdl
Initialization
2.x :
Copy client = fdl . FiddlerApi (
url = URL,
org_id = ORG_ID,
auth_token = AUTH_TOKEN
)
3.x :
Copy fdl . init (
url = URL,
token = AUTH_TOKEN
)
org_id
is no longer required and will be inferred from the token.
Projects
Get Projects
2.x : projects = client.get_projects()
3.x : projects = fdl.Project.list()
Add Project
2.x :
Copy PROJECT_ID = 'YOUR_PROJECT_NAME'
project = client . create_project (project_id = PROJECT_ID)
3.x :
Copy PROJECT_NAME = 'YOUR_PROJECT_NAME'
project = fdl . Project (name = PROJECT_NAME)
project . create ()
Delete Project
2.x :
Copy PROJECT_ID = 'YOUR_PROJECT_NAME'
client . delete_project (project_id = PROJECT_ID)
3.x :
Copy PROJECT_NAME = 'YOUR_PROJECT_NAME'
project = fdl . Project . from_name (name = PROJECT_NAME)
project . delete ()
Models
Get Model
2.x :
Copy PROJECT_ID = 'YOUR_PROJECT_NAME'
MODEL_ID = 'YOUR_MODEL_NAME'
model = client . get_model (
project_id = PROJECT_ID,
model_id = MODEL_ID
)
3.x :
Copy PROJECT_NAME = 'YOUR_PROJECT_NAME'
MODEL_NAME = 'YOUR_MODEL_NAME'
project = fdl . Project . from_name (name = PROJECT_NAME)
model = fdl . Model . from_name (name = MODEL_NAME, project_id = project.id)
List Models
2.x :
Copy PROJECT_ID = 'YOUR_PROJECT_NAME'
models = client . list_models (project_id = PROJECT_ID) )
3.x :
Copy PROJECT_ID = '6dbf8656-1b6b-4f80-ba2b-b75739526dc2'
models = fdl . Model . list (project_id = PROJECT_ID)
Add Model
2.x :
Copy PROJECT_ID = 'YOUR_PROJECT_NAME'
MODEL_ID = 'YOUR_MODEL_NAME'
DATASET_ID = 'YOUR_DATASET_NAME'
client . add_model (
project_id = PROJECT_ID,
dataset_id = DATASET_ID,
model_id = MODEL_ID,
model_info = model_info
)
3.x :
Copy DATASET_FILE_PATH = < path_to_file >
MODEL_NAME = 'YOUR_MODEL_NAME'
PROJECT_ID = '6dbf8656-1b6b-4f80-ba2b-b75739526dc2'
MODEL_SPEC = fdl . ModelSpec (
inputs = [ 'CreditScore' , 'Geography' , 'Gender' , 'Age' , 'Tenure' , 'Balance' ],
outputs = [ 'probability_churned' ],
targets = [ 'Churned' ],
decisions = [],
metadata = [],
custom_features = [],
)
model = fdl . Model . from_data (
source = DATASET_FILE_PATH,
name = MODEL_NAME,
project_id = PROJECT_ID,
spec = MODEL_SPEC,
)
model . create ()
\
Add Model Artifact
2.x :
Copy PROJECT_ID = 'YOUR_PROJECT_NAME'
MODEL_ID = 'YOUR_MODEL_NAME'
MODEL_DIR = < path_to_dir_containing_artifact >
DEPLOYMENT_PARAMS = { 'deployment_type' : 'MANUAL' , 'cpu' : 1000 }
job_id = client . add_model_artifact (
project_id = PROJECT_ID,
model_id = MODEL_ID,
model_dir = MODEL_DIR,
deployment_params = fdl. DeploymentParams ( ** DEPLOYMENT_PARAMS))
)
3.x :
Copy MODEL_ID = 'e38ab1ad-1a50-40b8-8bee-ab33cd8b9b93'
DEPLOYMENT_PARAMS = { 'deployment_type' : 'MANUAL' , 'cpu' : 1000 }
MODEL_DIR = < path_to_dir_containing_artifact >
model = fdl . Model . get (id_ = MODEL_ID)
job = model . add_artifact (
model_dir = MODEL_DIR,
deployment_params = fdl. DeploymentParams ( ** DEPLOYMENT_PARAMS)
)
job . wait ()
📘 Computation of Feature Importance
Pre-compute of feature importance and feature impact needs to be done manually in 3.x. Below blocks showcase how it can be done with client 3.x
3.x : New steps
Copy DATASET_ID = '5e1e67d2-5170-45ce-a851-68bdde1ac1ad'
importance_job = model . precompute_feature_importance (
dataset_id = DATASET_ID
)
importance_job . wait ()
impact_job = model . precompute_feature_impact (
dataset_id = DATASET_ID
)
impact_job . wait ()
Update Model
2.x :
Copy PROJECT_ID = 'YOUR_PROJECT_NAME'
MODEL_ID = 'YOUR_MODEL_NAME'
models = client . update_model (
project_id = PROJECT_ID,
model_id = MODEL_ID,
)
3.x :
Copy MODEL_ID = 'e38ab1ad-1a50-40b8-8bee-ab33cd8b9b93'
model = fdl . Model . get (id_ = MODEL_ID)
model . xai_params . default_explain_method = 'SHAP'
model . update ()
Update Model Artifact
2.x :
Copy PROJECT_ID = 'YOUR_PROJECT_NAME'
MODEL_ID = 'YOUR_MODEL_NAME'
MODEL_DIR = < path_to_dir_containing_artifact >
DEPLOYMENT_PARAMS = { 'deployment_type' : 'MANUAL' , 'cpu' : 1000 }
job_id = client . update_model_artifact (
project_id = PROJECT_ID,
model_id = MODEL_ID,
model_dir = MODEL_DIR,
deployment_params = fdl. DeploymentParams ( ** DEPLOYMENT_PARAMS))
)
3.x :
Copy MODEL_ID = 'e38ab1ad-1a50-40b8-8bee-ab33cd8b9b93'
DEPLOYMENT_PARAMS = { 'deployment_type' : 'MANUAL' , 'cpu' : 1000 }
MODEL_DIR = < path_to_dir_containing_artifact >
model = fdl . Model . get (id_ = MODEL_ID)
job = model . update_artifact (
model_dir = MODEL_DIR,
deployment_params = fdl. DeploymentParams ( ** DEPLOYMENT_PARAMS)
)
job . wait ()
\
📘 Computation of Feature Importance
Pre-compute of feature importance and feature impact needs to be done manually in 3.x. Below blocks showcase how it can be done with client 3.x
3.x : New steps
Copy DATASET_ID = '5e1e67d2-5170-45ce-a851-68bdde1ac1ad'
importance_job = model . precompute_feature_importance (
dataset_id = DATASET_ID
)
importance_job . wait ()
impact_job = model . precompute_feature_impact (
dataset_id = DATASET_ID
)
impact_job . wait ()
Download Artifacts
2.x :
Copy PROJECT_ID = 'YOUR_PROJECT_NAME'
MODEL_ID = 'YOUR_MODEL_NAME'
OUTPUT_DIR = < path_to_dir_to_download_artifact >
client . download_artifacts (
project_id = PROJECT_ID,
model_id = MODEL_ID,
output_dir = OUTPUT_DIR
)
3.x :
Copy MODEL_ID = 'e38ab1ad-1a50-40b8-8bee-ab33cd8b9b93'
OUTPUT_DIR = < path_to_dir_to_download_artifact >
model = fdl . Model . get (id_ = MODEL_ID)
model . download_artifact (
output_dir = OUTPUT_DIR
)
Get Model Deployment
2.x :
Copy PROJECT_ID = 'YOUR_PROJECT_NAME'
MODEL_ID = 'YOUR_MODEL_NAME'
model_deployment = client . get_model_deployment (
project_id = PROJECT_ID,
model_id = MODEL_ID
)
3.x :
Copy MODEL_ID = 'e38ab1ad-1a50-40b8-8bee-ab33cd8b9b93'
# Using Model
model = fdl . Model . get (id_ = MODEL_ID)
model_deployment = model . deployment
# Using ModelDeployment
model_deployment = fdl . ModelDeployment . of (model_id = MODEL_ID)
Add Model Surrogate
2.x :
Copy PROJECT_ID = 'YOUR_PROJECT_NAME'
MODEL_ID = 'YOUR_MODEL_NAME'
DEPLOYMENT_PARAMS = { 'deployment_type' : 'MANUAL' , 'cpu' : 1000 }
job_id = client . add_model_surrogate (
project_id = PROJECT_ID,
model_id = MODEL_ID,
deployment_params = fdl. DeploymentParams ( ** DEPLOYMENT_PARAMS))
)
3.x :
Copy MODEL_ID = 'e38ab1ad-1a50-40b8-8bee-ab33cd8b9b93'
DATASET_ID = '5e1e67d2-5170-45ce-a851-68bdde1ac1ad'
DEPLOYMENT_PARAMS = { 'deployment_type' : 'MANUAL' , 'cpu' : 1000 }
model = fdl . Model . get (id_ = MODEL_ID)
job = model . add_surrogate (
dataset_id = DATASET_ID,
deployment_params = fdl. DeploymentParams ( ** DEPLOYMENT_PARAMS))
job . wait ()
Update Model Surrogate
2.x :
Copy PROJECT_ID = 'YOUR_PROJECT_NAME'
MODEL_ID = 'YOUR_MODEL_NAME'
DEPLOYMENT_PARAMS = { 'deployment_type' : 'MANUAL' , 'cpu' : 1000 }
job_id = client . update_model_surrogate (
project_id = PROJECT_ID,
model_id = MODEL_ID,
deployment_params = fdl. DeploymentParams ( ** DEPLOYMENT_PARAMS))
)
3.x :
Copy MODEL_ID = 'e38ab1ad-1a50-40b8-8bee-ab33cd8b9b93'
DATASET_ID = '5e1e67d2-5170-45ce-a851-68bdde1ac1ad'
DEPLOYMENT_PARAMS = { 'deployment_type' : 'MANUAL' , 'cpu' : 1000 }
model = fdl . Model . get (id_ = MODEL_ID)
job = model . update_surrogate (
dataset_id = DATASET_ID,
deployment_params = fdl. DeploymentParams ( ** DEPLOYMENT_PARAMS))
job . wait ()
Update Model Deployment
2.x :
Copy PROJECT_ID = 'YOUR_PROJECT_NAME'
MODEL_ID = 'YOUR_MODEL_NAME'
model_deployment = client . update_model_deployment (
project_id = PROJECT_ID,
model_id = MODEL_ID,
cpu = 1000
)
3.x :
Copy MODEL_ID = 'e38ab1ad-1a50-40b8-8bee-ab33cd8b9b93'
model_deployment = fdl . ModelDeployment . of (
model_id = MODEL_ID
)
model_deployment . cpu = 1000
model_deployment . update ()
Delete Model
2.x :
Copy PROJECT_ID = 'YOUR_PROJECT_NAME'
MODEL_ID = 'YOUR_MODEL_NAME'
client . delete_model (
project_id = PROJECT_ID,
model_id = MODEL_ID,
)
3.x :
Copy MODEL_ID = 'e38ab1ad-1a50-40b8-8bee-ab33cd8b9b93'
model = fdl . Model . get (id_ = MODEL_ID)
job = model . delete ()
job . wait ()
Datasets
Get Dataset
2.x :
Copy PROJECT_ID = 'YOUR_PROJECT_NAME'
DATASET_ID = 'YOUR_MODEL_NAME'
dataset = client . get_dataset (
project_id = PROJECT_ID,
dataset_id = DATASET_ID
)
3.x :
Copy # From id
DATASET_ID = '5e1e67d2-5170-45ce-a851-68bdde1ac1ad'
dataset = fdl . Dataset . get (id_ = DATASET_ID)
# From name
DATASET_NAME = 'test_dataset'
MODEL_ID = 'e38ab1ad-1a50-40b8-8bee-ab33cd8b9b93'
dataset = fdl . Dataset . from_name (
name = DATASET_NAME,
model_id = MODEL_ID
)
List Datasets
2.x :
Copy PROJECT_ID = 'YOUR_PROJECT_NAME'
datasets = client . list_datasets (project_id = PROJECT_ID)
3.x :
Copy # In 3.x, datasets are stored at a model level rather than project level.
MODEL_ID = 'e38ab1ad-1a50-40b8-8bee-ab33cd8b9b93'
datasets = fdl . Dataset . list (model_id = MODEL_ID)
Upload Dataset
2.x :
Copy baseline_df = pd . read_csv (PATH_TO_BASELINE_CSV)
dataset_info = fdl . DatasetInfo . from_dataframe (baseline_df)
PROJECT_ID = 'YOUR_PROJECT_NAME'
DATASET_ID = 'YOUR_MODEL_NAME'
client . upload_dataset (
project_id = PROJECT_ID,
dataset_id = DATASET_ID,
dataset = {
'baseline' : baseline_df
},
info = dataset_info
)
3.x :
Copy # Horizon does not require a dataset to be added before model creation.
# However, you can optionally add a dataset to model using publish method as below.
MODEL_ID = 'e38ab1ad-1a50-40b8-8bee-ab33cd8b9b93'
DATASET_NAME = 'YOUR_DATASET_NAME'
DATASET_FILE_PATH = < path_to_dataset >
job = model . publish (
source = DATASET_FILE_PATH,
environment = fdl.EnvType.PRE_PRODUCTION,
dataset_name = DATASET_NAME,
)
job . wait ()
Baselines
Get Baseline
2.x :
Copy PROJECT_ID = 'YOUR_PROJECT_NAME'
DATASET_ID = 'YOUR_MODEL_NAME'
BASELINE_ID = 'YOUR_BASELINE_NAME'
baseline = client . get_baseline (
project_id = PROJECT_ID,
model_id = MODEL_ID,
baseline_id = BASELINE_ID
)
3.x :
Copy # From UUID
BASELINE_ID = '5e1e67d2-5170-45ce-a851-68bdde1ac1ad'
baseline = fdl . Baseline . get (id_ = BASELINE_ID)
# From name
MODEL_ID = 'e38ab1ad-1a50-40b8-8bee-ab33cd8b9b93'
BASELINE_NAME = 'YOUR_BASELINE_NAME'
baseline = fdl . Baseline . from_name (
name = BASELINE_NAME,
model_id = MODEL_ID
)
Add Baseline
2.x :
Copy PROJECT_ID = 'YOUR_PROJECT_NAME'
MODEL_ID = 'YOUR_MODEL_NAME'
BASELINE_ID = 'YOUR_BASELINE_NAME'
DATASET_ID = 'YOUR_DATASET_NAME'
# Static baseline
baseline = client . add_baseline (
project_id = PROJECT_ID,
model_id = MODEL_ID,
baseline_id = BASELINE_ID,
type = fdl.BaselineType.STATIC,
dataset_id = DATASET_ID
)
# Rolling baseline
baseline = client . add_baseline (
project_id = PROJECT_ID,
model_id = MODEL_ID,
baseline_id = BASELINE_NAME,
type = fdl.BaselineType.ROLLING_PRODUCTION,
offset = fdl.WindowSize.ONE_MONTH, # How far back to set our window
window_size = fdl.WindowSize.ONE_WEEK, # Size of the sliding window
)
3.x :
Copy BASELINE_NAME = 'YOUR_BASELINE_NAME'
MODEL_ID = 'e38ab1ad-1a50-40b8-8bee-ab33cd8b9b93'
# Static baseline
baseline = fdl . Baseline (
name = BASELINE_NAME,
model_id = MODEL_ID,
environment = fdl.EnvType.PRE_PRODUCTION,
type_ = fdl.BaselineType.STATIC,
dataset_id = DATASET_ID,
)
baseline . create ()
# Rolling baseline
baseline = fdl . Baseline (
name = BASELINE_NAME,
model_id = MODEL_ID,
environment = fdl.EnvType.PRODUCTION,
type_ = fdl.BaselineType.ROLLING,
window_bin_size = fdl.WindowBinSize.HOUR,
offset_delta = fdl.WindowBinSize.HOUR,
)
baseline . create ()
List Baselines
2.x :
Copy PROJECT_ID = 'YOUR_PROJECT_NAME'
baselines = client . list_baselines (project_id = PROJECT_ID)
3.x :
Copy MODEL_ID = 'e38ab1ad-1a50-40b8-8bee-ab33cd8b9b93'
baselines = fdl . Baseline . list (model_id = MODEL_ID)
Delete Baselines
2.x :
Copy PROJECT_ID = 'YOUR_PROJECT_NAME'
MODEL_ID = 'YOUR_MODEL_NAME'
BASELINE_ID = 'YOUR_BASELINE_NAME'
client . delete_baseline (
project_id = PROJECT_ID,
model_id = MODEL_ID,
baseline_id = BASELINE_ID
)
3.x :
Copy BASELINE_ID = '5e1e67d2-5170-45ce-a851-68bdde1ac1ad'
baseline = fdl . Baseline . get (id_ = BASELINE_ID)
baseline . delete ()
Event Publishing
📘 Source
Source can be either through Batch upload(CSV filepath, Parquet filepath, Pandas dataframe) or Streaming upload(python list) in 3.x
Publish batch production events
Pre-requisite :
Copy PROJECT_ID = 'YOUR_PROJECT_NAME'
MODEL_ID = 'YOUR_MODEL_NAME'
production_df = pd . read_csv (PATH_TO_EVENTS_CSV)
2.x :
Copy job = client . publish_events_batch (
project_id = PROJECT_ID,
model_id = MODEL_ID,
id_field = 'event_id' ,
batch_source = production_df,
timestamp_field = 'timestamp' ,
update_event = False ,
)
3.x :
Copy # setup project and model object in 3.x
project = fdl . Project . from_name (name = PROJECT_ID)
model = fdl . Model . from_name (project_id = project.id, name = MODEL_ID)
# Before publish events, make sure you update the necessary fields
# of the model(if any). One-time only after you switch to 3.x client.
model . event_ts_col = 'timestamp'
model . event_id_col = 'event_id'
model . update ()
job = model . publish (source = production_df, update = False )
job . wait ()
For more examples of update=True
in 3.x, refer to Publish .
\
Publish production events streaming
Copy PROJECT_ID = 'YOUR_PROJECT_NAME'
MODEL_ID = 'YOUR_MODEL_NAME'
event_dict = {...}
Copy client.publish_event(
project_id=PROJECT_ID,
model_id=MODEL_ID,
event=event_dict,
event_timestamp=event_time,
event_id=event_id_tmp,
update_event= False
)
Copy # setup project and model object in 3.x
project = fdl.Project.from_name(name=PROJECT_ID)
model = fdl.Model.from_name(project_id=project.id, name=MODEL_ID)
# Before publishing events, make sure you update the necessary fields of the
# model(if any). You only need to run one-time after switching to 3.x client.
model.event_ts_col = 'timestamp'
model.event_id_col = 'event_id'
model.update()
# Add the corresponding fields to every source you want to update
event_dict['event_id'] = event_id_tmp
event_dict['timestamp'] = event_time
# convert the dictionary to a list
model.publish(source=[event_dict])
\
Custom Metrics
Get Custom Metric
Copy CUSTOM_METRIC_ID = 'YOUR_METRIC_NAME'
client.get_custom_metric(metric_id=METRIC_ID)
Copy # From UUID
CUSTOM_METRIC_ID = '7057867c-6dd8-4915-89f2-a5f253dd4a3a'
custom_metric = fdl.CustomMetric.get(id_=CUSTOM_METRIC_ID)
# From name
MODEL_ID = 'e38ab1ad-1a50-40b8-8bee-ab33cd8b9b93'
CUSTOM_METRIC_NAME = 'YOUR_METRIC_NAME'
custom_metric = fdl.CustomMetric.from_name(
name=CUSTOM_METRIC_NAME,
model_id=MODEL_ID,
)
List Custom Metrics
Copy PROJECT_ID = 'YOUR_PROJECT_NAME'
MODEL_ID = 'YOUR_MODEL_NAME'
client.get_custom_metrics(
project_id=PROJECT_ID,
model_id=MODEL_ID,
)
Copy MODEL_ID = 'e38ab1ad-1a50-40b8-8bee-ab33cd8b9b93'
custom_metrics = fdl.CustomMetric.list(model_id=MODEL_ID)
Add Custom Metric
Copy PROJECT_ID = 'YOUR_PROJECT_NAME'
MODEL_ID = 'YOUR_MODEL_NAME'
CUSTOM_METRIC_NAME = 'YOUR_METRIC_NAME'
DEFINITION = 'average("Age")'
DESCRIPTION = 'Testing custom metrics'
client.get_custom_metrics(
name=CUSTOM_METRIC_NAME,
project_id=PROJECT_ID,
model_id=MODEL_ID,
definition=DEFINITION,
description=DESCRIPTION,
)
Copy CUSTOM_METRIC_NAME = 'YOUR_METRIC_NAME'
MODEL_ID = 'e38ab1ad-1a50-40b8-8bee-ab33cd8b9b93'
DEFINITION = 'average("Age")'
DESCRIPTION = 'Testing custom metrics'
custom_metric = fdl.CustomMetric(
name=CUSTOM_METRIC_NAME,
model_id=MODEL_ID,
definition=DEFINITION,
description=DESCRIPTION,
)
custom_metric.create()
Delete Custom Metric
Copy CUSTOM_METRIC_ID = 'YOUR_METRIC_NAME'
client.delete_custom_metric(metric_id=METRIC_ID)
Copy CUSTOM_METRIC_ID = '7057867c-6dd8-4915-89f2-a5f253dd4a3a'
custom_metric = fdl.CustomMetric.get(id_=CUSTOM_METRIC_ID)
custom_metric.delete()
Segments
Get Segment
Copy SEGMENT_ID = 'YOUR_SEGMENT_NAME'
client.get_segment(segment_id=SEGMENT_ID)
Copy # From UUID
SEGMENT_ID = '2c22a28b-08b8-4dd6-9238-7d7f1b5b4cb7'
segment = fdl.Segment.get(id_=SEGMENT_ID)
# From name
MODEL_ID = 'e38ab1ad-1a50-40b8-8bee-ab33cd8b9b93'
SEGMENT_NAME = 'YOUR_SEGMENT_NAME'
segment = fdl.Segment.from_name(
name=SEGMENT_NAME,
model_id=MODEL_ID,
)
List Segments
Copy PROJECT_ID = 'YOUR_PROJECT_NAME'
MODEL_ID = 'YOUR_MODEL_NAME'
client.get_segments(
project_id=PROJECT_ID,
model_id=MODEL_ID,
)
Copy MODEL_ID = '2c22a28b-08b8-4dd6-9238-7d7f1b5b4cb7'
segment = fdl.Segment.list(model_id=MODEL_ID)
Add Segment
Copy PROJECT_ID = 'YOUR_PROJECT_NAME'
MODEL_ID = 'YOUR_MODEL_NAME'
SEGMENT_NAME = 'YOUR_SEGMENT_NAME'
DEFINITION = 'Age < 60'
DESCRIPTION = 'Testing segment'
client.add_segment(
name=SEGMENT_NAME,
project_id=PROJECT_ID,
model_id=MODEL_ID,
definition=DEFINITION,
description=DESCRIPTION,
)
Copy SEGMENT_NAME = 'YOUR_SEGMENT_NAME'
MODEL_ID = 'e38ab1ad-1a50-40b8-8bee-ab33cd8b9b93'
DEFINITION = 'Age < 60'
DESCRIPTION = 'Testing segment'
segment = fdl.Segment(
name=SEGMENT_NAME,
model_id=MODEL_ID,
definition=DEFINITION,
description=DESCRIPTION
)
segment.create()
Delete Segment
Copy SEGMENT_ID = 'YOUR_SEGMENT_NAME'
client.delete_segment(
segment_id=SEGMENT_ID
)
Copy SEGMENT_ID = '2c22a28b-08b8-4dd6-9238-7d7f1b5b4cb7'
segment = fdl.Segment.get(id_=SEGMENT_ID)
segment.delete()
Alerts
List Alert Rules
Copy MODEL_ID = 'YOUR_MODEL_NAME'
rules = client.get_alert_rules(model_id=MODEL_ID)
Copy MODEL_ID = 'e38ab1ad-1a50-40b8-8bee-ab33cd8b9b93'
rules = fdl.AlertRule.list(model_id=MODEL_ID)
\
Add Alert Rule
Copy notifications_config = client.build_notifications_config(
emails = "name@google.com",
)
PROJECT_ID = 'YOUR_PROJECT_NAME'
MODEL_ID = 'YOUR_MODEL_NAME'
rule = client.add_alert_rule(
name = "Bank Churn Range Violation Alert1",
project_id = PROJECT_ID,
model_id = MODEL_ID,
alert_type = fdl.AlertType.DATA_INTEGRITY,
metric = fdl.Metric.RANGE_VIOLATION,
bin_size = fdl.BinSize.ONE_DAY,
compare_to = fdl.CompareTo.RAW_VALUE,
compare_period = None,
priority = fdl.Priority.HIGH,
warning_threshold = 2,
critical_threshold = 3,
condition = fdl.AlertCondition.GREATER,
column = "numofproducts",
notifications_config = notifications_config
)
Copy ALERT_NAME = 'YOUR_ALERT_NAME'
METRIC_NAME = 'null_violation_count'
MODEL_ID = 'e38ab1ad-1a50-40b8-8bee-ab33cd8b9b93'
rule = fdl.AlertRule(
name=ALERT_NAME,
model_id=MODEL_ID,
metric_id=METRIC_NAME,
priority=fdl.Priority.MEDIUM,
compare_to=fdl.CompareTo.RAW_VALUE,
condition=fdl.AlertCondition.GREATER,
bin_size=fdl.BinSize.HOUR,
critical_threshold=1,
warning_threshold=0.32,
)
rule.create()
Delete Alert Rule
Copy ALERT_RULE_ID = '31109d19-b8aa-4db0-a4d5-aa0706987b1b'
client.delete_alert_rule(alert_rule_uuid=ALERT_RULE_ID)
Copy ALERT_RULE_ID = '31109d19-b8aa-4db0-a4d5-aa0706987b1b'
rule = fdl.AlertRule.get(id_=ALERT_RULE_ID)
rule.delete()
\
Get Triggered Alerts
Copy ALERT_RULE_ID = '31109d19-b8aa-4db0-a4d5-aa0706987b1b'
rules = client.get_triggered_alerts(alert_rule_uuid=ALERT_RULE_ID)
Copy ALERT_RULE_ID = '31109d19-b8aa-4db0-a4d5-aa0706987b1b'
rules = fdl.AlertRecord.list(
alert_rule_id=ALERT_RULE_ID,
start_time=datetime(),
end_time=datetime()
)
Enable Notifications
Copy ALERT_RULE_ID = '31109d19-b8aa-4db0-a4d5-aa0706987b1b'
notifications = client.update_alert_rule(
alert_config_uuid=ALERT_RULE_ID,
enable_notification=True
)
Copy ALERT_RULE_ID = '31109d19-b8aa-4db0-a4d5-aa0706987b1b'
rule = fdl.AlertRule.get(id_=ALERT_RULE_ID)
rule.enable_notifications()
Disable Notifications
Copy ALERT_RULE_ID = '31109d19-b8aa-4db0-a4d5-aa0706987b1b'
notifications = client.update_alert_rule(
alert_config_uuid=ALERT_RULE_ID,
enable_notification=False
)
Copy ALERT_RULE_ID = '31109d19-b8aa-4db0-a4d5-aa0706987b1b'
rule = fdl.AlertRule.get(id_=ALERT_RULE_ID)
rule.disable_notifications()
Webhooks
Get Webhook
Copy WEBHOOK_UUID = '00cb3169-7983-497c-8f3c-d25df26543b0'
webhook = client.get_webhook(uuid=WEBHOOK_UUID)
Copy WEBHOOK_ID = '00cb3169-7983-497c-8f3c-d25df26543b0'
webhook = fdl.Webhook.get(id_=WEBHOOK_ID)
List Webhooks
Copy webhooks = client.get_webhooks()
Copy webhooks = fdl.Webhook.list()
Add Webhook
Copy WEBHOOK_NAME = 'YOUR_WEBHOOK_NAME'
WEBHOOK_URL = 'https://hooks.slack.com/services/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'
WEBHOOK_PROVIDER = 'SLACK'
webhook = client.add_webhook(
name=WEBHOOK_NAME,
url=WEBHOOK_URL,
provider=WEBHOOK_PROVIDER
)
Copy WEBHOOK_NAME = 'YOUR_WEBHOOK_NAME'
WEBHOOK_URL = 'https://hooks.slack.com/services/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'
WEBHOOK_PROVIDER = 'SLACK'
webhook = fdl.Webhook(
name=WEBHOOK_NAME,
url=WEBHOOK_URL,
provider=WEBHOOK_PROVIDER
)
webhook.create()
Update Webhook
Copy WEBHOOK_UUID = '00cb3169-7983-497c-8f3c-d25df26543b0'
WEBHOOK_NAME = 'YOUR_WEBHOOK_NAME'
WEBHOOK_URL = 'https://hooks.slack.com/services/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'
WEBHOOK_PROVIDER = 'SLACK'
webhook = client.update_webhook(
uuid=WEBHOOK_UUID,
name=WEBHOOK_NAME,
url=WEBHOOK_URL,
provider=WEBHOOK_PROVIDER
)
Copy WEBHOOK_ID = '00cb3169-7983-497c-8f3c-d25df26543b0'
webhook = fdl.Webhook.get(id_=WEBHOOK_ID)
webhook.name = 'YOUR_WEBHOOK_NAME_CHANGE'
webhook.update()
Delete Webhook
Copy WEBHOOK_UUID = '00cb3169-7983-497c-8f3c-d25df26543b0'
webhook = client.delete_webhook(uuid=WEBHOOK_UUID)
Copy WEBHOOK_ID = '00cb3169-7983-497c-8f3c-d25df26543b0'
webhook = fdl.Webhook.get(id_=WEBHOOK_ID)
webhook.delete()
XAI
Get Explanation
Copy PROJECT_ID = 'YOUR_PROJECT_NAME'
MODEL_ID = 'YOUR_MODEL_NAME'
# RowDataSource
explain = client.get_explanation(
project_id=PROJECT_ID,
model_id=MODEL_ID,
input_data_source=fdl.RowDataSource(row={})
)
# EventIdDataSource
explain = client.get_explanation(
project_id=PROJECT_ID,
model_id=MODEL_ID,
input_data_source=fdl.EventIdDataSource()
)
Copy MODEL_ID = 'e38ab1ad-1a50-40b8-8bee-ab33cd8b9b93'
# RowDataSource
model = fdl.Model.get(id_=MODEL_ID)
explain = model.explain(
input_data_source=fdl.RowDataSource(row={})
)
# EventIdDataSource
model = fdl.Model.get(id_=MODEL_ID)
explain = model.explain(
input_data_source=fdl.EventIdDataSource()
)
Get Fairness
Copy PROJECT_ID = 'YOUR_PROJECT_NAME'
MODEL_ID = 'YOUR_MODEL_NAME'
# DatasetDataSource
fairness = client.get_fairness(
project_id=PROJECT_ID,
model_id=MODEL_ID,
data_source=fdl.DatasetDataSource(),
protected_features=[],
positive_outcome='',
)
# SqlSliceQueryDataSource
fairness = client.get_fairness(
project_id=PROJECT_ID,
model_id=MODEL_ID,
data_source=fdl.SqlSliceQueryDataSource(),
protected_features=[],
positive_outcome='',
)
Copy MODEL_ID = 'e38ab1ad-1a50-40b8-8bee-ab33cd8b9b93'
# DatasetDataSource
model = fdl.Model.get(id_=MODEL_ID)
fairness = model.get_fairness(
data_source=fdl.DatasetDataSource(),
protected_features=[],
positive_outcome='',
)
# SqlSliceQueryDataSource
model = fdl.Model.get(id_=MODEL_ID)
fairness = model.get_fairness(
data_source=fdl.SqlSliceQueryDataSource(),
protected_features=[],
positive_outcome='',
)
Get Feature Impact
Copy PROJECT_ID = 'YOUR_PROJECT_NAME'
MODEL_ID = 'YOUR_MODEL_NAME'
# DatasetDataSource
impact =client.get_feature_impact(
project_id=PROJECT_ID,
model_id=MODEL_ID,
data_source=fdl.DatasetDataSource()
)
# SqlSliceQueryDataSource
impact = client.get_feature_impact(
project_id=PROJECT_ID,
model_id=MODEL_ID,
data_source=fdl.SqlSliceQueryDataSource()
)
Copy MODEL_ID = 'e38ab1ad-1a50-40b8-8bee-ab33cd8b9b93'
# DatasetDataSource
model = fdl.Model.get(id_=MODEL_ID)
impact = model.get_feature_impact(
data_source=fdl.DatasetDataSource()
)
# SqlSliceQueryDataSource
model = fdl.Model.get(id_=MODEL_ID)
impact = model.get_feature_impact(
data_source=fdl.SqlSliceQueryDataSource()
)
Get Feature Importance
Copy PROJECT_ID = 'YOUR_PROJECT_NAME'
MODEL_ID = 'YOUR_MODEL_NAME'
# DatasetDataSource
importance = client.get_feature_importance(
project_id=PROJECT_ID,
model_id=MODEL_ID,
data_source=fdl.DatasetDataSource()
)
# SqlSliceQueryDataSource
importance = client.get_feature_importance(
project_id=PROJECT_ID,
model_id=MODEL_ID,
data_source=fdl.SqlSliceQueryDataSource()
)
Copy MODEL_ID = 'e38ab1ad-1a50-40b8-8bee-ab33cd8b9b93'
# DatasetDataSource
model = fdl.Model.get(id_=MODEL_ID)
importance = model.get_feature_importance(
data_source=fdl.DatasetDataSource()
)
# SqlSliceQueryDataSource
model = fdl.Model.get(id_=MODEL_ID)
importance = model.get_feature_importance(
data_source=fdl.SqlSliceQueryDataSource()
)
Get Mutual Information
Copy PROJECT_ID = 'YOUR_PROJECT_NAME'
DATASET_ID = 'YOUR_DATASET_NAME'
QUERY=f'select * from production.{MODEL_NAME} limit 10'
mutual_info = client.get_mutual_information(
project_id=PROJECT_ID,
dataset_id=DATASET_ID,
query=QUERY,
COLUMN_NAME=''
)