OpenTelemetry Quick Start

Monitor custom AI agents and multi-framework agentic applications with Fiddler using OpenTelemetry's native instrumentation.

What You'll Learn

In this guide, you'll learn how to:

  • Set up OpenTelemetry tracing for custom agent frameworks

  • Configure Fiddler as your OTLP endpoint with proper authentication

  • Map agent attributes to Fiddler's semantic conventions

  • Create instrumented LLM and tool spans with required attributes

  • Verify traces in the Fiddler dashboard

Time to complete: ~10-15 minutes

When to Use OpenTelemetry Integration

This guide is for advanced users and specific scenarios:

  • Multi-framework environments requiring unified observability across different agent frameworks

  • Custom agentic frameworks without dedicated Fiddler SDK support

  • Advanced control over instrumentation and attribute mapping

When to Use Fiddler SDKs Instead:

SDKs provide automatic instrumentation and require significantly less code. Use OpenTelemetry when SDKs don't fit your use case.

Prerequisites

Before you begin, ensure you have:

  • Fiddler Account: An active account with a GenAI application created

  • Python 3.10+

  • OpenTelemetry Packages:

    • pip install opentelemetry-api opentelemetry-sdk opentelemetry-exporter-otlp-proto-http

  • LLM Provider (for examples): OpenAI API key or similar

  • Fiddler Access Token: Get your token from Settings > Credentials

For a complete working example with advanced patterns, download the Advanced OpenTelemetry Notebook from GitHub or open it in Google Colab.

1

Create Fiddler Application

  1. Log in to your Fiddler instance and navigate to GenAI Apps

  2. Select "Add Application" to create a new application

  3. Copy your Application ID - This must be a valid UUID4 format (e.g., 550e8400-e29b-41d4-a716-446655440000)

  4. Get your Access Token from Settings > Credentials

2

Configure Environment Variables

Set up your environment to connect to Fiddler's OTLP endpoint:

export OTEL_EXPORTER_OTLP_ENDPOINT="https://your-instance.fiddler.ai"
export OTEL_EXPORTER_OTLP_HEADERS="authorization=Bearer <YOUR_ACCESS_TOKEN>,fiddler-application-id=<YOUR_APPLICATION_UUID>"
export OTEL_RESOURCE_ATTRIBUTES="application.id=<YOUR_APPLICATION_UUID>"

Environment Variable Breakdown:

Variable
Description
Example

OTEL_EXPORTER_OTLP_ENDPOINT

Your Fiddler instance URL

https://org.fiddler.ai

OTEL_EXPORTER_OTLP_HEADERS

Authentication and app ID headers

authorization=Bearer sk-...,fiddler-application-id=550e8400...

OTEL_RESOURCE_ATTRIBUTES

Resource-level application identifier

application.id=550e8400-e29b-41d4-a716-446655440000

Python Configuration (alternative to environment variables):

import os

os.environ['OTEL_EXPORTER_OTLP_ENDPOINT'] = 'https://your-instance.fiddler.ai'
os.environ['OTEL_EXPORTER_OTLP_HEADERS'] = 'authorization=Bearer <TOKEN>,fiddler-application-id=<UUID>'
os.environ['OTEL_RESOURCE_ATTRIBUTES'] = 'application.id=<UUID>'
3

Initialize OpenTelemetry

Set up OpenTelemetry with Fiddler's OTLP exporter:

import os
from opentelemetry import trace
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import BatchSpanProcessor
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter

# Initialize tracer provider
trace.set_tracer_provider(TracerProvider())
tracer = trace.get_tracer(__name__)

# Configure OTLP exporter for Fiddler
otlp_endpoint = os.getenv('OTEL_EXPORTER_OTLP_ENDPOINT') + '/v1/traces'
otlp_exporter = OTLPSpanExporter(endpoint=otlp_endpoint)

# Add batch span processor
otlp_processor = BatchSpanProcessor(otlp_exporter)
trace.get_tracer_provider().add_span_processor(otlp_processor)

print(f"✅ OpenTelemetry configured with endpoint: {otlp_endpoint}")

What This Does:

  • TracerProvider: Manages trace generation

  • OTLPSpanExporter: Exports spans to Fiddler via OTLP protocol

  • BatchSpanProcessor: Batches spans for efficient network transmission

Local Debugging: Add a console exporter to see traces locally while developing:

from opentelemetry.sdk.trace.export import ConsoleSpanExporter

console_exporter = ConsoleSpanExporter()
console_processor = BatchSpanProcessor(console_exporter)
trace.get_tracer_provider().add_span_processor(console_processor)
4

Instrument Your Agent

Create instrumented spans for your agent's operations. Fiddler requires specific attributes to properly categorize and visualize your agent traces.

Required Fiddler Attributes

Resource Level (set via environment variable):

  • application.id - UUID4 of your Fiddler application

Trace Level (required in all spans):

  • gen_ai.agent.name - Name of your AI agent

  • gen_ai.agent.id - Unique identifier for the agent

Span Level (required for each span):

  • fiddler.span.type - Type of operation: "chain", "tool", "llm", or "other"

Example: Simplified Travel Agent

import json
from openai import OpenAI

client = OpenAI()

AGENT_NAME = "travel_agent"
AGENT_ID = "travel_agent_v1"

# Define tools
def book_hotel_tool(city: str, date: str):
    """Book a hotel in the specified city."""
    with tracer.start_as_current_span("book_hotel") as span:
        # Required attributes
        span.set_attribute("fiddler.span.type", "tool")
        span.set_attribute("gen_ai.agent.name", AGENT_NAME)
        span.set_attribute("gen_ai.agent.id", AGENT_ID)

        # Tool-specific attributes
        span.set_attribute("gen_ai.tool.name", "book_hotel")
        tool_input = {"city": city, "date": date}
        span.set_attribute("gen_ai.tool.input", json.dumps(tool_input))

        # Execute tool
        result = {"status": "confirmed", "hotel": f"Grand Hotel {city}", "confirmation": "HTL123"}
        span.set_attribute("gen_ai.tool.output", json.dumps(result))

        return result

def book_flight_tool(source: str, destination: str, date: str):
    """Book a flight between two cities."""
    with tracer.start_as_current_span("book_flight") as span:
        # Required attributes
        span.set_attribute("fiddler.span.type", "tool")
        span.set_attribute("gen_ai.agent.name", AGENT_NAME)
        span.set_attribute("gen_ai.agent.id", AGENT_ID)

        # Tool-specific attributes
        span.set_attribute("gen_ai.tool.name", "book_flight")
        tool_input = {"source": source, "destination": destination, "date": date}
        span.set_attribute("gen_ai.tool.input", json.dumps(tool_input))

        # Execute tool
        result = {"status": "confirmed", "flight": "FL456", "departure": "10:00 AM"}
        span.set_attribute("gen_ai.tool.output", json.dumps(result))

        return result

# Agent implementation
def travel_agent(user_request: str):
    """Main travel agent function."""
    with tracer.start_as_current_span("travel_agent_chain") as root_span:
        # Root span type
        root_span.set_attribute("fiddler.span.type", "chain")
        root_span.set_attribute("gen_ai.agent.name", AGENT_NAME)
        root_span.set_attribute("gen_ai.agent.id", AGENT_ID)

        # Call LLM to understand request
        with tracer.start_as_current_span("llm_call") as llm_span:
            # Required attributes
            llm_span.set_attribute("fiddler.span.type", "llm")
            llm_span.set_attribute("gen_ai.agent.name", AGENT_NAME)
            llm_span.set_attribute("gen_ai.agent.id", AGENT_ID)

            # LLM-specific attributes
            llm_span.set_attribute("gen_ai.request.model", "gpt-4o-mini")
            llm_span.set_attribute("gen_ai.system", "openai")
            llm_span.set_attribute("gen_ai.llm.input.user", user_request)
            llm_span.set_attribute(
                "gen_ai.llm.input.system",
                "You are a travel agent. Parse user requests and call appropriate tools."
            )

            # Call OpenAI
            response = client.chat.completions.create(
                model="gpt-4o-mini",
                messages=[
                    {"role": "system", "content": "You are a travel agent. Parse user requests and call appropriate tools."},
                    {"role": "user", "content": user_request}
                ],
                tools=[
                    {
                        "type": "function",
                        "function": {
                            "name": "book_hotel",
                            "description": "Book a hotel in a city for a specific date",
                            "parameters": {
                                "type": "object",
                                "properties": {
                                    "city": {"type": "string"},
                                    "date": {"type": "string"}
                                },
                                "required": ["city", "date"]
                            }
                        }
                    },
                    {
                        "type": "function",
                        "function": {
                            "name": "book_flight",
                            "description": "Book a flight between two cities",
                            "parameters": {
                                "type": "object",
                                "properties": {
                                    "source": {"type": "string"},
                                    "destination": {"type": "string"},
                                    "date": {"type": "string"}
                                },
                                "required": ["source", "destination", "date"]
                            }
                        }
                    }
                ]
            )

            # Set token usage
            llm_span.set_attribute("gen_ai.usage.input_tokens", response.usage.prompt_tokens)
            llm_span.set_attribute("gen_ai.usage.output_tokens", response.usage.completion_tokens)
            llm_span.set_attribute("gen_ai.usage.total_tokens", response.usage.total_tokens)

            # Process tool calls
            tool_results = []
            if response.choices[0].message.tool_calls:
                for tool_call in response.choices[0].message.tool_calls:
                    tool_name = tool_call.function.name
                    tool_args = json.loads(tool_call.function.arguments)

                    if tool_name == "book_hotel":
                        result = book_hotel_tool(**tool_args)
                        tool_results.append(result)
                    elif tool_name == "book_flight":
                        result = book_flight_tool(**tool_args)
                        tool_results.append(result)

            llm_span.set_attribute("gen_ai.llm.output",
                                 f"Called tools and received: {tool_results}")

        return {"status": "success", "bookings": tool_results}

# Run the agent
result = travel_agent("Book a hotel in Paris for tomorrow and a flight from London to Paris")
print(f"Agent result: {result}")

Key Implementation Details:

  • Chain Spans: Use fiddler.span.type = "chain" for high-level workflows

  • LLM Spans: Include model, system prompt, user input, output, and token usage

  • Tool Spans: Include tool name, input JSON, and output JSON

  • Nested Spans: Create parent-child relationships to show execution flow

5

Verify Monitoring

  1. Run your instrumented code using the example above

  2. Wait 1-2 minutes for traces to appear in Fiddler

  3. Navigate to GenAI Apps in your Fiddler instance

  4. Verify application status changes to Active

  5. View traces to see your agent spans, hierarchy, and attributes

Success Criteria:

✅ Application shows as Active in GenAI Apps ✅ Traces appear with correct agent name ✅ Span hierarchy shows chain → LLM → tools relationship ✅ All required attributes are present (agent name, agent ID, span type) ✅ LLM token usage is tracked ✅ Tool inputs and outputs are captured

Attribute Reference

Required Attributes

Resource Level:

Attribute
Type
Description
Example

application.id

string

UUID4 of your Fiddler application

"550e8400-e29b-41d4-a716-446655440000"

Trace Level (all spans):

Attribute
Type
Description
Example

gen_ai.agent.name

string

Name of the AI agent

"travel_agent"

gen_ai.agent.id

string

Unique identifier for the agent

"travel_agent_v1"

Span Level:

Attribute
Type
Description
Valid Values

fiddler.span.type

string

Type of operation

"chain", "tool", "llm", "other"

Optional Attributes

Conversation Tracking:

Attribute
Type
Description
Example

gen_ai.conversation.id

string

Session/conversation identifier

"conv_123"

LLM Span Attributes:

Attribute
Type
Description
Example

gen_ai.request.model

string

Model name

"gpt-4o-mini", "claude-3-opus"

gen_ai.system

string

LLM provider

"openai", "anthropic"

gen_ai.llm.input.system

string

System prompt

"You are a helpful assistant"

gen_ai.llm.input.user

string

User input

"What's the weather?"

gen_ai.llm.output

string

LLM response

"The weather is sunny"

gen_ai.usage.input_tokens

int

Input tokens used

42

gen_ai.usage.output_tokens

int

Output tokens used

28

gen_ai.usage.total_tokens

int

Total tokens used

70

Tool Span Attributes:

Attribute
Type
Description
Example

gen_ai.tool.name

string

Tool/function name

"search_database"

gen_ai.tool.input

string

Tool input (JSON)

"{\"query\": \"hotels\"}"

gen_ai.tool.output

string

Tool output (JSON)

"{\"results\": [...]}"

Custom User-Defined Attributes:

Pattern
Level
Example

fiddler.session.user.{key}

Trace (all spans)

fiddler.session.user.user_id = "usr_123"

fiddler.span.user.{key}

Span (individual)

fiddler.span.user.department = "sales"

Troubleshooting

Common Issues

Problem: Application not showing as "Active"

Solutions:

  1. Verify environment variables are set correctly

  2. Check that OTEL_EXPORTER_OTLP_ENDPOINT includes your Fiddler instance URL

  3. Ensure OTEL_EXPORTER_OTLP_HEADERS contains valid authorization token and application ID

  4. Add console exporter to verify spans are being generated locally

  5. Check network connectivity: curl -I https://your-instance.fiddler.ai

Problem: ModuleNotFoundError for OpenTelemetry packages

Solutions:

# Install all required packages
pip install opentelemetry-api opentelemetry-sdk opentelemetry-exporter-otlp-proto-http

# Verify installation
pip list | grep opentelemetry

Problem: Spans not appearing in Fiddler

Solutions:

  1. Verify required attributes are set:

    # Every span MUST have these
    span.set_attribute("fiddler.span.type", "llm")  # or "tool", "chain", "other"
    span.set_attribute("gen_ai.agent.name", "your_agent")
    span.set_attribute("gen_ai.agent.id", "agent_id")
  2. Check resource attributes:

    # Verify application.id is set
    print(os.getenv('OTEL_RESOURCE_ATTRIBUTES'))
  3. Enable console exporter for debugging:

    from opentelemetry.sdk.trace.export import ConsoleSpanExporter
    console_exporter = ConsoleSpanExporter()
    console_processor = BatchSpanProcessor(console_exporter)
    trace.get_tracer_provider().add_span_processor(console_processor)

Problem: Authentication errors (401 Unauthorized)

Solutions:

  1. Regenerate your access token from Fiddler Settings > Credentials

  2. Verify header format: authorization=Bearer <token>,fiddler-application-id=<uuid>

  3. Ensure no extra spaces in header values

  4. Check token hasn't expired

Problem: Invalid Application ID error

Solutions:

  1. Copy Application ID directly from Fiddler UI

  2. Verify UUID4 format: 550e8400-e29b-41d4-a716-446655440000

  3. Ensure no extra quotes or whitespace

Configuration Options

Basic Configuration

import os
from opentelemetry import trace
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import BatchSpanProcessor
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter

# Set environment variables
os.environ['OTEL_EXPORTER_OTLP_ENDPOINT'] = 'https://your-instance.fiddler.ai'
os.environ['OTEL_EXPORTER_OTLP_HEADERS'] = 'authorization=Bearer <TOKEN>,fiddler-application-id=<UUID>'
os.environ['OTEL_RESOURCE_ATTRIBUTES'] = 'application.id=<UUID>'

# Initialize tracing
trace.set_tracer_provider(TracerProvider())
tracer = trace.get_tracer(__name__)

# Configure OTLP exporter
otlp_endpoint = os.getenv('OTEL_EXPORTER_OTLP_ENDPOINT') + '/v1/traces'
otlp_exporter = OTLPSpanExporter(endpoint=otlp_endpoint)
otlp_processor = BatchSpanProcessor(otlp_exporter)
trace.get_tracer_provider().add_span_processor(otlp_processor)

Advanced Configuration

High-Volume Applications (Batch Processing Tuning):

from opentelemetry.sdk.trace.export import BatchSpanProcessor

# Customize batch processor settings
custom_processor = BatchSpanProcessor(
    otlp_exporter,
    max_queue_size=500,           # Default: 2048
    schedule_delay_millis=500,    # Default: 5000
    max_export_batch_size=50,     # Default: 512
    export_timeout_millis=10000   # Default: 30000
)

trace.get_tracer_provider().add_span_processor(custom_processor)

Environment Variable Configuration:

# Batch processor environment variables
export OTEL_BSP_MAX_QUEUE_SIZE=500
export OTEL_BSP_SCHEDULE_DELAY_MILLIS=500
export OTEL_BSP_MAX_EXPORT_BATCH_SIZE=50
export OTEL_BSP_EXPORT_TIMEOUT=10000

Sampling for Production (Reduce Volume):

from opentelemetry.sdk.trace import sampling

# Sample 10% of traces
sampler = sampling.TraceIdRatioBased(0.1)

# Create provider with sampler
provider = TracerProvider(sampler=sampler)
trace.set_tracer_provider(provider)

Compression (Reduce Network Usage):

from opentelemetry.exporter.otlp.proto.http.trace_exporter import Compression

# Enable gzip compression
otlp_exporter = OTLPSpanExporter(
    endpoint=otlp_endpoint,
    compression=Compression.Gzip
)

Using FiddlerClient Alternative (Simplified Setup):

If you have fiddler-langgraph installed, you can use FiddlerClient.get_tracer() for simplified setup:

from fiddler_langgraph import FiddlerClient

client = FiddlerClient(
    api_key='<FIDDLER_API_TOKEN>',
    application_id='<FIDDLER_APPLICATION_ID>',
    url='<FIDDLER_URL>'
)

# Get pre-configured tracer
tracer = client.get_tracer()

# Use tracer normally with manual span creation
with tracer.start_as_current_span("my_operation") as span:
    span.set_attribute("fiddler.span.type", "chain")
    # ... rest of your code

This approach handles OTLP configuration automatically.

Next Steps

Now that you have OpenTelemetry integration working: