FiddlerGeneration

FiddlerGeneration

Wrapper for LLM generation spans with semantic convention helpers.

Initialize LLM generation wrapper.

enter()

Enter context and set LLM type.

Return type: FiddlerGeneration

set_model()

Set the LLM model name (gen_ai.request.model).

Return type: None

set_system()

Set the LLM system/provider (gen_ai.system).

Return type: None

set_system_prompt()

Set the system prompt (gen_ai.llm.input.system).

Return type: None

set_user_prompt()

Set the user prompt (gen_ai.llm.input.user).

Return type: None

set_completion()

Set the LLM completion/output (gen_ai.llm.output).

Return type: None

set_usage()

Set token usage information (gen_ai.usage.*).

Return type: None

set_context()

Set additional context (gen_ai.llm.context).

Return type: None

set_messages()

Set input messages in OpenAI chat format (gen_ai.input.messages).

Accepts simple format: [{'role': 'user', 'content': '...'}] Auto-converts to OTel format: [{'role': 'user', 'parts': [{'type': 'text', 'content': '...'}]}]

Return type: None

set_output_messages()

Set output messages in OpenAI chat format (gen_ai.output.messages).

Accepts simple format: [{'role': 'assistant', 'content': '...'}] Auto-converts to OTel format: [{'role': 'assistant', 'parts': [{'type': 'text', 'content': '...'}]}]

Return type: None

set_tool_definitions()

Set available tool definitions for this LLM call (gen_ai.tool.definitions).

Return type: None

Last updated

Was this helpful?