Quickstart: For a simple end-to-end example, check the Tutorial.

Installation

Install the tracely package from PyPi:

pip install tracely

Initialize tracing

You must first connect to Evidently Cloud and create a Project.

To start sending traces, use init_tracing:

from tracely import init_tracing

init_tracing(
   address="https://app.evidently.cloud/",
   api_key="YOUR_EVIDENTLY_TOKEN",
   project_id="YOUR_PROJECT_ID",
   export_name="YOUR_TRACING_DATASET_NAME",
   )

You can also set parameters using environment variables with the specified names.

init_tracing() Function Arguments

ParameterDescriptionEnvironment Variable
addressTrace collector address. Defaults to https://app.evidently.cloud/.EVIDENTLY_TRACE_COLLECTOR
api_keyEvidently Cloud API key.EVIDENTLY_TRACE_COLLECTOR_API_KEY or EVIDENTLY_API_KEY
export_nameTracing dataset name. Traces with the same name are grouped into a single dataset.EVIDENTLY_TRACE_COLLECTOR_EXPORT_NAME
project_idDestination Project ID in Evidently Cloud.EVIDENTLY_TRACE_COLLECTOR_PROJECT_ID
exporter_typeTrace export protocol: grpc or http.-
as_globalRegisters the tracing provider (opentelemetry.trace.TracerProvider) globally (True) or use locally (False). Default: True.-

Decorator

Once Tracely is initialized, you can decorate your functions with trace_event to start collecting traces for a specific function:

from tracely import init_tracing
from tracely import trace_event

@trace_event()
def process_request(question: str, session_id: str):
    # do work
    return "work done"

You can also specify which function arguments should be included in the trace.

Example 1. To log all arguments of the function:

@trace_event()

Example 2. To log only input arguments of the function:

@trace_event(track_args=[])

Example 3. To log only “arg1” and “arg2”:

@trace_event(track_args=["arg1", "arg2"])

trace_event Decorator Arguments

ParameterDescriptionDefault
span_name: Optional[str]The name of the span to send in the event.Function name
track_args: Optional[List[str]]A list of function arguments to include in the event.None (all arguments included)
ignore_args: Optional[List[str]]A list of function arguments to exclude, e.g., arguments that contain sensitive data.None (no arguments ignored)
track_output: Optional[bool]Indicates whether to track the function’s return value.True
parse_output: Optional[bool]Indicates whether the result should be parsed, e.g., dict, list, and tuple types will be split into separate fields.True

Nested events (Spans)

Many LLM workflows involve multiple steps — such as retrieval followed by generation, or extraction followed by summarization. In these cases, it’s useful to trace all steps as part of a single parent trace, with each step recorded as a nested child span.

You can trace multi-step workflows using the @trace_event decorator and nesting the functions. If a traced function is called inside another traced function, it will automatically appear as a nested child span, as long as it’s executed in the same call context (same thread).

For example:

@trace_event(span_name="extraction")
def extract_info(document):


@trace_event(span_name="summarization")
def summarize_info(document):


@trace_event(span_name="document_processing")
def process_document(document):
    extract_output = extract_info(document)
    summary_output = summarize_info(document)
    return {
        "document": document,
        "extraction_output": extract_output,
        "summary_output": summary_output
    }

This results in the following trace structure:

document_processing
├── extraction
└── summarization

Context manager

To create a trace event without using a decorator (e.g., for a specific piece of code), you can use the context manager:

import uuid

from tracely import init_tracing
from tracely import create_trace_event

init_tracing()

session_id = str(uuid.uuid4())

with create_trace_event("external_span", session_id=session_id) as event:
    event.set_attribute("my-attribute", "value")
    # do work
    event.set_result({"data": "data"})

You can also trace multi-step workflows using context blocks. This gives you fine-grained control — useful when tracing inline code or scripts. For example, you can nest multiple create_trace_event() calls inline inside the same function, using with blocks.

def process_document(document):
    with create_trace_event("document_processing", document=document):
        with create_trace_event("extraction"):
            ...
        with create_trace_event("summarization"):
            ...

create_trace_event Function Arguments

ParameterDescriptionDefault
nameSpan name.-
parse_outputWhether to parse the result into separate fields for dict, list, or tuple types.True
paramsKey-value parameters to set as attributes.-

event Object Methods

MethodDescription
set_attributeSets a custom attribute for the event.
set_resultSets a result for the event. Only one result can be set per event.

Sessions

If your trace events are created in separate functions or threads you can also pass a shared session_id. In this case traces will be separate but you can view the session in the UI to join them together - e.g. to read the chat conversation.

See the example above the “Context Manager” session.

Add event attributes

If you want to add a new attribute to an active event span, you can use get_current_span() to get access to the current span:

import uuid

from tracely import init_tracing
from tracely import create_trace_event
from tracely import get_current_span

init_tracing()

session_id = str(uuid.uuid4())

with create_trace_event("external_span", session_id=session_id):
    span = get_current_span()
    span.set_attribute("my-attribute", "value")
    # do work
    span.set_result({"data": "data"})

get_current_span() Object Methods

MethodDescription
set_attributeAdds a new attribute to the active span.
set_resultSets a result field for the active span. (Has no effect in decorated functions that define return values).

Connecting event into a trace

Sometimes events happen across different systems, but it’s helpful to link them all into a single trace. You can do this using tracely.bind_to_trace:

@tracely.trace_event()
def process_request(question: str, session_id: str):
    # do work
    return "work done"

# trace id is unique 128-bit integer representing single trace
trace_id = 1234

with tracely.bind_to_trace(trace_id):
    process_request(...)

In this example, instead of creating a new trace ID for each event, all events will be attached to the existing trace with the given trace_id.

In this case you manage the trace_id yourself, so you need to make sure it’s truly unique. If you reuse the same trace_id, all events will be joined, even if they don’t belong together.