Table of Contents
Module spoon_ai.callbacks.statistics
StreamingStatisticsCallback Objects​
class StreamingStatisticsCallback(BaseCallbackHandler, LLMManagerMixin)
Collect simple throughput statistics during streaming runs.
By default, the callback prints summary metrics when the LLM finishes.
Consumers can provide a custom print_fn to redirect output, or disable
printing entirely and read the public attributes after execution.