Table of Contents
Module spoon_ai.runnables
Runnable interface and utilities for composable AI components.
This module provides the foundational Runnable interface that all Spoon AI components implement, enabling streaming, composition, and standardized execution.
Module spoon_ai.runnables.events
StreamEventBuilder Objects​
class StreamEventBuilder()
chain_start​
@staticmethod
def chain_start(run_id: UUID, name: str, inputs: Any,
**kwargs: Any) -> StreamEvent
Build chain start event.
chain_stream​
@staticmethod
def chain_stream(run_id: UUID, name: str, chunk: Any,
**kwargs: Any) -> StreamEvent
Build chain stream event.
chain_end​
@staticmethod
def chain_end(run_id: UUID, name: str, output: Any,
**kwargs: Any) -> StreamEvent
Build chain end event.
chain_error​
@staticmethod
def chain_error(run_id: UUID, name: str, error: Exception,
**kwargs: Any) -> StreamEvent
Build chain error event.
llm_stream​
@staticmethod
def llm_stream(run_id: UUID,
name: str,
token: str,
chunk: Optional[Any] = None,
**kwargs: Any) -> StreamEvent
Build LLM stream event.
Module spoon_ai.runnables.base
log_patches_from_events​
async def log_patches_from_events(
event_iter: AsyncIterator[Dict[str, Any]],
*,
diff: bool = True) -> AsyncIterator[RunLogPatch]
Convert a stream of events into run log patches.
Runnable Objects​
class Runnable(ABC, Generic[Input, Output])
astream_log​
async def astream_log(input: Input,
config: Optional[RunnableConfig] = None,
*,
diff: bool = True) -> AsyncIterator[RunLogPatch]
Asynchronously stream structured log patches derived from execution events.
astream_events​
async def astream_events(
input: Input,
config: Optional[RunnableConfig] = None
) -> AsyncIterator[Dict[str, Any]]
Asynchronously stream structured execution events.