Table of Contents
- spoon_ai.graph
- spoon_ai.graph.agent
- spoon_ai.graph.types
- spoon_ai.graph.checkpointer
- spoon_ai.graph.builder
- spoon_ai.graph.mcp_integration
- spoon_ai.graph.exceptions
- spoon_ai.graph.reducers
- spoon_ai.graph.decorators
- spoon_ai.graph.config
- spoon_ai.graph.engine
Module spoon_ai.graph
spoon_ai.graph package
Public facade for the graph engine. Import from here.
Module spoon_ai.graph.agent
GraphAgent implementation for the graph package.
Memory Objects​
class Memory()
Memory implementation with persistent storage
clear​
def clear()
Clear all messages and reset memory
add_message​
def add_message(msg)
Add a message to memory
get_messages​
def get_messages(limit: Optional[int] = None) -> List[Dict[str, Any]]
Get messages from memory
get_recent_messages​
def get_recent_messages(hours: int = 24) -> List[Dict[str, Any]]
Get messages from the last N hours
search_messages​
def search_messages(query: str, limit: int = 10) -> List[Dict[str, Any]]
Search messages containing the query
get_statistics​
def get_statistics() -> Dict[str, Any]
Get memory statistics
set_metadata​
def set_metadata(key: str, value: Any)
Set metadata
get_metadata​
def get_metadata(key: str, default: Any = None) -> Any
Get metadata
MockMemory Objects​
class MockMemory(Memory)
Alias for backward compatibility - now uses persistent memory
GraphAgent Objects​
class GraphAgent()
search_memory​
def search_memory(query: str, limit: int = 10) -> List[Dict[str, Any]]
Search memory for messages containing the query
get_recent_memory​
def get_recent_memory(hours: int = 24) -> List[Dict[str, Any]]
Get recent messages from memory
get_memory_statistics​
def get_memory_statistics() -> Dict[str, Any]
Get memory statistics
set_memory_metadata​
def set_memory_metadata(key: str, value: Any)
Set memory metadata
get_memory_metadata​
def get_memory_metadata(key: str, default: Any = None) -> Any
Get memory metadata
save_session​
def save_session()
Manually save current session
load_session​
def load_session(session_id: str)
Load a specific session
Module spoon_ai.graph.types
Typed structures for the graph package.
Module spoon_ai.graph.checkpointer
In-memory checkpointer for the graph package.
InMemoryCheckpointer Objects​
class InMemoryCheckpointer()
iter_checkpoint_history​
def iter_checkpoint_history(
config: Dict[str, Any]) -> Iterable[CheckpointTuple]
Return checkpoint tuples for the specified thread, newest last.
Module spoon_ai.graph.builder
Declarative builders and helpers for SpoonAI graphs.
Intent Objects​
@dataclass
class Intent()
Result of intent analysis.
IntentAnalyzer Objects​
class IntentAnalyzer()
LLM-powered intent analyzer.
Core stays generic; concrete prompts/parsers are supplied by callers.
AdaptiveStateBuilder Objects​
class AdaptiveStateBuilder()
Construct initial graph state using query intent and optional parameters.
ParameterInferenceEngine Objects​
class ParameterInferenceEngine()
LLM delegator for parameter extraction.
Core keeps this generic; applications provide formatting/parsing via options.
NodeSpec Objects​
@dataclass
class NodeSpec()
Declarative node specification.
EdgeSpec Objects​
@dataclass
class EdgeSpec()
Declarative edge specification.
end​
target name or callable router
ParallelGroupSpec Objects​
@dataclass
class ParallelGroupSpec()
Parallel group specification.
GraphTemplate Objects​
@dataclass
class GraphTemplate()
Complete declarative template for a graph.
DeclarativeGraphBuilder Objects​
class DeclarativeGraphBuilder()
Build StateGraph instances from declarative templates.
NodePlugin Objects​
class NodePlugin()
Pluggable node provider.
NodePluginSystem Objects​
class NodePluginSystem()
Registry and discovery for node plugins.
HighLevelGraphAPI Objects​
class HighLevelGraphAPI()
Convenience facade for building graphs per query.
Module spoon_ai.graph.mcp_integration
Utility classes for intelligent MCP tool discovery and configuration.
Core graph components no longer hard-code external tools; instead, user code registers tool specifications and optional transport/configuration details via these helpers.
MCPToolSpec Objects​
@dataclass
class MCPToolSpec()
Specification describing a desired MCP tool.
MCPConfigManager Objects​
class MCPConfigManager()
Centralised configuration loader for MCP tools.
MCPToolDiscoveryEngine Objects​
class MCPToolDiscoveryEngine()
Discover MCP tools based on registered intent mappings.
MCPIntegrationManager Objects​
class MCPIntegrationManager()
High level coordinator for MCP tool usage within graphs.
Module spoon_ai.graph.exceptions
Graph engine exception definitions (public within graph package).
Module spoon_ai.graph.reducers
Reducers and validators for the graph package.
Module spoon_ai.graph.decorators
Decorators and executor for the graph package.
Module spoon_ai.graph.config
Configuration primitives for the SpoonAI graph engine.
RouterConfig Objects​
@dataclass
class RouterConfig()
Controls how the graph chooses the next node after each execution step.
ParallelRetryPolicy Objects​
@dataclass
class ParallelRetryPolicy()
Retry policy for individual nodes inside a parallel group.
ParallelGroupConfig Objects​
@dataclass
class ParallelGroupConfig()
Controls how a parallel group executes and aggregates results.
quorum​
floats in (0, 1] treated as ratio, ints as absolute
error_strategy​
fail_fast, collect_errors, ignore_errors
GraphConfig Objects​
@dataclass
class GraphConfig()
Top-level configuration applied to an entire graph instance.
Module spoon_ai.graph.engine
Graph engine: StateGraph, CompiledGraph, and interrupt API implementation.
BaseNode Objects​
class BaseNode(ABC, Generic[State])
Base class for all graph nodes
__call__​
@abstractmethod
async def __call__(state: State,
config: Optional[Dict[str, Any]] = None) -> Dict[str, Any]
Execute the node logic
RunnableNode Objects​
class RunnableNode(BaseNode[State])
Runnable node that wraps a function
__call__​
async def __call__(state: State,
config: Optional[Dict[str, Any]] = None) -> Dict[str, Any]
Execute the wrapped function
ToolNode Objects​
class ToolNode(BaseNode[State])
Tool node for executing tools
__call__​
async def __call__(state: State,
config: Optional[Dict[str, Any]] = None) -> Dict[str, Any]
Execute tools based on state
ConditionNode Objects​
class ConditionNode(BaseNode[State])
Conditional node for routing decisions
__call__​
async def __call__(state: State,
config: Optional[Dict[str, Any]] = None) -> Dict[str, Any]
Execute condition and return routing decision
interrupt​
def interrupt(data: Dict[str, Any]) -> Any
Interrupt execution and wait for human input.
RouteRule Objects​
class RouteRule()
Advanced routing rule for automatic path selection
matches​
def matches(state: Dict[str, Any], query: str = "") -> bool
Check if this rule matches the current state/query
RunningSummary Objects​
@dataclass
class RunningSummary()
Rolling conversation summary used by the summarisation node.
SummarizationNode Objects​
class SummarizationNode(BaseNode[Dict[str, Any]])
Node that summarises conversation history before model invocation.
StateGraph Objects​
class StateGraph(Generic[State])
add_node​
def add_node(
node_name: str, node: Union[BaseNode[State],
Callable[[State], Any]]) -> "StateGraph"
Add a node to the graph
add_edge​
def add_edge(
start_node: str,
end_node: str,
condition: Optional[Callable[[State], bool]] = None) -> "StateGraph"
Add an edge. When condition is provided, edge becomes conditional.
add_conditional_edges​
def add_conditional_edges(start_node: str, condition: Callable[[State], str],
path_map: Dict[str, str]) -> "StateGraph"
Add conditional edges
set_entry_point​
def set_entry_point(node_name: str) -> "StateGraph"
Set the entry point
add_tool_node​
def add_tool_node(tools: List[Any], name: str = "tools") -> "StateGraph"
Add a tool node
add_conditional_node​
def add_conditional_node(condition_func: Callable[[State], str],
name: str = "condition") -> "StateGraph"
Add a conditional node
add_parallel_group​
def add_parallel_group(
group_name: str,
nodes: List[str],
config: Optional[Union[Dict[str, Any], ParallelGroupConfig]] = None
) -> "StateGraph"
Add a parallel execution group
add_routing_rule​
def add_routing_rule(source_node: str,
condition: Union[str, Callable[[State, str], bool]],
target_node: str,
priority: int = 0) -> "StateGraph"
Add an intelligent routing rule
get_state​
def get_state(
config: Optional[Dict[str, Any]] = None) -> Optional[StateSnapshot]
Fetch the latest (or specified) checkpoint snapshot for a thread.
get_state_history​
def get_state_history(
config: Optional[Dict[str, Any]] = None) -> Iterable[StateSnapshot]
Return all checkpoints for the given thread, ordered by creation time.
add_pattern_routing​
def add_pattern_routing(source_node: str,
pattern: str,
target_node: str,
priority: int = 0) -> "StateGraph"
Add pattern-based routing rule
set_intelligent_router​
def set_intelligent_router(
router_func: Callable[[Dict[str, Any], str], str]) -> "StateGraph"
Set the intelligent router function
set_llm_router​
def set_llm_router(router_func: Optional[Callable[[Dict[str, Any], str],
str]] = None,
config: Optional[Dict[str, Any]] = None) -> "StateGraph"
Set the LLM-powered router function
Arguments:
router_func- Custom LLM router function. If None, uses default LLM router.config- Configuration for LLM router (model, temperature, max_tokens, etc.)
enable_llm_routing​
def enable_llm_routing(
config: Optional[Dict[str, Any]] = None) -> "StateGraph"
Enable LLM-powered natural language routing
This automatically sets up LLM routing for the graph entry point.
compile​
def compile(checkpointer: Optional[Any] = None) -> "CompiledGraph"
Compile the graph
get_graph​
def get_graph() -> Dict[str, Any]
Get graph structure for visualization/debugging
CompiledGraph Objects​
class CompiledGraph(Generic[State])
Compiled graph for execution
get_execution_metrics​
def get_execution_metrics() -> Dict[str, Any]
Get aggregated execution metrics