Skip to main content

Table of Contents

Module spoon_ai.llm.providers.anthropic_provider

Anthropic Provider implementation for the unified LLM interface.

AnthropicProvider Objects​

@register_provider("anthropic", [
ProviderCapability.CHAT,
ProviderCapability.COMPLETION,
ProviderCapability.TOOLS,
ProviderCapability.STREAMING
])
class AnthropicProvider(LLMProviderInterface)

Anthropic provider implementation.

initialize​

async def initialize(config: Dict[str, Any]) -> None

Initialize the Anthropic provider with configuration.

get_cache_metrics​

def get_cache_metrics() -> Dict[str, int]

Get current cache performance metrics.

chat​

async def chat(messages: List[Message], **kwargs) -> LLMResponse

Send chat request to Anthropic.

chat_stream​

async def chat_stream(messages: List[Message],
callbacks: Optional[List] = None,
**kwargs) -> AsyncIterator[LLMResponseChunk]

Send streaming chat request to Anthropic with callback support.

Yields:

  • LLMResponseChunk - Structured streaming response chunks

completion​

async def completion(prompt: str, **kwargs) -> LLMResponse

Send completion request to Anthropic.

chat_with_tools​

async def chat_with_tools(messages: List[Message], tools: List[Dict],
**kwargs) -> LLMResponse

Send chat request with tools to Anthropic.

get_metadata​

def get_metadata() -> ProviderMetadata

Get Anthropic provider metadata.

health_check​

async def health_check() -> bool

Check if Anthropic provider is healthy.

cleanup​

async def cleanup() -> None

Cleanup Anthropic provider resources.