Tools
Tools are callable capabilities that let agents interact with the outside worldโAPIs, databases, blockchains, file systems, and any other external service. Without tools, an LLM can only generate text. With tools, it can take action.
Why Tools?โ
An LLM doesn't know today's Bitcoin price, can't send emails, and has no way to query your database. Tools bridge this gap:
SpoonOS tools are:
- Typed โ JSON-schema parameters prevent LLM hallucination of invalid inputs
- Validated โ Runtime checks ensure data integrity before execution
- Async โ Non-blocking I/O for high-performance agent loops
- Composable โ Bundle tools into toolkits, share via MCP protocol
Tool Anatomyโ
Every SpoonOS tool has three parts:
| Part | Purpose | Example |
|---|---|---|
| name | Unique identifier the LLM uses to call the tool | "get_crypto_price" |
| description | Natural language explanation of what the tool does | "Get real-time price for a cryptocurrency" |
| parameters | JSON-schema defining expected inputs | {"symbol": {"type": "string"}} |
The LLM reads the description to decide when to use a tool and the parameters to know how to call it.
What Can You Build?โ
| Tool Type | Examples |
|---|---|
| Data retrieval | Web search, database queries, API calls |
| Crypto/Web3 | CEX trading, DEX swaps, on-chain reads, wallet operations |
| File operations | Read/write files, parse documents, generate reports |
| Communication | Send emails, post to Slack, create tickets |
| Computation | Run code, execute SQL, perform calculations |
SpoonOS vs Other Tool Systemsโ
| Feature | SpoonOS | LangChain | OpenAI Functions |
|---|---|---|---|
| Definition | BaseTool class | @tool decorator | JSON in API call |
| Validation | JSON-schema + runtime | Optional Pydantic | Server-side only |
| Remote tools | MCP protocol (stdio/SSE/WS) | API wrappers | N/A |
| Discovery | ToolManager + semantic search | load_tools() | Manual |
| Crypto native | Built-in CEX/DEX/on-chain | Third-party | N/A |
Quick Startโ
pip install spoon-ai-sdk
import asyncio
from spoon_ai.tools.base import BaseTool
from spoon_ai.tools import ToolManager
# Define a tool with JSON-schema parameters
class GreetTool(BaseTool):
name: str = "greet"
description: str = "Greet someone by name"
parameters: dict = {
"type": "object",
"properties": {"name": {"type": "string"}},
"required": ["name"]
}
async def execute(self, name: str) -> str:
return f"Hello, {name}!"
# Register and execute
manager = ToolManager([GreetTool()])
async def main():
result = await manager.execute(name="greet", tool_input={"name": "World"})
print(result) # Hello, World!
asyncio.run(main())
Tool Typesโ
Local Tools (BaseTool)โ
All tools inherit from BaseTool with three required attributes and one method:
from spoon_ai.tools.base import BaseTool
class MyTool(BaseTool):
name: str = "my_tool" # Unique identifier
description: str = "What this tool does" # LLM reads this to decide when to use it
parameters: dict = { # JSON-schema for input validation
"type": "object",
"properties": {
"arg1": {"type": "string", "description": "First argument"},
"arg2": {"type": "integer", "default": 10}
},
"required": ["arg1"]
}
async def execute(self, arg1: str, arg2: int = 10) -> str:
return f"Result: {arg1}, {arg2}"
The __call__ method forwards to execute(), so await tool(arg1="value") works.
ToolManagerโ
Orchestrates tool registration, lookup, and execution:
from spoon_ai.tools import ToolManager
manager = ToolManager([MyTool(), AnotherTool()])
# Execute by name
result = await manager.execute(name="my_tool", tool_input={"arg1": "hello"})
# Get tool specs for LLM function calling
specs = manager.to_params() # List of OpenAI-compatible tool definitions
Key methods:
add_tool(tool)/add_tools([...])โ Register toolsremove_tool(name)โ Unregister by nameget_tool(name)โ Retrieve tool instanceto_params()โ Export OpenAI-compatible tool definitionsindex_tools()/query_tools(query)โ Semantic search (requires Pinecone + OpenAI)
Crypto toolkit (optional)โ
If you install spoon-toolkits, import the concrete tools you need:
from spoon_toolkits import CryptoPowerDataPriceTool, CryptoPowerDataCEXTool
from spoon_ai.tools import ToolManager
import asyncio
crypto_tools = [
CryptoPowerDataPriceTool(),
CryptoPowerDataCEXTool(),
]
manager = ToolManager(crypto_tools)
async def main():
result = await manager.execute(name="crypto_powerdata_price", tool_input={"source": "cex", "exchange": "binance", "symbol": "BTC/USDT"})
print(result)
asyncio.run(main())
Environment variables for these tools depend on the specific provider (e.g., OKX_API_KEY, BITQUERY_API_KEY, RPC_URL, etc.).
MCP client tools (MCPTool)โ
MCPTool lets an agent call tools hosted on an MCP server.
from spoon_ai.tools.mcp_tool import MCPTool
mcp_tool = MCPTool(
mcp_config={
"url": "http://localhost:8765", # or ws://..., or command/args for stdio
"transport": "sse", # optional: "sse" (default) | "http"
"timeout": 30,
"max_retries": 3,
}
)
# The toolโs schema/description is fetched dynamically from the MCP server.
MCPTool.execute(...) will fetch the serverโs tool list, align the name/parameters, and perform retries and health checks.
MCP clientsโ
SpoonOS agents primarily use MCPTool (MCP client) to talk to remote MCP servers:
from spoon_ai.tools.mcp_tool import MCPTool
import asyncio
# Example: connect to DeepWiki SSE MCP server
deepwiki = MCPTool(
name="read_wiki_structure", # Use the actual tool name from the server
description="DeepWiki MCP tool for repository analysis",
mcp_config={
"url": "https://mcp.deepwiki.com/sse",
"transport": "sse",
"timeout": 30,
},
)
async def main():
# Pre-load parameters to get the correct schema
print("Loading MCP tool parameters...")
await deepwiki.ensure_parameters_loaded()
# Use the correct parameter name: repoName (not repo)
result = await deepwiki.execute(repoName="XSpoonAi/spoon-core")
print(f"\nResult:\n{result}")
asyncio.run(main())
If you need to self-host an MCP server, follow that serverโs own documentation; the cookbook focuses on the spoon_ai MCP client (MCPTool) rather than FastMCP server setup.
Configurationโ
- Core: none required for basic tools.
- Embedding index (optional):
OPENAI_API_KEY,PINECONE_API_KEY. - Crypto/toolkit tools: provider-specific keys (e.g.,
OKX_API_KEY,BITQUERY_API_KEY,RPC_URL,GOPLUSLABS_API_KEY). - MCP: set transport target via
mcp_config(urlorcommand+args/env).
Best Practicesโ
- Keep tools single-purpose with clear
parametersJSON schema. - Validate inputs inside
execute; raise rich errors for better agent feedback. - Prefer async I/O in
executeto avoid blocking the event loop. - Reuse
ToolManagerfor name-based dispatch and tool metadata generation. - When using toolkit or MCP tools, fail gracefully if optional dependencies or servers are missing.