Integration
Vercel AI SDK Integration
Zod-validated tool definitions for Vercel AI SDK. Works with generateText, streamText, and any AI provider.
Quick Start
Import getTools from deadsimple/ai, pass your API key, and hand the tools to any Vercel AI SDK function. The tools are Zod-validated and work with every supported AI provider -- OpenAI, Anthropic, Google, Mistral, and more.
import { generateText } from "ai"; import { openai } from "@ai-sdk/openai"; import { getTools } from "deadsimple/ai"; const tools = getTools("dse_your_api_key"); const result = await generateText({ model: openai("gpt-4o"), tools, prompt: "Create an inbox and send a test email to hello@example.com", });
That's it. The AI model will automatically call createInbox, then sendEmail, using the tool definitions to understand what parameters are required.
Available Tools
The getTools() export returns a tools object with the following Zod-validated tool definitions:
| Tool | Description |
|---|---|
createInbox |
Create a new email inbox with an optional display name |
listInboxes |
List all inboxes on the account |
sendEmail |
Send an email from an inbox (to, subject, text/HTML body) |
readEmails |
List messages in an inbox with optional filters |
readEmail |
Read the full content of a single message |
replyToEmail |
Reply to an existing message, preserving the thread |
Streaming Example
Tools work identically with streamText. The AI provider streams its reasoning while tool calls execute in the background:
import { streamText } from "ai"; import { anthropic } from "@ai-sdk/anthropic"; import { getTools } from "deadsimple/ai"; const tools = getTools("dse_your_api_key"); const stream = streamText({ model: anthropic("claude-sonnet-4-20250514"), tools, prompt: "Check my inbox inb_abc123 for new messages and summarize them", }); for await (const chunk of stream.textStream) { process.stdout.write(chunk); }
Raw OpenAI Function Definitions
If you're using the OpenAI SDK directly (without the ai package), the getFunctionDefinitions() export gives you raw function definitions compatible with OpenAI's function calling format:
import { getFunctionDefinitions } from "deadsimple/ai"; import OpenAI from "openai"; const openai = new OpenAI(); const { tools, execute } = getFunctionDefinitions("dse_your_api_key"); const response = await openai.chat.completions.create({ model: "gpt-4o", tools, messages: [{ role: "user", content: "Create an inbox called Support" }], }); // Execute the tool call returned by the model const toolCall = response.choices[0].message.tool_calls[0]; const result = await execute(toolCall.function.name, toolCall.function.arguments);
This approach works with any OpenAI-compatible API, including local models served via Ollama, vLLM, or LiteLLM.
Multi-Step Agent Loops
For agents that need to take multiple actions, set maxSteps to allow the model to chain tool calls automatically:
const result = await generateText({ model: openai("gpt-4o"), tools, maxSteps: 5, prompt: "Create an inbox, send a welcome email to test@example.com, then list all messages in that inbox", }); // The model will: createInbox -> sendEmail -> readEmails console.log(result.text);
Related Integrations
- TypeScript SDK -- the underlying client that powers these tools
- MCP Server -- connect via Model Context Protocol for Claude Desktop and Cursor
- API Reference -- full endpoint documentation