LOOMAL
MastraTypeScript

Email and credentials
for Mastra agents.

Mastra is TypeScript's most opinionated agent framework — workflows, memory, and tools as first-class primitives. Loomal plugs in as an MCP server, so your Mastra agent picks up email, vault, and TOTP with one registration.

MCP tool registrationAgent + workflow supportEmail send / receiveVault & TOTPMemory-aware threads

Prerequisites

  • Loomal API key (free at console.loomal.ai)
  • Node.js 20+
  • Mastra 0.10+ (with MCPClient support)
  • An LLM provider key

Mastra's model is close to the Vercel AI SDK's — TypeScript-first, tool-based, but with heavier opinions around workflows and memory. It added MCP client support in mid-2025, which is the right integration point for Loomal.

Register Loomal as an MCP client at agent construction, and every mail and vault primitive becomes available to the agent's tool-calling loop. Threads survive across turns thanks to Mastra's memory, so you get stateful email conversations for free.

1. Create an identity and set environment

Provision a Loomal identity at console.loomal.ai. Put the API key and your LLM provider key in .env.

.env
LOOMAL_API_KEY=loid-your-api-key
OPENAI_API_KEY=sk-...

2. Install Mastra

Mastra ships as a handful of packages. You need the core package plus the MCP client and your chosen model provider.

shell
npm install @mastra/core @mastra/mcp @ai-sdk/openai zod

3. Register Loomal as an MCP client

MCPClient connects to the Loomal server as a subprocess over stdio. The client.getTools() call returns a typed tool object you pass directly to the Agent constructor.

The await on getTools is important — Mastra needs the tool list synchronously available at construction time.

mastra/agents/support.ts
import { Agent } from "@mastra/core/agent";
import { MCPClient } from "@mastra/mcp";
import { openai } from "@ai-sdk/openai";

const loomal = new MCPClient({
  servers: {
    loomal: {
      command: "npx",
      args: ["-y", "@loomal/mcp"],
      env: { LOOMAL_API_KEY: process.env.LOOMAL_API_KEY! },
    },
  },
});

export const supportAgent = new Agent({
  name: "support-agent",
  instructions: "Triage incoming support email. Reply to billing and password questions; label others 'needs-human'.",
  model: openai("gpt-4o-mini"),
  tools: await loomal.getTools(),
});

4. Run the agent with a workflow

Mastra's generate method is the simple entry point; for scheduled or recurring execution, wrap the call in a workflow step. Workflows also give you retries, timeouts, and error handling without bespoke code.

workflows/inbox-loop.ts
import { Workflow, Step } from "@mastra/core/workflow";
import { z } from "zod";
import { supportAgent } from "../agents/support";

export const inboxLoop = new Workflow({
  name: "inbox-loop",
  triggerSchema: z.object({}),
});

inboxLoop.step(new Step({
  id: "process-unread",
  execute: async () => {
    const result = await supportAgent.generate("Process unread support mail now.");
    return { processed: result.text };
  },
}));

inboxLoop.commit();

5. Use vault + memory for multi-turn flows

Mastra's memory lets the agent remember conversation context across turns. Combined with the vault, you can build multi-step flows that log into a service, fetch data, and email a summary — without the agent re-authenticating on every turn.

Pre-load credentials into the vault via the Loomal REST API, then refer to them by label in the agent's instructions.

agents/billing.ts
import { Agent } from "@mastra/core/agent";
import { Memory } from "@mastra/memory";

export const billingAgent = new Agent({
  name: "billing-agent",
  instructions: (
    "To pull invoice data, use vault.get label 'stripe-key'. "
    + "If a 2FA code is requested, use vault.totp label 'stripe-totp'. "
    + "Never echo secret values back."
  ),
  model: openai("gpt-4o-mini"),
  memory: new Memory(),
  tools: await loomal.getTools(),
});

Things to watch out for

Keep the MCPClient instance alive

Each MCPClient launches one subprocess. If you instantiate a new client on every request, you leak processes. Create it once at module scope and reuse.

Workflows need explicit commit

Mastra won't expose a workflow to the runtime until you call commit(). Easy to miss when prototyping; the error is that the workflow 'does not exist' at runtime.

FAQ

Can I limit which Loomal tools the Mastra agent sees?

Yes. MCPClient.getTools() returns a record keyed by tool name. Filter it to just the subset you want before passing to the Agent: tools: Object.fromEntries(Object.entries(await loomal.getTools()).filter(([k]) => !k.includes('delete'))).

Does this work with Mastra's eval harness?

Yes. The agent is a regular Mastra Agent; evals call generate the same way production code does. Your evals will actually exercise the MCP tools — useful for catching schema drift.

Can I run this on Cloudflare Workers?

Not via stdio — Workers can't spawn subprocesses. Deploy the Loomal MCP server as a separate HTTP service and connect from the worker over SSE. Mastra's MCPClient supports both transports.

Loomal primitives used

mail.sendmail.list_messagesmail.replyvault.getvault.totp

Ship it.

Free tier, no card. 30 seconds to first email.

Last updated: 2026-04-14 · See also: AutoGen, Claude Desktop, CrewAI