Generate an MCP Server
MCP Server generation is currently experimental, so we may not handle all types of OpenAPI spec yet. Contact support if you have any questions or feedback.
Model Context Protocol is a protocol that standardizes how applications provide context and actions to LLMs.
You can use Stainless to automatically generate an MCP Server that wraps your API so that AIs like Claude can retrieve data and take actions.
Enabling MCP Server generation
If you don't have a Stainless project yet, create one. MCP Servers are generated from TypeScript, so choose TypeScript as your first language.
Once you have a project, click "Add SDKs" and choose "MCP Server".
targets:
typescript:
package_name: my-org-name
production_repo: null
publish:
npm: false
options:
mcp_server:
package_name: my-org-name-mcp # this is the default
enable_all_resources: true
The project will generate a subpackage within your TypeScript SDK at packages/mcp-server
that can be independently published and imported by users.
Deploying
To deploy the MCP Server, publish your SDK by setting up a production repo and making a release.
The MCP server is published at the same time as your main TypeScript SDK and with the same version, but in a separate NPM package.
By default, the package name is <my-npm-package>-mcp
, but you can customize it in the target options.
Installation via Claude Desktop
Consult the README.md
file within the packages/mcp-server
directory of your TypeScript SDK to get more specific instructions for your project.
See the Claude desktop user guide for setup.
Once it's set up, find your claude_desktop_config.json
file:
- macOS:
~/Library/Application Support/Claude/claude_desktop_config.json
- Windows:
%APPDATA%\Claude\claude_desktop_config.json
Add the following value to your mcpServers
section. Make sure to provide any necessary environment variables (like API keys) as well.
{
"mcpServers": {
"my_org_api": {
"command": "npx",
"args": ["-y", "my-org-mcp"],
"env": {
"MY_API_KEY": "123e4567-e89b-12d3-a456-426614174000"
}
}
}
}
Customization
Choosing endpoints to expose
By default, all endpoints will be generated, but if some endpoints don't make sense for MCP, you can disable specific resources or endpoints.
Additionally, you can set enable_all_resources
to false
to opt-in only certain resources or endpoints. Note: end-users can also filter
which endpoints they import into their MCP client.
Example resources and endpoints opted in
resources:
my_resource:
mcp: true # enable MCP generation for all endpoints in this resource
methods: ...
another_resource:
methods:
create:
mcp: true # enable this endpoint for MCP generation
endpoint: post /v1/create
update: ...
Fine-tuning tool names and descriptions
By default, tool names and descriptions are generated from your OpenAPI spec and Stainless config. However, if you want to provide additional context to LLMs for certain tools, you can override these values:
resources:
my_resource:
methods:
create:
endpoint: post /v1/create
mcp:
tool_name: my_custom_tool_name
description: |
This is an LLM-specific tool description
End-users filtering the set of tools that are imported
Sometimes with a large API, it doesn't make sense to import every tool into an MCP client at once (it can be confusing to have too many for an LLM to choose from, and context windows are limited). Therefore, end-users can flexibly filter which tools they want to import.
They can do this by providing additional arguments to your MCP server:
--tool
to include a specific tool by name--resource
to include all tools under a specific resource, and it can have wildcards, e.g.my.resource*
--operation
to include just read (get/list) or just write operations--tag
to include endpoints that have a custom tag associated with them
Configuring custom tags
There are no tags on tools by default, but if you want to provide a custom grouping of tools for end users to filter on, you can tag resources or endpoints:
resources:
my_resource:
mcp:
tags:
- my_custom_resource_tag
methods:
create:
mcp:
tags:
- my_custom_endpoint_tag # also inherits my_custom_resource_tag
Deploy a Remote MCP Server
In the examples above, your MCP Server is a local program that accepts environment variables to handle authorization. This works great for developers using local clients like Claude Desktop or Cursor. However, this doesn't work well for end-users who might want to use MCP from a web-app (like Claude web).
The concept of "Remote" MCP Servers are a newer part of the modelcontextprotocol spec, and solves this issue by defining OAuth as a way of authenticating an MCP server with a client. Development of clients and servers to support this is still nascent.
However, you can support this new functionality today by integrating your Stainless-generated tools into a Cloudflare Worker. The worker can integrate with existing OAuth providers, or even let you implement a custom flow in the worker itself.
Setup
Create a remote Cloudflare Worker using their remote MCP Server template.
You should have a worker class that looks something like this:
export class MyMCP extends McpAgent {
server = new McpServer({
name: "Demo",
version: "1.0.0",
});
async init() {
this.server.tool("add", { a: z.number(), b: z.number() }, async ({ a, b }) => ({
content: [{ type: "text", text: String(a + b) }],
}));
}
}
To integrate Stainless-generated tools into your worker, import your MCP server and client library:
import { init, server } from "[my-package-mcp]/server";
import MyStainlessClient from "[my-package]";
export class MyMCP extends McpAgent {
server = server; // set the server on the class instance
async init() {
// Instantiate your client with the values from the worker environment
const client = new MyStainlessClient({
authToken: this.env.MY_API_TOKEN,
});
// Initialize all the generated endpoints with the server
init({ server: this.server, client });
}
}
Next, for local testing, add your environment variables to your local .dev.vars
file:
MY_API_TOKEN="123e4567-e89b-12d3-a456-426614174000",
Testing with the MCP Inspector
-
Start your worker:
$ yarn start
You should see output like:
[wrangler:inf] Ready on http://localhost:8787
-
Run the MCP Inspector in a separate terminal:
npx @modelcontextprotocol/inspector@latest
You'll see output like:
MCP Inspector is up and running at http://127.0.0.1:6274 🚀
-
Visit http://127.0.0.1:6274 in a browser and select "SSE" from Transport Type
-
Input http://localhost:8787/sse and click "connect", and then "list tools" to see your generated tools.
Customizing
You can customize which endpoints get served from the cloudflare worker by importing Stainless' generated tools directly and filtering or modifying them.
import { init, server, endpoints } from "[my-package-mcp]/server";
import MyStainlessClient from "[my-package]";
export class MyMCP extends McpAgent {
server = server;
async init() {
const client = new MyStainlessClient({
authToken: this.env.MY_API_TOKEN,
});
// filter generated endpoints by resource, tool, tag, etc
const filteredEndpoints = endpoints.filter((endpoint) =>
endpoint.metadata.tags.includes("mytag")
);
// make custom tools
const customTool = {
tool: {
name: "my-custom-tool",
description: "A custom tool that does something",
inputSchema: {
type: "object" as const,
properties: { a_property: { type: "string" } },
},
},
handler: async (client: MyStainlessClient, args: any) => {
return { a_value: "a_value" };
},
};
// provide them in the init function
init({
server: this.server,
endpoints: [...filteredEndpoints, customTool],
client,
});
}
}
Next Steps
MCP Server support and features are still quite new, so please reach out to us if you have any feedback or ideas for how we can make it work better for you!