MCP -> agent skills -> Cloudflare's discovery RFC. All in ~1 year. Agent experience (AX) is evolving fast, and here’s a practical starting point for API providers. https://lnkd.in/eQhV7f_z
Stainless
Software Development
New York, NY 5,300 followers
Quality fittings for your REST API.
About us
Stainless helps you deliver world-class developer interfaces for your API. Generate robust and idiomatic SDKs in popular programming languages, token-efficient MCP servers, and SDK-native documentations that evolves with your OpenAPI spec. Stainless is trusted by some of the most forward-thinking API teams in the world - including Anthropic, Cloudflare, Nvidia, OpenAI, and many more.
- Website
-
https://stainless.com
External link for Stainless
- Industry
- Software Development
- Company size
- 51-200 employees
- Headquarters
- New York, NY
- Type
- Privately Held
- Founded
- 2022
Products
Locations
-
Primary
Get directions
180 Varick St
1502
New York, NY 10014, US
Employees at Stainless
Updates
-
Stainless reposted this
This was a fun little 🚢 I use Claude Code extensively in my day to day development workflow. Last week I was working on a feature that uses the Stainless CLI and Claude was having a really hard time figuring out how to work with it, so I built this skill. After installing the skill, Claude stopped fumbling with the CLI, trying to run invalid commands, deeply exploring the `--help` text for every command, and just...did what I needed it to do. The skill made Claude Code work so well with the Stainless CLI that we decided to release it to everyone. `stl skills` will install the skill to your .claude/skills folder if you use Claude Code, or `.agents/skills` if you use another agent. You can also run `npx skills add stainless-api/stainless-api-cli` if you typically use the skills npm package for managing your skills. If you use Stainless to generate your SDKs, try it out with the Stainless CLI in your agent and let me know what you think! Docs on the CLI quickstart below: https://lnkd.in/eE6iVaGu
The Stainless CLI now supports agent skills 🤖 Run `stl skills` and agents like Claude Code and Codex can generate SDKs, trigger builds, manage branches, lint configs, and more. https://lnkd.in/guaeG5xt
-
On April 2, we're teaming up with Keycard, OpenAI, Vercel, and Google for AX Night. Join us in SF for live demos and Q&A on everything your API needs for agents to integrate successfully. MCP (code mode), agent skills, CLIs, and more. 🎟️ RSVP: https://luma.com/lu4qcfd4
-
-
The Stainless CLI now supports agent skills 🤖 Run `stl skills` and agents like Claude Code and Codex can generate SDKs, trigger builds, manage branches, lint configs, and more. https://lnkd.in/guaeG5xt
-
Big news… Stainless is expanding into rally racing! 🏎️ Just kidding. We recently worked with an old friend from the Stripe days, richö, on a security project. Instead of a typical payment, he asked if we’d sponsor his rally car (hard to say no to that). If you happen to see the Stainless logo flying down a gravel stage somewhere, now you know why.
-
-
Stainless reposted this
The best way to think about AI agents right now: they're only as useful as the tools they can reach. An agent with shell access but no way to talk to your infrastructure is just a fancy autocomplete. We just shipped a rebuilt Courier CLI that gives developers and AI agents full access to Courier from the terminal. Send messages across any channel, manage users and routing, check delivery status, the entire API. Whether you're adding notifications to a new app or managing an existing integration, everything is faster now. Your agent can handle Courier the same way it runs any other terminal command. Works with Cursor, Claude Code, or anywhere you have a shell. npm install -g @trycourier/cli, set your API key, go. Big shoutout to the folks at Stainless for making the launch of this new and improved CLI a piece of cake. Full walkthrough on the blog: https://lnkd.in/ejKne8fA
-
The Stainless Docs Platform is now in public beta! Generate API reference, SDK examples, and prose guides from the same OpenAPI spec that powers your SDKs. It’s built for the next era of API platforms where DX and AX matter equally. At its core, Stainless Docs is built around a few key ideas: - SDK-native examples with real method signatures and types - Docs-as-code workflow with Git editing and preview builds - Fast, reliable docs via static rendering and edge hosting Since early access, we’ve also added a number of new features, including: 🖼️ Automatic Open Graph images for every page 🤖 Conversational docs search powered by your API reference and guides 🎨 A redesigned theming system so docs match your product’s brand 🔎 Preview deployments for every docs pull request The Stainless Docs Platform is already used by teams like Anthropic, OpenAI, Beeper, DigitalOcean, Letta, and Profound. If you already generate SDKs with Stainless, enabling docs takes a single configuration change. Simply connect your OpenAPI spec and deploy your docs site in seconds. https://lnkd.in/eXjqkaKx
-
You can now attach additional files to your GitHub releases when publishing SDKs 📦 Configure `release_assets` to upload bundled artifacts, compiled outputs, or extra documentation. https://lnkd.in/eEsymBWu
-
99% reduction in context tokens (100k → ~1k) 🤯 Dodo Payments just shipped a new MCP server with Stainless using Code Mode. Instead of exposing dozens of API endpoints as tools, agents now use just two: - docs_search - execute_code (runs the SDK in a sandboxed TypeScript runtime) Fewer round trips, higher accuracy, and agents can integrate using the typed SDK instead of raw endpoints.
We rebuilt how AI agents talk to Dodo Payments. Here's why. When we first shipped our MCP server, we did what everyone does. One tool per endpoint. It worked fine for simple stuff. Then agents started trying to do real workflows. Create a subscription, apply a discount, check credit balance, issue an invoice. Four tools. Four round trips. Four model inferences. Tens of thousands of tokens burned on tool definitions alone. Code Mode changes this completely. Instead of exposing hundreds of tools, we expose two: search and execute. The agent searches our docs to understand the API, then writes TypeScript that runs against our SDK in a secure sandbox. One script. One execution. The entire workflow happens server-side. What this means in practice: -> 99% reduction in tokens used for tool definitions -> Complex multi-step operations complete in a single invocation -> API keys never appear in prompts (server-side injection) -> Sandboxed V8 execution with no filesystem or network access -> New API endpoints work automatically, zero maintenance Big shoutout to the team at Stainless for making this possible. The MCP ecosystem is growing fast but most servers still follow the one-tool-per-endpoint pattern. That approach doesn't scale. If you're building for agents that need to do real work, Code Mode is the architecture to adopt. Detailed blog post link in the first comment.
-
-
Generated CLI tools now support binary endpoint downloads ✨ That means file exports, media downloads, and raw data endpoints work out of the box, with -o support, pipe-friendly output, and smart filename handling to prevent overwrites. https://lnkd.in/eQXyrBeR