Comment by tobihrbr

16 hours ago

Great questions!

If you want to run your own remote servers (for your product/company) Railway or Render work great (Vercel is a bit more difficult since Lambdas are very expensive if you run them over long periods of time). Metorial targets developers who build their own AI agents and want to connect them to integrations. Plainly, we do a lot more then running MCP servers; we give you monitoring, observability, handle consumer-facing OAuth, and give you super nice SDKs to integrate MCP servers with your agent.

Regarding the second question, Metorial has three execution modes depending on what the server supports: 1) Docker - this is the most basic one which any MCP server should support. We did some heavy optimizations to get those to start as fast as possible and our hibernation system supports stopping and resuming them while restoring the state. 2) Remote MCP - we connect to remote MCP servers for you, while still giving you the same features and ease-of-integration you get with any Metorial server (I could go more into detail on how our remote servers are better than standard ones). 3) Servers on our own lambda-based runtime. While not every MCP server supports this execution mode, it's what really sets us apart. The Lambdas only run for short intervals, while the connection is managed by our gateway. We already have about 100 lambda-based servers and working on getting more on to that execution model.

There's a lot about our platform that I haven't included in this. Like our stateful MCP proxy, our security model, our scalable SOA, and how we transform OAuth into a single REST API calls for our users.

Let me know if you have any additional questions, always happy to talk about MCP and software architecture.

thanks for explaining, especially the runtimes part!

i am currently running Docker MCP Containers + MCP Gateway mixed with Remote MCPs in microVMs (aka. Sandboxes).

seems to be the most portable setup, so you don't have to worry about dealing with different exec like uvx, poetry, bun, npx and the whole stdio/streamable http conversion.

lambdas sound interesting, esp. if you have figured out the way to make stateful work stateless, but comes with the downside that you have to maintain all the integrations yourself + the environment itself might have compatibility issues. i've seen someone also using cloudflare dynamic workers for similar use-case (disco.dev), but they're maintaining all the integrations by hand (or with Claude Code rather). more extreme version of this would be writing custom integration specific to the user by following very strict prompt.

anyways, i'll look into Metorial as am curious about how the portable runtimes work.

i am also maintaining a list of MCP gateways, just added you there as well: https://github.com/e2b-dev/awesome-mcp-gateways

thanks for building this, looking forward to checking it out!

  • Thanks for sharing and adding us to your list. The point about the lambdas is fair, though we do support other execution modes to combat this. Please let me know if you have any feedback or encounter hiccups :)