Skip to main content
Smithery offers specialized hosting for MCP servers. We deploy MCPs on our edge network for low-latency connections and optimize specifically for the MCP protocol.
Best suited for: API wrappers and lightweight that connect to external services.Not designed for: Heavy compute workloads, ML inference, large file processing, or long-running background jobs.

Resource Limits

Hosted MCP servers run on Smithery’s edge runtime — a high-performance JavaScript environment without access to the Node.js filesystem or native OS APIs.
ResourceLimit
Memory128 MB
CPU time30 seconds per request
CPU time is the time your code is actively executing—not time spent waiting on external API calls, database queries, or other I/O. Most MCP servers that wrap APIs will use minimal CPU time since they’re primarily waiting on network responses.

Runtime Limitations

The edge runtime does not support:
  • Native Node.js modules - No fs, child_process, net, or other system-level APIs
  • Dynamic requires of Node built-ins - Packages using require("util"), require("stream"), etc. at runtime
  • Native bindings - Packages with C/C++ addons won’t work (e.g., sharp, bcrypt)
  • Filesystem access - No reading/writing local files
  • Spawning processes - Cannot execute shell commands or subprocesses
Most npm packages that are pure JavaScript/TypeScript will work. Packages that rely on native bindings or Node.js-specific APIs will not.
Common error: Dynamic require of "util" is not supportedThis error typically means a dependency (like axios) uses Node.js built-in modules. Solutions:
  • Replace axios with native fetch (supported in the runtime)
  • Use edge-compatible alternatives like ky or ofetch
  • Check your dependencies for Node.js-specific code

What This Means for Your Server

Your MCP server should:
  • Be stateless - Don’t rely on in-memory state between requests; use session state cache, or external databases to maintain state
  • Wrap APIs, not run compute - Ideal for calling external services, not processing data locally
  • Avoid loading large files or datasets into memory - Process data in chunks or stream from external sources
  • Use streaming for large responses - Return data incrementally rather than buffering entire responses
  • Keep dependencies minimal - Smaller dependency footprint means faster cold starts and less memory overhead

When to Publish Your Own MCP

Consider publishing your MCP (bring your own infrastructure) if you need:
  • More than the above resource limitations
  • Native Node.js APIs (fs, child_process, etc.)
  • Native bindings or C/C++ addons
  • GPU access or heavy compute
  • Persistent local storage
  • Long-running background processes
The Smithery Gateway provides observability and distribution for your MCP.