Ink vs Vercel
How Ink MCP compares to Vercel — serverless frontend platform with per-seat pricing versus full-stack agent infrastructure
Vercel is the default deployment platform for Next.js. It does frontend and serverless functions extremely well. But Vercel's architecture has hard limits — no WebSockets, no long-running processes, no Docker — and its per-seat pricing model doesn't fit a world where agents are the operators.
Feature comparison
| Ink | Vercel | |
|---|---|---|
| Agent integration | Skill (prompt-guided), MCP (Streamable HTTP), CLI | Skill (vercel-labs/agent-skills), MCP (Streamable HTTP with OAuth at mcp.vercel.com), CLI |
| MCP capabilities | Full read/write: deploy, delete, scale, databases, DNS, logs, metrics | 14 tools: logs, project info, deployments, docs search, domain purchase, deploy. Mostly read-heavy |
| Pricing model | Per-minute compute, no seat fees | Per-seat: $20/developer/month on Pro + usage. Free viewer seats |
| Infrastructure | Bare metal (persistent processes) | AWS Lambda (serverless functions) + Edge Network |
| WebSockets | Native support | Not supported — Vercel recommends third-party providers |
| Function timeout | No limit (persistent processes) | 300s default, 800s max on Pro (serverless) |
| Long-running streaming | Yes — SSE, long-poll, streaming without limits | Edge: 25s to first byte, 300s total. Node: 800s max |
| Game servers | Yes — persistent processes, custom ports | Not possible — no stateful connections, no custom ports |
| Docker support | Dockerfile build pack | Not supported |
| Backend support | Any framework, persistent processes | Serverless functions only (Express, FastAPI, etc. as Lambda) |
| Databases | SQLite via Turso (edge-replicated, managed) | No first-party databases — Neon/Upstash via Marketplace (sunset own Postgres/KV in 2024) |
| DNS management | Full programmatic DNS via MCP | Domain purchase via MCP, DNS records via CLI/REST API |
| Build system | Railpack (30+ frameworks) | Framework-specific (50+ frameworks, Next.js-optimized) |
| GraphQL API | Yes, with introspection | No |
Capabilities checklist
| Capability | Ink | Vercel |
|---|---|---|
| MCP server | ✅ | ✅ |
| Agent Skill (prompt-guided) | ✅ | ✅ |
| CLI | ✅ | ✅ |
| Multi-agent collaboration | ✅ | ❌ |
| Deploy via MCP (server-side) | ✅ | ❌ |
| Delete services via MCP | ✅ | ❌ |
| Provision databases via MCP | ✅ | ❌ |
| DNS management via MCP | ✅ | ❌ |
| Metrics via MCP | ✅ | ❌ |
| Logs via MCP | ✅ | ✅ |
| GraphQL API | ✅ | ❌ |
| WebSockets | ✅ | ❌ |
| Long-running processes (>800s) | ✅ | ❌ |
| Persistent server processes | ✅ | ❌ |
| Docker support | ✅ | ❌ |
| Bare metal infrastructure | ✅ | ❌ |
| Free seats (no per-seat pricing) | ✅ | ❌ |
| Per-minute billing | ✅ | ❌ |
| Managed databases (first-party) | 🔜 | ❌ |
| Preview deployments per PR | ❌ | ✅ |
| Edge CDN / image optimization | ❌ | ✅ |
| Next.js-specific optimizations (ISR, etc.) | ❌ | ✅ |
Where the gap is real
Per-seat pricing in the agentic world
Vercel charges $20 per developer seat per month on Pro. Viewer seats are free, but anyone who deploys or configures anything needs a paid seat.
This model doesn't survive contact with agents. If one agent deploys your app and a different agent debugs it next week, per-seat pricing tries to charge for entities with zero marginal cost. The actual platform cost is compute and bandwidth consumed, not how many operators have access.
Ink charges for compute used. Add as many agents, developers, or collaborators as you want. Your bill reflects what you run, not who can see the dashboard.
No WebSockets, no persistent connections
Vercel's serverless architecture fundamentally cannot support WebSocket connections. Their official documentation states this explicitly and recommends third-party providers (Ably, Pusher, Supabase Realtime).
This means you can't build on Vercel:
- Real-time chat applications with server-managed state
- Multiplayer game servers
- Live collaboration tools with bidirectional streaming
- Long-running AI inference with streaming responses beyond 800 seconds
- Any service requiring persistent server-to-client connections
Ink runs persistent processes. WebSockets, SSE, long-polling — all work natively with no timeout constraints.
Serverless ceiling
Every backend on Vercel runs as an AWS Lambda function. Maximum execution time is 800 seconds on Pro with Fluid Compute. Request and response bodies are limited to 4.5 MB. No in-memory state persists between requests. No background workers, no queue consumers, no persistent processes.
Ink deploys containers that run continuously. Your backend can hold state in memory, process background jobs, maintain database connection pools, and handle any workload pattern — not just request/response.
Agent integration depth
Vercel now offers three integration paths — an MCP server at mcp.vercel.com, Agent Skills via vercel-labs/agent-skills, and a CLI. The MCP server has 14 tools that are mostly read-heavy (logs, project info, deployments, docs search), though it now includes deploy_to_vercel and buy_domain write tools. The Agent Skills ecosystem includes a vercel-deploy-claimable skill for deployments.
Ink also offers three paths. The key difference is MCP capability depth: Ink's MCP server exposes 30+ tools with full read-write control — create_service deploys, create_resource provisions databases, add_custom_domain configures DNS, plus delete, scale, and full observability. Ink's Skill and CLI provide the same full platform access.
No Docker
Vercel explicitly does not support Docker deployments. Everything must fit their serverless build pipeline. If your stack needs a custom runtime, system dependencies, or anything outside their supported frameworks, you're stuck.
Ink supports Dockerfile builds alongside Railpack auto-detection and static site serving.
What Vercel does well
Vercel's Next.js integration is unmatched. Preview deployments on every PR, edge caching, image optimization, and ISR work seamlessly. The Edge Network delivers static assets fast. For teams building Next.js frontends deployed by humans through git push, Vercel is excellent at that specific job.