Sprouts.ai is a Generative Demand Platform for B2B revenue teams who are done guessing. Instead of adding another dashboard or enrichment layer on top of a fragile data stack, Sprouts rebuilds the operating layer underneath it. The result is a single connected system for real-time account and contact intelligence, signal-based orchestration, AI-driven next best action, and deep ICP qualification.
The platform serves CROs, RevOps teams, SDR and BDR leaders, and CMOs who need their data to tell the truth. Customers have seen a 3x increase in qualified leads, a 50% improvement in SDR and BDR efficiency, and over a 30% reduction in martech tool costs. Sprouts connects to Redis, Elasticsearch, and tenant-level Postgres to serve intelligence across 400 million data points in under two seconds.
What MCP unlocked for sprouts
The MCP server was a natural extension of Sprouts’ mission: bring the intelligence layer directly into the AI interfaces revenue teams already use, so they stop switching tabs and start getting answers. Kartikay Dhar, who leads product at Sprouts, built the initial MCP server himself as a pilot. It worked, and it quickly clarified the next challenge: making it secure enough to hand to enterprise customers.
The challenge: production-grade OAuth for MCP
- The MCP server worked and was already proving value. Users could ask natural-language questions and get fast, relevant answers across large multi-tenant datasets, validated through demos and early partner workflows.
- Security was pilot-grade, not enterprise-grade. Access relied on header-style authentication and service credentials, which was acceptable for known parties but became a blocker for enterprise rollouts and external AI ecosystems.
- Without OAuth, trust and control were hard to guarantee. There was no standardized consent flow, no clear scoped access model, and no straightforward way to demonstrate tenant isolation and permission boundaries to customers.
- Building OAuth correctly meant real engineering cost. Supporting modern flows, mapping scopes to tools, creating a usable consent screen, and validating tokens on every request would have taken significant upfront work.
- Maintaining it would be an ongoing burden. MCP standards and ecosystem expectations continue to evolve, and owning auth infrastructure long-term would pull focus away from the core data intelligence roadmap.
"The pilot proved the workflow. The hard part was getting it enterprise-ready without rebuilding half the system."
Kartikay Dhar,
Director of Product, AI, Sprouts.ai
The Solution
Sprouts integrated Scalekit's MCP auth layer into their existing homegrown Python server without migrating to a new framework or rebuilding any core logic. The existing server stayed exactly as it was. Scalekit handled everything that sits in front of it.
What Scalekit handled:
- OAuth 2.1 + PKCE for human users connecting through Claude Desktop or any MCP-compatible AI host, with a full browser-based consent flow
- Scope-based tool authorization so that individual MCP tools (data queries, account discovery, ICP analysis, signal routing) each carry specific permissions that users explicitly grant at consent time, not blanket access
- MCP consent widget surfacing meaningful, readable permission descriptions to end users before any data is accessed, configured directly from the Scalekit dashboard
- Token validation on every inbound call, with Scalekit issuing JWTs and the MCP server validating them without managing credentials itself
- Bring-your-own-auth compatibility: Scalekit dropped in as the auth layer directly, eliminating any requirement to first build or migrate a separate identity system
The results
With Scalekit handling the auth layer, Sprouts' MCP server is now open to enterprise customers and external AI platforms without exposing the kind of auth gaps that would stop a procurement or security review. The integration required no changes to existing server infrastructure and no migration to a new Python framework.
"I wanted to get the OAuth layer done without burning a full sprint rebuilding infrastructure we'd already built. Scalekit let us add auth to the existing server, configure the right permissions per tool from a dashboard, and move on. We didn't have to migrate to a new framework or touch the core logic."
Kartikay Dhar,
Director of Product, AI, Sprouts.ai
- Enterprise customers connect through a structured OAuth consent flow that surfaces tool-level permissions in plain language
- Tenant-level data isolation, already enforced in the data layer, is now reinforced at the auth layer before any query runs
- The MCP server is distribution-ready for ChatGPT, OpenAI's ecosystem, and any MCP-compatible AI host
The AI interfaces their customers already use can connect to Sprouts data directly, with the right permissions, for the right users, without Sprouts having had to build the auth infrastructure to make that possible.