MCP Auth is here
Drop-in OAuth for your MCP Servers
Learn more

Authorization: MCP’s less-talked about fourth pillar

Hrishikesh Premkumar
Founding Architect

Most introductions to the MCP architecture paint a familiar picture with three core components working together:

  • Hosts: LLM applications such as Claude Desktop or IDEs, which initiate connections
  • MCP clients: They reside within the host, establishing and maintain 1:1 connections with MCP servers
  • MCP server: Provide resources, prompts, tools, etc. to MCP clients.

Often missing from this list is the fourth and arguably most security-sensitive player: the authorization server.

This server is responsible for managing authentication, issuing access tokens, handling consent flows, enforcing token lifetimes, and validating permissions. In short, it is the core of the OAuth flow that MCP builds on. Yet, it is routinely left out of the conversation.

Why auth gets overlooked

Early MCP implementations are mostly local, single-user, and operated over stdio or UNIX sockets, with no auth required. Plus, the specification has been evolving rapidly, leaving many decisions to individual developers.

Devs are building MCP servers that double up as both resource and authorization servers. That means they are minting, signing, rotating, and validating tokens on their own.

Consider what this means. Every developer who builds a tool on top of MCP also needs to:

  • Implement OAuth 2.1 securely
  • Handle token signing and cryptographic key rotation
  • Design consent and access delegation flows
  • Store, refresh, and revoke tokens reliably

As more MCP servers go remote and integrate with sensitive services, this ambiguity becomes a liability. Teams are left to make decisions about authorization infrastructure without clear guidance. Increasingly, they choose to fold auth into the MCP server itself, creating risk.

Unless your team specializes in identity and security infrastructure, this is a recipe for trouble.

Do you trust every developer writing an MCP server to know how to mint, sign, rotate, and validate tokens securely? The risk of getting it wrong is very high if this isn’t what you do day in and day out — Den Delimarsky, Product Manager, Microsoft

Types of authorization servers: Embedded and external

1. Build it into the MCP server (embedded auth)

In this approach, the MCP server does everything:

  • Handles login and consent
  • Issues its own signed tokens
  • Enforces scopes and lifetimes
  • Validates incoming requests

This might sound efficient, but it tightly couples your tool logic to complex auth behavior. Unless you are an identity expert, you are likely to miss critical edge cases or introduce vulnerabilities.

When authorization responsibilities are embedded directly into the MCP server, developers effectively create a system that handles both resource management and token issuance internally.

On the surface, this might seem beneficial. It provides complete control over authentication flows, simplifies initial integration, and can tightly couple consent mechanisms directly with the server’s internal processes.

However, embedding authorization within your MCP server also introduces several challenges:

  • Complexity and workload: Managing OAuth 2.1 authorization internally requires significant knowledge of cryptographic best practices, token lifecycle management, and secure data handling. Your team must continuously maintain and audit this custom-built authorization infrastructure, which can divert resources away from core product innovation.
  • Security vulnerabilities: Even minor oversights in token issuance, refresh logic, or validation mechanisms can introduce serious vulnerabilities. Common mistakes include:
    • Token leakage
    • Weak cryptographic implementations
    • Improper key rotations
    • Scope mismanagement, all of which could lead to unauthorized access or data breaches.
  • Scalability and maintenance issues: Scaling a combined server and authorization infrastructure becomes increasingly complex. Every new feature or scope addition demands careful integration and validation across the entire stack, slowing down agility and responsiveness to new requirements.

An embedded authorization server can be viable for highly specialized or internal tools with dedicated security expertise. However, the inherent complexity and risk usually outweigh the perceived benefits for general use cases.

Example: Internal audit tool at a financial services firm

A fictional financial services company built an internal auditing tool using MCP with embedded authorization. Here’s precisely what the MCP server does:

  • Authentication: Presents an internal login form directly integrated with the company's user directory.
  • Consent management: Shows a simple internal consent screen, explicitly listing permissions such as audit:account:view and audit:report:create.
  • Token management: Generates and digitally signs JWT tokens internally, embedding specific user roles, permitted scopes, and expiration details.
  • Authorization validation: For every request (e.g., viewing audit logs or creating audit reports), the MCP server internally verifies tokens, matches permissions with requested actions, and authorizes or denies requests.
  • Auditing: Logs every authorization decision internally for compliance tracking.

Outcome: This approach gives the company direct, secure internal control but requires continuous maintenance and dedicated security expertise to manage the token lifecycle securely.

2. Use an external authorization server

Here, your MCP server acts only as a resource server. It validates tokens issued by a trusted external provider, and maps scopes to permissions internally.

This mirrors the OAuth best practice of separating concerns: identity and authorization handled by specialists, application logic handled by your code.

As OAuth veteran Aaron Parecki put it:

"Just treat every MCP client as an OAuth client. Treat the MCP server as a resource server that uses an existing authorization server to mint trusted tokens. That’s it. Problem solved."

This approach positions the MCP server as a straightforward resource server, delegating token management to a dedicated, trusted identity provider.

Advantages of this approach include:

  • Reduced complexity and lower operational overhead: By offloading authorization to a specialized service, your team can concentrate on core functionalities without needing extensive security or OAuth domain knowledge. This significantly simplifies your implementation and operational processes.
  • Enhanced security posture: External authorization providers invest considerable resources into continuously improving their security infrastructure, including comprehensive audits, penetration testing, and regular compliance updates. This significantly reduces the likelihood of token mismanagement or cryptographic failures, leveraging the expertise of industry leaders.
  • Improved scalability and agility: External auth providers offer robust and scalable infrastructure. Adding new features, modifying scopes, or supporting new types of clients can be accomplished quickly and reliably, facilitating agile development and rapid iteration.
  • Better interoperability and integration: An external auth solution provides standardized APIs and industry-standard integration patterns. This helps ensure compatibility across different MCP clients and tools, simplifying the integration of new applications or services into your existing infrastructure.

Leveraging external authorization providers thus allows MCP servers to focus on their primary role: efficiently serving authenticated and authorized requests, while securely delegating identity management tasks to dedicated specialists.

Example: Developer analytics platform integrating Scalekit for MCP auth

A fictional developer analytics SaaS platform needed to let third-party developer tools query usage metrics, debug user behavior, and modify dashboards on behalf of individual users.

To simplify secure delegation, they exposed a standardized MCP server as the interface for all tool interactions. Rather than building auth logic in-house, let’s assume they use Scalekit as the external authorization layer.

Scalekit provided a full OAuth 2.1-compliant authorization server, enabling the platform to focus entirely on its MCP server’s resource responsibilities.

Here’s what the flow looks like:

  • Dynamic client registration: MCP clients can dynamically register with Scalekit at runtime. This means tools do not need to be manually pre-provisioned. The server handles metadata like redirect URIs, supported grant types, and requested scopes automatically.
  • OAuth 2.1 flows with PKCE: When a developer tool initiates a flow, it uses Authorization Code flow with PKCE. This ensures tokens cannot be intercepted or replayed, even if the client is a public CLI.
  • User-backed tokens with consent: During auth, Scalekit presents a consent screen to the end user—typically a developer or analyst—clearly stating which permissions the tool is requesting (e.g., analytics:data:read, dashboard:modify). Users must explicitly approve this before any token is issued.
  • Token issuance and expiry: Once authorized, Scalekit issues short-lived access tokens (JWTs) that encode identity, scopes, and expiry. These are signed and auditable. The MCP server simply verifies the token against Scalekit’s published public keys.
  • Authorization inside the MCP server: For each incoming request, the MCP server validates the token, maps scopes to internal permissions, and allows or denies actions accordingly. For example, a token with dashboard:modify allows editing UI views, but cannot access usage logs (analytics:data:read).

Outcome: With Scalekit handling registration, consent, token minting, and validation infrastructure, the platform accelerated development, reduced security burden, and enabled user-delegated access at scale. The MCP server remained cleanly scoped: just a resource layer enforcing trusted, externally-issued authorization decisions.

Comparing embedded and external authorization servers

When deciding between embedded and external authorization, key considerations include complexity, security risk, and maintainability.

Embedded authorization can offer fine-tuned control but introduces significant operational complexity and requires specialized security expertise.

On the other hand, external authorization leverages established identity infrastructure, significantly reducing implementation overhead, minimizing security risks associated with token management, and streamlining ongoing maintenance.

Here’s a comparison table:

Embedded vs. external authz servers

Aspect
Embedded authorization
External authorization
Implementation complexity
High; requires in-depth security expertise
Low; leverages existing infrastructure
Operational overhead
High; responsible for token lifecycle management
Low; external provider manages token lifecycle
Security risk
Elevated; potential for misconfiguration
Reduced; managed by specialized identity providers
Flexibility
High customization; tailored for specific use-cases
Good flexibility; configurable via external systems
Maintainability
Challenging; ongoing security updates required
Easier; outsourced security and regular updates
Scalability
Moderate; requires internal scaling
High; benefits from external provider scalability

Implementing fine-grained authorization

A critical aspect often overlooked in MCP authorization discussions is the implementation of fine-grained, tool-specific authorization policies. While standard OAuth scopes generally map to broad API-level permissions, MCP interactions often occur at a more granular tool or function level.

This means authorization checks must go beyond simply validating tokens at an API boundary. Instead, MCP servers should enforce permissions directly tied to the specific tool actions or agent behaviors requested by MCP clients.

For example, rather than using a broad scope like read:data, a tool-specific permission like tool:codegen:execute or tool:debug:inspect allows servers to precisely control what an MCP client is allowed to do. This level of detail ensures users grant only the exact permissions needed, significantly reducing the risk of over-permissioning and unintended access.

Implementing fine-grained authorization policies aligned with tool-level interactions is essential for both security and usability, particularly in agent-driven experiences.

The spec is still evolving, but act now

The latest MCP spec drafts are still evolving: every few weeks, we get an update. There is active discussion around formalizing how authorization should work, and contributors are zeroing in on best practices.

However, if you are building agent-driven tools today, act immediately.

Treat your MCP server as an OAuth resource server.

Let token minting, signing, and validation be handled by trusted, battle-tested infrastructure. Keep your auth flows decoupled, your roles well-defined, and your users (and their data) secure.

No items found.
On this page
Share this article
Ready to add auth to your MCP Servers?

Acquire enterprise customers with zero upfront cost

Every feature unlocked. No hidden fees.
Start Free
$0
/ month
3 FREE SSO/SCIM connections
Built-in multi-tenancy and organizations
SAML, OIDC based SSO
SCIM provisioning for users, groups
Unlimited users
Unlimited social logins
Agentic Authentication

Authorization: MCP’s less-talked about fourth pillar

Hrishikesh Premkumar

Most introductions to the MCP architecture paint a familiar picture with three core components working together:

  • Hosts: LLM applications such as Claude Desktop or IDEs, which initiate connections
  • MCP clients: They reside within the host, establishing and maintain 1:1 connections with MCP servers
  • MCP server: Provide resources, prompts, tools, etc. to MCP clients.

Often missing from this list is the fourth and arguably most security-sensitive player: the authorization server.

This server is responsible for managing authentication, issuing access tokens, handling consent flows, enforcing token lifetimes, and validating permissions. In short, it is the core of the OAuth flow that MCP builds on. Yet, it is routinely left out of the conversation.

Why auth gets overlooked

Early MCP implementations are mostly local, single-user, and operated over stdio or UNIX sockets, with no auth required. Plus, the specification has been evolving rapidly, leaving many decisions to individual developers.

Devs are building MCP servers that double up as both resource and authorization servers. That means they are minting, signing, rotating, and validating tokens on their own.

Consider what this means. Every developer who builds a tool on top of MCP also needs to:

  • Implement OAuth 2.1 securely
  • Handle token signing and cryptographic key rotation
  • Design consent and access delegation flows
  • Store, refresh, and revoke tokens reliably

As more MCP servers go remote and integrate with sensitive services, this ambiguity becomes a liability. Teams are left to make decisions about authorization infrastructure without clear guidance. Increasingly, they choose to fold auth into the MCP server itself, creating risk.

Unless your team specializes in identity and security infrastructure, this is a recipe for trouble.

Do you trust every developer writing an MCP server to know how to mint, sign, rotate, and validate tokens securely? The risk of getting it wrong is very high if this isn’t what you do day in and day out — Den Delimarsky, Product Manager, Microsoft

Types of authorization servers: Embedded and external

1. Build it into the MCP server (embedded auth)

In this approach, the MCP server does everything:

  • Handles login and consent
  • Issues its own signed tokens
  • Enforces scopes and lifetimes
  • Validates incoming requests

This might sound efficient, but it tightly couples your tool logic to complex auth behavior. Unless you are an identity expert, you are likely to miss critical edge cases or introduce vulnerabilities.

When authorization responsibilities are embedded directly into the MCP server, developers effectively create a system that handles both resource management and token issuance internally.

On the surface, this might seem beneficial. It provides complete control over authentication flows, simplifies initial integration, and can tightly couple consent mechanisms directly with the server’s internal processes.

However, embedding authorization within your MCP server also introduces several challenges:

  • Complexity and workload: Managing OAuth 2.1 authorization internally requires significant knowledge of cryptographic best practices, token lifecycle management, and secure data handling. Your team must continuously maintain and audit this custom-built authorization infrastructure, which can divert resources away from core product innovation.
  • Security vulnerabilities: Even minor oversights in token issuance, refresh logic, or validation mechanisms can introduce serious vulnerabilities. Common mistakes include:
    • Token leakage
    • Weak cryptographic implementations
    • Improper key rotations
    • Scope mismanagement, all of which could lead to unauthorized access or data breaches.
  • Scalability and maintenance issues: Scaling a combined server and authorization infrastructure becomes increasingly complex. Every new feature or scope addition demands careful integration and validation across the entire stack, slowing down agility and responsiveness to new requirements.

An embedded authorization server can be viable for highly specialized or internal tools with dedicated security expertise. However, the inherent complexity and risk usually outweigh the perceived benefits for general use cases.

Example: Internal audit tool at a financial services firm

A fictional financial services company built an internal auditing tool using MCP with embedded authorization. Here’s precisely what the MCP server does:

  • Authentication: Presents an internal login form directly integrated with the company's user directory.
  • Consent management: Shows a simple internal consent screen, explicitly listing permissions such as audit:account:view and audit:report:create.
  • Token management: Generates and digitally signs JWT tokens internally, embedding specific user roles, permitted scopes, and expiration details.
  • Authorization validation: For every request (e.g., viewing audit logs or creating audit reports), the MCP server internally verifies tokens, matches permissions with requested actions, and authorizes or denies requests.
  • Auditing: Logs every authorization decision internally for compliance tracking.

Outcome: This approach gives the company direct, secure internal control but requires continuous maintenance and dedicated security expertise to manage the token lifecycle securely.

2. Use an external authorization server

Here, your MCP server acts only as a resource server. It validates tokens issued by a trusted external provider, and maps scopes to permissions internally.

This mirrors the OAuth best practice of separating concerns: identity and authorization handled by specialists, application logic handled by your code.

As OAuth veteran Aaron Parecki put it:

"Just treat every MCP client as an OAuth client. Treat the MCP server as a resource server that uses an existing authorization server to mint trusted tokens. That’s it. Problem solved."

This approach positions the MCP server as a straightforward resource server, delegating token management to a dedicated, trusted identity provider.

Advantages of this approach include:

  • Reduced complexity and lower operational overhead: By offloading authorization to a specialized service, your team can concentrate on core functionalities without needing extensive security or OAuth domain knowledge. This significantly simplifies your implementation and operational processes.
  • Enhanced security posture: External authorization providers invest considerable resources into continuously improving their security infrastructure, including comprehensive audits, penetration testing, and regular compliance updates. This significantly reduces the likelihood of token mismanagement or cryptographic failures, leveraging the expertise of industry leaders.
  • Improved scalability and agility: External auth providers offer robust and scalable infrastructure. Adding new features, modifying scopes, or supporting new types of clients can be accomplished quickly and reliably, facilitating agile development and rapid iteration.
  • Better interoperability and integration: An external auth solution provides standardized APIs and industry-standard integration patterns. This helps ensure compatibility across different MCP clients and tools, simplifying the integration of new applications or services into your existing infrastructure.

Leveraging external authorization providers thus allows MCP servers to focus on their primary role: efficiently serving authenticated and authorized requests, while securely delegating identity management tasks to dedicated specialists.

Example: Developer analytics platform integrating Scalekit for MCP auth

A fictional developer analytics SaaS platform needed to let third-party developer tools query usage metrics, debug user behavior, and modify dashboards on behalf of individual users.

To simplify secure delegation, they exposed a standardized MCP server as the interface for all tool interactions. Rather than building auth logic in-house, let’s assume they use Scalekit as the external authorization layer.

Scalekit provided a full OAuth 2.1-compliant authorization server, enabling the platform to focus entirely on its MCP server’s resource responsibilities.

Here’s what the flow looks like:

  • Dynamic client registration: MCP clients can dynamically register with Scalekit at runtime. This means tools do not need to be manually pre-provisioned. The server handles metadata like redirect URIs, supported grant types, and requested scopes automatically.
  • OAuth 2.1 flows with PKCE: When a developer tool initiates a flow, it uses Authorization Code flow with PKCE. This ensures tokens cannot be intercepted or replayed, even if the client is a public CLI.
  • User-backed tokens with consent: During auth, Scalekit presents a consent screen to the end user—typically a developer or analyst—clearly stating which permissions the tool is requesting (e.g., analytics:data:read, dashboard:modify). Users must explicitly approve this before any token is issued.
  • Token issuance and expiry: Once authorized, Scalekit issues short-lived access tokens (JWTs) that encode identity, scopes, and expiry. These are signed and auditable. The MCP server simply verifies the token against Scalekit’s published public keys.
  • Authorization inside the MCP server: For each incoming request, the MCP server validates the token, maps scopes to internal permissions, and allows or denies actions accordingly. For example, a token with dashboard:modify allows editing UI views, but cannot access usage logs (analytics:data:read).

Outcome: With Scalekit handling registration, consent, token minting, and validation infrastructure, the platform accelerated development, reduced security burden, and enabled user-delegated access at scale. The MCP server remained cleanly scoped: just a resource layer enforcing trusted, externally-issued authorization decisions.

Comparing embedded and external authorization servers

When deciding between embedded and external authorization, key considerations include complexity, security risk, and maintainability.

Embedded authorization can offer fine-tuned control but introduces significant operational complexity and requires specialized security expertise.

On the other hand, external authorization leverages established identity infrastructure, significantly reducing implementation overhead, minimizing security risks associated with token management, and streamlining ongoing maintenance.

Here’s a comparison table:

Embedded vs. external authz servers

Aspect
Embedded authorization
External authorization
Implementation complexity
High; requires in-depth security expertise
Low; leverages existing infrastructure
Operational overhead
High; responsible for token lifecycle management
Low; external provider manages token lifecycle
Security risk
Elevated; potential for misconfiguration
Reduced; managed by specialized identity providers
Flexibility
High customization; tailored for specific use-cases
Good flexibility; configurable via external systems
Maintainability
Challenging; ongoing security updates required
Easier; outsourced security and regular updates
Scalability
Moderate; requires internal scaling
High; benefits from external provider scalability

Implementing fine-grained authorization

A critical aspect often overlooked in MCP authorization discussions is the implementation of fine-grained, tool-specific authorization policies. While standard OAuth scopes generally map to broad API-level permissions, MCP interactions often occur at a more granular tool or function level.

This means authorization checks must go beyond simply validating tokens at an API boundary. Instead, MCP servers should enforce permissions directly tied to the specific tool actions or agent behaviors requested by MCP clients.

For example, rather than using a broad scope like read:data, a tool-specific permission like tool:codegen:execute or tool:debug:inspect allows servers to precisely control what an MCP client is allowed to do. This level of detail ensures users grant only the exact permissions needed, significantly reducing the risk of over-permissioning and unintended access.

Implementing fine-grained authorization policies aligned with tool-level interactions is essential for both security and usability, particularly in agent-driven experiences.

The spec is still evolving, but act now

The latest MCP spec drafts are still evolving: every few weeks, we get an update. There is active discussion around formalizing how authorization should work, and contributors are zeroing in on best practices.

However, if you are building agent-driven tools today, act immediately.

Treat your MCP server as an OAuth resource server.

Let token minting, signing, and validation be handled by trusted, battle-tested infrastructure. Keep your auth flows decoupled, your roles well-defined, and your users (and their data) secure.

No items found.
Ship Enterprise Auth in days