Stitchflow
Databricks logo

Databricks User Management Guide

Manual workflow

How to add, remove, and manage users with operational caveats that matter in production.

UpdatedMar 9, 2026

Summary and recommendation

Databricks user management can be run manually, but complexity usually increases with role models, licensing gates, and offboarding dependencies. This guide gives the exact mechanics and where automation has the biggest impact.

Databricks user management operates across two distinct layers: the account console and individual workspaces. Adding a user at the account level does not automatically grant them access to any workspace - that assignment is always a separate step.

Every app that connects to Databricks inherits this two-layer model, so access gaps are common when teams manage only one layer.

Three admin roles carry meaningfully different scopes: Account Admin controls workspace creation and SSO/SCIM configuration; Workspace Admin manages users and ACLs within a single workspace; Metastore Admin governs Unity Catalog data objects. None of these roles automatically confer the others.

SCIM provisioning requires both SSO to be configured first and a Premium plan. Standard tier is being phased out - AWS and GCP workspaces must upgrade by October 2025, Azure Standard by October 2026.

Quick facts

Admin console pathAccount Console → User Management → Users (account-level) OR Workspace Settings → Identity and Access → Users (workspace-level)
Admin console URLOfficial docs
SCIM availableYes
SCIM tier requiredPremium
SSO prerequisiteYes

User types and roles

Role Permissions Cannot do Plan required Seat cost Watch out for
Account Admin Full control over the Databricks account: create/delete workspaces, manage account-level users and groups, configure SSO/SCIM, assign workspace admins, view billing and usage. Cannot directly manage workspace-level object permissions unless also assigned as a workspace admin in that workspace. All plans (Standard, Premium) No separate seat cost; consumption-based DBU pricing applies when running workloads. Account admin role is assigned at account level only; it does not automatically grant workspace admin in every workspace.
Workspace Admin Manage users, groups, and service principals within a specific workspace; configure workspace settings, clusters, and access control lists (ACLs); assign workspace-level entitlements. Cannot create new workspaces or manage account-level SSO/SCIM configuration; cannot manage Unity Catalog metastore unless also granted metastore admin. All plans No separate seat cost; DBU consumption applies. Workspace admins can grant themselves any workspace permission, including cluster creation, so scope their access carefully.
Metastore Admin (Unity Catalog) Manage Unity Catalog metastore: create/drop catalogs, schemas, tables; grant/revoke privileges on all securable objects; manage external locations and storage credentials. Cannot manage workspace settings or account-level user provisioning unless also holding those roles. Premium (Unity Catalog requires Premium) No separate seat cost. Only one metastore can be attached to a workspace at a time; metastore admin is a single user or group assignment per metastore.
Regular User Access workspaces they are assigned to; run notebooks, jobs, and queries subject to ACLs and entitlements granted by admins. Cannot create clusters unless granted the 'Allow cluster creation' entitlement; cannot access data objects without explicit Unity Catalog or ACL grants. All plans No per-seat license fee; costs are DBU-based on compute used. Users must be added to the account first, then assigned to individual workspaces; account-level addition alone does not grant workspace access.
Service Principal Programmatic identity for automated jobs, CI/CD pipelines, and API access; can be granted any role or object permission a human user can receive. Cannot log in via the Databricks UI; cannot be used for interactive notebook sessions. All plans; OAuth M2M authentication requires Premium. No per-seat cost; DBU consumption applies when running workloads. Service principal secrets and OAuth tokens must be rotated manually unless managed via an external secrets manager; leaked credentials are a common security risk.

Permission model

  • Model type: hybrid
  • Description: Databricks uses a hybrid model combining fixed account/workspace roles (Account Admin, Workspace Admin, Metastore Admin) with object-level ACLs for workspace resources (clusters, notebooks, jobs, SQL warehouses) and a privilege-based GRANT/REVOKE model in Unity Catalog for data objects (catalogs, schemas, tables, volumes). There are no fully custom named roles; permissions are composed by assigning built-in roles plus explicit object-level grants.
  • Custom roles: No
  • Custom roles plan: Not documented
  • Granularity: Object-level: individual notebooks, folders, clusters, jobs, SQL warehouses, dashboards, and Unity Catalog securables (catalog, schema, table, view, volume, external location, storage credential). Entitlements (e.g., Allow cluster creation, Databricks SQL access) are assigned per user or group at workspace level.

How to add users

  1. Log in to the Databricks Account Console (accounts.cloud.databricks.com or accounts.azuredatabricks.net).
  2. In the left sidebar, click 'User management'.
  3. On the 'Users' tab, click 'Add user'.
  4. Enter the user's email address and optionally their first and last name.
  5. Click 'Send invite' (user receives an email invitation) or, if SSO is enforced, the user is added without an invitation email.
  6. To grant workspace access, navigate to the target workspace row in the Account Console → Workspaces, open the workspace, go to 'Permissions', and assign the user (or a group containing the user) with the desired entitlement.
  7. Alternatively, workspace admins can add users directly in Workspace Settings → Identity and Access → Users → Add user, but account-level management is recommended.

Required fields: Email address

Watch out for:

  • Adding a user at the account level does not automatically grant them access to any workspace; workspace assignment is a separate step.
  • If SSO is configured, users authenticate via the IdP and may be provisioned automatically via SCIM; manually added users may conflict with SCIM-managed records.
  • Email addresses must match the IdP identity exactly when SSO is enabled; mismatches cause login failures.
  • Guest or external users (outside the organization's domain) can be added but may face IdP restrictions depending on SSO configuration.
  • On Azure Databricks, users can also be added via Microsoft Entra ID (Azure AD) groups synced through SCIM; manual additions outside SCIM may be overwritten on the next sync.
Bulk option Availability Notes
CSV import No Not documented
Domain whitelisting No Automatic domain-based user add
IdP provisioning Yes Premium (account-level SCIM requires Premium; workspace-level SCIM is deprecated but available on Standard)

How to remove or deactivate users

  • Can delete users: Yes
  • Delete/deactivate behavior: Databricks supports both deactivation and deletion. Deactivating a user prevents login and API access but preserves the user record and their owned objects. Deleting a user permanently removes the account record. Deletion is available in the Account Console but is irreversible; Databricks recommends deactivation for most offboarding scenarios to preserve audit trails and object ownership. SCIM-managed users are deactivated (not deleted) when removed from the IdP.
  1. Log in to the Databricks Account Console.
  2. Navigate to 'User management' → 'Users'.
  3. Locate the user by name or email.
  4. Click the three-dot menu (⋮) next to the user.
  5. Select 'Deactivate user'.
  6. Confirm the deactivation in the dialog.
  7. The user's status changes to 'Deactivated'; they can no longer log in or make API calls.
Data impact Behavior
Owned records Notebooks, jobs, clusters, and other objects owned by the deactivated/deleted user remain in the workspace and are still accessible to users with appropriate permissions. Ownership is not automatically transferred; admins must manually reassign ownership of critical objects.
Shared content Shared notebooks and folders remain accessible to other users who have been granted permissions. The deactivated user's personal folder contents remain but the user cannot access them.
Integrations Personal access tokens (PATs) issued to the user are invalidated upon deactivation. OAuth tokens expire per their TTL. Service principals are unaffected unless the user was the sole admin managing them.
License freed Because Databricks is consumption-based (DBU pricing), deactivating a user stops their ability to consume compute, which reduces future DBU charges. There is no fixed per-seat license to release.

Watch out for:

  • Deleting a user is irreversible; the user record cannot be restored. Deactivation is preferred for offboarding.
  • SCIM-provisioned users deactivated via the IdP are marked inactive in Databricks but not deleted; their objects and history are preserved.
  • Owned jobs continue to run after user deactivation if they were already scheduled; admins should reassign job ownership before deactivating.
  • Unity Catalog object ownership (tables, schemas, catalogs) held by a deleted user must be manually transferred; orphaned ownership can block certain admin operations.
  • Deactivating a user at the account level also removes their access to all workspaces; there is no workspace-by-workspace deactivation toggle.

License and seat management

Seat type Includes Cost
Databricks Unit (DBU) – Jobs Compute Compute for automated jobs, pipelines, and batch workloads. $0.07–$0.20+/DBU depending on cloud, instance type, and plan tier
Databricks Unit (DBU) – All-Purpose Compute Interactive notebooks, ad-hoc analysis, collaborative development. $0.40–$0.65+/DBU depending on cloud and instance type
Databricks Unit (DBU) – SQL Warehouse (Serverless) Databricks SQL queries via serverless SQL warehouses. Varies by cloud; serverless SQL typically priced per DBU-second consumed
Databricks Committed Use (DBCU) Prepurchase Pre-purchased DBU commitment redeemable across workload types; discounts up to 37% vs on-demand. Negotiated; 1-year or 3-year terms
  • Where to check usage: Account Console → Billing → Usage (provides DBU consumption by workspace, cluster, user, and workload type with date range filters)
  • How to identify unused seats: In Account Console → Billing → Usage, filter by user or principal to identify accounts with zero DBU consumption over a selected period. Workspace admins can also review cluster event logs and job run history to find inactive users. No built-in 'inactive user' report exists; admins must query system tables (system.billing.usage, system.access.audit) in Unity Catalog for user-level activity analysis.
  • Billing notes: Databricks charges are consumption-based (DBU), not per-seat. Cloud infrastructure costs (EC2, Azure VMs, GCP VMs) are billed separately by the cloud provider and typically add 50–200% on top of DBU charges. Standard plan is being phased out: AWS and GCP Standard workspaces must upgrade to Premium by October 2025; Azure Standard retires October 2026. Unity Catalog and account-level SCIM require Premium.

The cost of manual management

Databricks billing is consumption-based (DBU), not per-seat, so unused accounts do not generate direct license waste. The real cost of manual access management is operational: there is no built-in inactive user report, and identifying dormant accounts requires querying system tables (system.access.audit) in Unity Catalog - itself a Premium-only feature.

When a user is deleted without prior cleanup, owned jobs continue running under orphaned principals and Unity Catalog object ownership must be manually transferred. These cleanup tasks scale poorly across large workspaces.

Cloud infrastructure costs (EC2, Azure VMs, GCP VMs) are billed separately by the cloud provider and typically add 50–200% on top of DBU charges, making runaway compute from unreviewed service principals a meaningful cost risk.

What IT admins are saying

Nested groups are not supported when syncing via Azure Entra ID SCIM - only flat group membership is provisioned. Organizations with deep group hierarchies in their IdP must flatten structures before sync or accept incomplete provisioning.

Migrating from legacy workspace-level SCIM to account-level SCIM requires reconfiguring IdP endpoints and can introduce temporary provisioning gaps during the transition. Large organizations have also reported hitting workspace-level user and group count limits, requiring architectural changes to split users across multiple workspaces.

Personal access token (PAT) management is entirely manual - there is no native PAT expiry enforcement or bulk rotation tool in the UI, which is a recurring security concern in community discussions.

Common complaints:

  • Nested groups are not supported when syncing via Azure Entra ID (Azure AD) SCIM; only flat group membership is provisioned.
  • Legacy workspace-level SCIM is deprecated and Databricks recommends migrating to account-level SCIM, but migration requires reconfiguring IdP SCIM endpoints and can cause temporary provisioning gaps.
  • Large organizations report hitting user and group count limits at the workspace level, requiring architectural changes to split users across multiple workspaces.
  • There is no built-in inactive user report; identifying unused accounts requires querying system tables (system.access.audit) which requires Unity Catalog Premium setup.
  • Deleting a user does not automatically reassign owned jobs or Unity Catalog object ownership, leaving orphaned resources that require manual cleanup.
  • Personal access token (PAT) management is manual; there is no native PAT expiry enforcement or bulk rotation tool in the UI.
  • Users added manually in the Account Console can be overwritten or deactivated unexpectedly if SCIM is also configured and the IdP does not include that user.
  • Workspace assignment after account-level user creation is a separate step that is easy to overlook, resulting in users who exist in the account but cannot access any workspace.

The decision

Manual management is viable for small teams with a single workspace and infrequent user changes. The two-step add flow (account level, then workspace assignment) is straightforward at low volume but becomes error-prone as workspace count grows.

For organizations running multiple workspaces or requiring consistent offboarding across every app in their data stack, account-level SCIM with an IdP is the recommended path. SCIM requires Premium and SSO as prerequisites - confirm both are in place before planning a migration.

Teams on Standard tier should factor the forced upgrade timeline (AWS/GCP by October 2025, Azure by October 2026) into any access management roadmap, since Unity Catalog and account-level SCIM are both Premium-only capabilities.

Bottom line

Databricks access management is functional but layered: account-level and workspace-level controls are independent, and every app or user added at one layer must be explicitly wired to the other.

Manual processes work at small scale but break down quickly when workspaces multiply, group structures grow, or offboarding requires audit-safe deactivation rather than deletion.

The absence of a native inactive user report and the manual PAT lifecycle are the two friction points most likely to create compliance exposure over time. Teams planning to scale should treat the Premium upgrade and account-level SCIM configuration as foundational prerequisites, not optional improvements.

Automate Databricks workflows without one-off scripts

Stitchflow builds and maintains end-to-end IT automation across your SaaS stack, including apps without APIs. Built for exactly how your company works, with human approvals where they matter.

Every app coverage, including apps without APIs
60+ app integrations plus browser automation for apps without APIs
IT graph reconciliation across apps and your IdP
Less than a week to launch, maintained as APIs and admin consoles change
SOC 2 Type II. ~2 hours of your team's time

UpdatedMar 9, 2026

* Details sourced from official product documentation and admin references.

Keep exploring

Related apps

15Five logo

15Five

Full API + SCIM
AutomationAPI + SCIM
Last updatedFeb 2026

15Five uses a fixed role-based permission model with six predefined roles: Account Admin, HR Admin, Billing Admin, Group Admin, Manager, and Employee. No custom roles can be constructed. User management lives at Settings gear → People → Manage people p

1Password logo

1Password

Full API + SCIM
AutomationAPI + SCIM
Last updatedFeb 2026

1Password's admin console at my.1password.com covers the full user lifecycle — invitations, group assignments, vault access, suspension, and deletion — without any third-party tooling. Like every app that mixes role-based and resource-level permissions

8x8 logo

8x8

Full API + SCIM
AutomationAPI + SCIM
Last updatedFeb 2026

8x8 Admin Console supports full lifecycle user management — create, deactivate, and delete — across its X Series unified communications platform. Every app a user can access (8x8 Work desktop, mobile, web, Agent Workspace) is gated by license assignmen