Stitchflow
BigQuery logo

BigQuery User Management Guide

Manual workflow

How to add, remove, and manage users with operational caveats that matter in production.

UpdatedMar 4, 2026

Summary and recommendation

BigQuery user management can be run manually, but complexity usually increases with role models, licensing gates, and offboarding dependencies. This guide gives the exact mechanics and where automation has the biggest impact.

BigQuery does not manage user accounts or issue its own credentials. Access control runs entirely through Google Cloud IAM, with permissions assignable at the organization, folder, project, dataset, table, or column level.

Because of this layered model, every app that touches BigQuery data inherits whatever IAM bindings are in place at the time - making role hygiene a continuous operational concern, not a one-time setup task.

Six predefined roles cover most use cases: Admin, Data Owner, Data Editor, Data Viewer, Job User, and User. A critical non-obvious dependency: Data Viewer alone does not allow running queries. Users also need roles/bigquery.jobUser at the project level, or they will hit permission errors despite having data access.

Column-level security requires a separate Data Catalog policy tag configuration. Row-level security is managed via BigQuery row access policies through DDL or the API - neither is visible or configurable from the standard IAM console.

Quick facts

Admin console pathGoogle Cloud Console > IAM & Admin > IAM (for project-level) or BigQuery Studio > dataset > Share
Admin console URLOfficial docs
SCIM availableNo
SCIM tier requiredGoogle Cloud (Usage-based)
SSO prerequisiteNo

User types and roles

Role Permissions Cannot do Plan required Seat cost Watch out for
BigQuery Admin (roles/bigquery.admin) Full control over all BigQuery resources in the project: create/delete datasets and tables, run jobs, manage IAM policies, view billing data. Cannot manage Google Cloud project billing or create new GCP projects without additional project-level IAM roles. Any Google Cloud project (pay-per-use or flat-rate) No per-seat cost; IAM role assignment is free. BigQuery charges are usage-based. Granting bigquery.admin at project level gives full access to all datasets in that project, including sensitive ones. Prefer dataset-level grants where possible.
BigQuery Data Owner (roles/bigquery.dataOwner) Read, write, and delete access to datasets and their contents. Can update dataset metadata and grant access to others on datasets they own. Cannot run jobs (queries) without also having bigquery.jobUser or bigquery.user role. Cannot manage project-level IAM. Any Google Cloud project No per-seat cost. Data Owner alone does not allow running queries; must be combined with a job-running role.
BigQuery Data Editor (roles/bigquery.dataEditor) Read and write data within datasets (create/update/delete tables, rows). Can list datasets. Cannot delete datasets. Cannot run jobs without bigquery.jobUser. Cannot manage IAM on datasets. Any Google Cloud project No per-seat cost. Cannot delete datasets; only Data Owner or Admin can delete datasets.
BigQuery Data Viewer (roles/bigquery.dataViewer) Read-only access to dataset metadata and table data. Can list tables and read table data. Cannot write data, run jobs, or modify any resources. Cannot run queries without bigquery.jobUser. Any Google Cloud project No per-seat cost. Read-only on data does not mean read-only on query costs; the job runner's project is billed for queries.
BigQuery Job User (roles/bigquery.jobUser) Can run jobs (queries, load jobs, export jobs) within the project. Billed to the project. Cannot access any dataset data unless also granted a data role on the relevant dataset. Any Google Cloud project No per-seat cost; query costs are billed to the project. Commonly paired with bigquery.dataViewer for analyst access. Without a data role, the user can run jobs but will receive permission errors on datasets.
BigQuery User (roles/bigquery.user) Can run jobs, list datasets in the project, and create new datasets. Includes bigquery.jobUser permissions plus dataset listing. Cannot read table data in datasets unless granted a data role on those datasets. Any Google Cloud project No per-seat cost. bigquery.user allows creating datasets, which may be undesirable in tightly controlled environments.
BigQuery Metadata Viewer (roles/bigquery.metadataViewer) Can list and view metadata for all datasets, tables, and views in the project. Cannot read actual table data. Cannot read row data, run jobs, or modify resources. Any Google Cloud project No per-seat cost. Useful for data catalog and discovery use cases without exposing actual data.
BigQuery Read Session User (roles/bigquery.readSessionUser) Can create read sessions via the BigQuery Storage Read API. Cannot run standard SQL jobs or access datasets without additional roles. Any Google Cloud project No per-seat cost; Storage Read API has separate usage pricing. Required specifically for BigQuery Storage API consumers (e.g., Apache Spark, Arrow integrations).

Permission model

  • Model type: hybrid
  • Description: BigQuery uses Google Cloud IAM for access control. Permissions can be granted at the organization, folder, project, dataset, table, or column level. Predefined roles bundle common permission sets. Custom roles can be created from individual IAM permissions. Dataset-level access can also be managed via the BigQuery API or Console Share dialog using legacy roles (OWNER, WRITER, READER) or IAM conditions. Column-level security is enforced via BigQuery policy tags (requires Data Catalog).
  • Custom roles: Yes
  • Custom roles plan: Available on all Google Cloud plans at no additional IAM cost. Requires resourcemanager.projects.setIamPolicy or equivalent permission to create and assign.
  • Granularity: Organization > Folder > Project > Dataset > Table > Column (via policy tags). Row-level security is also available via row access policies.

How to add users

  1. Navigate to https://console.cloud.google.com/iam-admin/iam and select the target project.
  2. Click 'Grant Access' (previously 'Add').
  3. In the 'New principals' field, enter the user's Google account email, Google Group email, service account, or Cloud Identity domain.
  4. Select one or more BigQuery IAM roles from the 'Select a role' dropdown (e.g., roles/bigquery.dataViewer + roles/bigquery.jobUser).
  5. Optionally add IAM conditions to restrict access by resource, time, or other attributes.
  6. Click 'Save'.
  7. For dataset-level access: open BigQuery Studio at https://console.cloud.google.com/bigquery, select the dataset, click 'Sharing' > 'Permissions', then 'Add Principal', enter the principal and role, and save.

Required fields: Principal identifier (Google account email, Google Group, service account email, or Cloud Identity domain), At least one IAM role

Watch out for:

  • The principal must already have a Google account or be a Google Workspace / Cloud Identity managed account. BigQuery cannot create user accounts itself.
  • Project-level IAM grants apply to all datasets in the project unless overridden at the dataset level.
  • Dataset-level grants are additive with project-level grants; there is no deny at dataset level to override a project-level grant (use IAM Deny policies for explicit denials, available in preview).
  • Granting access to an external (non-org) Google account is possible but may be restricted by org policy constraints/iam.allowedPolicyMemberDomains.
  • Column-level security requires creating policy tags in Data Catalog and assigning the Fine-Grained Reader role separately.
  • Row-level security requires creating row access policies via DDL or the API; it is not managed through the IAM console.
Bulk option Availability Notes
CSV import No Not documented
Domain whitelisting Yes Automatic domain-based user add
IdP provisioning Yes Google Workspace or Cloud Identity (Free or Premium). SCIM provisioning to Cloud Identity is supported via Okta, Azure AD/Entra ID, and other IdPs, which then makes those identities available for BigQuery IAM grants. BigQuery itself has no direct SCIM endpoint.

How to remove or deactivate users

  • Can delete users: No
  • Delete/deactivate behavior: BigQuery does not manage user accounts. Removing access means revoking IAM role bindings for the principal on the project, folder, or dataset. The Google account itself is managed in Google Workspace or Cloud Identity. Disabling or deleting the Google account in Cloud Identity/Workspace prevents all GCP access including BigQuery. Revoking IAM bindings removes BigQuery access without affecting the account.
  1. To revoke project-level access: navigate to https://console.cloud.google.com/iam-admin/iam, locate the principal, click the edit (pencil) icon, remove the relevant BigQuery roles, and click 'Save'. Alternatively, click the delete (trash) icon to remove all project-level roles for that principal.
  2. To revoke dataset-level access: open BigQuery Studio, select the dataset, click 'Sharing' > 'Permissions', find the principal, and click the remove icon next to their role.
  3. To disable the underlying Google account (full access revocation across GCP): go to Google Workspace Admin Console (admin.google.com) > Directory > Users, select the user, and click 'Suspend user' or 'Delete user'.
  4. To revoke access via gcloud CLI: run 'gcloud projects remove-iam-policy-binding PROJECT_ID --member=user:EMAIL --role=ROLE_ID'.
Data impact Behavior
Owned records BigQuery tables, datasets, views, and saved queries are owned by the project, not by individual users. Revoking a user's IAM access does not delete or transfer any data. All resources remain intact in the project.
Shared content Shared datasets, authorized views, and shared queries remain accessible to other principals who have been granted access. Removing one user's access does not affect other users' access.
Integrations Scheduled queries and data transfer jobs created by the user are associated with the user's identity (or a service account). If the user's account is deleted or suspended, scheduled queries owned by that user may fail. Transfer ownership of scheduled queries before removing the account.
License freed BigQuery is usage-based, not seat-licensed. Revoking access does not free a paid seat. Future query and storage costs attributed to that principal will cease, but there is no seat license to reclaim.

Watch out for:

  • Scheduled queries (Data Transfer Service jobs) run as the creating user's credentials. Deleting or suspending that user will cause those jobs to fail. Re-own or recreate them under a service account before removing the user.
  • If the user created datasets with themselves as OWNER via legacy dataset ACLs, those ACL entries remain but become orphaned. Audit dataset ACLs after removing a user.
  • Removing a user from IAM does not immediately invalidate active OAuth tokens or short-lived credentials; token expiry (typically 1 hour) is required for full revocation unless using token revocation APIs.
  • Org policy constraints/iam.allowedPolicyMemberDomains can prevent re-adding external accounts after removal.
  • BigQuery audit logs (Cloud Audit Logs) retain records of the removed user's past actions regardless of account status.

License and seat management

Seat type Includes Cost
On-demand (pay-per-query) Query pricing charged per TiB of data processed. Storage charged per GB/month. No per-user or per-seat fee. Any number of IAM principals can be granted access. Approximately $6.25 per TiB queried (first 1 TiB/month free). Storage: $0.02/GB/month active, $0.01/GB/month long-term. Prices vary by region.
BigQuery editions (Standard, Enterprise, Enterprise Plus) - capacity-based Compute capacity purchased as slots (virtual CPUs). Autoscale editions available. Per-slot pricing replaces per-query pricing. Storage billed separately. No per-user seat cost. Standard: $0.04/slot-hour (autoscale). Enterprise: $0.06/slot-hour. Enterprise Plus: $0.10/slot-hour. Committed use discounts available for 1-year and 3-year terms.
BigQuery ML, BI Engine, Omni Additional features with separate pricing. BI Engine: reservation-based in-memory analysis. Omni: cross-cloud queries. ML: model training costs. Varies by feature and usage. See https://cloud.google.com/bigquery/pricing for current rates.
  • Where to check usage: Google Cloud Console > Billing > Reports (filter by BigQuery service) or Cloud Console > BigQuery > Admin > Capacity Management for slot usage. IAM principal-level query costs visible via INFORMATION_SCHEMA.JOBS_BY_PROJECT or Cloud Audit Logs.
  • How to identify unused seats: Query INFORMATION_SCHEMA.JOBS_BY_USER or INFORMATION_SCHEMA.JOBS_BY_PROJECT to identify principals who have not run jobs in a given period. Cross-reference with IAM policy to find principals with roles but no recent job activity. No built-in 'inactive user' report in the console.
  • Billing notes: BigQuery has no per-seat licensing. All costs are usage-based (compute and storage). Granting or revoking IAM access has no direct billing impact. Costs are attributed to the project where jobs run, not to the individual user's account. Use project-level budgets and alerts in Cloud Billing to monitor spend.

The cost of manual management

BigQuery has no per-seat licensing. All costs are usage-based: compute (per TiB queried on-demand, or per slot-hour on capacity editions) and storage (per GB/month). Granting or revoking IAM access has no direct billing impact.

The operational cost of manual access management is query-side: identifying who has access but isn't using it requires custom INFORMATION_SCHEMA.JOBS_BY_USER queries. There is no built-in inactive-user report in the console. Without that discipline, over-provisioned roles accumulate silently and every app querying shared datasets may be running under broader permissions than intended.

Scheduled queries (Data Transfer Service jobs) run as the creating user's credentials by default. When that user is offboarded, those jobs fail. Re-owning them to a service account before removal is a manual step with no automated prompt.

What IT admins are saying

The most consistent friction point reported by practitioners is the dual-layer admin surface: Google Workspace or Cloud Identity for account lifecycle, and the GCP/BigQuery console for IAM role assignment.

These are separate consoles with separate admin roles, and neither is aware of the other's state by default.

Dataset-level legacy ACLs (OWNER/WRITER/READER) and project-level IAM roles coexist in the same environment. Effective permissions are the union of both, which makes auditing non-trivial - especially when a user has been removed from IAM but retains a legacy dataset ACL entry.

OAuth token expiry adds a latency gap to offboarding: revoking IAM bindings does not immediately invalidate active tokens. Full revocation requires waiting for token expiry (typically up to one hour) unless token revocation APIs are called explicitly.

Common complaints:

  • No direct BigQuery SCIM endpoint; user provisioning must go through Google Cloud Identity or Google Workspace, adding an extra layer of management.
  • Must manage via Google Cloud Identity layer, which requires a separate admin console (admin.google.com) from the BigQuery/GCP console.
  • Scheduled queries are tied to the creating user's identity rather than a service account by default, causing failures when users are offboarded.
  • No built-in inactive user report; identifying unused access requires custom INFORMATION_SCHEMA queries.
  • Dataset-level ACLs (legacy OWNER/WRITER/READER) and project-level IAM roles coexist, creating confusion about effective permissions.
  • Column-level security requires a separate Data Catalog policy tag setup, which is not obvious from the BigQuery console alone.
  • IAM Deny policies (needed to override inherited project-level grants at dataset level) were in preview and not generally available for all customers.
  • External (non-org) Google accounts can be granted BigQuery access unless explicitly blocked by org policy, which is a common misconfiguration risk.

The decision

Manual management is workable for small, stable teams where dataset access is coarse-grained and changes infrequently. The IAM console is functional, role assignments are auditable via Cloud Audit Logs, and the predefined roles cover most analyst and engineering access patterns without custom configuration.

The model breaks down at scale in three specific ways: offboarding requires coordinated action across Cloud Identity, project-level IAM, and dataset-level ACLs with no single-pane view; scheduled query ownership creates hidden dependencies on individual user accounts; and column- or row-level security requires tooling (Data Catalog, DDL) that sits outside the standard IAM workflow.

Teams managing more than a handful of BigQuery users, or operating in environments with frequent role changes, will find the lack of a native inactive-user report and the absence of a unified offboarding checklist to be the sharpest edges.

Bottom line

BigQuery's access model is powerful and granular, but it is not self-contained. Every app and analyst workflow that depends on BigQuery data is only as well-governed as the IAM bindings and Cloud Identity accounts behind it.

The predefined roles are well-designed, but the operational burden - dual consoles, no inactive-user reporting, legacy ACL coexistence, and scheduled query ownership risk - means that manual management requires deliberate process discipline to stay clean.

Teams that treat IAM hygiene as a periodic audit task rather than a continuous workflow will accumulate access debt that is difficult to unwind without custom INFORMATION_SCHEMA queries and cross-system reconciliation.

Automate BigQuery workflows without one-off scripts

Stitchflow builds and maintains end-to-end IT automation across your SaaS stack, including apps without APIs. Built for exactly how your company works, with human approvals where they matter.

Every app coverage, including apps without APIs
60+ app integrations plus browser automation for apps without APIs
IT graph reconciliation across apps and your IdP
Less than a week to launch, maintained as APIs and admin consoles change
SOC 2 Type II. ~2 hours of your team's time

UpdatedMar 4, 2026

* Details sourced from official product documentation and admin references.

Keep exploring

Related apps

Abnormal Security logo

Abnormal Security

API Only
AutomationAPI only
Last updatedMar 2026

Abnormal Security is an enterprise email security platform focused on detecting and investigating threats such as phishing, account takeover (ATO), and vendor email compromise. It does not support SCIM provisioning, which means every app in your stack

ActiveCampaign logo

ActiveCampaign

API Only
AutomationAPI only
Last updatedFeb 2026

ActiveCampaign uses a group-based permission model: every user belongs to exactly one group, and all feature-area access (Contacts, Campaigns, Automations, Deals, Reports, Templates) is configured at the group level, not per individual. The default Adm

ADP logo

ADP

API Only
AutomationAPI only
Last updatedFeb 2026

ADP Workforce Now is a mid-market to enterprise HCM platform that serves as the HR source of record for employee data — payroll, benefits, time, and talent. User access is governed by a hybrid permission model: predefined security roles (Security Maste