Skip to main content

Changes to User Management

This section covers changes between release on the following topics:

  • Authorization to the platform

    • User roles

    • Permissions of roles

  • Required permissions

  • Authentication methods

  • User management

Release 8.10

Introducing project users and roles management

Note

This feature may not be available in all product editions. For more information on available features, see Compare Editions.

Beginning in this release, administrators can manage the access and roles of users of Dataprep by Trifacta, which includes basic access to the product and role-based access controls to types of objects created in the project.

A Dataprep admin can create and assign roles to project users.

  • A role is a set of privileges that you can assign to project users.

    Note

    Each current user or newly created user is automatically assigned the default role, which grants a set of privileges for all governed object types in the project.

    • Dataprep by Trifactausers may have one or more roles within the project.

    • Roles are created and assigned through the Roles page in the Admin console.

  • A privilege is a level of access to a type of user-defined Alteryx object, such as flows.

For more information on these distinctions, see Overview of Authorization.

For more information, see Roles Page.

For more information, see Privileges and Roles Reference.

Dataprep admin is a super user

The Dataprep admin role is a super-user of the product. It is automatically assigned to anyone who is designated as the project owner in Google Cloud Platform.

Note

This role has owner access to user-created objects, such as flows and connections, within the project.

Note

The project owner is automatically granted the Dataprep admin role. This role can be assigned to non-project owners. It grants a project user all of the privileges of the project owner within Dataprep by Trifacta. If the Dataprep admin role is un-assigned to a project owner, it is automatically granted back to the project owner on next login.

Note

When roles are modified, some menu items may not be displayed to specific users because of their role assignments. Dataprep admins may receive inquiries about menu option availability. A user's assigned roles could be a likely source for why a menu option is not available to the user.

Dataprep admin can edit any global connection

After an administrator has made a connection global (available to all users):

  • Any administrator can edit the connection.

  • All users can use the connection (existing functionality)

  • The connection cannot be made private again (existing functionality). Connection must be deleted and recreated.

Fine-grained sharing permissions on individual objects

At a more granular level, role-based access controls (RBAC) determine access to individual Alteryx objects at finer-grained levels for flows, connections, and plans. This capability has been present in the product for a few releases. With this release, RBAC is elevated to project-level privileges within a role definition.

Tip

You can still change the permissions to a shared object for individual users. These fine-grained permissions can be assigned at the time of sharing by the object's owner or a Dataprep admin. They can also be changed at a later time.

Note

Project-level permissions that are defined through a user's assigned roles define the maximum and default level of permissions that can be assigned when an object is shared. When an object is shared, you can determine if you wish to set the access level on the object to a lower level if desired.

For more information, see Overview of Authorization.

For more information, see Overview of Sharing.

Release 8.1

Fine-grained sharing permissions on individual objects

Beginning in this release, you can change the permissions to a shared object for individual users. These fine-grained permissions can be assigned at the time of sharing by the object's Owner or a workspace admin. They can also be changed at a later time.

Note

In this release, fine-grained sharing permissions apply to flows and connections only.

For more information, see Overview of Sharing.

Release 8.0

Scheduled outputs now inherit service account settings

Note

This feature may not be available in all product editions. For more information on available features, see Compare Editions.

Previously, when companion service accounts were enabled for use, outputs for scheduled jobs that had already been created did not inherit the use of the companion service accounts. Without a companion service account, these scheduled outputs cannot be executed, and their jobs would fail with a No Dataflow service account provided error message. The issue and prior workaround are described below.

Beginning in Release 8.0, scheduled outputs that do not have a companion service account inherit the companion service account defined in the user's preferences.

Note

The workaround described below to fix scheduled outputs is no longer required.

See User Profile Page.

Release 7.10

Changes to IAM roles for service accounts

Note

This feature may not be available in all product editions. For more information on available features, see Compare Editions.

Recently, Google announced changes to the required permissions for IAM roles used by service accounts to access platform resources.

Note

The following changes will be deployed by Google on January 27, 2021. These changes have been enacted by Google and are outside of the platform. All administrators of Dataprep by Trifacta should review the following changes to requirements and verify the impacts on their deployments.

Dataprep by Trifacta Premium customers

Steps:

  1. Login as an administrator.

  2. From the left nav bar, select User menu > Admin console > Settings.

  3. Locate the following setting: Manage access to data using user IAM permissions. Check the current setting:

    Setting

    Description

    Enabled

    Users in your deployment are using IAM permissions. Please review and complete the following steps.

    Disabled

    Users in your deployment are not affected. No further action is required. See "Other product editions" below for details.

    Default

    Default setting is Disabled. See previous.

    For more information, see Dataprep Project Settings Page.

New requirements:

Note

With this change, users who connect to platform resources using IAM roles must meet one of the following requirements to run jobs on Dataflow.

  • User must have iam.serviceAccounts.actAspermission on a compute service account, which must be specified during job execution.

  • User must have iam.serviceAccounts.actAspermissionspecified at the project level or in the default compute service account.

  • Project owners are not affected.

Recommendations:

For uninterrupted service, please do one of the following:

  1. Project administrators can grant their users the iam.serviceAccounts.actAspermissionon the default compute service account:

    <project-number>-compute@developer.gserviceaccount.com
  2. (preferred) Project administrators should provision compute service accounts of narrower scope for their users. Users should be educated on how to use them.

    Warning

    If users are now using companion service accounts, the outputs of any scheduled jobs must be updated. See "Fixing schedules" below.

Fixing schedules:

If individual users are now using companion service accounts to run jobs, all affected schedules must be updated.

Steps:

Note

Each user should complete the following steps for each of flow that they own.

  1. Identify if the flow contains a schedule:

    1. Open the flow in Flow View.

    2. At the top of the page, you may see one of the following icons.

    3. This icon indicates that there is an enabled schedule. Its outputs must be updated. Please see Step 2.

      ChangesToUserManagement-ScheduleIcon.png
    4. This icon indicates that there is a schedule, but it is disabled. Its outputs must be updated. Please see Step 2.

      ChangesToUserManagement-ScheduleIcon-Disabled.png
    5. If neither icon is present, then the flow does not have a schedule. Please go to the next flow.

  2. For flows that do contain enabled or disabled schedules, please do the following:

    1. Locate the scheduled outputs. In the Flow View canvas, these outputs are labeled Scheduled Output.

    2. Select a scheduled output.

    3. In the right panel, under Scheduled Destinations click Edit.

    4. In the Scheduled publishing settings, click the Advanced Settings caret to open it.

    5. In the Service Account textbox, insert the new companion service account that you should use for the scheduled job.

  3. Repeat Step 2 for any other scheduled outputs in the flow.

  4. Repeat these steps for other scheduled flows that you own.

References:

Other product editions

No action is required. Actions on Dataflow are already executed with the proper permissions.

For more information on Dataprep by Trifacta permissions, see Required Dataprep User Permissions.

Release 7.9

Manage access to data using IAM permissions

Note

This feature may not be available in all product editions. For more information on available features, see Compare Editions.

You can optionally configure the workspace to manage access to BigQuery and Cloud Storage data based on a fine-grained set of permissions in the user's IAM role.

Reduced scope of minimum required permissions

Note

This feature may not be available in all product editions. For more information on available features, see Compare Editions.

Prior to Release 7.9, the required IAM permissions for Premium Edition were the following list. These permissions had to be included in any IAM role assigned to a product user.

However, many of these previously required permissions do not directly apply to use of the product. Beginning in Release 7.9, the list of required permissions has been reduced to only those required to access the Trifacta Application and other elements of the product.

Note

Permissions that are needed for access to other services are now considered optional for use of the product itself. In the list below, these optional permissions are primarily tied to use of BigQuery.

Note

The storage.buckets.list permission must be enabled at the project level. All other storage.* permissions only need to be enabled on the staging bucket.

Note

The bigquery.jobs.create permission is required if you wish to use to BigQuery at all. The other permissions are optional and can be applied at the project or dataset level.

Area

Permission

Release 7.8 and earlier

Release 7.9 and later

General

resourcemanager.projects.get

required

required

dataprep.projects.use

required

required

BigQuery

bigquery.datasets.get

required

optional

bigquery.jobs.create

required

optional

Note

This permission is required if you wish to use to BigQuery at all.

bigquery.tables.create

required

optional

bigquery.tables.get

required

optional

bigquery.tables.getData

required

optional

bigquery.tables.list

required

optional

compute.machineTypes.get

required

required

dataflow.jobs.create

required

required

dataflow.jobs.get

required

required

dataflow.messages.list

required

required

dataflow.metrics.get

required

required

base storage

storage.buckets.get

required

required

storage.buckets.list

required

required

Note

This permission must be enabled at the project level.

storage.objects.create

required

required

storage.objects.delete

required

required

storage.objects.get

required

required

storage.objects.list

required

required

storage.objects.update

required

optional

New permissions

The following permissions are newly tracked for use with Premium Edition:

Note

Please verify that any newly required permissions are added to user roles.

Area

Permission

Release 7.8 and earlier

Release 7.9 and later

BigQuery

bigquery.tables.delete

n/a

optional

Note

This permission is required on a table if you wish to publish to it. Otherwise, the table is reported as being read-only. For more information, see Required Dataprep User Permissions.

bigquery.datasets.create

n/a

optional

bigquery.datasets.update

n/a

optional

dataflow.jobs.cancel

n/a

optional

For more information, see Required Dataprep User Permissions.