Workspaces
Workspace provisioning allows you to create, update or delete child workspaces.
Multitenancy in GoodData
See Multitenancy: One Platform, Many Customers to learn more about how to leverage child workspaces in GoodData.
You can provision child workspaces using full or incremental load methods. Each of these methods requires a specific input type.
Usage
Start by importing and initializing the WorkspaceProvisioner.
from gooddata_pipelines import WorkspaceProvisioner
host = "http://localhost:3000"
token = "some_user_token"
# Initialize the provisioner with GoodData credentials
provisioner = WorkspaceProvisioner.create(host=host, token=token)
Then validate your data using an input model corresponding to the provisioned resource and selected workflow type, i.e., WorkspaceFullLoad
if you intend to run the provisioning in full load mode, or WorkspaceIncrementalLoad
if you want to provision incrementally.
The models expect the following fields:
- parent_id: ID of the parent workspace.
- workspace_id: ID of the child workspace.
- workspace_name: Name of the child workspace.
- workspace_data_filter_id: ID of the workspace data filter to apply (must exist on parent workspace).
- workspace_data_filter_values: List of filter values that determine which data this workspace can access.
- is_active: Deletion flag. Present only in the IncrementalLoad models.
Note on IDs
Each ID can only contain allowed characters. See Workspace Object Identification to learn more about object identifiers.
Use the appropriate model to validate your data:
# Add the model to the imports
from gooddata_pipelines import WorkspaceFullLoad, WorkspaceProvisioner
host = "http://localhost:3000"
token = "some_user_token"
# Initialize the provisioner with GoodData credentials
provisioner = WorkspaceProvisioner.create(host=host, token=token)
# Load your data
raw_data = [
{
"parent_id": "parent_workspace_id",
"workspace_id": "workspace_id_1",
"workspace_name": "Workspace 1",
"workspace_data_filter_id": "data_filter_id",
"workspace_data_filter_values": ["workspace_data_filter_value_1"],
},
]
# Validate the data
validated_data = [
WorkspaceFullLoad(
parent_id=item["parent_id"],
workspace_id=item["workspace_id"],
workspace_name=item["workspace_name"],
workspace_data_filter_id=item["workspace_data_filter_id"],
workspace_data_filter_values=item["workspace_data_filter_values"],
)
for item in raw_data
]
Now with the provisioner initialized and your data validated, you can run the provisioner:
# Import, initialize, validate...
...
# Run the provisioning method
provisioner.full_load(validated_data)
Workspace Data Filters
If you want to apply Workspace Data Filters to a child workspace, the filter must be set up on the parent workspace before you run the provisioning.
Workspace Data Filters
See Set Up Data Filters in Workspaces to learn how workspace data filters work in GoodData.
Examples
Here are full examples of a full load and incremental load workspace provisioning workflows:
Full Load
import logging
from gooddata_pipelines import WorkspaceFullLoad, WorkspaceProvisioner
host = "http://localhost:3000"
token = "some_user_token"
# Initialize the provisioner
provisioner = WorkspaceProvisioner.create(host=host, token=token)
# Optional: set up logging and subscribe to logs emitted by the provisioner
logging.basicConfig(level=logging.INFO)
logger = logging.getLogger(__name__)
provisioner.logger.subscribe(logger)
# Prepare your data
raw_data: list[dict] = [
{
"parent_id": "parent_workspace_id",
"workspace_id": "workspace_id_1",
"workspace_name": "Workspace 1",
"workspace_data_filter_id": "data_filter_id",
"workspace_data_filter_values": ["workspace_data_filter_value_1"],
},
{
"parent_id": "parent_workspace_id",
"workspace_id": "workspace_id_2",
"workspace_name": "Workspace 2",
"workspace_data_filter_id": "data_filter_id",
"workspace_data_filter_values": ["workspace_data_filter_value_2"],
},
{
"parent_id": "parent_workspace_id",
"workspace_id": "child_workspace_id_1",
"workspace_name": "Workspace 3",
"workspace_data_filter_id": "data_filter_id",
"workspace_data_filter_values": ["workspace_data_filter_value_3"],
},
]
# Validate the data
validated_data = [
WorkspaceFullLoad(
parent_id=item["parent_id"],
workspace_id=item["workspace_id"],
workspace_name=item["workspace_name"],
workspace_data_filter_id=item["workspace_data_filter_id"],
workspace_data_filter_values=item["workspace_data_filter_values"],
)
for item in raw_data
]
# Run the provisioning with the validated data
provisioner.full_load(validated_data)
Incremental Load
import logging
from gooddata_pipelines import WorkspaceIncrementalLoad, WorkspaceProvisioner
host = "http://localhost:3000"
token = "some_user_token"
# Initialize the provisioner
provisioner = WorkspaceProvisioner.create(host=host, token=token)
# Optional: set up logging and subscribe to logs emitted by the provisioner
logging.basicConfig(level=logging.INFO)
logger = logging.getLogger(__name__)
provisioner.logger.subscribe(logger)
# Prepare your data
raw_data: list[dict] = [
{
"parent_id": "parent_workspace_id",
"workspace_id": "workspace_id_1",
"workspace_name": "Workspace 1",
"workspace_data_filter_id": "data_filter_id",
"workspace_data_filter_values": ["workspace_data_filter_value_1"],
"is_active": True,
},
{
"parent_id": "parent_workspace_id",
"workspace_id": "workspace_id_2",
"workspace_name": "Workspace 2",
"workspace_data_filter_id": "data_filter_id",
"workspace_data_filter_values": ["workspace_data_filter_value_2"],
"is_active": True,
},
{
"parent_id": "parent_workspace_id",
"workspace_id": "child_workspace_id_1",
"workspace_name": "Workspace 3",
"workspace_data_filter_id": "data_filter_id",
"workspace_data_filter_values": ["workspace_data_filter_value_3"],
"is_active": False, # This will mark the workspace for deletion
},
]
# Validate the data
validated_data = [
WorkspaceIncrementalLoad(
parent_id=item["parent_id"],
workspace_id=item["workspace_id"],
workspace_name=item["workspace_name"],
workspace_data_filter_id=item["workspace_data_filter_id"],
workspace_data_filter_values=item["workspace_data_filter_values"],
is_active=item["is_active"],
)
for item in raw_data
]
# Run the provisioning with the validated data
provisioner.incremental_load(validated_data)