GoodData MCP Server
Experimental Feature
This is an experimental feature that is still under active development. Its behavior may change significantly in future releases, or the feature may be removed entirely.
We are launching an experimental Model Context Protocol (MCP) server that exposes GoodData features to external MCP clients. For now, you can connect from your own MCP Client (for example, a custom chatbot) and interact with GoodData using natural language.
AI is becoming part of everyday work, but connecting it safely to company data can be a challenge. Teams want their chatbots, IDE assistants, and custom agents to work with real data, while keeping access secure and consistent.
The GoodData MCP Server is our new step in that direction. It lets AI clients connect directly to governed analytics in GoodData, so they always reason with trusted metrics instead of raw or ad-hoc queries.
At the moment, this feature is available only through MCP clients (such as Cursor, ChatGPT with MCP support, or MCP Inspector).
How It Works
- The MCP server exposes platform capabilities to AI agents and developer tools.
- It supports MCP protocol with proper error handling, multi-workspace isolation, and authentication.
- You connect with your own MCP client using an API Token.
- Once connected, you can use natural language to trigger GoodData tools (list metrics, create alerts, etc.).
Available Tools
The MCP Server provides 24 tools organized into the following categories:
Workspace & Analytics Model
- get_workspace_info - Get workspace name, description, and organization information
- get_workspace_analytics - Retrieve complete analytics model (metrics, dashboards, visualizations, filter contexts, attribute hierarchies, export definitions, dashboard plugins)
- deploy_workspace_analytics - Deploy full analytics model to workspace (⚠️ WARNING: Replaces existing analytics)
Metadata Browsing (Workspace-scoped)
- list_workspace_metrics - List multiple metrics with optional RSQL filtering and pagination
- list_workspace_attributes - List multiple attributes with optional RSQL filtering and pagination
- list_workspace_visualizations - List multiple visualizations with optional RSQL filtering and pagination
- list_workspace_dashboards - List multiple dashboards with optional RSQL filtering and pagination
List operations:
- include – Request related entities in one call (e.g., facts, labels, datasets, visualizations)
- limit - Optional pagination (recommended: 10-50 for quick browsing, 100+ for comprehensive lists)
- rsql_filter - Optional RSQL filtering (e.g.,
title=like=*revenue*,tags=in=('finance','kpi'))
All list_* tools support the include parameter to fetch related entities in a single request.
Examples of valid includes:
- Metrics:
["facts", "attributes"] - Dashboards:
["visualizations"] - Attributes:
["labels", "datasets"]
Example usage:
list_workspace_metrics(rsql_filter="id==revenue_total")
list_workspace_dashboards(rsql_filter="id==sales_dashboard", include=["visualizations"])
list_workspace_attributes(include=["labels", "datasets"])
list_datasources(rsql_filter="id==my_db")
AI-Powered Search & Chat
- ai_search - Natural language search across workspace data with AI-generated insights
- ai_chat - Conversational assistant with intelligent routing:
- Semantic Search - Find existing dashboards, metrics, and visualizations
- Visualization Creation - Generate new visualizations from natural language
- General Questions - Get answers about analytics concepts and best practices
- Supports thread continuity for multi-turn conversations
AFM Execution (Label Elements)
- compute_label_elements - Retrieve distinct label (attribute) values with:
- Pattern filtering (SQL LIKE syntax:
"Premium%") - Exact filtering (
["USA", "Canada"]) - Exclusion mode (
complement_filter=True) - Pagination (default: 100, max: 10000)
- Dependent filters and validation by metrics/attributes
- Pattern filtering (SQL LIKE syntax:
Automations (Alerts)
- list_notification_channels - Discover available delivery channels (email, Slack, webhooks)
- list_automations - List all automations in the workspace
- create_metric_alert - Create unified metric-based alerts supporting:
- Comparison alerts:
GREATER_THAN,LESS_THAN,EQUAL_TO,GREATER_THAN_OR_EQUAL_TO,LESS_THAN_OR_EQUAL_TO,NOT_EQUAL_TO - Range alerts:
BETWEEN,NOT_BETWEEN - Relative alerts:
INCREASES_BY,DECREASES_BY,CHANGES_BY(withDIFFERENCEorCHANGEarithmetic) - Schedule configuration (cron format, timezone)
- Recipients (internal user IDs and/or external email addresses)
- Metric formatting (titles and number formats for readable notifications)
- Comparison alerts:
- update_metric_alert - Update alert thresholds, operators, recipients, schedules, or convert alert type
- pause_alerts – Temporarily disable all alert notifications
- unpause_alerts – Resume previously paused alerts
Example of Use
- You ask the assistant: “Alert me if Total Revenue drops below 90K.”
- The assistant analyzes your request and sets parameters:
- metric =
Total Revenue - operator =
LESS_THAN - threshold =
90000 - schedule = every morning at 8am
- delivery = your default method
- metric =
- You get confirmation: “Alert created: will check daily and notify you when Total Revenue < 90K.”
Important Recipient Rules:
Recipient types must match the notification channel’s allowedRecipients setting:
- CREATOR: Only the current user can be an internal recipient
- INTERNAL: Only
internal_recipients(user IDs) allowed - EXTERNAL: Both internal (user IDs) and external (email addresses) recipients allowed
Datasources & Data Modeling
- list_datasources - List multiple datasources with optional RSQL filtering and pagination
- test_datasource - Test datasource connectivity before scanning
- scan_datasource - Discover Physical Data Model (PDM) from database schemas
- generate_ldm - Generate Logical Data Model (LDM) from PDM
- register_upload_notification - Invalidate cache after new data uploads
Note: Datasource operations are organization-scoped and may require elevated permissions.
Knowledge Base Tools
The GoodData MCP Server provides tools for accessing platform documentation and analytics knowledge. These allow MCP clients to retrieve structured guidance on MAQL, dashboards, datasets, visualizations, and the Logical Data Model.
Knowledge Tool Endpoints
get_maql_guide Returns MAQL syntax rules, function explanations, and examples.
list_knowledge_topics Lists all available knowledge topics, such as dashboards, MAQL, LDM schema, or datasets.
get_knowledge_topic Retrieves the full content of a specific knowledge topic.
Knowledge Resources
All knowledge topics exposed through the Knowledge Tools are also available as MCP Resources for clients that support resource querying. This allows AI agents to load documentation directly as structured content and use it to understand GoodData concepts.
Available Knowledge Resource URIs:
| MCP Resource URI | Description |
|---|---|
| gdc-analytics-rules://gooddata | Overview of the GoodData analytics platform, key concepts, and system structure. Useful for general platform understanding. |
| gdc-analytics-rules://maql | MAQL reference including syntax, operators, functions, and examples. Essential for generating or validating metrics. |
| gdc-analytics-rules://dashboards | Documentation for dashboards, widgets, layout, filters, and interactions. Helps AI agents reference or construct dashboards correctly. |
| gdc-analytics-rules://visualizations | Details on visualization types, configuration options, bucket structure, sorting, and limitations. Useful for generating valid visualization definitions. |
| gdc-analytics-rules://datasets | Information about dataset structure, grain, roles, and relationships. Useful for metadata reasoning or dataset design. |
| gdc-analytics-rules://ldm-schema | Logical Data Model structure and modeling rules, including dataset relationships and keys. Important for AI-driven LDM creation and validation. |
Each resource contains structured documentation that AI assistants can load, summarize, and reason about, enabling more accurate results when working with analytics or metadata definitions.
Connect
Endpoint
Use this endpoint to connect:
http(s)://<your-gooddata-host>/api/v1/actions/workspaces/{workspaceId}/ai/mcp
Replace {workspaceId} with the actual workspace ID you want to work in.
Authentication
Pass your GoodData API token as a Bearer token:
Authorization: Bearer <your-token>
Example Configurations
Cursor IDE
(~/.cursor/mcp.json)
{
"mcpServers": {
"gooddata-mcp-server": {
"type": "streamable-http",
"url": "https://your-gooddata-host/api/v1/actions/workspaces/{workspaceId}/ai/mcp",
"headers": {
"Authorization": "Bearer <your-token>"
}
}
}
}
MCP Inspector CLI
mcp-inspector https://your-gooddata-host/api/v1/actions/workspaces/{workspaceId}/ai/mcp \
--header "Authorization: Bearer <your-token>"
Usage Examples
Browse Metrics
Tool: list_workspace_metrics
Optional filters:
title=like=*revenue*- Find metrics with “revenue” in the titletags=in=('finance','kpi')- Filter by tagslimit=10- Limit results to 10 metrics
Get specific metric:
get_workspace_metric(metric_id="revenue_total")- Retrieve full metric details including MAQL definition
Create Alerts
Tool: create_metric_alert
Comparison Alert Example:
automation_id: "revenue-milestone"
metric_id: "revenue_total"
operator: "LESS_THAN"
threshold: 90000
metric_format: "$#,##0"
metric_title: "Total Revenue"
notification_channel_id: "email-channel-id"
internal_recipients: ["user-id-123"]
cron: "0 0 8 * * *"
Range Alert Example:
operator: "BETWEEN"
from_value: 100000
to_value: 1000000
Relative Alert Example:
operator: "INCREASES_BY"
compare_metric_id: "revenue_last_month"
threshold: 0.10
arithmetic_operator: "CHANGE"
compare_metric_format: "$#,##0"
compare_metric_title: "Last Month Revenue"
Update Alerts
Tool: update_metric_alert
Examples:
- Update threshold:
threshold=2000000 - Convert to range alert:
operator="BETWEEN",from_value=100000,to_value=500000 - Change to relative alert:
operator="INCREASES_BY",compare_metric_id="revenue_last_month",threshold=0.15 - Update schedule:
cron="0 30 14 * * 1-5",timezone="UTC"
AI Search & Chat
AI Search:
ai_search(question="What were the total sales last quarter?")
AI Chat:
ai_chat(question="Give me bar chart slicing revenue by country")
Multi-turn conversation:
# First request
result1 = ai_chat(question="Give me revenue by country")
thread_id = result1.thread_id_suffix
# Follow-up maintains context
result2 = ai_chat(
question="And now filter to USA only",
thread_id_suffix=thread_id
)
Label Elements (AFM)
Tool: compute_label_elements
Examples:
- Basic lookup:
label_id="attr.customer.name" - Pattern search:
pattern_filter="Premium%" - Exact filter:
exact_filter=["North America", "Europe"] - Exclusion:
exact_filter=["Archived", "Deleted"],complement_filter=True - Pagination:
offset=200,limit=100
Scan Datasource and Generate LDM
Step 1: Scan datasource
scan_datasource(
datasource_id="my_db",
schemas=["public"],
scan_tables=True,
scan_views=True
)
Step 2: Generate LDM
generate_ldm(
datasource_id="my_db",
pdm=scan_result # Pass entire scan result or scan_result["pdm"]
)
Security and Permissions
- Authentication: Bearer token required for all operations
- Workspace Isolation: All operations are scoped to the authenticated user’s workspace context
- Permission Enforcement: Tools respect user permissions and only return objects you can access
- Organization-scoped Operations: Datasource operations may require elevated permissions
- Alert Recipients: Must match notification channel’s
allowedRecipientsconfiguration
Limits and Behavior
- Stateless HTTP Streaming: Optimized for MCP clients with stateless operation
- Pagination: List tools support
limitparameter (recommended: 10-50 for quick browsing) - RSQL Filtering: Advanced filtering using RSQL syntax on list operations
- Cron Format:
"second minute hour day month weekday"(e.g.,"0 0 8 * * *"= daily at 8 AM) - Label Elements: Default limit 100, maximum 10000
- Error Handling: Structured error responses with field-level validation messages
Troubleshooting
401/403 Errors:
- Verify your API token is valid and has access to the workspace
- Check that you have the required permissions for organization-scoped operations (datasources)
404 Errors:
- Object may not exist or you may lack permission to view it
- Verify workspace ID is correct in the endpoint URL
Invalid RSQL Filter:
- Check RSQL syntax (e.g.,
title=like=*revenue*) - Verify field names match available metadata fields
Alert Creation Failures:
- Comparison alerts require
thresholdparameter - Range alerts require both
from_valueandto_value - Relative alerts require
compare_metric_idandthreshold - Recipients must match notification channel’s
allowedRecipientspolicy
Label Elements Errors:
limitmust be > 0 and ≤ 10000sort_ordermust be"ASC"or"DESC"
LDM Generation:
- Ensure PDM comes from successful
scan_datasourceoperation - Can pass entire scan result or just the
pdmfield
Compatibility
Compatible with any MCP-compatible client:
- Cursor IDE - Full support via
~/.cursor/mcp.json - MCP Inspector - CLI tool for testing and debugging
- Custom MCP Clients - Any client implementing MCP protocol