Written by Harry Dix |
In our final technical webinar of 2020, our team of experts gave a behind-the-scenes look at developing a data product. Split into two-parts, the webinar gave an inside look at how to embed analytics directly into your data product. If you missed it, or want to dig a little deeper into what we covered, you can watch a full recording of Part 1 and Part 2 of the webinar.
As with all our webinars, we allocated time for a Q&A session, something that is always sure to bring great insights as well as help explore specific topics in more depth. And with that in mind, we’ve created a list of the top data product-development related questions you asked and we answered.
What types of tools are available in order to customize each workspace?
There are a number of different tools to do this. We have a robust API associated with anything you see on the GoodData platform. Customizations can be managed on your own.
What recommendations do you have for starting out with the logical data model?
In general, when building your LDM, star schema dimensional modelling is the most effective way of working with the GoodData semantic model. Be sure to keep the data model as flat as possible to minimise costly joins on the front end.
Another top tip is to refrain from having duplicate names in the LDM. It can be especially confusing for end users using the analytical designer, in the instance that there’s duplicate names for them to choose. Remember that they may not have familiarity with the data model or with the data sources. If there are duplicates, and an attribute is referenced in more than one fact table, be sure to separate this out into a shared or conformed dimension.
How can I learn which logical data model design works best for my use case?
The LDM modeler is a really cool way to plug and play on the GoodData platform. If you want to learn more about data modelling techniques, check out our GoodData University courses.
Is it better to load things like total sales as part of the data load or create them as measures?
Playing around with what’s the most efficient is great. In something simple like this the power of GoodData enables it to render the results very quickly. For something more complex with more intensive definitions associated with the measure, then having them in the ETL directly as a fact in the fact table is best - rather than rendering it in the front end.
Permissions - If the source of the data is being edited by other coworkers, are the charts and analytics made in GoodData edited or refreshed by themselves or do you have to do it manually?
When a data load from the new data source is kicked off you’ll automatically see updated data in GoodData. Whatever is being loaded into your workspace is what is being rendered. No manual changes needed.
What is a KPI alert and how can I set them for my end users?
KPI alerts can be set on the key performance indicators on the KPI dashboard. However, they’re not available in edit mode, it’s something you’ll edit in the publicly available dashboard. Select by clicking on the KPI indicator bell icon and it will let you set an alert for a threshold ( e.g. benchmark of 100k min. sales). This is a user-defined alert, so educating users on how to use it will enable them to do it themselves.
Are you able to create new insights or measures without using the interface?
Definitely. With GoodData, everything is configurable through the APIs. If you want to create a new measure or different insights, you can set them up programmatically. Visit our API page for more info.
How do you correctly configure geo charts in GoodData?
For steps on configuring geo charts, read this helpful blog article.
Our data currently lives in on-premise SQL servers, can you explain how GoodData can connect to them and pull the data to ADS?
The best way to do this is by exporting CSV files and then storing them in an AWS S3 bucket. Our blog article explains the benefits of working with CSVs.
Can I schedule different data loads from different data sources in the same workspace?
Yes. You can configure different processes for the different sources and different schedules for the different data sets.
Am I allowed to modify different pieces of the LDM to run data loads at different times?
Yes. It is possible to set up different schedules for different datasets within the data integration console.
If a domain can have a single domain admin, it is easy to change the admin email in case they are no longer active?
This is done via a GoodData support ticket. If you change the credentials, you need to update them in the data pipeline. Best practice is to have a single login and then manage who has access, thus giving the customer the ability to manage access.
Can the provisioning metadata be stored in a different data source to the actual data?
You can have different components mapped to different data sources. An example would be to have a different data source for AD and for the provisioning bricks.
The manual configuration of the bricks is great, but is there an automated way by which we can set up bricks programmatically?
Is the GD.UI tool available in the GoodData Free version?
Yes. It is.
Can we use GD.UI in a non-REACT environment?
Do you support typescript?
Yes. We do. There are optional typescript plugins available.
If someone wants to refresh the LDM catalogue in GD.UI how do you do it?
If you want to change something in the LDM you can do it manually with a command to refresh it and continue working.
Do I need to allow the GoodData domain as a CORS origin to make the app work when deployed?
If a customer creates a custom report on their end will the roll out process wipe out their changes?
This will only happen if there was a change in the LDM. If the report points to an entity that’s wiped from the LDM then this entity would no longer be there for the report to point to. We recommend an impact analysis before changing the LDM.
Want to learn more about developing a data product with GoodData? Request a demo and let our experts take you on a guided tour of the GoodData platform. They’ll help you discover its rich feature set, ease of implementation and unparalleled performance, as well as answer your questions.
Written by Harry Dix |
Subscribe to our newsletter
Get your dose of interesting facts on analytics in your inbox every month.Subscribe