Create a Logical Data Model Manually
This is an alternative approach to Auto generate LDM. It is recommended to go this way in the following scenarios:
- You want to create LDM first to define your reporting use cases
- Later you map LDM to PDM.
- You want to customize and evolve your LDM
- Once you apply custom changes into LDM, the auto-generation would rewrite them
- You could regularly update the relational model, so auto-generation still works, but can be very costly and even not possible in some cases
InfoLDM is focused on your reporting needs and may not match relational model of your database. Before mapping LDM to PDM, you may have to alter the relational model (introduce an ETL process).
To start creating your logical data model, simply drag and drop a new empty Dataset or Date from the left panel onto the canvas. Click on More … in the bottom-right corner of the dataset and pick View details. You can add attributes, additional labels, and facts into the dataset.
Here is the example of a manually created dataset “Order lines”:
Map Your Logical Data Model to the Physical Model
Mapping your logical data model to the physical model is not required from the point of your ability to publish the model - you can publish a model without any mapping. However, mapping is mandatory from the point of your ability to utilize the model in an analytical application like Analytical Designer.
There is a second tab in the dataset detail - Data Mapping.
As you can see, no table can be mapped to the dataset yet.
Use the following procedure:
Click on Scan button in the top-left corner, uncheck the checkbox “Generate datasets”
Return back to dataset details and select a table to be mapped with the dataset.
Map labels / facts to the physical columns.
Map all entities that you want to utilize in analytics applications.
Go to Physical Data Model to learn more about how to manage it.
Set the Primary Key
Before you can publish the model, you have to define the primary key (grain) of each dataset. You can find the Set primary key option in the bottom right corner (More…) of each dataset.
References (Relations) between Datasets
Creating a new dataset. In the following example it is the “Customers” dataset.
Set the primary key on the second dataset.
Click the dataset, locate the blue bullet on the right edge of the dataset object and drag it and drop it to target “Order lines” dataset to create a new reference.
You can see that “Order lines” dataset has just been extended by “Customer id” foreign key.
Enter the detail of “Order lines” dataset, go to the Data Mapping tab and find the new foreign key entity. You can map it to physical column (foreign key).
The reference opens up new analytics scenarios in Analytical Designer, for example, you can slice the price from Order lines by the state from Customers.
Drag and drop Date from the left panel:
Create a new reference between the new date dataset and a dataset that contains a date / timestamp column.
In our case it is “Order lines” dataset.
Drag the blue bullet on the right edge of the date dataset object and drop it to to the target “Order lines” dataset.
Map the new date “foreign key” with a physical column (just like in the case of a standard reference).
Configure date dataset details (optional).
- Press “Details” button within the date dataset box.
- In the configuration dialog you can:
- Add description to date dataset.
- Configure how the name of included date/time granularity will be formatted.
- Select date/time granularity.
Title patternwill define the overall format of the title for all included granularity. Using placeholders
%granularityTitleyou can position where the
Title baseand the default granularity title will be placed. If the
Title baseis not specified, the default name of date dataset will be used.
Example of titles generated using the default format
Date - Year,
Date - Hour, etc.
Several date granularity are mandatory and can’t be avoided.
Publish the Model
Go to Publish Logical Data Model to learn how to publish model.
Once you have the LDM published, you can start building insights and dashboards.