Load the Demo Data to Your Own Database

How to integrate the demo data into a different database?

The GoodData.CN Community edition image contains the sample data in a PostgreSQL database. If you want to use a different (supported) database, you can load the sample data to this database and continue with the tutorials normally.

  • The demo data is provided as CSV files.
  • The DDL is PostgreSQL-specific, but you can update it as you see fit.

To load the sample data to a different database, do the following:

  1. Download the demo data.
  2. Create a database model.
  3. Upload the sample data to your database.

Download the Demo Data

Download and review the sample CSV files to help you understand how GoodData.CN processes the rows and columns.


  1. Download the archived file with the CSV files from https://gooddata-cloud-native-demo-workspace.s3.amazonaws.com/1.1.1/demo/demo.zip.
  2. Unzip the downloaded file.
  3. Review the sample data for the following datasets:
    • campaigns
    • customers
    • products
    • campaign_channels
    • order_lines

Create a Database Model

To create a database model, execute SQL script demo_ddl.sql in your database. If you are using a database other than PostgreSQL, update the script to make it work with your database.

Upload the Sample Data to Your Database

To upload the demo data to your database, run the following command for each dataset (campaigns, customers, products, campaign_channels, and order_lines):

PowerShell 7
psql -c "\COPY $dataset FROM 'demo_${dataset}.csv' (FORMAT csv, DELIMITER ',', QUOTE '\"', ESCAPE '\\', HEADER true);"
# Consider setting $env:PGUSER and $env:PGPASSWORD variables
# Set your own PostgreSQL version
& "c:\Program Files\PostgreSQL\$PGVERSION\bin\psql.exe" -c "\COPY $dataset FROM 'demo_$dataset.csv' (FORMAT csv, DELIMITER ',', QUOTE '\`"', ESCAPE '`\', HEADER true);"

(Optional) Update the Sample Data

Update the sample data using the following SQL script, if you want see some data while applying filters like “Last week”.

-- Shift date so it covers actual year, so filters like "This month" show any data
-- To do not have to implement various version for various databases, we shift by 365 days.
-- Equijoin required by Redshift
update "order_lines" set "date" = "date" + cast(dateshift as int)
from (
  -- Calculate the diff between max(date) in data vs. actual year.
  -- Create string 'X year', which can be CASTed to interval and used for the shift
  select (date_part('year', current_date) - date_part('year', max("date") over ())) * 365  as dateshift,
    "order_line_id" as "update_order_line"
  from "order_lines"
) d
where "update_order_line" = "order_line_id"