In order to see the available datasources within your organization:
In order to create a new datasource:
|name||str||the name of the datasource|
|project||str||project name of BQ table|
|dataset||str||dataset name of BQ table|
|table||str||table name of BQ table|
|table_type||str||choose from |
|service_account||str||path to service account json file for access to |
All of the above are required, except
service_account which is only required if
In order to set a datasource active:
|<datasource_id>||int||the id of the selected datasource|
In order to peek at a datasource
peek function will randomly sample a number of samples, in order to get a basic look at the data.
If you want to add a private BigQuery table as a datasource, you need to provide the Core Engine with a service account that has the BigQuery Data Viewer role. We will use this service account to create a copy of the BigQuery table to our own cloud, and use the copy as a datasource from then on. You can think of it as creating a snapshot of your BigQuery table.
In order to create such a service account, its easiest to use the gcloud CLI.
First create a service account as follows:
Then add the BigQuery Data Viewer role to the service account.
Finally, create the service account json file!
The json file will be stored at
~./key.json if you're in a Linux based system.
You can now use this file when you create your private BigQuery table.
Supported data types
|BigQuery Data Type||Supported|
We are working hard to bring the more supported data types to the Core Engine. Please give us feedback at
firstname.lastname@example.org so that we can prioritize the most important ones quicker!
For now, only BigQuery is supported as a datasource. We are actively looking for feedback for different formats. Please let us know
email@example.com which format would you like supported next!
The BigQuery table must also conform to the following restrictions:
- Max number of rows: 20,000,000
- Max number of columns: 1000
- Location: Private datasets must be in location EU
If your dataset is located outside of
EU, please try and follow this handy Google guide to copy it from any location to
EU. We apologize for this workaround and are working hard to bring all locations to the Core Engine!