Datasets API Reference
Geckoboard's Datasets API is a powerful and flexible way to compile data from in-house systems, third-party tools, and databases on your dashboard.
To do this you'll need to:
- Write a script that connects to your data source and requests the required data.
- Create and push a dataset to Geckoboard that includes all the metrics you want to display.
Authentication
Find your API key
Log into your Geckoboard account and follow these steps:
- Click your initials in the top right corner and select Account.
- On the Account Details screen, scroll down and look for API Key towards the bottom of the page.
Install a client library
You can make calls to the Datasets API with whichever method you usually use to make HTTP requests, but Geckoboard offers client libraries that make interacting with the API even simpler.
Switch the programming language of the examples with the tabs in the top right. By default, the Datasets API Docs demonstrate using cURL to interact with the API over HTTP.
If you're on a Unix based OS (Mac, Linux), you likely have cURL installed on your machine (use the curl -V
command in your terminal to confirm). Windows users can access the Command Prompt by searching for Command within Cortana.
Make your first API call
curl https://api.geckoboard.com/ -u "your-api-key:"
You should receive a
200
response containing{}
Authenticate and test your account when using the Datasets API by including your personal API key in the request.
If you missed including the colon :
or are still asked for a password, hit Enter in your terminal.
Plan your schema
When you’re adding a dataset widget to your dashboard, we’ll look at your schema and present the visualization options that make sense for the types of data you’re sending us. For example, to plot a line chart the dataset must contain the date
or datetime
types.
Visualizations are powered by individual datasets, which means you can't combine data from two or more datasets to build a visualization
Geckoboard can handle data aggregation and grouping, so there’s no need to pre-aggregate your data. And when an update is received via the API, all the widgets powered by that dataset are then updated automatically.
The Datasets API currently supports the following types:
Date format
Example creation:
"fields":{
"date":{
"type": "date",
"name": "Date"
"optional": false
}
}
Example adding data:
"data":[
{
"date": "2018-01-01"
}
]
All date types must be formatted as YYYY-MM-DD
(e.g. 2018-01-01
).
For hours, minutes and seconds, use the Datetime format.
A date
field can be NULL
if set as optional
.
Element | Description | Notes |
---|---|---|
YYYY |
Four-digit year | |
MM |
Two-digit month | Use leading 0 for 1-9 |
DD |
Two-digit day of month | 01 through 31 |
Datetime format
Example creation:
"fields":{
"datetime":{
"type": "datetime",
"name": "Datetime"
"optional": false
}
}
Example adding data:
"data":[
{
"datetime": "2018-01-01T12:00:30Z"
}
]
datetime
fields must be formatted as ISO 8601 strings, the International Standard for the representation of dates and times.
We recommend you use the YYYY-MM-DDThh:mm:ssTZD
variation, which will produce values that look like 2018-01-01T12:00:30Z
(1st January, 2018, 12:00:30 pm, UTC).
A datetime
field can be NULL
if set as optional
.
Element | Description | Notes |
---|---|---|
YYYY |
Four-digit year | |
MM |
Two-digit month | Use leading 0 for 1-9 |
DD |
Two-digit day of month | 01 through 31 |
hh |
Two digits of hour | 00 through 23 . 24-hour clock only. |
mm |
Two digits of minute | 00 through 59 |
ss |
Two digits of second | 00 through 59 |
TZD |
Time zone designator | Use Z for UTC or +hh:mm or -hh:mm . A time zone offset of +hh:mm or -hh:mm indicates that the date/time uses a local time zone which is hh hours and mm minutes ahead of or behind UTC. |
Duration format
"fields": {
"name": "Duration",
"type": "duration",
"time_unit": "minutes",
"optional": false
}
Example adding data:
"data":[
{
"duration": 83
}
]
duration
fields can be set to milliseconds
, seconds
, minutes
or hours
.
Decimal or integer values can be used. For example, if your field is set to minutes
and you send 1.5
, that will be displayed as 1m 30s
in app.
A duration
field can be NULL
if set as optional
.
Money format
Example creation:
"fields":{
"dollars":{
"type": "money",
"name": "Dollars",
"currency_code": "USD",
"optional": false
}
}
Example adding data:
"data":[
{
"dollars": 14000
}
]
money
fields represent a certain amount of money in a single currency. You can specify the currency when defining the field using the currency_code
option. This option accepts three character currency codes defined by the ISO 4217 standard. Currency codes should always be in uppercase.
Records should specify the amount of money in the currency’s smallest denomination, as an integer. For example, the USD’s smallest denomination is the cent, so a USD field would specify $10.00 as 1000
.
A money
field can be NULL
if set as optional
.
Currency | ISO 4217 code | Symbol |
---|---|---|
Australian dollar | AUD | A$ |
British pound sterling | GBP | £ |
Canadian dollar | CAD | C$ |
Chinese renminbi | CNY | 元 |
Euro | EUR | € |
Japanese yen | JPY | ¥ |
Mexican peso | MXN | $ |
Swedish krona | SEK | kr |
Swiss franc | CHF | Fr |
United States dollar | USD | $ |
If your currently isn't listed check out the full list of active codes of official ISO 4217 currency names.
Number format
Example creation:
"fields":{
"amount":{
"type": "number",
"name": "Amount",
"optional": false
}
}
Example adding data:
"data":[
{
"amount": 42
}
]
number
fields can be NULL
if set as optional
.
Regular decimal values (e.g. 10.24
) can be used in number fields.
For some types of decimal values (like software versions 5.1234
), as well as other characters like dashes -
and brackets ()
(used for telephone numbers (555) 555-1234
), you may need to use the String format format instead.
Percentage format
Example creation:
"fields":{
"percentage":{
"type": "percentage",
"name": "Percentage",
"optional": false
}
}
Example adding data:
"data":[
{
"percentage": 0.35
}
]
When using a percentage field, a number in the 0
to 1
range will be displayed in the 0
to 100%
range.
For example, a percentage field with value 0.35
will be interpreted by Geckoboard as the percentage 35%
.
Values above 1
will correspond to percentages higher than 100%
. For example, 1.5
will be interpreted as 150%
.
A percentage
field can be NULL
if set as optional
.
String format
Example creation:
"fields":{
"string":{
"type": "string",
"name": "String"
"optional": false
}
}
Example adding data:
"data":[
{
"string": "This is a string field"
}
]
All string
fields must not contain more than 256 characters.
A string
field can be NULL
if set as optional
.
API requests
Find or create a new dataset
PUT https://api.geckoboard.com/datasets/:id
Example:
curl https://api.geckoboard.com/datasets/sales.by_day \
-X PUT \
-u '222efc82e7933138077b1c2554439e15:' \
-H 'Content-Type: application/json' \
-d '{
"fields": {
"amount": {
"type": "number",
"name": "Amount",
"optional": false
},
"timestamp": {
"type": "datetime",
"name": "Date"
}
},
"unique_by": ["timestamp"]
}'
Response:
{
"id": "sales.by_day",
"fields": {
"amount": { "type": "number", "name": "Amount", "optional": false },
"timestamp": { "type": "datetime", "name": "Date" }
},
"unique_by": ["timestamp"]
}
PUT
is the HTTP method for retrieving resources from the Datasets API.
Where :id
is a string to help you identify your dataset in Geckoboard.
Request parameters
Attribute | Type | Required? | Description |
---|---|---|---|
fields |
Object | Yes | An object with keys for each column in your dataset. The value describes the type for that column. |
unique_by |
Array | No | An array of one or more field names whose values will be unique across all your records. |
Append data to a dataset
POST https://api.geckoboard.com/datasets/:id/data
Example:
curl https://api.geckoboard.com/datasets/sales.by_day/data \
-X POST \
-u '222efc82e7933138077b1c2554439e15:' \
-H 'Content-Type: application/json' \
-d '{
"data": [
{
"timestamp": "2018-01-01T12:00:00Z",
"amount": 819
},
{
"timestamp": "2018-01-02T12:00:00Z",
"amount": 409
},
{
"timestamp": "2018-01-03T12:00:00Z",
"amount": 164
}
]
}'
Response:
{}
Append will add new records to OR modify the already existing records within your dataset. It calls the POST
method.
If you haven’t included a unique_by
array with your dataset definition, then all new records will be appended to the existing contents of your dataset.
If you have included a unique_by
array of fields, then any conflict between your new and existing records will be resolved by merging your updates into the contents of your dataset. This can be used to modify existing records in case their values have changed since your last update or if you want to fix an incorrect record.
Should the number of records in your dataset exceed the 5000 records limit following an Append, old records will be discarded.
Attribute | Description |
---|---|
data |
An array of objects with key + values representing a record in your dataset. |
delete_by |
An optional string specifying the name of a date or datetime field used order records for truncation. |
Replace all data in a dataset
PUT https://api.geckoboard.com/datasets/:id/data
Example:
curl https://api.geckoboard.com/datasets/sales.by_day/data \
-X PUT \
-u '222efc82e7933138077b1c2554439e15:' \
-H 'Content-Type: application/json' \
-d '{
"data": [
{
"timestamp": "2018-01-01T12:00:00Z",
"amount": 819
},
{
"timestamp": "2018-01-02T12:00:00Z",
"amount": 409
},
{
"timestamp": "2018-01-03T12:00:00Z",
"amount": 164
}
]
}'
Response:
{}
Replace will delete all the existing data within the dataset and then write the new data. In effect, your dataset will contain only the new records that you just pushed (you can think of it as similar to an overwrite action). It calls the PUT
method.
Attribute | Description |
---|---|
data |
An array of objects with key + values representing a record in your dataset. |
Append vs replace
Your particular setup and use case will largely determine which method you use, but using append (in combination with unique_by
and delete_by
) is nearly always preferable as it's quicker, provides better performance, and lets you send more data to Geckoboard.
Consider append when:
- You’re always getting data for right now from your data source and want to build up a historical record in your dataset.
- You’re planning to push data more than once every few minutes – even if you’re updating a dataset that contains only a single record.
- If you want to push more than 500 records to your dataset. For this you’ll need to send multiple appends.
Consider replace when:
- You’re not interested in maintaining historical data or displaying a comparison.
- You’re able to pull the complete data for the entire time period you’re interested in.
Clear all data in a dataset
curl https://api.geckoboard.com/datasets/sales.by_day/data \
-X PUT \
-u '222efc82e7933138077b1c2554439e15:' \
-H 'Content-Type: application/json' \
-d '{ "data": [] }'
Response:
{}
Wipes clean all the existing data in a dataset by passing an empty array via the PUT
method (i.e. Replace) and leaves behind an empty dataset. The dataset itself and its schema will be preserved though.
This example provides the empty array syntax.
Delete a dataset
DELETE https://api.geckoboard.com/datasets/:id
Example:
curl -X DELETE \
-u '222efc82e7933138077b1c2554439e15:' \
https://api.geckoboard.com/datasets/sales.by_day
Response:
{}
Deletes the dataset and all data with the given id
.
Specify multiple field names
You can specify multiple field names as part of unique_by
as long as they are string
, date
or datetime
fields and have unique identifiers.
Limits and quotas
API rate limit
"error": {
"message": "You have exceeded the API rate limit of 60 requests per minute. Try sending data less frequently"
}
}
There is basic rate limiting on the API. This restricts you to 60 requests per minute for your API key.
If you exceed your limit, the API will return a 429 TOO MANY REQUESTS
status and error message.
Records per dataset
Each dataset can contain up to 5000 records.
When a dataset exceeds the record count limit the oldest records (by insertion time) will be removed. This behaviour can be overridden by using the delete_by
option when appending new records.
When set to the name of a date
or datetime
field, the delete_by
option will be used to order your records (from newest to oldest) before records are truncated from the dataset.
If you specify a date
field for delete_by
then the datasets API will try to avoid leaving your dataset with a partially complete day’s worth of data. When it deletes a record it will also delete any records that have the same date value for that field.
If the delete_by
field is a datetime
field then only records with that exact same timestamp (i.e. same year, month, day, hour, minute, second, and millisecond) will be deleted.
Columns per dataset
Each dataset can contain up to 80 columns.
Records per request
Each PUT
or POST
request will accept 500 records, which includes both new records and updates to existing records.
Datasets per account
Each Geckoboard account is limited to 200 datasets. When the number of datasets is reached, no more datasets can be added. You can delete datasets via the API.
Update frequency
Widgets powered by datasets update in real-time when new data is received.
There's no limitation on the frequency that you can send an update to a dataset, as long as it falls within the rate limit, but the visualizations on your dashboards will only show changes up to every 10 seconds. We'd recommend not updating a dataset more frequently than this.
Visualization requirements
Your schema determines the visualizations that can be built with your dataset on a Geckoboard dashboard.
Make sure to include these types of data in your schema if you're building a particular visualization.
Number visualization
The number visualization is focused on the display of a metric that can be represented by a single number, along with optional associated secondary metrics, such as a change or trend indication.
Visualization type | Visualization example | Required types |
---|---|---|
Number | ![]() |
duration or money or number or percentage |
Number with sparkline comparison | ![]() |
money or number or percentage |
Number with specific time-based sparkline comparison | ![]() |
money or number or percentage and date or datetime |
Number with percentage comparison | ![]() |
money or number or percentage |
Number with number comparison | ![]() |
money or number or percentage |
Number with goal comparison | ![]() |
money or number or percentage |
Gauge visualization
Gauges are a great way of representing a single data point that fluctuates over time, like a speedometer in a car. The gauge is most useful to quickly see a metric in comparison to defined minimum and maximum values.
Visualization type | Visualization example | Required types |
---|---|---|
Gauge | ![]() |
duration or money or number or percentage |
Gauge with needle on specific time-based value | ![]() |
duration or money or number or percentage and date or datetime |
Line Chart
Line charts are best used to track changes over time, using equal intervals of time between each data point.
There are two ways to create multi-series line chart using the Datasets API. When creating a multi-series line chart, you’ll need to pick one:
- Having Line Chart Series with identical data types (i.e. all of them money)
- Having a string value in the dataset to “split by” (which automatically generates series for each different string value)
Visualization type | Visualization example | Required types |
---|---|---|
Line chart multi-series | ![]() |
duration or money or number or percentage |
Line chart X-axis | ![]() |
duration or date or datetime |
Column Chart
Column chart data is represented by rectangular bars with lengths proportional to the values that they represent. The column chart's discrete data is categorical data and answers the question of "how many?" in each category.
Visualization type | Visualization example | Required types |
---|---|---|
Column chart metric | ![]() |
duration or money or number or percentage |
Column chart X-axis | ![]() |
duration or date or datetime or string |
Multi-series column chart metric | ![]() |
duration or money or number or percentage |
Bar Chart
Bar charts display data using horizontal rectangular bars, where the length of the bar is proportional to the data value. The bar chart's discrete data is categorical data and answers the question of "how many?" in each category.
Visualization type | Visualization example | Required types |
---|---|---|
Bar chart metric | ![]() |
duration or money or number or percentage |
Bar chart X-axis | ![]() |
duration or date or datetime or string |
Leaderboard visualization
Leaderboards are a visualization of achievement. Their goal is to make comparisons between people's (or item's) ranks.
Visualization type | Visualization example | Required types |
---|---|---|
Leaderboard label | ![]() |
string |
Leaderboard value | ![]() |
money or number or percentage |
Table visualization
Tables are used to display data from up to 10 columns from a dataset. There are two types of tables:
- Raw data: A row for each record showing the raw data.
- Summary: Aggregated data, grouped by a string or date.
Visualization type | Visualization example | Required types |
---|---|---|
Table raw data | ![]() |
Any data type as long as there are at least two of them |
Table summary | ![]() |
duration or
|