Below you will find information about how to configure and send data to Azure Data Explorer (Kusto) databases. This service uses the streaming ingestion method to send data to your ADX server.
You need to create a service principal with access to your ADX cluster. The resource ID can be found in the Properties
section of your ADX Cluster in the field Resource ID
.
az ad sp create-for-rbac --name <mysp> --scopes /subscriptions/<subid>/resourcegroups/<rg>/providers/Microsoft.Kusto/Clusters/<cluster>
Below is a sample configuration for a table that collects temperature and humidity data, enables the ingestion API, and adds the service principal to the ingestors.
# create the table
.create table environmental ( Ts:datetime, Sensor:string, Temperature: real, Humidity: real)
# create the JSON mapping named `Telemetry`
.create table environmental ingestion json mapping "Telemetry"
'['
' { "column": "Ts", "datatype": "datetime", "Properties":{"Path":"$.ts"}},'
' { "column": "Sensor", "datatype": "string", "Properties":{"Path":"$.sensor"}},'
' { "column": "Temperature", "datatype": "real", "Properties":{"Path":"$.values.temperature"}},'
' { "column": "Humidity", "datatype": "real", "Properties":{"Path":"$.values.humidity"}},'
']'
# add the SP created above to the ingestors
.add table hvac ingestors ('aadapp=<sp appid>;<sp tenant>') 'IoT Bridge'
The Azure Data Explorer(Kusto) service contains the information required to establish a connection to your database.
https://tartabit.eastus2.kusto.windows.net
.Ingest a record to Kusto. Under the covers your requests will be batched for efficiency.
// The following example uses the schema created in the sample above.
// Ingest a record to the 'kusto' service using the 'Telemetry' JSON mapping.
azure_kusto.ingest('kusto',{ts: event.ts, sensor: event.endpoint.key, values: {temperature: -23.2, humidity: 48},{mapping: 'Telemetry'})
The following events are considered billable: