Introduction to Data Connectors
A quick introduction to what Data Connectors are and how they work.
Last updated
A quick introduction to what Data Connectors are and how they work.
Last updated
Data Connectors are designed to reliably and securely forward live sensor events from DT Cloud to your cloud in real time. This allows you to store sensor data within your systems, and do your processing, analysis, or visualizations as soon as new data becomes available. Data Connectors have a built-in retry mechanism in case your cloud is down for maintenance or is otherwise unavailable, and provides a mechanism to verify the origin and integrity of the sensor events.
Data Connectors are the recommended way to build an integration between DT Cloud and your system whenever possible.
Data Connectors can be summarized in a few steps.
A sensor sends a data event to DT Cloud via a Cloud Connector.
DT Cloud uses the Data Connector to forward the events to your cloud.
Your cloud acknowledges that the event was received.
The Data Connector type that is supported today is a webhook. A webhook will forward each sensor event to your server through an HTTPS POST request to the endpoint URL configured on the Data Connector. See the Receiving Events section for more information about how to implement your cloud service.
Data Connectors have properties that make them particularly well suited for forwarding your sensor data from DT Cloud to an external service with minimal effort.
Data Connectors have a retry mechanism that guarantees all events will be pushed at least once for up to 12 hours. All device events received by DT Cloud are put in a dedicated per-Data Connector queue. Events are only removed from this queue either once acknowledged or if the event is older than 12 hours. See the retry policy for more details.
Data Connectors give you the lowest possible end-to-end latency from the Sensor to your cloud. DT Cloud will forward each event individually to your cloud as soon as it is received from the sensor. We do not introduce wait states to aggregate events or process events sequentially, keeping latency as low as possible.
All events sent by a Data Connector to an external service are encrypted using TLS with the optional addition of a signature secret that can be used to verify both the origin and integrity of each event. We recommend using signature secrets with Data Connectors in a production environment. See Configuring a Data Connector for more information.
DT Cloud will never wait for an event delivery to complete before sending the next event. A new event will always be sent instantly to your cloud, running as many requests in parallel as there are events at any given moment.