DPH Computation

Data Processing operations per Hour, or in brief DPH, is the amount of operations performed by the Servitly back end each hour to process incoming data.

A specific number of Data Processing operations are consumed each time there is the need to process a data, and this includes:

  • processing messages published by the connected products to save the raw metric values in the database;

  • processing of computed metrics with CONTINUOUS evaluation;

  • processing of work sessions;

  • processing of events (Failures, Anomalies, Operations) with CONTINUOUS evaluation.

Note that, only CONTINUOUS evaluation consumes DPH, SAMPLED evaluation is free of charge.

  • CONTINUOUS: periodically all the data points since the last evaluation are considered in the computation, and multiple data points could be generated.

  • SAMPLE: periodically only the last data point is considered in the computation, and at most one data point is generated.

The frequency of data publishing is up to the connected product, instead periodic computations are based on a fixed evaluation interval.

Day by day, Servitly's billing engine calculates the average DPH consumed by data point processing operations and sums the number of DPH consumed by periodic calculations. Here below, you can find more details about how DPH are computed.

Data Points Processing

Each time the IoT Connector receives a message, the value of the fields mapped to a metric is extracted, and for each value saved in the database, one DP operation is consumed. The number of data points takes into account any type of metrics except the default ones (Connection Status, Cloud Status).

The number of data points is calculated by summing:

  • Incoming Data Points: the data points saved as a result of receiving an IoT message sent by the connected product.

  • Computed Data Points: The data points saved as a result of the calculated metrics

The number of DPH consumed by data points, is obtained by averaging the number of data points saved in the month by the number of hours in the month.

// SCENARIO
One thing ONLINE from 08:00 to 18:00 every day 
When ONLINE sends 1 message with 2 values every 10 seconds

// CALCULATIONS
Messages per hour: 3600 sec / 10 sec = 360
Total number of data points saved daily: 360 x 2 x 10 = 7200 data points
DPH to save 1 data point at hour: 1 DPH 
Daily average DPH: 7200 / 24 X 1 = 300 DPH 

The DPH consumption is strictly related to the message publishing rate and the number of metrics.

Note that, the rate of incoming data points is limited, and when this limit is reached, exceeding data points are discarded.

For more details, refer to the Publishing Rate Limit article.

Computed Metrics Processing

The Servitly data computation engine, for each computed metric with CONTINUOUS evaluation, periodically verifies the presence of new data points for the input metric-based variables.
The retrieved set of data points is then processed by the computation engine, and a data point for each distinct timestamp is generated, and stored in the database as a new metric value.

Other than the number of input variables based on metrics, the DPH consumption of a computed metric is strictly related to the evaluation interval.

// SCENARIO
A derived metric with 3 input metrics (expression: M1 x M2 x M3)
Evaluation interval = 120 sec

// FORMULA
DPH = 8 + (COUNT(Input_Metrics) - 1) x 4 x (60 / Computed_Metrics_Evaluation_Interval)

// CALCULATIONS
DPH at 60sec evaluation interval: 8 DPH + (3 Inputs - 1) x 4 DPH = 16 DPH
DPH at 120sec evaluation interval: 16 x (60 / 120) = 8 DPH

Note that, regardless of the type of computation (SAMPLED or CONTINUOUS), during the computation a Computed Metric will generate a data point in case one of its inputs is changed, so in addition to DPH for the computation, a Computed Metric will also consume DPH for the Data Processing.

In the worst scenario, where input are continuously updated, a Computed Metric with SAMPLED computation, can consume a maximum number of DPH, which can be calculated in this way.

DPH = 3600 / Computed_Metrics_Evaluation_Interval

Depending on the input data points frequency, a SAMPLED Computed Metric may consume less DPH for data point processing than a CONTINUOUS Computed Metric with the same inputs.

Work Sessions Processing

The Servitly work-session computation engine, for each work-session definition verifies, according to the start and stop conditions, the presence of work-sessions to create or to be stopped and historicized. Moreover, when a work-session is active, the monitored metrics are periodically computed and updated to the work-session itself.

Other than the number of monitored metrics to be computed, the DPH consumption is strictly related to evaluation interval.

// SCENARIO
A work-session definition with 2 metrics used in the Active/Clear conditions and 4 monitored metrics
Evaluation interval = 120 sec

// FORMULA
DPH = 30 + (COUNT(Conditions_Metrics) + COUNT(Monitored_Metrics) - 1) x 8 x (60 / WS_Evaluation_Interval)

// CALCULATIONS
DPH at 60sec evaluation interval: 30 DPH + (2 Cond. Metrics + 4 Mon. Metrics - 1) x 8 DPH = 70 DPH
DPH at 120sec evaluation interval: 70 x (60 / 120) = 35 DPH

For each monitored metric, the initial, minimum, maximum and final value are calculated and stored in a work-session.

Events Processing

The Servitly event computation engine, for each event with CONTINUOUS evaluation, according to the active and clear condition variables, verifies the presence of events to activate or to be cleared and historicized.

Other than the number of metrics involved in the Active and Clear conditions, the DPH consumption is strictly related to evaluation interval.

// SCENARIO
A FAILURE event definition with 3 metrics used in the Active/Clear conditions
Evaluation interval = 120 sec

// FORMULA
DPH = 20 + (COUNT(Conditions_Metrics) - 1) x 8 x (60 / Events_Evaluation_Interval)

// CALCULATIONS
DPH at 60sec evaluation interval: 20 DPH + (3 Cond. Metrics - 1) x 8 DPH = 36 DPH
DPH at 120sec evaluation interval: 36 x (60 / 120) = 18 DPH