Output Selector: Standard Deviation
The Standard Deviation output selector calculates and outputs the standard deviation of the set of Number values input to it from all the records being merged together.
The Standard Deviation selector is used when performing statistical analysis on data, where records are grouped together by a common attribute value (or attribute values), or matched together using complex rules.
The following table describes the configuration options:
Configuration | Description |
---|---|
Inputs |
Any Number attributes from any input data sets. If the specified attribute is null for a given record, that record is ignored for the purposes of calculating the standard deviation. |
Options |
Specify the following options:
|
Example
In this example, the Standard Deviation output selector is used to select the Standard Deviation for a number attribute from each record. The processor has been configured to treat the input as the whole population of values.
Example output
The following table shows examples of output selection using the Standard Deviation selector:
Table 1-109 Example Output Using Standard Deviation Selector
Input values | Output value (Standard Deviation) |
---|---|
45, 66, 76, 78, 87, 94, 98, 99, 103 |
17.72 |
43, 45, 54, 76, 87, 89, 94, 99, 103 |
22.12 |