C.10.9.1.4.2 Channel Sensitivity and Channel Sensitivity Units

Channel Sensitivity is the nominal value of one unit (i.e., the least significant bit) of each waveform sample in the Waveform Data attribute (5400,1010). It includes both the amplifier gain and the analog-digital converter resolution. It does not relate the vertical scaling of a waveform on a particular display.

Note: The Defined (default) Context Group for Channel Sensitivity Units Sequence is CID 3082 Waveform Units of Measurement, which includes all the commonly used measurement values. Units of measurement not included in the default list can be specified using the more general CID 82 Units of Measurement, or a local Coding Scheme. The Defined Context ID may be replaced in a specialization of the IOD.

Channel Sensitivity Correction Factor (003A,0212) is the ratio of the actual (calibrated) value to the nominal Channel Sensitivity specified in Data Element (003A,0210). Thus a waveform sample value multiplied by the Channel Sensitivity value provides the nominal measured value in Channel Sensitivity Units, and that nominal value multiplied by the Channel Sensitivity Correction Factor provides the calibrated measured value.