Absolute Pressure (a)
Pressure measured relative to a zero pressure (perfect vacuum) reference.
Absolute Pressure Sensor
Product whose output is proportional to the difference between applied pressure and a built-in fixed reference to vacuum (zero pressure). Typically, the minimum operating pressure (Pmin) is set to absolute zero pressure (perfect vacuum).
The pressure, temperature or other environmental conditions of the medium surrounding the case of the transducer.
The maximum deviation in output from a best fit straight line (BFSL) fitted to output measured over the compensated pressure range at reference temperature. Includes all errors due to: pressure Non-linearity, pressure hysteresis and non-repeatability.
A compensation technique based on sampling output at a known reference condition, within the compensated temperature and compensated pressure range of the product. Typically, a zero pressure reference such as atmospheric pressure (or equal pressures on both pressure ports for a differential product) is employed to allow the external correction of offset error.
Best Fit Straight Line (BFSL)
The straight line fitted through a set of points which minimizes the sum of the square of the deviations of each of the points from the straight line (least squares method). See also: Pressure Non-Linearity.
The input impedance of an uncompensated, unamplified analog output product.
The maximum pressure that may be applied to any port of the product without causing escape of pressure media. The product should not be expected to function after exposure to any pressure beyond the burst pressure. See also: Overpressure.
A test during which known values of a measurand are applied to the transducer and the corresponding output reading is recorded under specified conditions.
A graphical representation of the calibration record.
The application of known values of measurand, and recording of corresponding output readings, over the full (or specified portion of the) range of a transducer in an ascending and descending direction.
A record (e.g., table or graph) of the measured relationship of the transducer output to the applied measurand over the transducer range. Note: Calibration records may contain additional calculated points so identified.
The relation of a transducer calibration, through a specified step-by-step process, to an instrument or group of instruments calibrated by the National Bureau of Standards. Note: The estimated error incurred in each step must be known.
Common Mode Pressure
The applied “line” pressure which is common to both ports of a differential pressure sensor. See also: Maximum Common Mode Pressure.
Common Mode Voltage
The voltage between each of the output terminals of a differential output product and electrical ground.
Compensated Temperature Range
Temperature range over which the transducer has been corrected by a circuit that adjusts output for errors introduced by thermally-induced changes in bridge resistance.
Compound Range Pressure Sensor
Product for measuring gage pressures both above and below atmospheric pressure. Typically the minimum operating pressure (Pmin) is set to -1 bar below atmospheric pressure.
The open volume inside the product which is occupied by fluids being sensed. Does not include the flow channel for flow-through pressure products.
Differential Pressure (d)
Pressure difference measured between two pressure sources.
Differential Pressure Sensor
Product whose output is proportional to the difference between pressure applied to each of the pressure ports.
An undesired change in output over a period of time, where the change is not a function of the measurand.
Fluid Temperature Range
A measured fluid’s temperature range within which the transducer is intended to operate. When a fluid temperature range is not separately specified, it is the same as the operating temperature range.
A quantitative measure of the response time of a transducer when an input is applied. Often confused with natural frequency.
Full Scale Output (FSO)
The actual output reading a transducer provides when the specified full scale range of the measurand is applied. This is different from full scale span (FSS). FSO is the actual reading at full scale. See: Full Scale Span for definition.
Full Scale Span (FSS)
The algebraic difference between output signal measured at the upper and lower limits of the operating pressure range. Also known as “span” or ambiguously as “full scale output.”
The maximum difference between output readings when the same pressure is applied consecutively, under the same operating conditions, with pressure approaching from opposite directions within the specified operating pressure range.
The maximum calculated error in the output values shown in a calibration record due to causes not attributable to the transducer.
Natural Frequency (eigenfrequency)
The frequency at which a system oscillates when no driving or damping force is present. It is the frequency at which resonance occurs when a forced frequency in the system is applied.
The maximum deviation of product output from a straight line fitted to the output measured over the specified operating pressure range. Standard methods of straight line fit specified for this calculation are either BFSL or TSL.
Operating Temperature Range
The range of ambient temperatures within which the transducer is intended to operate.
The maximum pressure that may be applied to a device without changing its specified performance. It is also known as “overpressure.”
A transducer whose output has a direct correlation to voltage potential supplied to the sensor, typically measured in mV/V or V/V. Example: A 3mV/V sensor will provide a 30mV output signal at a 10VDC supply voltage.
The pressure used as a reference (zero) in measuring product performance. Unless otherwise specified, this is vacuum (0 psi a) for an absolute pressure sensor and local ambient atmospheric pressure (0 psi g) for gage, compound and differential pressure sensors.
The temperature used as a reference in measuring product performance, typically 25 ±3°C.
The maximum difference between output readings when the same pressure is applied consecutively, under the same operating conditions, with pressure approaching from the same direction within the specified operating pressure range. See also: Pressure Hysteresis and Thermal Hysteresis.
See: Output Resolution
Time elapsed for output of the product to change from 10 to 90% of full scale span in response to a step change in input pressure from the specified minimum to maximum operating pressure.
The ratio of output signal change to the corresponding input pressure change. Sensitivity is determined by computing the ratio of full scale span to the specified operating pressure range. Also known as “slope.”
An ambiguous term sometimes used to describe a permanent change in output of a sensor. The terms “offset shift” and “span shift” are also sometimes used to describe output changes due to temperature. To avoid confusion, these should be replaced by thermal effect on offset and thermal effect on span. See also: Drift
The maximum deviation in measured full scale span at reference temperature relative to the ideal (or target) full scale span as determined from the ideal transfer function. See also: Thermal Effect on Span
A transducer’s ability to reproduce output readings obtained during its original calibration at room conditions for a specified time period. It is typically expressed as “within percent of full scale output for a period of months.”
The voltage excitation used as a reference in measuring product performance, typically 5.00 ±0.01 Vdc.
Supply Voltage Operating Limits
The range of voltage excitation which can be supplied to the product to produce an output which is proportional to pressure, but due to supply voltage ratiometricity errors may not remain within the specified performance limits.
The maximum change in output, at any measurand value within the specified range, when the transducer temperature is changed from room temperature to specified temperature extremes.
Temperature Error Band
The error band applicable over stated environmental temperature limits.
Temperature Gradient Error
The transient deviation in output of a transducer at a given measurand value, when the ambient temperature or the measured fluid temperature changes at a specified rate between specified magnitudes.
A theoretical slope for which the theoretical end points are 0 and 100% of both measurand and output.
The specified relationship (table, graph or equation) of the transducer output to the applied measurand over the range.
Theoretical End Points
The specified points between which the theoretical curve is established and to which no end point tolerances apply. The points can be other than 0 and 100% of both measurand and output.
The straight line between the theoretical end points.
Thermal Coefficient of Resistance
The relative change in resistance of a conductor or semiconductor per unit change in temperature over a stated range of temperature. This is expressed in ohms per ohm per degree F or C.
Thermal Sensitivity Shift
The sensitivity shift due to changes of the ambient temperature from room temperature to the specified limits of the operating temperature range.
Thermal Zero Shift
The zero shift due to changes in ambient temperature from room temperature to the specified limits of the operating temperature range.
The smallest change in the measurand that will result in a measurable change in transducer output. When the threshold is influenced by the measurand values, these values must be specified.
The time in seconds for a pressure sensor signal to change from 0 to 63.2% of the full scale when the sensor is exposed to an instantaneous full scale pressure change.
A device which provides a usable output in response to a specified measurand.
The electrical portion of a transducer in which the output originates.
A transducer whose output is specifically designed to travel over long cable distances. Non-voltage output sensors fall into this category. Electrical outputs are either a current output or digital output signal.
An acceleration perpendicular to the sensitive axis of the transducer.
See: Transverse Sensitivity
The sensitivity of a transducer to transverse acceleration or other transverse measurand. It is specified as maximum transverse sensitivity when a specified value of measurand is applied along the transverse plane in any direction, and is usually expressed in percent of the sensitivity of the transducer in its sensitive axis.
The maximum change in output, at any measurand value within the specified range, when vibration levels of specified amplitude and range of frequencies are applied to the transducer along specified axes.
See: Vibration Error
For potentiometric transducers, the ratio of output voltage to excitation voltage, usually expressed in percent.
The period of time, starting with the application of excitation to the transducer, required to assure that the transducer will perform within all specified tolerances.
The output of a transducer, under room conditions unless otherwise specified, with nominal excitation and zero measurand applied.
A change in the zero-measurand output over a specified period of time and at room conditions. This error is characterized by a parallel displacement of the entire calibration curve.