home · On a note · Determination of the error of pressure sensors from temperature. Sensor error due to supply voltage fluctuations

Determination of the error of pressure sensors from temperature. Sensor error due to supply voltage fluctuations

When choosing pressure sensors, any consumer sets the goal of measuring pressure with the accuracy stated in the technical documentation. This is one of the sensor selection criteria. In the passport for the sensor, GOST standards require that acceptable values ​​be indicated basic error measurements (+ - from true pressure). These values ​​according to GOST 22520 are selected from the range 0.075; 0.1; 0.15; 0.2; 0.25; 0.4; 0.5%; etc. depending on the technical capabilities products. The indicator of the basic error is normalized for normal (i.e. ideal) conditions measurements. Normal conditions are defined according to GOST 12997. These conditions are also indicated in the method of verification of the measuring instrument. For example, according to MI1997, to determine the basic error, you need to set following conditions env. Wednesday:
- temperature 23+-2оС,
- humidity from 30 to 80%,
- atm. pressure 84-106.7 kPa,
- power supply 36+-0.72V,
- absence of external magnetic fields, etc.
As you can see, the operating conditions for the sensor when determining the main error are almost ideal. Therefore, each calibration laboratory must have the ability to regulate them. For example, to regulate the temperature in a room, microclimate devices (heater, air conditioner, etc.) are used. But what readings from the sensor we will get in real operating conditions at the facility, for example at +80°C or -30°C, is a question. The answer to this question is given by additional error, which is also standardized in TU and GOSTs.
Additional error - Deviation of the conversion function caused by one influencing quantity (temperature, pressure, vibration, radio interference, supply voltage, etc.). Calculated as difference(ignoring the sign) between the error value in workers(actual) measurement conditions, and the error value under normal conditions.
Of course, all operating conditions factors affect the output signal. But for pressure sensors (transmitters) the most significant effect is the deviation of the ambient air temperature. In GOST 22520, the additional error is normalized for every 10°C deviation from normal conditions (i.e. from 23°C). Tolerances according to GOST look like this:

If the sensor meets these tolerances during temperature testing, then it “complies with GOST 22520,” which in most cases is written in the documentation for the sensor.
Let's analyze the accuracy of the sensor, which complies with GOST 22520, when exposed to temperature. For example, a sensor with a basic error of 0.5% and an operating temperature range of -30..+80°C at 30°C can err by 0.5+0.45=0.95%, at 40°C (deviation of 2 deci.°C) 1.4% accordingly, and finally at 80°C we get an accuracy of 3.2% - this is the sum of the main and additional errors. Let me remind you that we are dealing with a 0.5% sensor, and when operating at 80°C we get an accuracy of 3.2% (approx. 6 times worse), and such a sensor meets the requirements of GOST 22520.
The results do not look very nice and will certainly not please the buyer of a sensor with a stated accuracy of 0.5%. Therefore, most manufacturers do thermal compensation of the output signal and the requirements for additional sensors are tightened in the specifications for a specific sensor. errors due to temperature. For example, for SENSOR-M sensors, in the technical specifications we set a requirement of less than 0.1% per 10°C.
Purpose of temperature compensation– reduce additional error from temperature to zero. Nature additional We will consider temperature errors and methods of temperature compensation of sensors in detail in the next article. In this article I would like to summarize.
Need to take into account main error and additional depending on the required measurement accuracy within operating temperatures sensor The additional error of each sensor can be found in the passport, operating manual or specifications for the product. If the indicator is additional errors are not specified in those. Documentation for the sensor, then it simply meets the GOST requirements that we analyzed above.
One should also distinguish temperature compensation range And Operating temperature range. In the temperature compensation range additional. the error is minimal; when you go beyond the temperature compensation range, the requirements apply again

Basic qualitative characteristics of any instrumentation sensor is the measurement error of the controlled parameter. The measurement error of a device is the amount of discrepancy between what the instrumentation sensor showed (measured) and what actually exists. The measurement error for each specific type of sensor is indicated in the accompanying documentation (passport, operating instructions, verification procedure), which is supplied with this sensor.

According to the form of presentation, errors are divided into absolute, relative And given errors.

Absolute error is the difference between the value of Xiz measured by the sensor and the actual value of Xd of this value.

The actual value Xd of the measured quantity is the experimentally found value of the measured quantity that is as close as possible to its true value. Speaking plain language The actual value of Xd is the value measured by a reference device, or generated by a calibrator or setter of a high accuracy class. The absolute error is expressed in the same units as the measured value (for example, m3/h, mA, MPa, etc.). Since the measured value may be either greater or less than its actual value, the measurement error can be either with a plus sign (the device readings are overestimated) or with a minus sign (the device underestimates).

Relative error is the ratio of the absolute measurement error Δ to the actual value Xd of the measured quantity.

The relative error is expressed as a percentage, or is a dimensionless quantity, and can also take on both positive and negative values.

Reduced error is the ratio of the absolute measurement error Δ to the normalizing value Xn, constant over the entire measurement range or part of it.


The normalizing value Xn depends on the type of instrumentation sensor scale:

  1. If the sensor scale is one-sided and the lower measurement limit is zero (for example, the sensor scale is from 0 to 150 m3/h), then Xn is taken equal to the upper measurement limit (in our case, Xn = 150 m3/h).
  2. If the sensor scale is one-sided, but the lower measurement limit is not zero (for example, the sensor scale is from 30 to 150 m3/h), then Xn is taken equal to the difference between the upper and lower measurement limits (in our case, Xn = 150-30 = 120 m3/h ).
  3. If the sensor scale is two-sided (for example, from -50 to +150 ˚С), then Xn is equal to the width of the sensor measurement range (in our case, Xn = 50+150 = 200 ˚С).

The given error is expressed as a percentage, or is a dimensionless quantity, and can also take both positive and negative values.

Quite often, the description of a particular sensor indicates not only the measurement range, for example, from 0 to 50 mg/m3, but also the reading range, for example, from 0 to 100 mg/m3. The given error in this case is normalized to the end of the measurement range, that is, to 50 mg/m3, and in the reading range from 50 to 100 mg/m3 the measurement error of the sensor is not determined at all - in fact, the sensor can show anything and have any measurement error. The measuring range of the sensor can be divided into several measuring subranges, for each of which its own error can be determined, both in magnitude and in the form of presentation. In this case, when checking such sensors, each sub-range can use its own standard measuring instruments, the list of which is indicated in the verification procedure for this device.

For some devices, the passports indicate the accuracy class instead of the measurement error. Such instruments include mechanical pressure gauges, indicating bimetallic thermometers, thermostats, flow indicators, pointer ammeters and voltmeters for panel mounting, etc. An accuracy class is a generalized characteristic of measuring instruments, determined by the limits of permissible basic and additional errors, as well as a number of other properties that affect the accuracy of measurements made with their help. Moreover, the accuracy class is not a direct characteristic of the accuracy of measurements performed by this device; it only indicates the possible instrumental component of the measurement error. The accuracy class of the device is applied to its scale or body in accordance with GOST 8.401-80.

When assigning an accuracy class to a device, it is selected from the series 1·10 n; 1.5 10n; (1.6 10n); 2 10n; 2.5 10n; (3·10 n); 4·10n; 5·10n; 6·10n; (where n =1, 0, -1, -2, etc.). The values ​​of accuracy classes indicated in brackets are not established for newly developed measuring instruments.

The measurement error of sensors is determined, for example, during their periodic verification and calibration. With the help of various setpoints and calibrators, certain values ​​of one or another are generated with high accuracy physical quantity and compare the readings of the sensor being verified with the readings of a standard measuring instrument, to which the same value of the physical quantity is supplied. Moreover, the measurement error of the sensor is controlled both during forward stroke (an increase in the measured physical quantity from the minimum to the maximum of the scale) and when reverse stroke(decrease in the measured value from the maximum to the minimum of the scale). This is due to the fact that due to the elastic properties of the sensor’s sensitive element (pressure sensor membrane), different flow rates chemical reactions(electrochemical sensor), thermal inertia, etc. The sensor readings will be different depending on how the physical quantity affecting the sensor changes: decreases or increases.

Quite often, in accordance with the verification procedure, the readings of the sensor during verification should be performed not according to its display or scale, but according to the value of the output signal, for example, according to the value of the output current of the current output 4...20 mA.

For the pressure sensor being verified with a measurement scale from 0 to 250 mbar, the main relative measurement error over the entire measurement range is 5%. The sensor has a current output of 4...20 mA. The calibrator applied a pressure of 125 mbar to the sensor, while its output signal is 12.62 mA. It is necessary to determine whether the sensor readings are within acceptable limits.
First, it is necessary to calculate what the output current of the sensor Iout.t should be at a pressure Рт = 125 mbar.
Iout.t = Ish.out.min + ((Ish.out.max – Ish.out.min)/(Rsh.max – Rsh.min))*Рт
where Iout.t is the output current of the sensor at a given pressure of 125 mbar, mA.
Ish.out.min – minimum output current of the sensor, mA. For a sensor with an output of 4…20 mA, Ish.out.min = 4 mA, for a sensor with an output of 0…5 or 0…20 mA, Ish.out.min = 0.
Ish.out.max - maximum output current of the sensor, mA. For a sensor with an output of 0...20 or 4...20 mA, Ish.out.max = 20 mA, for a sensor with an output of 0...5 mA, Ish.out.max = 5 mA.
Рш.max – maximum of the pressure sensor scale, mbar. Psh.max = 250 mbar.
Rsh.min – minimum scale of the pressure sensor, mbar. Rsh.min = 0 mbar.
Рт – pressure supplied from the calibrator to the sensor, mbar. RT = 125 mbar.
Substituting known values we get:
Iout.t = 4 + ((20-4)/(250-0))*125 = 12 mA
That is, with a pressure of 125 mbar applied to the sensor, its current output should be 12 mA. We consider the limits within which the calculated value of the output current can change, taking into account that the main relative measurement error is ± 5%.
ΔIout.t =12 ± (12*5%)/100% = (12 ± 0.6) mA
That is, with a pressure of 125 mbar applied to the sensor at its current output, the output signal should be in the range from 11.40 to 12.60 mA. According to the conditions of the problem, we have an output signal of 12.62 mA, which means that our sensor did not meet the measurement error specified by the manufacturer and requires adjustment.
The main relative measurement error of our sensor is:
δ = ((12.62 – 12.00)/12.00)*100% = 5.17%

Verification and calibration of instrumentation devices must be carried out under normal environmental conditions of atmospheric pressure, humidity and temperature and at the rated supply voltage of the sensor, since higher or lower temperatures and supply voltage may lead to additional measurement errors. The verification conditions are specified in the verification procedure. Devices whose measurement error does not fall within the limits established by the verification method are either re-adjusted and adjusted, after which they are re-verified, or, if the adjustment does not bring results, for example, due to aging or excessive deformation of the sensor, they are repaired. If repair is impossible, the devices are rejected and taken out of service.

If, nevertheless, the devices were able to be repaired, then they are no longer subject to periodic, but to primary verification with the implementation of all the points set out in the verification procedure for this type of verification. In some cases, the device is specially subjected to minor repairs () since according to the verification method, performing primary verification turns out to be much easier and cheaper than periodic verification, due to differences in the set of standard measuring instruments that are used for periodic and primary verification.

To consolidate and test the knowledge gained, I recommend doing this.

Temperature error sensor

This error is not indicated in the sensor data sheet, since the sensor itself does not have it. It can be eliminated by changing the sensor switching circuit (by replacing the voltage stabilizer supplying the sensor with a current stabilizer and switching from a three-wire line to a four-wire line). But if this is not done, then the resulting error, at least approximately, should be taken into account when calculating the resulting channel error.

Changes in readings due to deviation of operating conditions from normal, i.e. additional errors are normalized by indicating the coefficients of influence of changes in individual influencing quantities on changes in readings in the form. Although in fact these functions of the influence of influencing factors are, as a rule, nonlinear, for ease of calculation they are approximately considered linear and the resulting additional errors are determined as

where is the deviation from normal conditions.

Maximum temperature error value at = 3K:

To move from the calculated maximum value of this error, which occurs when maximum deviations temperatures up to 5 or 35 °C, to standard deviation it is necessary to know the law of temperature distribution in the workshop. We do not have any data about this. Let us accept a completely heuristic assumption that the temperature is normally distributed and 8 days a year reaches critical values, and the remaining 365 - 8 = 357 days, i.e. 357/365 = 0.98 cases, not out of bounds. According to the normal distribution table, we find that the probability P = 0.98 corresponds to a limit of ± 2.3y. From here:

Normal distribution parameters k = 2.066, h = 0.577, e = 3

The temperature error is multiplicative, i.e. obtained by multiplication (sensitivity error). The width of the error band increases in proportion to the increase in the input value x, and at x=0 it is also equal to 0.

Sensor error due to supply voltage fluctuations

This error is purely multiplicative and is distributed according to the same law as the deviation of the network voltage from its nominal value of 220V. The network voltage distribution is close to triangular with the limits accepted above ± 15%. The stabilizer removes the swing of voltage fluctuations by K=25 times, i.e. at the output of the stabilizer, the distribution is also triangular, but with a swing of 15%/25=0.6%. The maximum value of this error: gUD = 15%. Standard deviation for a triangular distribution.

1. Features of the use of pressure sensors

The areas of application of pressure sensors (pressure transducers) are quite wide, but, as a rule, each specific application has its own specifics that must be taken into account in the design of the sensors.

In general, all applications of pressure transducers can be divided into two main groups:

  • Measuring the actual pressure (or vacuum) of any medium in a pipeline or technological installation;
  • Measuring the level of liquids in containers (tanks) by measuring the pressure of the liquid column (hydrostatic level sensor).

When selecting pressure sensors of both groups, it is necessary to clarify the following application features:

  • Hygiene requirements: The food and pharmaceutical industries place high demands on pressure sensors in terms of hygiene both at the point of contact with the product and outside (as a rule, they are made entirely of stainless steel). The assortment of KIP-Service LLC includes pressure sensors KLAY-INSTRUMENTS, which are specially designed for use in the dairy, brewing and food industries.
  • Availability of certificates: Often, for various applications, in addition to the usual GOST R certificate of conformity (or declaration of conformity), additional certificates are required. For example, accounting systems require a certificate of approval of the type of measuring instruments; for the use of pressure sensors in the food industry, a conclusion from the SES is required; for applications in hazardous industries, permission from Rostechnadzor is required, etc.
  • Explosion protection requirements: In explosive industries (for example, oil and gas, chemical, alcohol industries), explosion-proof pressure sensors are used. The most widely used types of explosion protection for sensors are intrinsically safe Ex ia circuits and explosion-proof enclosure Ex d, the choice of which is determined by the specific application.
  • Type of measured medium: if the medium being measured is viscous, aggressive, weakly fluid, or has any other specific properties (for example, the presence of dirt particles), these features must also be taken into account. It is possible that this application requires the use of membrane pressure sensors (equipped with a separating membrane), which protect the sensitive element of the sensor from exposure to aggressive media.
  • Presence of external influences: the presence of vibration, electromagnetic fields or other mechanical or electrical influences.

When selecting pressure sensors for Group I applications when measuring pressures greater than 1 bar, you also need to consider:

  • Presence of water hammer in the system: if there may be water hammer in the system, the pressure sensor must be selected with a sufficient margin for overload (peak pressure) or measures must be taken to compensate for water hammer (silencers, special sensors, etc.) on site;
  • Optional equipment: As a rule, when measuring pressure, sensors are mounted using 3-way valves; in addition, when measuring steam pressure, it is recommended to connect pressure sensors through a special device - Perkins tube, which reduces the temperature of the medium acting on the pressure sensor.

When selecting pressure sensors for use as hydrostatic level sensors, it is necessary to take into account the fact that the pressure value at the same height of the liquid column can change with changes in the density of the measured medium.

2. Measuring range

Pressure sensor measurement range - the range of pressure values, when applied, the sensor will carry out measurements and linear conversion of the measured value into a unified output signal.

The measurement range is determined by the lower and upper measurement limits, which correspond to the minimum and maximum values ​​of the measured pressure. Examples of measuring ranges: 0…1 bar, 0…2.5 MPa, –100…100 KPa.

When selecting pressure sensors, it is necessary to take into account that sensors come with both a fixed measurement range (for example, PD100 pressure transducers) and with an adjustable measurement range (for example, KLAY-INSTRUMENTS pressure sensors). For pressure sensors with a fixed measuring range, the output signal values ​​are strictly tied to the measurement limits. For example, a PTE5000 pressure sensor at a pressure of 0 MPa will output 4 mA, and at a pressure of 0.6 MPa it will output 20 mA, since it is rigidly configured for the range of 0 ... 0.6 MPa. In turn, the KLAY 8000-E-S pressure sensor has an adjustable range of 0-1...4 bar, which means that at a pressure of 0 bar the sensor will similarly output 4 mA, and the sensor will output 20 mA at any value from the range of 1...4 bar, which is adjusted by the user using a special potentiometer “SPAN”.

3. Process temperature

The temperature of the measured medium is very important parameter when choosing pressure sensors. When selecting a sensor, it is necessary that the process temperature does not go beyond the permissible operating temperature range.

In the food industry, short-term (20 to 40 minutes) CIP and SIP cleaning (sanitization) processes occur where ambient temperatures can reach 145 °C. For such applications, sensors should be used that are resistant to such temporary exposure to high temperatures, such as KLAY-INSTRUMENTS SAN pressure sensors - 8000-SAN and 2000-SAN.

The readings of all pressure sensors using the tensor-resistive principle of conversion strongly depend on the temperature of the measured medium, since the resistance of the resistors that make up the measuring circuit of the pressure sensor also changes with temperature changes.

For pressure sensors, the concept of “temperature error” is introduced, which is an additional measurement error for every 10 °C change in the temperature of the measured medium relative to the base temperature (usually 20 °C). Thus, the process temperature must be known to determine the total measurement error of the pressure sensor.

To reduce the influence of temperature in pressure meters, use various schemes temperature compensation.

Based on the use of temperature compensation, all pressure sensors can be divided into three groups:

  • Budget pressure sensors that do not use thermal compensation circuits;
  • Mid-price sensors using passive thermal compensation circuits;
  • High-level pressure sensors for systems requiring measurement accuracy that use active temperature compensation circuits.

To measure the pressure of media with a constant temperature of more than 100 °C, special high-temperature pressure sensors are used, which make it possible to measure the pressure of media with temperatures up to 250 °C. As a rule, such sensors are equipped with a cooling radiator and/or have a special design that allows the part of the sensor with electronics to be placed in an area with an acceptable operating temperature.

4. Type of connection between the sensor and the process

Type of connection of the sensor to the process - the type of mechanical inclusion of the pressure sensor in the process to carry out measurements.

The most popular connections for pressure transmitters of general industrial design are threaded connections G1/2″ DIN 16288 and M20x1.5.

When selecting a sensor, the type of connection must be specified to ensure ease of installation into an existing system without additional work (welding, cutting other types of threads, etc.)

The most diverse types of process connections used are the food, pulp and paper and chemical industries. For example, KLAY-INSTRUMENTS pressure sensors, which are specially designed for these industries, can be manufactured with more than 50 various options inclusion in the process.

The choice of connection type is most relevant for the food industry, because along with convenience, the connection must first of all ensure “sanitary” and the absence of “dead zones” for the sanitization process. For pressure sensors intended to work in contact with food, there are special certificates confirming their “sanitary” properties - the European EHEDG (European Hygienic Equipment Design Group) certificate and the American 3A Sanitary Standards certificate. In Russia, for sensors in contact with food media, it is necessary to have sanitary epidemiological report. In the assortment of KIP-Service LLC, the requirements of these certificates are met by sensors of the 8000-SAN and 2000-SAN series from KLAY-INSTRUMENTS.

5. Environmental parameters

When selecting pressure transmitters, the following environmental parameters must be taken into account:

  • Ambient temperature;
  • Ambient humidity;
  • Presence of aggressive environments;

All environmental parameters must be within acceptable limits for the selected pressure transducer.

If available in environment aggressive substances, many manufacturers of pressure sensors (including KLAY-INSTRUMENTS BV) offer special versions that are resistant to chemical attack.

When operating in high humidity conditions with frequent temperature changes, pressure sensors from many manufacturers are faced with the problem of corrosion of the pressure sensor. The main cause of sensor corrosion in pressure sensors is the formation of condensation.

Gauge pressure transmitters, in order to measure relative pressure, require a connection between the sensor and the atmosphere. For low-cost sensors, the sensor is connected to the atmosphere due to the non-tightness of the housing (IP65 connector); wet air, with this design, after getting inside the sensor, it condenses when the temperature drops, thereby gradually causing corrosion of the measuring element.

For applications where conventional pressure transmitters fail due to sensor corrosion, KLAY-INSTRUMENTS industrial pressure transmitters are ideal. For KLAY pressure transmitters, the sensor is connected to the atmosphere through a special "breathing" membrane made of Gore-Tex material, which prevents moisture from penetrating into the sensor.

In addition, the sensor contacts of all KLAY sensors are filled by default with a special synthetic compound for additional protection sensor against corrosion.

6. Pressure sensor output type

The most common analog output signal for pressure sensors is a unified 4...20 mA current signal.

Almost always 4 mA corresponds to the lower value of the measurement range, and 20 mA to the upper value, but sometimes a reverse signal occurs (usually on vacuum ranges). Also in industry there are pressure sensors with other types of analog output signals, for example: 0...1 V, 0...10 V, 0...20 mA, 0...5 mA, 0...5 V.

The range of pressure sensors stored by LLC "KIP-Service" includes only sensors with an output signal of 4 ... 20 mA. To obtain another type of output signal from 4 ... 20 mA, you can use the Seneca Z109 REG2 universal signal converter, which mutually converts almost all types of unified current and voltage signals, while providing galvanic isolation.

Intelligent pressure sensors, in addition to the main signal 4 ... 20 mA, can be manufactured with support for the HART protocol, which can be used to configure or obtain information about the status of the sensor and additional information.

In addition to the analog output signal, smart pressure transmitters also come with a digital output signal. These are sensors with Profibus PA protocol output, which SIEMENS uses in its devices.

7. Required measurement accuracy

When calculating the measurement error of pressure sensors, it must be taken into account that in addition to the main error, there is an additional error.

Basic error- the value of the error of the pressure sensor relative to the measurement range, declared by the manufacturer for normal operating conditions. As a rule, the following conditions are understood as normal operating conditions:

  • Ambient and working temperature - 20 °C;
  • The pressure of the working medium is within the measuring range of the sensor;
  • Normal atmospheric pressure;
  • Absence of flow turbulence or other phenomena at the location of the sensor that could affect the readings.

Additional error- error value caused by deviation of operating conditions from normal, due to the characteristics of this specific application. One of the main components of the additional error is the temperature error, which is indicated in the technical documentation for pressure sensors and can be calculated for a specific temperature value of the working medium.

Also, additional error can be caused by turbulence of the flow of the measured medium, changes in the density of the medium during hydrostatic level measurement, dynamic loads on equipment while moving in space (vessels, vehicles, etc.) and other possible factors.

When calculating the error of the measuring system as a whole, it is also necessary to take into account the accuracy class measuring instrument- indicator.

As an example, let's calculate the total measurement error for the following system:

Given:

  • The KLAY-Instruments 8000-SAN-F-M(25) pressure sensor is installed on the product pipeline;
  • The maximum product pressure is 4 bar, so the sensor is set to a range of 0…4 bar;
  • Maximum temperature product - 60 °C;
  • Flow turbulence and other factors do not affect accuracy.

Solution:

  • According to passport data, we find that the main error of the 8000-SAN-F-(M25) sensor is 0.2%
  • The temperature error according to the data sheet is 0.015%/°C, so the temperature error at 60°C is 0.015%/°C x (60°C - 20°C) = 0.6%
  • 0.2% + 0.6% + 0.25% = 1.05% - total relative error;
  • 1.05% x 4 bar = 0.042 bar - the absolute measurement error of this system.