Range
Definition: In industrial and instrumentation contexts, range is the span between the minimum and maximum values a sensor, instrument, or system can measure or operate within while maintaining acceptable performance. It sets the boundaries of valid data collection and safe equipment operation.
Key Takeaways
- Range defines the minimum-to-maximum span a sensor can measure or a system can safely operate within.
- Sensor range and operating range are distinct: one governs what can be measured, the other governs environmental limits.
- Choosing a range that is too wide reduces resolution; choosing one that is too narrow causes signal saturation.
- Range, accuracy, and resolution are related but separate specifications that must all be evaluated together.
- Matching range to expected process conditions is a prerequisite for reliable condition monitoring and fault detection.
What Is Range?
Range is one of the most fundamental specifications in industrial measurement. When a maintenance or reliability engineer selects a sensor for a new monitoring point, range is often the first parameter to verify because it determines whether the sensor can physically detect the signals of interest.
For a sensor, range is expressed as a lower bound and an upper bound: for example, a pressure sensor rated for 0 to 10 bar or a temperature sensor covering -40 to 150 degrees Celsius. Any signal that falls within this span can be measured. Any signal outside it will either be ignored, clipped, or cause the device to produce inaccurate output. Operating range follows the same logic but applies to environmental conditions rather than measured variables.
In practice, "range" is used loosely across industries and sometimes conflated with related concepts such as span, scale, or full-scale output. Understanding the precise definition prevents costly specification errors during sensor selection and installation planning.
Sensor Range vs. Operating Range
Two distinct range specifications appear on every industrial sensor datasheet. They govern different aspects of performance and must be evaluated independently.
| Specification | What It Defines | Example | What Happens If Exceeded |
|---|---|---|---|
| Measurement range (input range) | The span of values the sensor can detect and report | Vibration sensor: 0 to 80 g | Signal clipping, saturation, or data loss |
| Operating range (environmental range) | The environmental conditions under which the sensor functions correctly | Storage temperature: -20 to 85 degrees Celsius | Sensor damage, drift, or complete failure |
| Span | The arithmetic difference between upper and lower range limits | 0 to 10 bar = span of 10 bar | N/A (a derived value, not a limit) |
| Overrange limit | The maximum input beyond the rated range the sensor can survive without permanent damage | Pressure sensor: 150% of full scale for 10 seconds | Permanent calibration shift or physical damage |
Both specifications must be matched to the installation environment. A sensor with an adequate measurement range but an insufficient operating temperature range will produce drifted readings in a hot furnace room, even if vibration levels are well within the input specification.
Range in Common Industrial Sensors
Typical ranges vary widely by sensor type and application. The table below shows standard measurement ranges for the four sensor categories most commonly used in industrial condition monitoring programs.
| Sensor Type | Typical Measurement Range | Common Industrial Application | Key Range Consideration |
|---|---|---|---|
| Vibration sensor (accelerometer) | 0.5 g to 500 g peak, depending on class | Rotating machinery: motors, pumps, gearboxes | Low-g sensors offer better resolution for slow machines; high-g sensors protect against shock from impacts |
| Temperature sensor | -200 to 1750 degrees Celsius (thermocouple); -50 to 300 degrees Celsius (RTD) | Bearings, motor windings, heat exchangers, furnaces | RTDs offer higher accuracy in narrow ranges; thermocouples cover wider ranges at lower precision |
| Pressure sensor | 0 to 1 bar up to 0 to 700 bar for industrial gauges | Hydraulic systems, compressed air, process pipelines | Selecting a range too far above normal working pressure reduces sensitivity to small drops that indicate leaks or blockages |
| Current sensor (CT clamp) | 1 A to 5000 A, depending on conductor size | Motor load monitoring, energy metering, fault detection | Range must cover both normal running current and potential startup surges without saturating the core |
These ranges represent general industry standards. Specific sensor models may offer narrower sub-ranges optimized for particular machines or process conditions. Always verify the datasheet against the actual expected values at the measurement point before installation.
Why Range Selection Matters for Maintenance
Range selection has a direct impact on the quality of data captured by a condition monitoring program. Poor range choices create blind spots or noise that make fault detection unreliable.
Undersized Range: Saturation and Signal Clipping
When a sensor's range is too narrow for the measured variable, the output signal clips at the upper or lower limit. A vibration sensor rated to 20 g installed on a machine that produces 35 g shock pulses will never report the true peak. Fault signatures that depend on amplitude, such as bearing impact events, will be systematically underreported. Alarms calibrated to those clipped values will be slower to trigger or may never trigger at all.
Oversized Range: Resolution Loss
An oversized range spreads the full-scale output across a wider span, compressing small changes into fewer digital steps. A 16-bit sensor covering 0 to 1000 bar resolves roughly 0.015 bar per step. The same sensor covering 0 to 100 bar resolves roughly 0.0015 bar per step, giving ten times finer discrimination for low-pressure applications. In machines where early fault indicators appear as subtle shifts of a few tenths of a bar, the wider range sensor will miss the signal entirely.
Range and Alarm Setpoints
Alarm thresholds are typically set as a percentage of the expected operating value. If the sensor range does not match the process range, alarm setpoints will be misaligned. Maintenance teams may receive nuisance alarms at low levels or, more dangerously, no alarm as a parameter climbs to a damaging level.
Range and Calibration Validity
Sensors are calibrated across their rated range. Using a sensor at the extreme edges of its range, or pushing it past its overrange limit, degrades the calibration and introduces systematic error. Periodic calibration checks are only meaningful if the sensor is being used within the span for which it was calibrated.
Range vs. Accuracy vs. Resolution
These three specifications are frequently confused because they are closely related. Each one describes a different aspect of sensor performance and must be considered independently during specification review.
| Specification | Definition | Example | Impact on Maintenance |
|---|---|---|---|
| Range | Total measurable span from minimum to maximum | 0 to 200 degrees Celsius | Determines whether the sensor can physically capture the signal of interest |
| Accuracy | Closeness of a measured value to the true value, expressed as error percentage or absolute units | Plus or minus 0.5 degrees Celsius across the range | Determines whether the reported value can be trusted for alarm setpoints and trend analysis |
| Resolution | Smallest change in the measured variable the sensor can detect and report | 0.01 degrees Celsius per digital step | Determines whether gradual degradation trends are visible before they become critical failures |
| Sensitivity | Output signal change per unit of input change (often expressed as mV/g or mA/bar) | 100 mV/g | Higher sensitivity enables detection of small amplitude changes without amplifying noise |
A sensor can have an excellent accuracy specification yet still be unsuitable for the application if its range forces poor resolution. Conversely, a sensor with fine resolution is only useful if its range encompasses the full expected process span. All three must be evaluated together.
How to Choose the Right Range
A structured selection process prevents the most common range specification errors.
Step 1: Establish the Expected Process Range
Review historical data, process documentation, or equipment manufacturer specs to identify the normal minimum and maximum values at the measurement point. Include startup, shutdown, and transient conditions, not just steady-state operation. For new installations without historical data, consult the equipment OEM for expected output ranges under normal and fault conditions.
Step 2: Add a Safety Margin
Select a sensor range that extends beyond the expected process range by 10 to 20 percent on each end. This buffer accommodates transient spikes, process upsets, and future changes in operating conditions without saturating the sensor. Do not select a range so wide that the safety margin destroys resolution.
Step 3: Check the Overrange Limit
Some processes produce brief excursions beyond the normal range during startup, shutdown, or fault events. Confirm the sensor's overrange limit and survival pressure or g-rating to ensure the device survives without permanent damage or calibration shift during these events.
Step 4: Verify the Operating Range
Confirm that the installation environment falls within the sensor's operating temperature, humidity, and vibration limits. A sensor mounted near a furnace or in an outdoor enclosure subject to freezing temperatures must have an operating range that covers those extremes.
Step 5: Validate Against Resolution Requirements
Calculate the effective resolution at the selected range. If the smallest fault signature you need to detect is smaller than the resolution, either select a narrower range or a higher-bit-depth converter. For industrial IoT sensors with onboard analog-to-digital conversion, higher bit depth directly increases resolution at a given range.
Step 6: Cross-Check Alarm Setpoint Feasibility
Before finalizing the range, confirm that the intended alarm and alert thresholds fall well within the measurable span and above the noise floor. Alarm setpoints set at more than 90 percent of full scale are a warning sign that the selected range is undersized.
The Bottom Line
Range is not a background detail on a sensor datasheet. It is the foundational specification that determines what a sensor can see, how finely it can see it, and whether it will survive the environment it operates in. An incorrectly specified range produces corrupted data that undermines alarm reliability, trend analysis, and fault detection across the entire maintenance program.
Reliable condition monitoring begins with sensors matched precisely to their measurement points. Selecting the right measurement range, confirming the operating range, and validating resolution against fault detection requirements are non-negotiable steps in any sensor deployment. Teams that treat range selection as a one-time engineering exercise and revisit it when processes change will consistently outperform teams that rely on default catalog specifications.
See How Tractian Monitors Assets in Real Time
Tractian sensors are matched to exact measurement ranges for each asset class, delivering precise vibration, temperature, and current data to detect faults before they cause downtime.
See How Tractian WorksFrequently Asked Questions
What is range in industrial sensors?
Range is the span between the minimum and maximum values a sensor or instrument can measure and report accurately. For example, a vibration sensor with a range of 0 to 80 g can detect acceleration signals anywhere within that band. Readings outside the range are clipped or distorted, making range selection critical for reliable data.
What is the difference between sensor range and operating range?
Sensor range (also called measurement range or input range) describes the span of values the sensor can detect. Operating range describes the environmental conditions, such as temperature, humidity, or supply voltage, within which the sensor functions correctly. A sensor can have an accurate measurement range yet fail if its operating environment exceeds specification.
Why does choosing the wrong sensor range cause problems?
An oversized range reduces resolution, meaning small changes in the measured value become indistinguishable from noise. An undersized range causes the sensor to saturate, clipping real signal peaks and generating false or missing alarms. Both cases lead to poor fault detection and unreliable condition monitoring data.
How is range different from accuracy and resolution?
Range is the total span of measurable values. Accuracy is how close a reading is to the true value within that span. Resolution is the smallest change the sensor can detect. A wide range does not guarantee accuracy, and a highly accurate sensor may have coarse resolution if its range is too broad for the application.
Related terms
Scrap Rate: How to Calculate and Minimize Waste
Scrap rate is the percentage of production output that becomes unusable. Learn the formula, calculation steps, cost impact, and how to reduce manufacturing waste.
Stock Turnover Ratio
The stock turnover ratio measures how many times a company sells and replaces its inventory in a period. Learn the formula, calculation steps, benchmarks, and strategies to improve.
Throughput: Complete Guide for Maintenance Teams
Throughput measures how much usable output your operation delivers over a set period of time. Learn the formula, key factors, bottleneck identification, and how maintenance drives output.
System Adoption Rate
System adoption rate measures the percentage of target users who actively use a new maintenance system. Learn how to calculate it, key metrics, and steps to boost adoption.
Unit Costs in Maintenance
Unit costs measure the total expense of maintaining one asset or producing one output unit. Learn how to calculate, benchmark, and reduce maintenance unit costs.