Precision is one of the most frequently misused terms in laboratory weighing. It appears on balance specification sheets, in GMP audit reports, in USP Chapter 41 compliance documentation, and in daily conversation between laboratory analysts — often meaning different things to different people in different contexts. An analyst who says their balance has “good precision” may mean it reads consistently, reads accurately, or simply reads to four decimal places. These are three different things. Only one of them is precision.
This article defines precision exactly, explains how it differs from accuracy and readability, covers the four components that together determine a balance’s overall precision performance, explains how to verify precision through a repeatability test, and explains how precision directly determines the minimum weight a balance can reliably measure under USP Chapter 41.
Table of Contents
What Precision Actually Means on an Analytical Balance
As confirmed by Mettler Toledo — the world’s leading laboratory balance manufacturer — precision refers to how consistent or reproducible a measurement is. In other words, precision measures how well a measurement can be replicated or repeated. A measurement can be precise even if it is not accurate.
The dartboard analogy is the clearest illustration. If you throw five darts and they all land in the same corner of the board — away from the bullseye — your throwing is precise but not accurate. The results are consistent, but consistently wrong. In balance terms: a balance that reads 10.0023 g, 10.0022 g, 10.0023 g, 10.0022 g, and 10.0023 g for the same 10.0000 g reference weight is extremely precise — the readings are tightly clustered — but it is not accurate. The consistent offset of 0.0022–0.0023 g is a systematic error that calibration must correct.
The three distinct concepts that precision is confused with:
Readability is the smallest increment the balance display can show — 0.1 mg for a standard analytical balance. Readability is a display characteristic. It says nothing about whether the displayed value is correct or reproducible.
Accuracy is how close the displayed value is to the true mass of the sample. As Stuccler confirms, accuracy measures how close your reading is to the true value — if a 10 g standard weight reads 10.0001 g on your balance, it is highly accurate. Accuracy is the combination of trueness and precision.
Precision is the consistency of repeated measurements under identical conditions. It is expressed as a standard deviation — the smaller the standard deviation, the more precise the balance.
A balance can be precise without being accurate (consistent but systematically wrong). A balance can be accurate on average without being precise (correct average but highly variable individual readings). A balance that is both accurate and precise — consistently close to the true value — is the instrument every regulated laboratory requires.
The Four Components of Analytical Balance Precision
Precision in the broad sense is determined by four measurable performance characteristics. As confirmed by GMP Insiders, the two key weighing parameters with the most notable impact on balance performance are repeatability and accuracy, and accuracy itself is impacted by sensitivity, eccentricity, and linearity.
1. Repeatability
Repeatability is the primary component of precision — the one most directly measured and reported in balance specifications. It is the ability of the balance to produce the same result when the same sample is weighed multiple times under identical conditions by the same operator on the same balance in the same location within a short period.
As Drawell confirms, repeatability is quantified by calculating the standard deviation (SD) or relative standard deviation (RSD) of a series of repeated weighings. A lower standard deviation indicates better repeatability and precision.
How repeatability is measured: Ten replicate weighings of a stable reference weight are performed. The standard deviation of the ten results is calculated. This standard deviation is the balance’s repeatability for that weight at those conditions.
Repeatability in the balance specification sheet: Most manufacturers express repeatability as a standard deviation value in the same unit as the balance’s readability — for a 0.1 mg analytical balance, repeatability might be specified as 0.1 mg standard deviation (equivalent to the readability) or 0.05 mg (better than the readability on high-specification models).
Why repeatability matters more than readability for regulated applications: Laboratory Supply Network confirms that repeatability may differ from readability in some cases — and it is the repeatability value, not the readability, that is used to calculate whether a balance can weigh a given quantity within USP Chapter 41 requirements.
2. Linearity
Linearity describes whether the balance’s accuracy is consistent across its full capacity range — not just at the calibration point. A balance calibrated with a 200 g weight may read correctly at 200 g but deviate systematically at 50 g or 350 g if its linearity is poor.
Linearity is tested by weighing certified reference weights at multiple points across the full capacity range — typically at 10%, 25%, 50%, 75%, and 100% of capacity — and recording the deviation from the known value at each point. The maximum deviation across all test points is the linearity specification.
As GMP Insiders confirms, linearity is one of the factors that affect accuracy independently of repeatability. A balance with excellent repeatability but poor linearity produces consistent readings — but consistently wrong ones at weights away from the calibration point.
Practical importance: In a pharmaceutical QC laboratory, weighing samples across a wide range — from 10 mg API quantities to 500 g bulk excipients on the same balance — linearity across the full range is essential. Single-point calibration that only verifies accuracy at one weight does not satisfy the linearity requirement.
3. Eccentricity (Corner Load Error)
Eccentricity — also called corner load error — is the variation in the balance’s reading depending on where on the weighing pan the load is placed. An ideal balance reads identically whether the sample is centered on the pan or positioned at any of the four corners. In practice, load cell design, platform geometry, and mechanical coupling all introduce some position-dependent variation.
Eccentricity is measured by placing a reference weight at the center of the pan and at each of the four corners, recording the reading at each position, and calculating the maximum deviation between positions. The result is the balance’s eccentricity specification.
Practical importance: Eccentricity matters most when weighing samples in large or irregularly positioned vessels — a tall flask, a large beaker, or a non-symmetric sample container that does not center the load’s center of gravity over the platform center. Poor eccentricity introduces a position-dependent error that varies between measurements if the vessel is not positioned identically each time.
4. Sensitivity
Sensitivity is the balance’s ability to detect a small change in mass — the smallest addition to the platform that produces a measurable change in the displayed reading. It is related to but distinct from readability. As GMP Insiders notes, when conducting an accuracy performance check, evaluating sensitivity at 0.05% is the standard for regulated laboratory applications.
Sensitivity drift over time — caused by mechanical wear, temperature cycling, and accumulated environmental exposure — is the primary reason calibration is required on a regular schedule. A balance that has lost sensitivity does not read to its specified readability — the last decimal place on the display fluctuates without resolution.
How to Verify Precision Through a Repeatability Test
Verifying precision is a straightforward process that any laboratory can perform with a stable reference weight and a data record. It is the primary performance check specified for analytical balances under GLP, GMP, and USP Chapter 41.
Equipment required:
- The analytical balance is being tested
- A stable, non-volatile reference weight — OIML E2 or F1 class, or equivalent ASTM Class 1
- A data record sheet or electronic record
Step-by-step procedure:
Step 1 — Prepare the balance. Confirm the balance has been powered on for the required warm-up period — typically 30–60 minutes. Check and adjust the bubble level. Zero the balance with the pan empty.
Step 2 — Select the test weight. Choose a reference weight close to the quantity you routinely weigh in this balance’s primary application. If you routinely weigh 50–200 mg quantities of API, test at 100 mg. Testing only at mid-range weights and assuming performance at other weights does not verify linearity.
Step 3 — Perform ten replicate weighings. Place the reference weight on the center of the pan. Record the stable reading. Remove the weight. Wait for the balance to return to zero. Repeat for a total of ten independent placements. Each placement must be a complete removal and replacement — not simply leaving the weight on the pan and re-reading.
Step 4 — Calculate the standard deviation. Calculate the standard deviation of the ten readings using the formula:
SD = √[ Σ(xi − x̄)² ÷ (n − 1) ]
Where xi is each individual reading, x̄ is the mean of all ten readings, and n is the number of readings (10).
Step 5 — Compare against the acceptance criterion. Under USP Chapter 41, the acceptance criterion for repeatability is:
2 × SD ÷ Test weight ≤ 0.001 (0.1%)
If twice the standard deviation divided by the weight of the test quantity is less than or equal to 0.1%, the balance meets USP Chapter 41 repeatability requirements for that weight. If it exceeds 0.1%, the balance cannot reliably weigh that quantity to within USP 41 tolerance.
Step 6 — Determine the minimum weight. Rearranging the USP 41 formula:
Minimum Weight = 2 × SD ÷ 0.001 = 2000 × SD
Worked example: A balance produces a standard deviation of 0.12 mg on ten replicate weighings.
Minimum Weight = 2000 × 0.12 mg = 240 mg
This balance cannot reliably weigh below 240 mg to within USP 41 tolerance. An analyst who weighs 15 mg of an API on this balance is operating below the minimum weight — the relative error exceeds USP 41 requirements regardless of how carefully the weighing is performed.

What Affects Precision in Practice
Precision is not solely a property of the balance — it is a property of the balance in its actual operating environment. The same balance can produce different precision results in different locations.
Vibration is the most consistent precision degrader in laboratory environments. Transmitted through bench surfaces from centrifuges, refrigerators, and foot traffic, vibration adds random variation to each measurement — increasing standard deviation and reducing repeatability. For the full guide to environmental factors that affect analytical balance performance, see our article on what affects lab balance accuracy.
Temperature instability causes the balance’s load cell output to drift between measurements — adding a time-dependent component to the measurement variability. Modern balances with internal calibration compensate for temperature drift automatically, but rapid temperature changes above the balance’s compensation threshold reduce precision temporarily.
Static electricity on weighing vessels — particularly plastic vessels and powder samples in dry conditions — adds a continuously changing electrostatic force to the reading that appears as poor repeatability. An analyst who sees readings that drift in one direction without stabilizing is almost always observing a static electricity effect rather than a precision failure of the balance itself.
Operator technique affects precision significantly at the 0.1 mg level. Inconsistent pan loading position, handling the weighing vessel with bare hands rather than tweezers, and not closing the draft shield doors before reading all introduce operator-dependent variability that appears as poor repeatability but is not a balance fault.
Precision in GLP, GMP, and ISO/IEC 17025 Frameworks
Each major regulatory framework has specific requirements for how precision must be verified, documented, and reported.
GLP (FDA 21 CFR Part 58): Requires that laboratory equipment be demonstrated as suitable for its intended purpose. A repeatability verification using the USP 41 approach — ten replicates, standard deviation calculated, minimum weight determined — satisfies this requirement when documented in the balance’s qualification record.
GMP (FDA 21 CFR Part 211): Requires calibration at defined intervals with adequate records. GMP calibration includes a repeatability performance check as a standard element. As confirmed by GMP Insiders, performance checks must be carried out between calibrations — balances are considered suitable if none of the precision or accuracy errors exceed 0.10%.
USP Chapter 41 (February 2026 revision): The most specific precision requirement in any regulatory framework for analytical balances. The minimum weight calculation must be performed using demonstrated repeatability — not the manufacturer’s specification — and must be verified using the actual tare vessels used in routine work. For the complete guide to USP Chapter 41 compliance, see our article on pharmaceutical lab weighing and USP standards.
ISO/IEC 17025: Requires that measurement uncertainty be calculated and reported for every calibration result. Repeatability is one of the primary contributors to the measurement uncertainty budget for an analytical balance calibration. For the full ISO/IEC 17025 requirements for laboratory balance calibration, see our article on ISO/IEC 17025 and lab balance calibration.

When to Re-verify Precision
Precision verification at commissioning is not a one-time event. Re-verification is required whenever conditions change that could have affected the balance’s performance.
Re-verify precision after:
- Balance relocation — even within the same room
- Any service, repair, or electronic component replacement
- An overload event or physical impact
- A significant change in the operating environment — new HVAC system, new nearby equipment, and room renovation
- A daily or weekly in-house check that produces a result outside the laboratory’s acceptance criteria
- Each formal external calibration event — to document as-left performance
For the complete calibration program that keeps analytical balance precision verified and documented throughout its service life, see our article on how to calibrate a lab balance.
FAQs
What is the precision on an analytical balance?
Precision is the consistency of repeated measurements under identical conditions. It is quantified as the standard deviation of a series of replicate weighings of the same sample. A more precise balance produces readings that are more tightly clustered — a lower standard deviation. Precision is distinct from accuracy (closeness to the true value) and readability (the smallest increment the display shows).
What is the difference between precision and accuracy on a laboratory balance?
Accuracy is how close the displayed value is to the true mass. Precision is how consistently the balance produces the same result under identical conditions. A balance can be precise but not accurate — producing consistent readings that are systematically wrong — or accurate on average but not precise — producing correct averages with high individual variability. Both are required for reliable laboratory measurement.
How do you measure the precision of an analytical balance?
Perform ten replicate weighings of a stable reference weight under identical conditions, with complete removal and replacement between each weighing. Calculate the standard deviation of the ten results. The standard deviation is the balance’s repeatability — the primary measure of precision — for that weight at those conditions.
What is the minimum weight on an analytical balance?
The minimum weight is the smallest quantity the balance can weigh with acceptable relative uncertainty under USP Chapter 41. It is calculated as 2000 times the standard deviation from a ten-replicate weighing test. A balance with a standard deviation of 0.12 mg has a minimum weight of 240 mg — it cannot reliably weigh this quantity below to within USP 41 tolerance.
Does readability determine precision on an analytical balance?
Readability and precision are related but distinct specifications. Readability is the smallest increment the display shows. Precision (repeatability) is the standard deviation of repeated measurements. The two are often numerically close — a 0.1 mg readability balance often has 0.1 mg repeatability — but precision can be better or worse than readability depending on the balance’s design, condition, and operating environment.
How do environmental conditions affect analytical balance precision?
Vibration, temperature instability, static electricity, air currents, and operator technique all affect precision by adding variability to individual measurements. A balance in a controlled environment with vibration isolation, stable temperature, and correct operator technique will achieve its specified precision. The same balance in an uncontrolled environment may perform significantly worse. For the full guide to environmental factors, see our article on what affects lab balance accuracy.
Conclusion
Precision on an analytical balance is not the same as readability, and confusing the two leads to incorrect balance selection, invalid minimum weight calculations, and regulatory findings in GLP and GMP audits.
Precision is repeatability — the standard deviation of ten replicate weighings under identical conditions. It determines the minimum weight the balance can measure to within USP Chapter 41 tolerance, contributes directly to the measurement uncertainty budget required by ISO/IEC 17025, and is the performance characteristic that GMP performance checks verify between formal calibration events.
Verifying precision requires a ten-replicate test with a stable reference weight — not a review of the manufacturer’s specification sheet. The specification tells you what the balance achieved is in the manufacturer’s controlled test environment. The verification test tells you what it achieves in your laboratory, with your operating conditions, today.











