ASTM D6956-17
(Guide)Standard Guide for Demonstrating and Assessing Whether a Chemical Analytical Measurement System Provides Analytical Results Consistent with Their Intended Use
Standard Guide for Demonstrating and Assessing Whether a Chemical Analytical Measurement System Provides Analytical Results Consistent with Their Intended Use
SIGNIFICANCE AND USE
4.1 This guide is intended for use by both generators and users of analytical results. It is intended to promote consistent demonstration and documentation of the quality of the measurement results and facilitate determination of the validity of measurements for their intended use.
4.2 This guide specifies documentation that a laboratory should supply with the analytical results to establish that the resulting measurements: (1) meet measurement quality requirements; (2) are suitable for their intended use; and (3) are technically defensible.
4.3 While the guide describes information that the measurement results provider needs to give the user/decision maker, in order for measurement providers to supply data users with appropriate data, information is needed from the data user. Examples of information that the user should provide to the laboratory, in addition to the analytes of concern (including the form of the analyte that is to be determined, for example, total lead, dissolved lead, organic lead, inorganic lead), include but are not limited to:
4.3.1 Type of material (that is, matrix—fresh or salt water, coal fly ash, sandy loam soil, wastewater treatment sludge),
4.3.2 Maximum sample holding time,
4.3.3 Projected sampling date and delivery date to the laboratory,
4.3.4 Method of chemical preservation (for example, not preserved, chemical used),
4.3.5 Chain-of-custody requirements, if any,
4.3.6 Analytical methods that must be used, if any,
4.3.7 Measurement quality requirements expressed as DQOs or MQOs and action limits,
4.3.8 Allowable interferences as described in 10.4,
4.3.9 Documentation requirement, and
4.3.10 Subcontracting restrictions/requirements.
4.4 Users/decision makers should consult with the laboratory about these issues during the analytical design stage. This will allow the design of sample collection process and project schedule to accommodate the laboratory activities necessary to determine the desired le...
SCOPE
1.1 This guide describes an approach for demonstrating the quality of analytical chemical measurement results from the application of a measurement system (that is, method or sequence of methods) to the analysis of environmental samples of soil, water, air, or waste. The purpose of such measurements can include demonstrating compliance with a regulatory limit, determining whether a site is contaminated above some specified level, or determining treatment process efficacy.
1.2 This guide describes a procedure that can be used to assess a measurement system used to generate analytical results for a specific purpose. Users and reviewers of the analytical results can determine, with a known level of confidence, if they meet the quality requirements and are suitable for the intended use.
1.3 This protocol does not address the general components of laboratory quality systems necessary to ensure the overall quality of laboratory operations. For such systems, the user is referred to International Standards Organization (ISO) Standard 17025 or the National Environmental Laboratory Accreditation Conference (NELAC) laboratory accreditation standards.
1.4 The values stated in SI units are to be regarded as standard. No other units of measurement are included in this standard.
1.5 This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this standard to establish appropriate safety, health, and environmental practices and determine the applicability of regulatory limitations prior to use.
1.6 This international standard was developed in accordance with internationally recognized principles on standardization established in the Decision on Principles for the Development of International Standards, Guides and Recommendations issued by the World Trade Organization Technical Barriers to Trade (TBT) Committee.
General Information
- Status
- Published
- Publication Date
- 31-Aug-2017
- Technical Committee
- D34 - Waste Management
- Drafting Committee
- D34.01.01 - Planning for Sampling
Relations
- Effective Date
- 01-Sep-2017
- Effective Date
- 01-Nov-2023
- Effective Date
- 01-Nov-2023
- Effective Date
- 01-Nov-2018
- Effective Date
- 01-May-2018
- Effective Date
- 01-Sep-2017
- Effective Date
- 01-Nov-2016
- Effective Date
- 01-Feb-2016
- Effective Date
- 01-Sep-2015
- Effective Date
- 01-Feb-2013
- Effective Date
- 01-Dec-2010
- Effective Date
- 01-Dec-2010
- Effective Date
- 01-Sep-2009
- Effective Date
- 01-Jul-2009
- Effective Date
- 01-Feb-2009
Overview
ASTM D6956-17 is a comprehensive standard guide developed by ASTM International to assist both laboratories and data users in verifying and documenting the quality of chemical analytical measurement systems. The primary focus of this guide is to ensure that analytical results from environmental samples-including soil, water, air, or waste-are consistent with their intended use, meet defined measurement quality requirements, and are technically defensible. The guide establishes key documentation requirements and supports the traceability and reliability of chemical data, which are essential for compliance, environmental assessment, and process control.
Key Topics
- Measurement System Quality: The guide outlines the process for evaluating whether a laboratory's analytical measurement system delivers results that are accurate, precise, sensitive, and selective enough for its intended purpose.
- Documentation Requirements: Laboratories must provide documentation with analytical results to demonstrate that:
- Quality requirements are met (e.g., data quality objectives (DQOs), measurement quality objectives (MQOs))
- Results are suitable for intended use
- Results and methods are technically defensible
- User Input: For valid results, data users must supply relevant details to the laboratory, such as:
- Analytes of concern (and specific chemical forms)
- Sample matrix type (e.g., fresh water, soil, sludge)
- Maximum sample holding times
- Sampling and delivery schedules
- Preservation methods and chain-of-custody specifics
- Approaches to Performance Demonstration: The standard recommends several levels of demonstration, from the most reliable (multi-method comparison) to the use of historical data. These prioritize the demonstration of bias, precision, selectivity, and sensitivity in analytical results.
- Collaborative Planning: Early consultation between laboratories and decision makers ensures sampling and analysis planning aligns with project requirements and timelines.
Applications
ASTM D6956-17 is widely applicable to environmental monitoring, regulatory compliance, and quality assurance in various sectors, including:
- Environmental Compliance: Demonstrating that soil, water, air, or waste samples meet regulatory limits for contaminants.
- Remediation and Site Assessment: Evaluating whether a site exceeds contamination thresholds, supporting decisions about site closure or required remediation.
- Process Control and Efficacy: Assessing the effectiveness of treatment technologies by documenting consistent and valid analytical results over time.
- Project Planning: Ensuring project stakeholders have reliable chemical data for risk assessments or compliance reporting.
This guide benefits laboratories, environmental consultants, regulators, and industries seeking to standardize analytical data quality and facilitate decision making based on credible results.
Related Standards
Adherence to ASTM D6956-17 is often complemented by other standards and best practices to ensure holistic quality management, including:
- ISO/IEC 17025: General requirements for the competence of testing and calibration laboratories.
- ASTM D4687: Guide for General Planning of Waste Sampling.
- ASTM D5792: Practice for Generation of Environmental Data Related to Waste Management Activities: Development of Data Quality Objectives.
- ASTM D5283, D5681, D6311, D5956, D6044, D6250: Standards covering terminology, sampling strategies, and quality control in environmental analysis.
- EPA Method Detection Limit (40 CFR 136, Appendix B): Procedures for establishing detection and quantitation limits.
Practical Value
By following ASTM D6956-17, organizations can:
- Enhance confidence in analytical results for environmental samples.
- Streamline regulatory compliance through transparent, standardized documentation.
- Enable more effective communication between laboratories and data users regarding data quality needs.
- Reduce the risk of data rejection or regulatory noncompliance by ensuring that analytical results are fit for purpose.
Implementing this standard supports reliable, decision-ready chemical data in environmental projects and fosters robust quality assurance frameworks across industries.
Buy Documents
ASTM D6956-17 - Standard Guide for Demonstrating and Assessing Whether a Chemical Analytical Measurement System Provides Analytical Results Consistent with Their Intended Use
REDLINE ASTM D6956-17 - Standard Guide for Demonstrating and Assessing Whether a Chemical Analytical Measurement System Provides Analytical Results Consistent with Their Intended Use
Get Certified
Connect with accredited certification bodies for this standard

ECOCERT
Organic and sustainability certification.

Eurofins Food Testing Global
Global leader in food, environment, and pharmaceutical product testing.

Intertek Bangladesh
Intertek certification and testing services in Bangladesh.
Sponsored listings
Frequently Asked Questions
ASTM D6956-17 is a guide published by ASTM International. Its full title is "Standard Guide for Demonstrating and Assessing Whether a Chemical Analytical Measurement System Provides Analytical Results Consistent with Their Intended Use". This standard covers: SIGNIFICANCE AND USE 4.1 This guide is intended for use by both generators and users of analytical results. It is intended to promote consistent demonstration and documentation of the quality of the measurement results and facilitate determination of the validity of measurements for their intended use. 4.2 This guide specifies documentation that a laboratory should supply with the analytical results to establish that the resulting measurements: (1) meet measurement quality requirements; (2) are suitable for their intended use; and (3) are technically defensible. 4.3 While the guide describes information that the measurement results provider needs to give the user/decision maker, in order for measurement providers to supply data users with appropriate data, information is needed from the data user. Examples of information that the user should provide to the laboratory, in addition to the analytes of concern (including the form of the analyte that is to be determined, for example, total lead, dissolved lead, organic lead, inorganic lead), include but are not limited to: 4.3.1 Type of material (that is, matrix—fresh or salt water, coal fly ash, sandy loam soil, wastewater treatment sludge), 4.3.2 Maximum sample holding time, 4.3.3 Projected sampling date and delivery date to the laboratory, 4.3.4 Method of chemical preservation (for example, not preserved, chemical used), 4.3.5 Chain-of-custody requirements, if any, 4.3.6 Analytical methods that must be used, if any, 4.3.7 Measurement quality requirements expressed as DQOs or MQOs and action limits, 4.3.8 Allowable interferences as described in 10.4, 4.3.9 Documentation requirement, and 4.3.10 Subcontracting restrictions/requirements. 4.4 Users/decision makers should consult with the laboratory about these issues during the analytical design stage. This will allow the design of sample collection process and project schedule to accommodate the laboratory activities necessary to determine the desired le... SCOPE 1.1 This guide describes an approach for demonstrating the quality of analytical chemical measurement results from the application of a measurement system (that is, method or sequence of methods) to the analysis of environmental samples of soil, water, air, or waste. The purpose of such measurements can include demonstrating compliance with a regulatory limit, determining whether a site is contaminated above some specified level, or determining treatment process efficacy. 1.2 This guide describes a procedure that can be used to assess a measurement system used to generate analytical results for a specific purpose. Users and reviewers of the analytical results can determine, with a known level of confidence, if they meet the quality requirements and are suitable for the intended use. 1.3 This protocol does not address the general components of laboratory quality systems necessary to ensure the overall quality of laboratory operations. For such systems, the user is referred to International Standards Organization (ISO) Standard 17025 or the National Environmental Laboratory Accreditation Conference (NELAC) laboratory accreditation standards. 1.4 The values stated in SI units are to be regarded as standard. No other units of measurement are included in this standard. 1.5 This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this standard to establish appropriate safety, health, and environmental practices and determine the applicability of regulatory limitations prior to use. 1.6 This international standard was developed in accordance with internationally recognized principles on standardization established in the Decision on Principles for the Development of International Standards, Guides and Recommendations issued by the World Trade Organization Technical Barriers to Trade (TBT) Committee.
SIGNIFICANCE AND USE 4.1 This guide is intended for use by both generators and users of analytical results. It is intended to promote consistent demonstration and documentation of the quality of the measurement results and facilitate determination of the validity of measurements for their intended use. 4.2 This guide specifies documentation that a laboratory should supply with the analytical results to establish that the resulting measurements: (1) meet measurement quality requirements; (2) are suitable for their intended use; and (3) are technically defensible. 4.3 While the guide describes information that the measurement results provider needs to give the user/decision maker, in order for measurement providers to supply data users with appropriate data, information is needed from the data user. Examples of information that the user should provide to the laboratory, in addition to the analytes of concern (including the form of the analyte that is to be determined, for example, total lead, dissolved lead, organic lead, inorganic lead), include but are not limited to: 4.3.1 Type of material (that is, matrix—fresh or salt water, coal fly ash, sandy loam soil, wastewater treatment sludge), 4.3.2 Maximum sample holding time, 4.3.3 Projected sampling date and delivery date to the laboratory, 4.3.4 Method of chemical preservation (for example, not preserved, chemical used), 4.3.5 Chain-of-custody requirements, if any, 4.3.6 Analytical methods that must be used, if any, 4.3.7 Measurement quality requirements expressed as DQOs or MQOs and action limits, 4.3.8 Allowable interferences as described in 10.4, 4.3.9 Documentation requirement, and 4.3.10 Subcontracting restrictions/requirements. 4.4 Users/decision makers should consult with the laboratory about these issues during the analytical design stage. This will allow the design of sample collection process and project schedule to accommodate the laboratory activities necessary to determine the desired le... SCOPE 1.1 This guide describes an approach for demonstrating the quality of analytical chemical measurement results from the application of a measurement system (that is, method or sequence of methods) to the analysis of environmental samples of soil, water, air, or waste. The purpose of such measurements can include demonstrating compliance with a regulatory limit, determining whether a site is contaminated above some specified level, or determining treatment process efficacy. 1.2 This guide describes a procedure that can be used to assess a measurement system used to generate analytical results for a specific purpose. Users and reviewers of the analytical results can determine, with a known level of confidence, if they meet the quality requirements and are suitable for the intended use. 1.3 This protocol does not address the general components of laboratory quality systems necessary to ensure the overall quality of laboratory operations. For such systems, the user is referred to International Standards Organization (ISO) Standard 17025 or the National Environmental Laboratory Accreditation Conference (NELAC) laboratory accreditation standards. 1.4 The values stated in SI units are to be regarded as standard. No other units of measurement are included in this standard. 1.5 This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this standard to establish appropriate safety, health, and environmental practices and determine the applicability of regulatory limitations prior to use. 1.6 This international standard was developed in accordance with internationally recognized principles on standardization established in the Decision on Principles for the Development of International Standards, Guides and Recommendations issued by the World Trade Organization Technical Barriers to Trade (TBT) Committee.
ASTM D6956-17 is classified under the following ICS (International Classification for Standards) categories: 71.040.40 - Chemical analysis. The ICS classification helps identify the subject area and facilitates finding related standards.
ASTM D6956-17 has the following relationships with other standards: It is inter standard links to ASTM D6956-11, ASTM D5681-23, ASTM D5792-10(2023), ASTM D5681-18, ASTM D5283-18, ASTM D5681-17, ASTM D5681-16a, ASTM D5681-16, ASTM D5792-10(2015), ASTM D5681-13, ASTM D5792-10, ASTM D6597-10, ASTM D6311-98(2009), ASTM D5681-09, ASTM D5283-92(2009). Understanding these relationships helps ensure you are using the most current and applicable version of the standard.
ASTM D6956-17 is available in PDF format for immediate download after purchase. The document can be added to your cart and obtained through the secure checkout process. Digital delivery ensures instant access to the complete standard document.
Standards Content (Sample)
This international standard was developed in accordance with internationally recognized principles on standardization established in the Decision on Principles for the
Development of International Standards, Guides and Recommendations issued by the World Trade Organization Technical Barriers to Trade (TBT) Committee.
Designation: D6956 − 17
Standard Guide for
Demonstrating and Assessing Whether a Chemical
Analytical Measurement System Provides Analytical Results
Consistent with Their Intended Use
This standard is issued under the fixed designation D6956; the number immediately following the designation indicates the year of
original adoption or, in the case of revision, the year of last revision.Anumber in parentheses indicates the year of last reapproval.A
superscript epsilon (´) indicates an editorial change since the last revision or reapproval.
1. Scope Development of International Standards, Guides and Recom-
mendations issued by the World Trade Organization Technical
1.1 This guide describes an approach for demonstrating the
Barriers to Trade (TBT) Committee.
quality of analytical chemical measurement results from the
application of a measurement system (that is, method or
2. Referenced Documents
sequenceofmethods)totheanalysisofenvironmentalsamples
ofsoil,water,air,orwaste.Thepurposeofsuchmeasurements 2.1 ASTM Standards:
D4687Guide for General Planning of Waste Sampling
can include demonstrating compliance with a regulatory limit,
determining whether a site is contaminated above some speci- D5283Practice for Generation of Environmental Data Re-
lated to Waste ManagementActivities: QualityAssurance
fied level, or determining treatment process efficacy.
and Quality Control Planning and Implementation
1.2 This guide describes a procedure that can be used to
D5681Terminology for Waste and Waste Management
assess a measurement system used to generate analytical
D5792Practice for Generation of Environmental Data Re-
results for a specific purpose. Users and reviewers of the
lated to Waste Management Activities: Development of
analytical results can determine, with a known level of
Data Quality Objectives
confidence, if they meet the quality requirements and are
D5956Guide for Sampling Strategies for Heterogeneous
suitable for the intended use.
Wastes
1.3 This protocol does not address the general components
D6044Guide for Representative Sampling for Management
of laboratory quality systems necessary to ensure the overall
of Waste and Contaminated Media
quality of laboratory operations. For such systems, the user is
D6233Guide for DataAssessment for EnvironmentalWaste
referred to International Standards Organization (ISO) Stan-
Management Activities (Withdrawn 2016)
dard17025ortheNationalEnvironmentalLaboratoryAccredi-
D6250Practice for Derivation of Decision Point and Confi-
tation Conference (NELAC) laboratory accreditation stan-
dence Limit for Statistical Testing of Mean Concentration
dards.
in Waste Management Decisions
D6311Guide for Generation of Environmental Data Related
1.4 The values stated in SI units are to be regarded as
toWaste ManagementActivities: Selection and Optimiza-
standard. No other units of measurement are included in this
tion of Sampling Design
standard.
D6582GuideforRankedSetSampling:EfficientEstimation
1.5 This standard does not purport to address all of the
of a Mean Concentration in Environmental Sampling
safety concerns, if any, associated with its use. It is the
(Withdrawn 2012)
responsibility of the user of this standard to establish appro-
D6597Practice forAssessment ofAttaining Clean Up Level
priate safety, health, and environmental practices and deter-
for Site Closure (Withdrawn 2016)
mine the applicability of regulatory limitations prior to use.
2.2 Other Documents:
1.6 This international standard was developed in accor-
Guidelines for Evaluating and Expressing the Uncertainty of
dance with internationally recognized principles on standard-
NIST Measurement Results, National Institute of Standard
ization established in the Decision on Principles for the
1 2
This guide is under the jurisdiction of ASTM Committee D34 on Waste For referenced ASTM standards, visit the ASTM website, www.astm.org, or
Management and is the direct responsibility of Subcommittee D34.01.01 on contact ASTM Customer Service at service@astm.org. For Annual Book of ASTM
Planning for Sampling. Standards volume information, refer to the standard’s Document Summary page on
Current edition approved Sept. 1, 2017. Published October 2017. Originally the ASTM website.
approved in 2003. Last previous edition approved in 2011 as D6956–11. DOI: The last approved version of this historical standard is referenced on
10.1520/D6956-17. www.astm.org.
Copyright © ASTM International, 100 Barr Harbor Drive, PO Box C700, West Conshohocken, PA 19428-2959. United States
D6956 − 17
Technology Technical Note 1297, 1994 4.3.1 Type of material (that is, matrix—fresh or salt water,
ISO/IEC 17025:1999General Requirements for the Compe- coal fly ash, sandy loam soil, wastewater treatment sludge),
tence of Testing and Calibration Laboratories 4.3.2 Maximum sample holding time,
Quantifying Uncertainty in Analytical Measurement, 4.3.3 Projected sampling date and delivery date to the
EURACHEM/ CITAC Guide, Second Edition, 2000
laboratory,
4.3.4 Method of chemical preservation (for example, not
3. Terminology
preserved, chemical used),
3.1 For definitions of terms used in this guide, refer to 4.3.5 Chain-of-custody requirements, if any,
Terminology D5681. 4.3.6 Analytical methods that must be used, if any,
4.3.7 Measurement quality requirements expressed as
3.2 Definitions:
DQOs or MQOs and action limits,
3.2.1 actionlevel(AL)—thelevelaboveorbelowwhichwill
4.3.8 Allowable interferences as described in 10.4,
lead to the adoption of one of two alternative actions.
4.3.9 Documentation requirement, and
3.2.2 measurement quality objectives (MQOs)—quantitative
4.3.10 Subcontracting restrictions/requirements.
statements of the acceptable level of selectivity, sensitivity,
4.4 Users/decision makers should consult with the labora-
bias, and precision for measurements of the analyte of interest
tory about these issues during the analytical design stage. This
in the matrix of concern.
will allow the design of sample collection process and project
3.2.3 measurement system—all elements of the analytical
scheduletoaccommodatethelaboratoryactivitiesnecessaryto
process including laboratory subsampling, sample preparation
determine the desired level of measurement quality. The
and cleanup, and analyte detection and quantitation, including
number of samples, budgets, and schedules should also be
the analysts.
discussed.
3.2.4 method of standard additions—the addition of a series
of known amounts of the analytes of interest to more than one
5. Limitations and Assumptions
aliquotofthesampleasameansofcorrectingforinterferences.
5.1 This guide deals only with samples from the time the
3.2.5 selectivity—the ability to accurately measure the ana-
laboratory receives the samples until the time the analytical
lyte in the presence of other sample matrix components or
results are provided to the user including necessary documen-
analytical process contaminants.
tation.
3.2.6 surrogate—a substance with properties that mimic the
5.2 Aspects of environmental measurements that are within
performance of the analyte of interest in the measurement
the control of the laboratory are normally specified by the
system, but which is not normally found in the sample of
project stakeholders in the form of MQOs. MQOs are a subset
concern and is added for quality control purposes.
of the data quality objectives (DQOs). The DQOs describe the
overallmeasurementqualityandtolerableerrorofthedecision
4. Significance and Use
for the project while the MQOs describe the uncertainty of the
4.1 This guide is intended for use by both generators and
analytical process only. The DQO overall level of uncertainty
users of analytical results. It is intended to promote consistent
includes uncertainty from both sampling and environmental
demonstration and documentation of the quality of the mea-
laboratory measurement operations.Additional information on
surement results and facilitate determination of the validity of
the DQO process and establishing the level of analytical
measurements for their intended use.
uncertainty can be found in the references provided in Section
4.2 This guide specifies documentation that a laboratory
2.
should supply with the analytical results to establish that the
5.3 This guide applies whether the measurements are per-
resultingmeasurements:(1)meetmeasurementqualityrequire-
formed in a fixed location or in the field (on-site).
ments; (2) are suitable for their intended use; and (3) are
technically defensible. 5.4 This guide assumes that the laboratory is operating with
alladministrativeandanalyticalsystemsfunctioningwithinthe
4.3 Whiletheguidedescribesinformationthatthemeasure-
quality assurance and quality control protocols and procedures
ment results provider needs to give the user/decision maker, in
described in their quality system documents (quality assurance
order for measurement providers to supply data users with
plan and standard operating procedures).
appropriate data, information is needed from the data user.
Examples of information that the user should provide to the 5.5 Thisguidedoesnotaddressmulti-laboratoryapproaches
laboratory,inadditiontotheanalytesofconcern(includingthe
to demonstrating acceptable laboratory performance such as
form of the analyte that is to be determined, for example, total collaborative testing, inter-laboratory studies, or round-robin
lead, dissolved lead, organic lead, inorganic lead), include but types of studies.
are not limited to:
6. Outline of Approach
Available from National Institute of Standards and Technology (NIST), 100
6.1 The approach set forth in this guide employs two
Bureau Dr., Stop 1070, Gaithersburg, MD 20899-1070, http://www.nist.gov.
fundamental properties of measurement systems: bias and
Available fromAmerican National Standards Institute (ANSI), 25 W. 43rd St.,
precision to determine the quality of the analytical results.The
4th Floor, New York, NY 10036, http://www.ansi.org.
Available from http://www.citac.cc/QUAM2000–1.pdf. guide singles out selectivity, a component of bias, for special
D6956 − 17
emphasis.Sensitivityisalsodiscussedsince,unlessameasure- when the measurement system is shown to yield the same
ment system is sensitive enough to measure the analytes of results as another system that employs a fundamentally differ-
interest at the level of interest, it is not capable of being used
ent measurement principle. The likelihood is small that two
for the purpose at hand. Both areas are frequently highlighted
analytical techniques will experience the same systematic
for demonstration in acceptable environmental measurement
errors and will be subject to the same types of chemical and
collection efforts.
physical interferences. If two such analytical techniques agree,
the possibility of unknown systematic errors is substantially
6.2 This guide provides examples of approaches that deter-
decreased. Therefore, showing that a different measurement
mine bias, precision, selectivity, and sensitivity of a measure-
technique yields the same results as the subject technique
ment system used to analyze a set of samples. It also provides
servestovalidatetheabilityofthesubjectsystemtoyieldvalid
examples of factors laboratories should consider in designing
measurements. If the two techniques disagree, there is a
the demonstration.
possibility of systematic or random error in one or both
6.3 This guide describes, in general terms, the rigor of the
techniques.
demonstration of bias, precision, selectivity, and sensitivity
6.6.2 Option 2—The next lower level of certainty is ob-
that should be conducted for a set of samples. It describes the
tained by determining the bias, precision, sensitivity, and
appropriate use of public literature and historical laboratory
selectivity of the candidate measurement system using refer-
performance information to minimize the need to collect
ence materials provided by NIST, or some other appropriate
additional experimental measurements.
national certifying authority (for example, Standards Canada,
6.4 When analytical performance results are already avail-
DIN). Such reference materials would have been confirmed by
able on the measurement system’s response to the type of
the use of multiple methods, each using a different analytical
sample to be analyzed (for example, historical results from the
principle. Comparison of the test results from new methods
laboratory conducting the demonstration, method developer
with published reference values on such materials can be used
information), such information may be used to determine one
to determine measurement system bias. Commercially pro-
or more of the measurement properties (that is, bias, precision,
ducedreferencematerialsmayalsobeused,butthetruevalues
selectivity, sensitivity). Only very limited amounts of new
are usually developed using only one (sometimes two) analyti-
measurements would then be necessary to support the conclu-
cal technique(s). The reliable use of reference standards is
sions drawn from the existing information.
extremely sensitive to the degree that the reference materials
6.5 This guide is intended to offer users a technically
havethesamematrix/analytephysicalpropertiesandchemistry
defensible strategy to determine the applicability of an analyti-
as the project samples. If the match of the properties between
cal technique to a set of environmental samples.The complex-
the project samples and the reference materials is poor, the
ity of the problem, the available resources (trained staff,
study results can be misleading.
equipment, and time), and the intended use of the analytical
6.6.3 Option 3—The lack of availability of more than one
results require the application of professional judgment in
analytical method (no alternative technology or resources) or
selecting the best available option to meet the project-specific
of appropriate reference materials will prevent use of the
needs.Thefollowingsectionspresenttheuserwithavarietyof
techniques mentioned above. When this is the case, the use of
optionstodeterminebias,precision,selectivity,andsensitivity.
matrix spikes and surrogates becomes the “best available
The discussion of these options does not recommend one over
technology” and can be a reliable option. As in all analytical
another. However, there are general principles that can assist
studies, the analyst must support conclusions with scientific
the user in selecting an appropriate option.
rationale, including the statistical basis of the number of
6.6 The laboratory should select the available option that
samples analyzed, the evaluation of experimental
will provide the information needed to determine if the
measurements, and the limitations of the study.
measurements meet the required level of quality (as defined by
6.6.3.1 Inorganic Matrix Spikes—While matrix spikes can
theuser/decisionmaker).Thenecessarylevelofqualityshould
be a valuable tool in demonstrating the validity of the
be available from the project data quality requirements, DQOs
measurement, the uncertainty associated with the chemical
or MQOs. This guide assumes that the laboratory and users
form of metals in the sample and the mechanism by which it is
have sufficient familiarity (or access to qualified individuals)
incorporated into the sample matrix diminishes the value of
thatcanbalancethetrade-offsassociatedwiththeMQOs,such
this technique compared to the previous two mentioned above.
that rigid standards are not applied but rather the pooled effect
In general, matrix spikes are made from known amounts of the
(overall analytical uncertainty) of all items affecting measure-
compounds or elements (most often in solution) added to the
ment usability (bias, precision, selectivity, sensitivity) are
project sample. The form of the target metal in the sample
considered. The following options are ranked from the most
matrixisunlikelytobethesameastheformofthetargetmetal
reliable(Option1)totheleastreliable(Option4)andshouldbe
consideredinlightoftheoverallprojectgoals.Thisguidedoes inthespikingmaterial.Thismayleadtoahighrecoveryofthe
spiked material (because it’s in a readily soluble form) com-
notproposeaspecificsetofproceduralstepsbecauseeachcase
is different and must be addressed by a consensus process pared to the recovery of the target metal originally present in
involving appropriate representatives from the stakeholders. thematrix.Thiscouldleadtotheerroneousconclusionthatthe
proposed method is efficient in recovering and quantitating the
6.6.1 Option 1—The most certainty in showing that a
measurement system is free of unacceptable bias is obtained target analytes in the sample.
D6956 − 17
6.6.3.2 Organic Matrix Spikes—Matrix spikes of organic the application of the measurement system to the analyte/
compounds suffer from similar limitations based on the degree matrix exist in the literature.
and type of association between the target organic analyte and
6.8 Many inorganic and organic analyses rely on a sample
the sample matrix. In addition, the spiking vehicle (for
preparationmethodpriortothedeterminativemethodtoisolate
example, solvent) must be compatible with the matrix to get
the analytes of concern from the matrix. The use of new or
the spike distributed properly into the matrix. Most field
modified preparative techniques is a viable way to achieve
samples are “aged” and the analytes may become much more
project objectives. The use of any preparative steps must be
intimately associated with the matrix than the spiking com-
fully evaluated using the above options.
poundswhichareonlyincontactwiththematrixforveryshort
6.9 The subsequent sections discuss the application of these
periods of time prior to extraction and isolation for analysis.
techniques to the demonstration of the bias, precision,
6.6.3.3 Surrogates—The use of surrogates (used as a mea-
selectivity, and sensitivity in more detail. In many cases, the
sure of analyte recovery of an analytical process) is a reliable
strengths and weaknesses of the techniques are explained for
means of demonstrating that the analytical technique is being
the individual application.
performed correctly when their recoveries are high and within
thestatisticallydefinedvariancenormallyassociatedwiththeir
7. Bias
use. Calculation of surrogate recovery can be performed using
either the reported concentration of the surrogate or the total 7.1 Definition of Bias—Bias is the difference between the
response (peak area or height) of the analytical signal. This value determined using the measurement system in question
technique suffers from the same limitations as discussed above and the true value; operationally, the difference between the
with matrix spikes. Additionally, more uncertainty is intro- sample mean and an accepted true value. Bias can be negative
duced if materials selected as surrogates do not perform in the or positive (that is, the average of the measured values can be
same manner as the target analyte in the sample matrix. The lessthanormorethanthetruevalue,respectively).Biascanbe
useofcompoundsoutsidethelistofthosenormallyusedinthe expressed in two ways: absolute bias (for example, the bias is
determination of the target analytes must be preceded by −2mg/L),orpercentbias(forexample,biasis+20%).Method
studies demonstrating that the chosen compounds have a selectivity is an important element of analytical bias. Because
clearly defined correlation with the target analytes. The use of of its importance, it is discussed separately in Section 10.
surrogates determines method performance compared to his-
7.2 Demonstration of Bias—Ideally, the user will define the
torical levels (developed from statistically derived acceptance
questiontobeansweredbytheinformationgatheringstudyand
criteria).Thisoptiondoesnotdeterminetheabilityofamethod
the level of uncertainty that is acceptable (the DQO).
toreturnthetruevalueofanalyteinthematrixsinceitdoesnot
Alternatively,theusermayspecifyanacceptablelevelorrange
involve the target analytes.
ofbias(forexample,arangeof20%ofthetrueconcentration)
forthelaboratorytoachieve.Throughtheuseofthetechniques
6.7 Option 4, Use of Historical Analytical Results—
Performing additional studies may not be necessary to show described below, the laboratory determines the bias (if any) of
the measurement system (including both the analytical tech-
that the proposed analytical protocol is appropriate unless
required by the user/decision maker. In some instances, his- nique and the operator in the matrix representative of those
encounteredintheproject).Thisperformanceisthencompared
torical analytical results alone or in combination with abbre-
to the project MQOs.
viated studies will suffice. The user should be informed of the
laboratory’s plan to use historical data to support the project.
7.3 Guidance on Demonstration of Bias—Demonstration of
The user may elect to have the actual bias, precision, or
bias may be made through the conduct of new bias studies, the
sensitivity evaluated experimentally. Proprietary information
use of historical analytical results, or by some combination.
or confidential information should not be used because review
7.3.1 Conduct of New Bias Studies—There are four gener-
and evaluation may not be possible. Examples of the use of
ally accepted techniques available for determining the bias of
prior studies include but are not limited to:
a measurement system. In the order of technical defensibility,
6.7.1 Use of an extensive database on the performance (that
these are:
is, bias, precision, sensitivity) of the candidate measurement
7.3.1.1 Analysis of split samples using both the method to
system on project samples.
be verified and a second method that employs a fundamentally
6.7.2 Validation that the measurement system was rugged/
different measurement principle,
robust and the bias, precision sensitivity, and selectivity of the
7.3.1.2 Analysis of a reference material (RM) whose matrix
measurement system are well documented in available litera-
is analytically representative of the samples and contains the
ture or reports for the analyte/matrix combination of interest.
analyte at a concentration appropriate to the study,
6.7.3 Thesamplematrixofconcern(forexample,claysoils)
7.3.1.3 Analysis of split samples using the method to be
is similar to other samples that the laboratory is familiar with
verified and a different but similar method, of known
and has historical analytical results, requiring only abbreviated
variability, that has been validated for the application by a
tests to verify applicability such as performing a limited
recognized methods certification organization (for example,
number of spike additions to splits of field samples.
U.S. Environmental Protection Agency (EPA), ASTM, ISO,
6.7.4 The sample matrix and analytes are relatively simple American Public Health Association) for the analytes of
concern in the matrix of concern, and
(for example, drinking water, water from a clean surface
stream)andbias,precision,andsensitivityanalyticalresultson 7.3.1.4 Analysis of matrix spike samples.
D6956 − 17
7.3.2 The user is cautioned that the design of the experi- be used to determine a lack of bias on the part of the subject
ments and number of replicates necessary to determine bias method. Statistical analysis should be conducted on the two
may not be a trivial exercise. Careful consideration must be
sets of results to determine whether the two methods yield
given to the estimated level of target analytes, method significantly different results. If the two methods do not give
sensitivity,andthepresenceofinterferences.Thedesignofthe the same results (no significant difference statistically), then
experiments must make appropriate use of statistical tech-
additional testing will be necessary to determine the lack of
niques to ensure that project objectives are met. bias or to determine the level of bias.
7.3.3 The choice among options depends on the available
7.3.3.4 Option 4, Matrix Spikes—In this approach, known
RMs, the number of viable analytical techniques, the available
quantities of the analyte of concern are added to one or more
spikingmaterials,andthecomplexityofthesamplematrixand
aliquots of the field samples, the samples are analyzed, and the
its constituents. Each of these options is discussed in more
resultsarecomparedtotheamountofaddedspike.Thelevelof
detail in the following sections.
the spike should be close to the concentration of analyte
anticipated to be in the field sample (for example, if the field
7.3.3.1 Option 1, Reference Materials—Under this
sample is analyzed at 10 mg/L of the analyte of concern, then
approach, samples of a RM are analyzed and the results
compared to the known amount of the analyte (that is, the the spike should ideally also be near 10 mg/L). If too little of
theanalyteisusedforthespiking,itspresencemaybemasked.
certified amount). The difference between the average analysis
results and the known analyte concentration is the bias. Masking occurs when the difference between the amount of
added spike and its measured response is within normal
Performing bias studies with RMs is useful if the field samples
analytical variance of the amount present in the original
being tested are in relatively well-defined matrices (for
example, tap water, coal fly ash). When matrices become sample. If too much is used, the spike can mask the effect of
interferingcompoundsoriginallypresentbecausetheanalytical
complex (for example, soils which can be combinations of
clays,silts,sands,organicmatter)RMsmayhavelimitedvalue variance of the measured response of the spiked sample
exceeds the signal of the analyte in the original sample. For
because they may not closely resemble the field samples.
Similarly, when the contaminant mix is complex (for example, these reasons, it is important that the amount of added spike
numerous compounds with similar chromatographic behavior should be based on the estimated value of the target analyte
to the compound being sought) RMs may be of limited value after the field sample has been diluted to fall within the
because of interferences (see discussion on selectivity in
calibration range of the analytical method. When dilution of
Section10).PerformingRMandspiketestsmaynotaccurately thefieldsampleisrequired,thecorrectamountofspikeshould
characterize and measure the analytes in the field sample
beaddedafterthesamplehasbeendilutedtothecorrectrange.
because RMs are unlikely to contain the same number and Each of the spiked samples is then analyzed using the
concentration of individual compounds present in the original
candidate measurement system. The average of the results of
sample. Finally, RMs are not available for many types of such analysis (for example, 22 mg/L) is compared with the
analyte/matrix combinations.
results of measurement of an unspiked sample (that is,
10mg⁄L). The arithmetic differences between the unspiked
7.3.3.2 Option 2, Comparison to Alternative Measurement
and the spiked sample average (22−10 or 12 mg/L) are
Technique Using a Fundamentally Different Technique—
compared to the known amount of the spike (10 mg/L). The
Another approach to determine bias is the comparison of the
amount of the spike that is recovered ( ⁄10 or 120%) indicates
analytical results from a candidate measurement system with
thebiasisapositive20%.Wherespikingisdoneproperlyand
those of an alternative measurement system that uses a
the physical and chemical properties of the sample are simple,
fundamentally different science. The second technique should
the matrix spiking technique can produce an accurate measure
be recognized in the available literature as being applicable to
the problem. Multiple measurement systems based on different of bias. For spiking to be valid, it should be performed using
the actual sample matrix and mix of target analytes.
scientificprinciplesareunlikelytobesubjecttothesametypes
of interferences and other problems.Therefore, when the same
results are obtained using different methods, a high degree of
8. Precision
confidence can be attached to the results. It should be pointed
8.1 Definition of Precision—A measure of the scatter of
out that for the alternative technique approach to be scientifi-
measurementsystemtestresultsobtainedfromsamplesthatare
cally valid, it is important that not only the determinative step
ostensibly the same (for example, taken at the same time and
be changed but also any preparative steps to ensure that the
location or from the same container).
preparative step is not the accuracy limiting step.
8.2 DemonstrationofPrecision—Precisionisdeterminedby
7.3.3.3 Option 3, Comparison to a Recognized Reference
measuring the scatter or variability of the measurements
Method—Anotherapproachtodeterminebiasistocomparethe
analytic results from the candidate measurement system to resulting from replicate measurements of the same material.
The desired level of precision should be specified by the user.
those of an alternative measurement system that has been
validated for the application by a recognized methods certifi- It usually takes the form of an acceptable measurement system
variability, for example, 10% relative standard deviation
cation organization (for example, EPA, ASTM, ISO, and
AmericanPublicHealthAssociation).Tousethisapproach,the (RSD) or the range of the average that equates to a specified
degree of confidence (for example, true value lies within the
field sample is split and the splits are analyzed using both
¯
measurement systems. Similar results using both methods can rangeX 63σwhereσisthestandarddeviationandthedesired
D6956 − 17
level of confidence is 99%). It is important that the demon- procedure but are not naturally present in the samples ana-
stration of precision be determined at the project action level lyzed. Surrogates are added to each sample prior to sample
(AL). The precision of most analytical techniques decreases preparation (or when specified in the method). The percentage
whentheconcentrationoftheanalytedecreasesinthesamples. recovery monitors the extraction efficiency and any unusual
Failure to match the demonstration to the action level will lead matrix effects. The variability of surrogate recovery from
toanincorrectestimateofprecisionwhereitismostimportant, multiple samples measures the precision of the measurement
the action level. system at the surrogate concentration being used. This ap-
proach can be used if the analyte of interest is not commer-
8.3 Guidance on Demonstration of Precision—Precision
cially available or is too dangerous, toxic, or is unstable (that
may be determined by new precision studies, the use of
is, has a poor shelf life).
historical analytical results from prior studies, the measured
8.3.4 Reference Materials or Laboratory Control Samples
variability of the project samples, or analysis of laboratory
(LCS)—These materials, normally used to ensure that the
control samples that are representative of the analyte concen-
laboratory is operating in control, can also be used to assess
tration and matrix of concern. The following are examples of
measurement system precision. Such materials should be
approaches that may be used to determine and document
selected to provide a sample with analytically similar proper-
precision.
ties to that of the actual samples to be analyzed (matrix and
8.3.1 Project Samples—Analysis of multiple samples of
concentrationssimilartoprojectsamples).Referencematerials
project material (for example, a series of effluent or waste
and LCSs evaluate the precision of the entire measurement
samples taken over a period of time, a collection of soil
system including the sample preparation, cleanup, and deter-
samples taken from various points at a site, a series of hourly
minative steps. If needed, this approach can be used to
air samples) containing the analyte of interest will determine
determine the precision of the determinative step alone as long
overall project-specific precision. Additionally, when the ana-
as no preparation or other steps are required before the
lytical results are obtained under a statistical design, the data
determinative step. When using reference materials whose
can be analyzed using analysis of variance techniques to
certified analyte values and precision were obtained using a
decompose the total variance into components due to sample
method that is different from the subject method, the precision
variability and the variability (precision) of the measurement
obtained from the candidate method may be different from the
system. Sample variability may be composed of variance
certified precision. This difference indicates bias (see Section
between field samples, subsampling variance, and differences
7) between the two methods. The use of laboratory control
in sample preparation. Note that this approach cannot be used
samplesasanindicatoroflaboratoryprecisionisinappropriate
to determine the precision of the measurement system alone
if the sample matrix is much more complex than the matrix of
(see8.3.4)sinceitmeasuresthetotalvariability,whichconsists
t
...
This document is not an ASTM standard and is intended only to provide the user of an ASTM standard an indication of what changes have been made to the previous version. Because
it may not be technically possible to adequately depict all changes accurately, ASTM recommends that users consult prior editions as appropriate. In all cases only the current version
of the standard as published by ASTM is to be considered the official document.
Designation: D6956 − 11 D6956 − 17
Standard Guide for
Demonstrating and Assessing Whether a Chemical
Analytical Measurement System Provides Analytical Results
Consistent with Their Intended Use
This standard is issued under the fixed designation D6956; the number immediately following the designation indicates the year of
original adoption or, in the case of revision, the year of last revision. A number in parentheses indicates the year of last reapproval. A
superscript epsilon (´) indicates an editorial change since the last revision or reapproval.
1. Scope
1.1 This guide describes an approach for demonstrating the quality of analytical chemical measurement results from the
application of a measurement system (that is, method or sequence of methods) to the analysis of environmental samples of soil,
water, air, or waste. The purpose of such measurements can include demonstrating compliance with a regulatory limit, determining
whether a site is contaminated above some specified level, or determining treatment process efficacy.
1.2 This guide describes a procedure that can be used to assess a measurement system used to generate analytical results for
a specific purpose. Users and reviewers of the analytical results can determine, with a known level of confidence, if they meet the
quality requirements and are suitable for the intended use.
1.3 This protocol does not address the general components of laboratory quality systems necessary to ensure the overall quality
of laboratory operations. For such systems, the user is referred to International Standards Organization (ISO) Standard 17025 or
the National Environmental Laboratory Accreditation Conference (NELAC) laboratory accreditation standards.
1.4 The values stated in SI units are to be regarded as standard. No other units of measurement are included in this standard.
1.5 This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility
of the user of this standard to establish appropriate safety safety, health, and healthenvironmental practices and determine the
applicability of regulatory requirementslimitations prior to use.
1.6 This international standard was developed in accordance with internationally recognized principles on standardization
established in the Decision on Principles for the Development of International Standards, Guides and Recommendations issued
by the World Trade Organization Technical Barriers to Trade (TBT) Committee.
2. Referenced Documents
2.1 ASTM Standards:
D4687 Guide for General Planning of Waste Sampling
D5283 Practice for Generation of Environmental Data Related to Waste Management Activities: Quality Assurance and Quality
Control Planning and Implementation
D5681 Terminology for Waste and Waste Management
D5792 Practice for Generation of Environmental Data Related to Waste Management Activities: Development of Data Quality
Objectives
D5956 Guide for Sampling Strategies for Heterogeneous Wastes
D6044 Guide for Representative Sampling for Management of Waste and Contaminated Media
D6233 Guide for Data Assessment for Environmental Waste Management Activities (Withdrawn 2016)
D6250 Practice for Derivation of Decision Point and Confidence Limit for Statistical Testing of Mean Concentration in Waste
Management Decisions
This guide is under the jurisdiction of ASTM Committee D34 on Waste Management and is the direct responsibility of Subcommittee D34.01.01 on Planning for
Sampling.
Current edition approved July 1, 2011Sept. 1, 2017. Published August 2011October 2017. Originally approved in 2003. Last previous edition approved in 20032011 as
D6956 – 03.D6956 – 11. DOI: 10.1520/D6956-11.10.1520/D6956-17.
For referenced ASTM standards, visit the ASTM website, www.astm.org, or contact ASTM Customer Service at service@astm.org. For Annual Book of ASTM Standards
volume information, refer to the standard’s Document Summary page on the ASTM website.
The last approved version of this historical standard is referenced on www.astm.org.
Copyright © ASTM International, 100 Barr Harbor Drive, PO Box C700, West Conshohocken, PA 19428-2959. United States
D6956 − 17
D6311 Guide for Generation of Environmental Data Related to Waste Management Activities: Selection and Optimization of
Sampling Design
D6582 Guide for Ranked Set Sampling: Efficient Estimation of a Mean Concentration in Environmental Sampling (Withdrawn
2012)
D6597 Practice for Assessment of Attaining Clean Up Level for Site Closure (Withdrawn 2016)
2.2 Other Documents:
Guidelines for Evaluating and Expressing the Uncertainty of NIST Measurement Results, National Institute of Standard
Technology Technical Note 1297, 1994
ISO/IEC 17025:1999 General Requirements for the Competence of Testing and Calibration Laboratories
Quantifying Uncertainty in Analytical Measurement, EURACHEM/ CITAC Guide, second edition,Second Edition, 2000
3. Terminology
3.1 For definitions of terms used in this guide, refer to Terminology D5681.
3.2 Definitions:
3.2.1 action level (AL)—the level above or below which will lead to the adoption of one of two alternative actions.
3.2.2 measurement quality objectives (MQOs)—quantitative statements of the acceptable level of selectivity, sensitivity, bias,
and precision for measurements of the analyte of interest in the matrix of concern.
3.2.3 measurement system—all elements of the analytical process including laboratory subsampling, sample preparation and
cleanup, and analyte detection and quantitation, including the analysts.
3.2.4 method of standard additions—the addition of a series of known amounts of the analytes of interest to more than one
aliquot of the sample as a means of correcting for interferences.
3.2.5 selectivity—the ability to accurately measure the analyte in the presence of other sample matrix components or analytical
process contaminants.
3.2.6 surrogate—a substance with properties that mimic the performance of the analyte of interest in the measurement system,
but which is not normally found in the sample of concern and is added for quality control purposes.
4. Significance and Use
4.1 This guide is intended for use by both generators and users of analytical results. It is intended to promote consistent
demonstration and documentation of the quality of the measurement results and facilitate determination of the validity of
measurements for their intended use.
4.2 This guide specifies documentation that a laboratory should supply with the analytical results to establish that the resulting
measurements: (1) meet measurement quality requirements; (2) are suitable for their intended use; and (3) are technically
defensible.
4.3 While the guide describes information that the measurement results provider needs to give the user/decision maker, in order
for measurement providers to supply data users with appropriate data, information is needed from the data user. Examples of
information that the user should provide to the laboratory, in addition to the analytes of concern (including the form of the analyte
that is to be determined, for example, total lead, dissolved lead, organic lead, inorganic lead), include but are not limited to:
4.3.1 Type of material (that is, matrix—fresh or salt water, coal fly ash, sandy loam soil, wastewater treatment sludge),
4.3.2 Maximum sample holding time,
4.3.3 Projected sampling date and delivery date to the laboratory,
4.3.4 Method of chemical preservation (for example, not preserved, chemical used),
4.3.5 Chain-of-custody requirements, if any,
4.3.6 Analytical methods that must be used, if any,
4.3.7 Measurement quality requirements expressed as DQOs or MQOs and action limits,
4.3.8 Allowable interferences as described in 10.4,
4.3.9 Documentation requirement, and
4.3.10 Subcontracting restrictions/requirements.
4.4 Users/decision makers should consult with the laboratory about these issues during the analytical design stage. This will
allow the design of sample collection process and project schedule to accommodate the laboratory activities necessary to determine
the desired level of measurement quality. The number of samples, budgets, and schedules should also be discussed.
Available from National Institute of Standards and Technology (NIST), 100 Bureau Dr., Stop 1070, Gaithersburg, MD 20899-1070, http://www.nist.gov.
Available from American National Standards Institute (ANSI), 25 W. 43rd St., 4th Floor, New York, NY 10036, http://www.ansi.org.
Available from http://www.citac.cc/QUAM2000–1.pdf.
D6956 − 17
5. Limitations and Assumptions
5.1 This guide deals only with samples from the time the laboratory receives the samples until the time the analytical results
are provided to the user including necessary documentation.
5.2 Aspects of environmental measurements that are within the control of the laboratory are normally specified by the project
stakeholders in the form of MQOs. MQOs are a subset of the data quality objectives (DQOs). The DQOs describe the overall
measurement quality and tolerable error of the decision for the project while the MQOs describe the uncertainty of the analytical
process only. The DQO overall level of uncertainty includes uncertainty from both sampling and environmental laboratory
measurement operations. Additional information on the DQO process and establishing the level of analytical uncertainty can be
found in the references provided in Section 2.
5.3 This guide applies whether the measurements are performed in a fixed location or in the field (on-site).
5.4 This guide assumes that the laboratory is operating with all administrative and analytical systems functioning within the
quality assurance and quality control protocols and procedures described in their quality system documents (quality assurance plan
and standard operating procedures).
5.5 This guide does not address multi-laboratory approaches to demonstrating acceptable laboratory performance such as
collaborative testing, inter-laboratory studies, or round-robin types of studies.
6. Outline of Approach
6.1 This guide uses the concepts of bias and precision to describe uncertainty in a measurement system. The approach set forth
in this guide employs two fundamental properties of measurement systems: bias and precision to determine the quality of the
analytical results. The guide singles out selectivity, a component of bias, for special emphasis. Sensitivity is also discussed since,
unless a measurement system is sensitive enough to measure the analytes of interest at the level of interest, it is not capable of being
used for the purpose at hand. Both areas are frequently highlighted for demonstration in acceptable environmental measurement
collection efforts.
6.2 This guide provides examples of approaches that determine bias, precision, selectivity, and sensitivity of a measurement
system used to analyze a set of samples. It also provides examples of factors laboratories should consider in designing the
demonstration.
6.3 This guide describes, in general terms, the rigor of the demonstration of bias, precision, selectivity, and sensitivity that
should be conducted for a set of samples. It describes the appropriate use of public literature and historical laboratory performance
information to minimize the need to collect additional experimental measurements.
6.4 When analytical performance results are already available on the measurement system’s response to the type of sample to
be analyzed (for example, historical results from the laboratory conducting the demonstration, method developer information),
such information may be used to determine one or more of the measurement properties (that is, bias, precision, selectivity,
sensitivity). Only very limited amounts of new measurements would then be necessary to support the conclusions drawn from the
existing information.
6.5 This guide is intended to offer users a technically defensible strategy to determine the applicability of an analytical technique
to a set of environmental samples. The complexity of the problem, the available resources (trained staff, equipment, and time), and
the intended use of the analytical results require the application of professional judgment in selecting the best available option to
meet the project-specific needs. The following sections present the user with a variety of options to determine bias, precision,
selectivity, and sensitivity. The discussion of these options does not recommend one over another. However, there are general
principles that can assist the user in selecting an appropriate option.
6.6 The laboratory should select the available option that will provide the information needed to determine if the measurements
meet the required level of quality (as defined by the user/decision maker). The necessary level of quality should be available from
the project data quality requirements, DQOs or MQOs. This guide assumes that the laboratory and users have sufficient familiarity
(or access to qualified individuals) that can balance the trade-offs associated with the MQOs, such that rigid standards are not
applied but rather the pooled effect (overall analytical uncertainty) of all items affecting measurement usability (bias, precision,
selectivity, sensitivity) are considered. The following options are ranked from the most reliable (Option 1) to the least reliable
(Option 4) and should be considered in light of the overall project goals. This guide does not purposepropose a specific set of
procedural steps because each case is different and must be addressed by a consensus process involving appropriate representatives
from the stakeholders.
6.6.1 Option 1—The most certainty in showing that a measurement system is free of unacceptable bias is obtained when the
measurement system is shown to yield the same results as another system that employs a fundamentally different measurement
principle. The likelihood is small that two analytical techniques will experience the same systematic errors and will be subject to
the same types of chemical and physical interferences. If two such analytical techniques agree, the possibility of unknown
systematic errors is substantially decreased. Therefore, showing that a different measurement technique yields the same results as
D6956 − 17
the subject technique serves to validate the ability of the subject system to yield valid measurements. If the two techniques
disagree, there is a possibility of systematic or random error in one or both techniques.
6.6.2 Option 2—The next lower level of certainty is obtained by determining the bias, precision, sensitivity, and selectivity of
the candidate measurement system using reference materials provided by NIST, or some other appropriate national certifying
authority (for example, Standards Canada, DIN). Such reference materials would have been confirmed by the use of multiple
methods, each using a different analytical principle. Comparison of the test results from new methods with published reference
values on such materials can be used to determine measurement system bias. Commercially produced reference materials may also
be used, but the true values are usually developed using only one (sometimes two) analytical technique(s). The reliable use of
reference standards is extremely sensitive to the degree that the reference materials have the same matrix/analyte physical
properties and chemistry as the project samples. If the match of the properties between the project samples and the reference
materials is poor, the study results can be misleading.
6.6.3 Option 3—The lack of availability of more than one analytical method (no alternative technology or resources) or of
appropriate reference materials will prevent use of the techniques mentioned above. When this is the case, the use of matrix spikes
and surrogates becomes the “best available technology” and can be a reliable option. As in all analytical studies, the analyst must
support conclusions with scientific rationale, including the statistical basis of the number of samples analyzed, the evaluation of
experimental measurements, and the limitations of the study.
6.6.3.1 Inorganic Matrix Spikes—While matrix spikes can be a valuable tool in demonstrating the validity of the measurement,
the uncertainty associated with the chemical form of metals in the sample and the mechanism by which it is incorporated into the
sample matrix diminishes the value of this technique compared to the previous two mentioned above. In general, matrix spikes
are made from known amounts of the compounds or elements (most often in solution) added to the project sample. The form of
the target metal in the sample matrix is unlikely to be the same as the form of the target metal in the spiking material. This may
lead to a high recovery of the spiked material (because it’s in a readily soluble form) compared to the recovery of the target metal
originally present in the matrix. This could lead to the erroneous conclusion that the proposed method is efficient in recovering and
quantitating the target analytes in the sample.
6.6.3.2 Organic Matrix Spikes—Matrix spikes of organic compounds suffer from similar limitations based on the degree and
type of association between the target organic analyte and the sample matrix. In addition, the spiking vehicle (for example, solvent)
must be compatible with the matrix to get the spike distributed properly into the matrix. Most field samples are “aged” and the
analytes may become much more intimately associated with the matrix than the spiking compounds which are only in contact with
the matrix for very short periods of time prior to extraction and isolation for analysis.
6.6.3.3 Surrogates—The use of surrogates (used as a measure of analyte recovery of an analytical process) is a reliable means
of demonstrating that the analytical technique is being performed correctly when their recoveries are high and within the
statistically defined variance normally associated with their use. Calculation of surrogate recovery can be performed using either
the reported concentration of the surrogate or the total response (peak area or height) of the analytical signal. This technique suffers
from the same limitations as discussed above with matrix spikes. Additionally, more uncertainty is introduced if materials selected
as surrogates do not perform in the same manner as the target analyte in the sample matrix. The use of compounds outside the list
of those normally used in the determination of the target analytes must be preceded by studies demonstrating that the chosen
compounds have a clearly defined correlation with the target analytes. The use of surrogates determines method performance
compared to historical levels (developed from statistically derived acceptance criteria). This option does not determine the ability
of a method to return the true value of analyte in the matrix since it does not involve the target analytes.
6.7 Option 4, Use of Historical Analytical Results—Performing additional studies may not be necessary to show that the
proposed analytical protocol is appropriate unless required by the user/decision maker. In some instances, historical analytical
results alone or in combination with abbreviated studies will suffice. The user should be informed of the laboratory’s plan to use
historical data to support the project. The user may elect to have the actual bias, precision, or sensitivity evaluated experimentally.
Proprietary information or confidential information should not be used because review and evaluation may not be possible.
Examples of the use of prior studies include but are not limited to:
6.7.1 Use of an extensive database on the performance (that is, bias, precision, sensitivity) of the candidate measurement system
on project samples.
6.7.2 Validation that the measurement system was rugged/robust and the bias, precision sensitivity, and selectivity of the
measurement system are well documented in available literature or reports for the analyte/matrix combination of interest.
6.7.3 The sample matrix of concern (for example, clay soils) is similar to other samples that the laboratory is familiar with and
has historical analytical results, requiring only abbreviated tests to verify applicability such as performing a limited number of
spike additions to splits of field samples.
6.7.4 The sample matrix and analytes are relatively simple (for example, drinking water, water from a clean surface stream) and
bias, precision, and sensitivity analytical results on the application of the measurement system to the analyte/matrix exist in the
literature.
6.8 Many inorganic and organic analyses rely on a sample preparation method prior to the determinative method to isolate the
analytes of concern from the matrix. The use of new or modified preparative techniques is a viable way to achieve project
objectives. The use of any preparative steps must be fully evaluated using the above options.
D6956 − 17
6.9 The subsequent sections discuss the application of these techniques to the demonstration of the bias, precision, selectivity,
and sensitivity in more detail. In many cases, the strengths and weaknesses of the techniques are explained for the individual
application.
7. Bias
7.1 Definition of Bias—Bias is the difference between the value determined using the measurement system in question and the
true value; operationally, the difference between the sample mean and an accepted true value. Bias can be negative or positive (that
is, the average of the measured values can be less than or more than the true value, respectively). Bias can be expressed in two
ways: absolute bias (for example, the bias is −2 mg/L), or percent bias (for example, bias is +20 %). Method selectivity is an
important element of analytical bias. Because of its importance, it is discussed separately in Section 10.
7.2 Demonstration of Bias—Ideally, the user will define the question to be answered by the information gathering study and the
level of uncertainty that is acceptable (the DQO). Alternatively, the user may specify an acceptable level or range of bias (for
example, a range of 20 % of the true concentration) for the laboratory to achieve. Through the use of the techniques described
below, the laboratory determines the bias (if any) of the measurement system (including both the analytical technique and the
operator in the matrix representative of those encountered in the project). This performance is then compared to the project MQOs.
7.3 Guidance on Demonstration of Bias—Demonstration of bias may be made through the conduct of new bias studies, the use
of historical analytical results, or by some combination.
7.3.1 Conduct of New Bias Studies—There are four generally accepted techniques available for determining the bias of a
measurement system. In the order of technical defensibility, these are:
7.3.1.1 Analysis of split samples using both the method to be verified and a second method that employs a fundamentally
different measurement principle,
7.3.1.2 Analysis of a reference material (RM) whose matrix is analytically representative of the samples and contains the analyte
at a concentration appropriate to the study,
7.3.1.3 Analysis of split samples using the method to be verified and a different but similar method, of known variability, that
has been validated for the application by a recognized methods certification organization (for example, U.S. Environmental
Protection Agency [EPA],(EPA), ASTM, ISO, American Public Health Association) for the analytes of concern in the matrix of
concern, and
7.3.1.4 Analysis of matrix spike samples.
7.3.2 The user is cautioned that the design of the experiments and number of replicates necessary to determine bias may not
be a trivial exercise. Careful consideration must be given to the estimated level of target analytes, method sensitivity, and the
presence of interferences. The design of the experiments must make appropriate use of statistical techniques to ensure that project
objectives are met.
7.3.3 The choice among options depends on the available RMs, the number of viable analytical techniques, the available spiking
materials, and the complexity of the sample matrix and its constituents. Each of these options is discussed in more detail in the
following sections.
7.3.3.1 Option 1, Reference Materials—Under this approach, samples of a RM are analyzed and the results compared to the
known amount of the analyte (that is, the certified amount). The difference between the average analysis results and the known
analyte concentration is the bias. Performing bias studies with RMs is useful if the field samples being tested are in relatively
well-defined matrices (for example, tap water, coal fly ash). When matrices become complex (for example, soils which can be
combinations of clays, silts, sands, organic matter) RMs may have limited value because they may not closely resemble the field
samples. Similarly, when the contaminant mix is complex (for example, numerous compounds with similar chromatographic
behavior to the compound being sought) RMs may be of limited value because of interferences (see discussion on selectivity in
Section 10). Performing RM and spike tests may not accurately characterize and measure the analytes in the field sample because
RMs are unlikely to contain the same number and concentration of individual compounds present in the original sample. Finally,
RMs are not available for many types of analyte/matrix combinations.
7.3.3.2 Option 2, Comparison to Alternative Measurement Technique Using a Fundamentally Different Technique—Another
approach to determine bias is the comparison of the analytical results from a candidate measurement system with those of an
alternative measurement system that uses a fundamentally different science. The second technique should be recognized in the
available literature as being applicable to the problem. Multiple measurement systems based on different scientific principles are
unlikely to be subject to the same types of interferences and other problems. Therefore, when the same results are obtained using
different methods, a high degree of confidence can be attached to the results. It should be pointed out that for the alternative
technique approach to be scientifically valid, it is important that not only the determinative step be changed but also any preparative
steps to ensure that the preparative step is not the accuracy limiting step.
7.3.3.3 Option 3, Comparison to a Recognized Reference Method—Another approach to determine bias is to compare the
analytic results from the candidate measurement system to those of an alternative measurement system that has been validated for
the application by a recognized methods certification organization (for example, EPA, ASTM, ISO, and American Public Health
Association). To use this approach, the field sample is split and the splits are analyzed using both measurement systems. Similar
results using both methods can be used to determine a lack of bias on the part of the subject method. Statistical analysis should
D6956 − 17
be conducted on the two sets of results to determine whether the two methods yield significantly different results. If the two
methods do not give the same results (no significant difference statistically), then additional testing will be necessary to determine
the lack of bias or to determine the level of bias.
7.3.3.4 Option 4, Matrix Spikes—In this approach, known quantities of the analyte of concern are added to one or more aliquots
of the field samples, the samples are analyzed, and the results are compared to the amount of added spike. The level of the spike
should be close to the concentration of analyte anticipated to be in the field sample (for example, if the field sample is analyzed
at 10 mg/L of the analyte of concern, then the spike should ideally also be near 10 mg/L ). mg/L). If too little of the analyte is
used for the spiking, its presence may be masked. Masking occurs when the difference between the amount of added spike and
its measured response is within normal analytical variance of the amount present in the original sample. If too much is used, the
spike can mask the effect of interfering compounds originally present because the analytical variance of the measured response of
the spiked sample exceeds the signal of the analyte in the original sample. For these reasons, it is important that the amount of
added spike should be based on the estimated value of the target analyte after the field sample has been diluted to fall within the
calibration range of the analytical method. When dilution of the field sample is required, the correct amount of spike should be
added after the sample has been diluted to the correct range. Each of the spiked samples is then analyzed using the candidate
measurement system. The average of the results of such analysis (for example, 22 mg/L) is compared with the results of
measurement of an unspiked sample (that is, 1010 mg mg/L). ⁄L). The arithmetic differences between the unspiked and the spiked
sample average (22−10 or 12 mg/L) are compared to the known amount of the spike (10 mg/L). The amount of the spike that is
recovered ( ⁄10 or 120 %) indicates the bias is a positive 20 %. Where spiking is done properly and the physical and chemical
properties of the sample are simple, the matrix spiking technique can produce an accurate measure of bias. For spiking to be valid,
it should be performed using the actual sample matrix and mix of target analytes.
8. Precision
8.1 Definition of Precision—A measure of the scatter of measurement system test results obtained from samples that are
ostensibly the same (for example, taken at the same time and location or from the same container).
8.2 Demonstration of Precision—Precision is determined by measuring the scatter or variability of the measurements resulting
from replicate measurements of the same material. The desired level of precision should be specified by the user. It usually takes
the form of an acceptable measurement system variability, for example, 10 % relative standard deviation (RSD) or the range of
the average that equates to a specified degree of confidence (for example, true value lies within the range X¯ 6 3σ where σ is the
standard deviation and the desired level of confidence is 99 %). It is important that the demonstration of precision be determined
at the project action level (AL). The precision of most analytical techniques decreases when the concentration of the analyte
decreases in the samples. Failure to match the demonstration to the action level will lead to an incorrect estimate of precision where
it is most important, the action level.
8.3 Guidance on Demonstration of Precision—Precision may be determined by new precision studies, the use of historical
analytical results from prior studies, the measured variability of the project samples, or analysis of laboratory control samples that
are representative of the analyte concentration and matrix of concern. The following are examples of approaches that may be used
to determine and document precision.
8.3.1 Project Samples—Analysis of multiple samples of project material (for example, a series of effluent or waste samples
taken over a period of time, a collection of soil samples taken from various points at a site, a series of hourly air samples)
containing the analyte of interest will determine overall project-specific precision. Additionally, when the analytical results are
obtained under a statistical design, the data can be analyzed using analysis of variance techniques to decompose the total variance
into components due to sample variability and the variability (precision) of the measurement system. Sample variability may be
composed of variance between field samples, subsampling variance, and differences in sample preparation. Note that this approach
cannot be used to determine the precision of the measurement system alone (see 8.3.4) since it measures the total variability, which
consists of the variance of the field sampling procedure (if one was necessary), and the variance of the measurement system. A
major benefit of this approach is that it may eliminate the need to determine measurement system precision if the overall variability
(sample preparation + measurement system + sample) is low enough to meet the study MQO/DQO. The use of this technique
assumes that the samples submitted for evaluation adequately represent the variability of the actual materials being evaluated.
8.3.2 Matrix Spikes—Measurement system precision can be determined by the analysis of replicate matrix spike samples. The
matrix spike is composed of analytes added to samples in known quantities and analyzed to assess the variability in recovery of
the analyte due to the sample preparation and analytical steps. The matrix spike is added as early in the process as possible to
ensure that as many sources of variability as possible can be evaluated. This means that matrix spikes should be added prior to
any sample preparation and cleanup steps. While this approach accounts for matrix specific effects, problems associated with
spiking can lead to the measured precision being better (that is, lower RSD) than it actually is. See discussion on problems
associated with spiking in 6.6.3 for further information. One benefit of this approach is that precision can often be assessed without
having to conduct additional analyses when spiked samples are also being used to determine analytical bias (see Section 6).
8.3.3 Surrogates—Surrogates are compounds that perform in a similar manner to the analytes of interest in the analytical
procedure but are not naturally present in the samples analyzed. Surrogates are added to each sample prior to sample preparation
(or when specified in the method). The percentage recovery monitors the extraction efficiency and any unusual matrix effects. The
D6956 − 17
variability of surrogate recovery from multiple samples measures the precision of the measurement system at the surrogate
concentration being used. This approach can be used if the analyte of interest is not commercially available or is too dangerous,
toxic, or is unstable (that is, has a poor shelf life).
8.3.4 Reference Materials or Laboratory Control Samples (LCS)—These materials, normally used to ensure that the laboratory
is operating in control, can also be used to assess measurement system precision. Such materials should be selected to provide a
sample with analytically similar properties to that of the actual samples to be analyzed (matrix and concentrations similar to project
samples). Reference materials and LCSs evaluate the precision of the entire measurement system including the sample preparation,
cleanup, and determinative steps. If needed, this approach can be used to determine the precision of the determinative step alone
as long as no preparation or other steps are required before the determinative step. When using reference materials whose certified
analyte values and precision were obtained using a method that is different from the subject method, the precision obtained from
the candidate method may be different from the certified precision. This difference indicates bias (see Section 7) between the two
methods. The use of laboratory control samples as an indicator of laboratory precision is inappropriate if the sample matrix is much
more complex than the matrix of the LCS.
8.4 Use of Prior Studies—Performing actual precision studies may not always be necessary, unless required by the user.
Analytical results from historical files can be used if they cover the matrix of concern and are available for review and evaluation.
In many instances, historical precision demonstrations alone or in combination with abbreviated studies will suffice.
9. Sensitivity
9.1 Definition of Sensitivity—Ability of the measurement system to yield valid measurements at the level of interest in the
samples of concern.
9.2 Demonstration of Sensitivity—Sensitivity is determined by showing that the measurement system can measure the substance
of interest at the level of interest in the matrix of concern.
9.3 Guidance on Demonstration of Sensitivity—Demonstration of adequate measurement system sensitivity may be made
through the conduct of new sensitivity determinations or through the evaluation of historical analytical results combined with a
verification of the test. Adequate sensitivity can be determined in a number of ways. The most valid approach is to analyze a matrix
spike or reference material that contains the analyte of interest at a level 0.2 to 0.5 times the level of interest and confirm the
measurement sensitivity. If the objective of the analysis is to determine if the samples contain 10 ppm or more of a particular
analyte, then demonstrating acceptable measurement system accuracy at a concentration of 2 to 5 ppm in the matrix of concern
determines that the measurement system has adequate sensitivity for the purpose at hand.
9.3.1 Alternative approaches to demonstrating sensitivity by either identifying the presence or determining the concentration of
the analyte of concern (whichever is germane to the situation at hand) include the following, if they are conducted in the matrix
of interest.
9.3.1.1 Level of detection determination:
(1) Critical Level Approach (Currie, 1968),
(2) Detection Limit Approach (Hubaux and Vos, 1970),
(3) Decision Limit Approach using Noncentral t-Distribution (Clayton et al., 1987),
(4) U.S. EPA Method Detection Limit Approach (40 CFR 136, Appendix B), and
(5) Weighted Least Squares Approach (Gibbons et al., 1997).
9.3.1.2 Level of quantitation determination.
9.3.2 Results from bias studies conducted at the concentration of interest can also be used to determine adequate system
sensitivity if a response significantly above the baseline is noted.
9.4 The level of effort that should be used to determine adequate sensitivity is dependent upon several factors, including but not
limited to:
9.4.1 The closeness that the estimated project sample concentration is to the known system sensitivity,
9.4.2 The variability of the sample matrix (presence of interference), and
9.4.3 Stability of the measurement system over the period of analysis.
9.4.4 For example, if the question being addressed is whether the discharge concentration is below the permit level of 1 mg/L,
then the demonstration only has to show that the system can yield accurate results at a level of 1 mg/L (this usually means showing
that reliable quantitation is achieved at 0.2 to 0.5 mg/L).
10. Selectivity
10.1 Definition of Selectivity—The ability to accurately measure the analyte in the presence of other sample matrix components
or analytical process reagents. Selectivity, or the lack thereof, is one aspect of bias in analytical measurements. In this guide,
selectivity is discussed as a separate subject because of its importance in developing accurate analytical information to support
project decisions. See Section 6 for a complete discussion of bias.
10.2 Demonstration of Selectivity:
10.2.1 Selectivity can usually be adequately determined by various approaches. These include the following:
D6956 − 17
10.2.1.1 Method of standard additions as found in 10.5.1,
10.2.1.2 Comparison of results of a test in the concentration range of interest of samples with and without potential
interferences,
10.2.1.3 Comparison to the results of a test that uses a different measurement principle, or
10.2.1.4 Demonstrating that potential interferences do not adversely affect the decision resulting from the use of the
measurements.
10.2.2 For example, a successful demonstration of selectivity might use samples of the matrix that contain the same potential
interferences that are present in actual samples to determine that the measurement system is free from:
10.2.2.1 Overlapping chromatographic peaks in the region where the compound of interest elutes,
10.2.2.2 Interfering ions in mass spectrum,
10.2.2.3 Overlapping spectral peaks in emission spectrometry,
10.2.2.4 Interfering absorbances in absorption spectrometry, and
10.2.2.5 Cross sensitivity in immunoassay.
10.3 Selectivity is not an absolute. This means that interfering substances can affect the reported results while the measurement
system is still considered suitable for its intended purpose. The interferences do not affect the usability of the measurements if the
sum of both the target analyte and the absolute value of the interfering substance are below the action level for the project (this
can happen in chromatographic methods when the interferents co-elute with the target analyte). Spectroscopic methods can have
spectral inferences (emission, absorbance) in the region where the target analyte is measured. The method is suitable for its
intended use if the emission lines or spectral absorbance do not cause the measured value of the target analyte to be outside the
project-specific acceptability levels.
10.4 When demonstrating the effect of potential interferences, the direction of the likely interference needs to be taken into
account. Negative interferences decrease the measured response of the target analyte and lead to a result that is biased low. Positive
interferences increase the measured response and lead to a result that is biased high. Depending on the use of the measurements,
one type of interference may be allowable while the other may not. The project stakeholders should provide the laboratory with
any limitations on the allowable level of both positive and negative interferences. In the absence of such guidance, the laboratory
must document the level of interference found.
10.5 Guidance on Demonstration of Selectivity—Demonstration of freedom from interferences may be determined in a number
of ways. These include but are not limited to:
10.5.1 Use of Method of Standard Addition—The method of standard additions can determine selectivity for the target analytes
in samples whose matrices are unknown or have not been characterized previously. After the sample matrix (believed to contain
no or low levels of target analytes) has been through the sample preparation step, a series of spikes areis added to the prepared
matrix solution. The plot of the concentration of added analyte versus measured concentration determines the presence of bias if
the slope of the linear regression best fit line deviates from the theoretical value of 1.0 for no interference. Values less than one
indicate a negative bias; values greater than one indicate a positive bias. The absence of bias is indicated if the slope is 1.0. Straight
line linear regression solutions occur when there is a one-to-one correlation between the biasing agent and the target analyte.
Curvilinear plots occur when there are more than two interferents with different levels of influence or when the effect of a single
interferent is nonlinear.
10.5.2 Demonstrating Absence of Known Interferents—The absence of known interferences can be determined by the direct
analysis of the sample for the interferents or by spiking the sample with known amounts of the potential interferents and measuring
their effect on the target analytes. The use of the second approach is restricted to cases where the quantitative correlation of bias
versus interferent concentration is known. This is necessary for the back calculation of the interferent concentration based on the
amount of bias determined by the addition of the interferent spikes to sample.
10.5.3 Verification of Results Using Fundamentally Different Measurement System—Analytical techniques that are fundamen-
tally different from the proposed measurement system can be used to determine that the proposed system is free from interferences.
This is because fundamentally different analytical techniques have differing responses to potential interferences. If the two
techniques give the same quantitative results, it is likely that the target analytes are free from interferences by both techniques. It
is important that comparisons be conducted on aliquots of the same sample material as the project
...








Questions, Comments and Discussion
Ask us and Technical Secretary will try to provide an answer. You can facilitate discussion about the standard in here.
Loading comments...