Standard Practice for Calibration/Verification of Linear Displacement Transducers for Geotechnical Purposes

SIGNIFICANCE AND USE
5.1 The displacement transducer plays an important role in geotechnical applications to measure change in dimensions of specimens.  
5.2 The displacement transducer must be calibrated/verified for use in the laboratory to ensure reliable conversions of the sensor's electrical output to engineering units.  
5.3 The displacement transducer should be calibrated/verified before initial use, at least annually thereafter, after any change in the electronic configuration that employs the sensor, after any significant change in test conditions using the transducer that differ from conditions during the last calibration/verification, and after any physical action on the transducer that might affect its response.  
5.4 Displacement transducer generally has a working range within which voltage output is linearly proportional to displacement of the transducer. This procedure is applicable to the linear range of the transducer. Recommended practice is to use the displacement transducer only within its linear working range.
Note 1: Verification as in Practices E2309/E2309M should not be confused with calibration
SCOPE
1.1 This practice outlines the procedure for calibration/verification of displacement transducers and their readout systems for geotechnical purposes. It covers any transducer used to measure displacement, which gives an electrical output that is linearly proportional to displacement. This includes linear variable displacement transducers (LVDTs), linear displacement transducers (LDTs) and linear strain transducers (LSTs).  
1.2 This calibration/verification procedure is used to determine the relationship between output of the transducer and its readout system and change in length. This relationship is used to convert readings from the transducer readout system into engineering units.  
1.3 This calibration/verification procedure also is used to determine the accuracy of the transducer and its readout system over the range of its use to compare with the manufacturer’s specifications for the instrument and the suitability of the instrument for a specific application.  
1.4 Units—The values stated in either SI units or inch-pound units given in brackets are to be regarded separately as the standard. The values stated in each system may not be exact equivalents; therefore, each system shall be used independently of the other. Combination values from the two systems may result in non-conformance with standard.  
1.5 All observed and calculated values shall conform to the guidelines for significant digits and rounding established in Practice D6026 unless superseded by this standard.  
1.5.1 The procedures used to specify how data are collected, recorded or calculated in this standard are regarded as the industry standard. In addition they are representative of the significant digits that generally should be retained. The procedures used do not consider material variation, purpose for obtaining the data, special purpose studies, or any consideration for the user’s objectives; it is common practice to increase or reduce significant digits of reported data to be commensurate with these considerations. It is beyond the scope of this standard to consider significant digits used in analytical methods for engineering design.  
1.6 This practice offers a set of instructions for performing one or more specific operations. This standard cannot replace education or experience and should be used in conjunction with professional judgment. Not all aspects of this practice may be applicable in all circumstances. This ASTM standard is not intended to represent or replace the standard of care by which the adequacy of a given professional service must be judged, nor should this document be applied without consideration of a project’s many unique aspects. The word “standard” in the title of this document means only that the document has been approved through the ASTM consensus process.  
1.7 This standa...

General Information

Status
Published
Publication Date
31-Dec-2023
Technical Committee
D18 - Soil and Rock

Relations

Effective Date
01-Jan-2024
Effective Date
01-Jan-2024
Effective Date
01-Jan-2024
Effective Date
01-Jan-2024
Effective Date
01-Jan-2024
Effective Date
01-Jan-2024
Effective Date
01-Jan-2024

Overview

ASTM D6027/D6027M-24 is the internationally recognized standard practice for calibration and verification of linear displacement transducers used in geotechnical applications. Published by ASTM International, this standard outlines procedures to ensure the accuracy and reliability of transducers that convert linear displacement into electrical signals. Such calibration and verification are essential for converting sensor output into precise engineering units, supporting reliable testing and data collection in the analysis of soil and rock specimens. The standard applies to devices such as Linear Variable Displacement Transducers (LVDTs), Linear Displacement Transducers (LDTs), and Linear Strain Transducers (LSTs).

Key Topics

  • Purpose of Calibration/Verification
    Calibration and verification establish the precise relationship between a transducer's electrical output and the physical displacement measured. This ensures trustworthy conversion to engineering units and is critical before initial use, after significant system modifications, or at prescribed intervals.

  • Range and Linearity
    Displacement transducers typically operate within a linear range where output is directly proportional to displacement. The standard recommends using transducers only within this linear working range.

  • Procedures
    The document details two principal calibration methods:

    • Using precision gauge blocks traceable to national or international standards
    • Utilizing a calibrated micrometer fixture

    Both methods require controlled environments to minimize errors due to temperature or electronic instability.

  • Frequency of Calibration/Verification
    Transducers should be calibrated or verified:

    • Before first use
    • At least annually
    • After any system changes
    • When test conditions deviate significantly from the previous calibration
    • After potential physical impacts affecting performance
  • Calculation of Error and Accuracy
    Procedures are given for determining calibration factors, linearity, percent error, and ensuring results meet specified tolerances for geotechnical testing.

  • Documentation and Reporting
    Clear records must be maintained, including calibration methods, equipment identifiers, environmental conditions, and detailed results, supporting traceability and compliance.

Applications

  • Geotechnical Testing Laboratories
    Used extensively in laboratories measuring deformations in soil and rock, ensuring accurate data for engineering analysis, construction quality control, and research.

  • Instrumentation Verification
    Critical for agencies involved in geotechnical instrumentation, calibration per ASTM D6027/D6027M-24 supports the ongoing accuracy of sensors in various test setups.

  • Project Compliance
    Agencies fulfilling project-specific requirements benefit from regular calibration, as many engineering standards demand traceable, reliable instrumentation data for certification and reporting.

  • Maintenance of Equipment
    The standard guides routine and event-based calibration, thereby extending the service life of transducers and preventing costly measurement errors.

Related Standards

  • ASTM D6026
    Practice for Using Significant Digits and Data Records in Geotechnical Data

  • ASTM D653
    Terminology Relating to Soil, Rock, and Contained Fluids

  • ASTM D3740
    Practice for Minimum Requirements for Agencies Engaged in Testing and/or Inspection of Soil and Rock

  • ASTM E2309/E2309M
    Practices for Verification of Displacement Measuring Systems and Devices Used in Material Testing Machines

Practical Value

Implementing ASTM D6027/D6027M-24 ensures that linear displacement transducers used in geotechnical analysis provide reliable, accurate, and traceable measurements. These practices minimize measurement uncertainty and risk, directly impacting the safety, quality, and compliance of geotechnical engineering projects. Regular calibration and verification as prescribed in this standard is fundamental for any laboratory or organization involved in the testing, analysis, or certification of soil and rock, supporting sound engineering decisions and regulatory requirements.

Keywords: calibration, displacement transducer, geotechnical, verification, LVDT, instrumentation accuracy, ASTM D6027/D6027M-24, laboratory practice.

Buy Documents

Standard

ASTM D6027/D6027M-24 - Standard Practice for Calibration/Verification of Linear Displacement Transducers for Geotechnical Purposes

English language (10 pages)
sale 15% off
sale 15% off
Standard

REDLINE ASTM D6027/D6027M-24 - Standard Practice for Calibration/Verification of Linear Displacement Transducers for Geotechnical Purposes

English language (10 pages)
sale 15% off
sale 15% off

Get Certified

Connect with accredited certification bodies for this standard

BSMI (Bureau of Standards, Metrology and Inspection)

Taiwan's standards and inspection authority.

TAF Taiwan Verified

Sponsored listings

Frequently Asked Questions

ASTM D6027/D6027M-24 is a standard published by ASTM International. Its full title is "Standard Practice for Calibration/Verification of Linear Displacement Transducers for Geotechnical Purposes". This standard covers: SIGNIFICANCE AND USE 5.1 The displacement transducer plays an important role in geotechnical applications to measure change in dimensions of specimens. 5.2 The displacement transducer must be calibrated/verified for use in the laboratory to ensure reliable conversions of the sensor's electrical output to engineering units. 5.3 The displacement transducer should be calibrated/verified before initial use, at least annually thereafter, after any change in the electronic configuration that employs the sensor, after any significant change in test conditions using the transducer that differ from conditions during the last calibration/verification, and after any physical action on the transducer that might affect its response. 5.4 Displacement transducer generally has a working range within which voltage output is linearly proportional to displacement of the transducer. This procedure is applicable to the linear range of the transducer. Recommended practice is to use the displacement transducer only within its linear working range. Note 1: Verification as in Practices E2309/E2309M should not be confused with calibration SCOPE 1.1 This practice outlines the procedure for calibration/verification of displacement transducers and their readout systems for geotechnical purposes. It covers any transducer used to measure displacement, which gives an electrical output that is linearly proportional to displacement. This includes linear variable displacement transducers (LVDTs), linear displacement transducers (LDTs) and linear strain transducers (LSTs). 1.2 This calibration/verification procedure is used to determine the relationship between output of the transducer and its readout system and change in length. This relationship is used to convert readings from the transducer readout system into engineering units. 1.3 This calibration/verification procedure also is used to determine the accuracy of the transducer and its readout system over the range of its use to compare with the manufacturer’s specifications for the instrument and the suitability of the instrument for a specific application. 1.4 Units—The values stated in either SI units or inch-pound units given in brackets are to be regarded separately as the standard. The values stated in each system may not be exact equivalents; therefore, each system shall be used independently of the other. Combination values from the two systems may result in non-conformance with standard. 1.5 All observed and calculated values shall conform to the guidelines for significant digits and rounding established in Practice D6026 unless superseded by this standard. 1.5.1 The procedures used to specify how data are collected, recorded or calculated in this standard are regarded as the industry standard. In addition they are representative of the significant digits that generally should be retained. The procedures used do not consider material variation, purpose for obtaining the data, special purpose studies, or any consideration for the user’s objectives; it is common practice to increase or reduce significant digits of reported data to be commensurate with these considerations. It is beyond the scope of this standard to consider significant digits used in analytical methods for engineering design. 1.6 This practice offers a set of instructions for performing one or more specific operations. This standard cannot replace education or experience and should be used in conjunction with professional judgment. Not all aspects of this practice may be applicable in all circumstances. This ASTM standard is not intended to represent or replace the standard of care by which the adequacy of a given professional service must be judged, nor should this document be applied without consideration of a project’s many unique aspects. The word “standard” in the title of this document means only that the document has been approved through the ASTM consensus process. 1.7 This standa...

SIGNIFICANCE AND USE 5.1 The displacement transducer plays an important role in geotechnical applications to measure change in dimensions of specimens. 5.2 The displacement transducer must be calibrated/verified for use in the laboratory to ensure reliable conversions of the sensor's electrical output to engineering units. 5.3 The displacement transducer should be calibrated/verified before initial use, at least annually thereafter, after any change in the electronic configuration that employs the sensor, after any significant change in test conditions using the transducer that differ from conditions during the last calibration/verification, and after any physical action on the transducer that might affect its response. 5.4 Displacement transducer generally has a working range within which voltage output is linearly proportional to displacement of the transducer. This procedure is applicable to the linear range of the transducer. Recommended practice is to use the displacement transducer only within its linear working range. Note 1: Verification as in Practices E2309/E2309M should not be confused with calibration SCOPE 1.1 This practice outlines the procedure for calibration/verification of displacement transducers and their readout systems for geotechnical purposes. It covers any transducer used to measure displacement, which gives an electrical output that is linearly proportional to displacement. This includes linear variable displacement transducers (LVDTs), linear displacement transducers (LDTs) and linear strain transducers (LSTs). 1.2 This calibration/verification procedure is used to determine the relationship between output of the transducer and its readout system and change in length. This relationship is used to convert readings from the transducer readout system into engineering units. 1.3 This calibration/verification procedure also is used to determine the accuracy of the transducer and its readout system over the range of its use to compare with the manufacturer’s specifications for the instrument and the suitability of the instrument for a specific application. 1.4 Units—The values stated in either SI units or inch-pound units given in brackets are to be regarded separately as the standard. The values stated in each system may not be exact equivalents; therefore, each system shall be used independently of the other. Combination values from the two systems may result in non-conformance with standard. 1.5 All observed and calculated values shall conform to the guidelines for significant digits and rounding established in Practice D6026 unless superseded by this standard. 1.5.1 The procedures used to specify how data are collected, recorded or calculated in this standard are regarded as the industry standard. In addition they are representative of the significant digits that generally should be retained. The procedures used do not consider material variation, purpose for obtaining the data, special purpose studies, or any consideration for the user’s objectives; it is common practice to increase or reduce significant digits of reported data to be commensurate with these considerations. It is beyond the scope of this standard to consider significant digits used in analytical methods for engineering design. 1.6 This practice offers a set of instructions for performing one or more specific operations. This standard cannot replace education or experience and should be used in conjunction with professional judgment. Not all aspects of this practice may be applicable in all circumstances. This ASTM standard is not intended to represent or replace the standard of care by which the adequacy of a given professional service must be judged, nor should this document be applied without consideration of a project’s many unique aspects. The word “standard” in the title of this document means only that the document has been approved through the ASTM consensus process. 1.7 This standa...

ASTM D6027/D6027M-24 is classified under the following ICS (International Classification for Standards) categories: 17.020 - Metrology and measurement in general. The ICS classification helps identify the subject area and facilitates finding related standards.

ASTM D6027/D6027M-24 has the following relationships with other standards: It is inter standard links to ASTM D6027/D6027M-15, ASTM D8259/D8259M-21, ASTM D4546-21, ASTM D8292-20, ASTM D2435/D2435M-11(2020), ASTM D3080/D3080M-23, ASTM D4729-19. Understanding these relationships helps ensure you are using the most current and applicable version of the standard.

ASTM D6027/D6027M-24 is available in PDF format for immediate download after purchase. The document can be added to your cart and obtained through the secure checkout process. Digital delivery ensures instant access to the complete standard document.

Standards Content (Sample)


This international standard was developed in accordance with internationally recognized principles on standardization established in the Decision on Principles for the
Development of International Standards, Guides and Recommendations issued by the World Trade Organization Technical Barriers to Trade (TBT) Committee.
Designation: D6027/D6027M − 24
Standard Practice for
Calibration/Verification of Linear Displacement Transducers
for Geotechnical Purposes
This standard is issued under the fixed designation D6027/D6027M; the number immediately following the designation indicates the
year of original adoption or, in the case of revision, the year of last revision. A number in parentheses indicates the year of last
reapproval. A superscript epsilon (´) indicates an editorial change since the last revision or reapproval.
1. Scope* ation for the user’s objectives; it is common practice to
increase or reduce significant digits of reported data to be
1.1 This practice outlines the procedure for calibration/
commensurate with these considerations. It is beyond the scope
verification of displacement transducers and their readout
of this standard to consider significant digits used in analytical
systems for geotechnical purposes. It covers any transducer
methods for engineering design.
used to measure displacement, which gives an electrical output
that is linearly proportional to displacement. This includes 1.6 This practice offers a set of instructions for performing
linear variable displacement transducers (LVDTs), linear dis- one or more specific operations. This standard cannot replace
placement transducers (LDTs) and linear strain transducers education or experience and should be used in conjunction with
(LSTs). professional judgment. Not all aspects of this practice may be
applicable in all circumstances. This ASTM standard is not
1.2 This calibration/verification procedure is used to deter-
intended to represent or replace the standard of care by which
mine the relationship between output of the transducer and its
the adequacy of a given professional service must be judged,
readout system and change in length. This relationship is used
nor should this document be applied without consideration of
to convert readings from the transducer readout system into
a project’s many unique aspects. The word “standard” in the
engineering units.
title of this document means only that the document has been
1.3 This calibration/verification procedure also is used to
approved through the ASTM consensus process.
determine the accuracy of the transducer and its readout system
1.7 This standard does not purport to address all of the
over the range of its use to compare with the manufacturer’s
safety concerns, if any, associated with its use. It is the
specifications for the instrument and the suitability of the
responsibility of the user of this standard to establish appro-
instrument for a specific application.
priate safety, health, and environmental practices and deter-
1.4 Units—The values stated in either SI units or inch-
mine the applicability of regulatory limitations prior to use.
pound units given in brackets are to be regarded separately as
1.8 This international standard was developed in accor-
the standard. The values stated in each system may not be exact
dance with internationally recognized principles on standard-
equivalents; therefore, each system shall be used independently
ization established in the Decision on Principles for the
of the other. Combination values from the two systems may
Development of International Standards, Guides and Recom-
result in non-conformance with standard.
mendations issued by the World Trade Organization Technical
Barriers to Trade (TBT) Committee.
1.5 All observed and calculated values shall conform to the
guidelines for significant digits and rounding established in
2. Referenced Documents
Practice D6026 unless superseded by this standard.
1.5.1 The procedures used to specify how data are collected,
2.1 ASTM Standards:
recorded or calculated in this standard are regarded as the D653 Terminology Relating to Soil, Rock, and Contained
industry standard. In addition they are representative of the
Fluids
significant digits that generally should be retained. The proce- D3740 Practice for Minimum Requirements for Agencies
dures used do not consider material variation, purpose for
Engaged in Testing and/or Inspection of Soil and Rock as
obtaining the data, special purpose studies, or any consider- Used in Engineering Design and Construction
D6026 Practice for Using Significant Digits and Data Re-
cords in Geotechnical Data
This practice is under the jurisdiction of ASTM Committee D18 on Soil and
Rock and is the direct responsibility of Subcommittee D18.95 on Information
Retrieval and Data Automation. For referenced ASTM standards, visit the ASTM website, www.astm.org, or
Current edition approved Jan. 1, 2024. Published January 2024. Originally contact ASTM Customer Service at service@astm.org. For Annual Book of ASTM
approved in 1996. Last previous edition approved in 2015 as D6027-15. DOI: Standards volume information, refer to the standard’s Document Summary page on
10.1520/D6027_D6027M-24. the ASTM website.
*A Summary of Changes section appears at the end of this standard
Copyright © ASTM International, 100 Barr Harbor Drive, PO Box C700, West Conshohocken, PA 19428-2959. United States
D6027/D6027M − 24
E2309/E2309M Practices for Verification of Displacement 3.2 Definitions of Terms Specific to This Standard:
Measuring Systems and Devices Used in Material Testing 3.2.1 core, n—central rod that moves in and out of the
Machines transducer body.
3.2.2 null position, n—the core position within the sensor
3. Terminology
body at which the transducer voltage output is zero (some
3.1 Definitions: transducers may not have a null position).
3.1.1 Definitions of common technical terms used in this
3.2.3 percent error of displacement, E , n—the ratio of the
r
practice refer to Terminology D653. For definitions of common
displacement measurement error to the applied displacement as
metrology terms used in this standard, refer to the International
measured by the displacement measurement standard, ex-
Vocabulary of Metrology.
pressed as a percent.
3.1.2 calibrated range, n—distance over which the linear
4. Summary of Practice
displacement sensor system is calibrated.
3.1.3 calibration, n—operation that, under specified 4.1 A displacement transducer is mounted in such manner to
conditions, in a first step, establishes a relation between the permit it to be subjected to a precise, known displacement.
quantity values with measurement uncertainties provided by
4.2 Displacement is applied in steps over the full range of
measurement standards and corresponding indications with
the transducer and readings taken from the readout device.
associated measurement uncertainties and, in a second step,
4.3 The slope of the best-fit straight line relating sensor
uses this information to establish a relation for obtaining a
readout data to displacement is determined by linear regres-
measurement result from an indication.
sion.
3.1.4 displacement measurement standard, n—system con-
4.4 The percent error of the transducer readout system is
sisting of micrometer or Precision Gauge Blocks combined
calculated and compared with the requirements for the specific
with an appropriate device for indicating the magnitude (or a
use of the sensor.
quantity proportional to the magnitude) of deformation of the
member under an applied displacement
4.5 See Appendix X1 for a flowchart of the calibration/
verification process.
3.1.5 displacement transducer, n—an electrical transducer
which converts linear displacement to electrical output.
4.6 See Appendix X2 for identifying and determination
measurement uncertainty component during a displacement
3.1.6 metrological traceability, n—property of a measure-
ment result whereby the result can be related to a reference transducer system calibration/verification.
through a documented unbroken chain of calibrations, each
5. Significance and Use
contributing to the measurement uncertainty.
5.1 The displacement transducer plays an important role in
3.1.7 power supply, n—a voltage source with output equal to
geotechnical applications to measure change in dimensions of
that required by the sensor.
specimens.
3.1.8 readout system, n—electronic equipment that accepts
5.2 The displacement transducer must be calibrated/verified
output from the signal conditioner for the transducer and
for use in the laboratory to ensure reliable conversions of the
provides a visual display or digital record of the transducer
sensor’s electrical output to engineering units.
output.
5.3 The displacement transducer should be calibrated/
3.1.9 signal conditioner, n—electronic equipment that
verified before initial use, at least annually thereafter, after any
makes the output of the transducer compatible with the readout
change in the electronic configuration that employs the sensor,
system. The signal conditioner may also filter the transducer
after any significant change in test conditions using the
output to remove noise.
transducer that differ from conditions during the last
3.1.10 total linear range (TLR), n—total distance that the
calibration/verification, and after any physical action on the
core may move from the position of maximum voltage output
transducer that might affect its response.
to the position of minimum voltage output with a linear
5.4 Displacement transducer generally has a working range
relationship between displacement and voltage.
within which voltage output is linearly proportional to dis-
3.1.11 transducer, n—device, used in measurement, that
placement of the transducer. This procedure is applicable to the
provides an output quantity having a specified relation to the
linear range of the transducer. Recommended practice is to use
input quantity.
the displacement transducer only within its linear working
3.1.12 verification, n—provision of objective evidence that
range.
a given item fulfils specified requirements.
NOTE 1—Verification as in Practices E2309/E2309M should not be
confused with calibration
These practices are under the jurisdiction of ASTM Committee E28 on
6. Apparatus
Mechanical Testing and is the direct responsibility of Subcommittee E28.01 on
Calibration of Mechanical Testing Machines and Apparatus. 6.1 Linear Displacement Transducer, to be calibrated.
International vocabulary of metrology—Basic and general concepts and
6.2 Power Supply with Output, equal to that required by the
associated terms (VIM), 3rd Edition. Joint Committee for Guides in Metrology,
2012. https://www.bipm.org/utils/common/documents/jcgm/JCGM_200_2012.pdf sensor.
D6027/D6027M − 24
NOTE 2—Some LVDTs use ac voltage while others use dc. The LVDTs
and displacement transducer may be damaged if connected to the incorrect
power supply.
6.3 Signal Conditioning, Readout Equipment, and Related
Cables and Fittings.
6.4 Test Method A—Precision Gauge Block Calibration/
Verification:
6.4.1 Precision Gauge Blocks, a set of precision reference
blocks traceable to the National Institute for Standards and
Technology or another recognized standard agency. A gauge
block set shall contain sizes necessary to perform satisfactorily
the calibration procedures as outlined in Section 9 over the
total linear range of the transducer (otherwise the test cannot be
performed).
FIG. 2 Micrometer Fixture
6.4.2 Comparator Stand, consisting of a base of warp-free
stability and ground to a guaranteed flatness, a support column,
7.1.1 This practice involves electrical equipment. Verify
and an adjustable arm onto which the sensor mounting block
that all electrical wiring is connected properly and that the
can be securely attached. Alternatively, mount the sensor in the
power supply and signal conditioner are grounded properly to
configuration will be used in such a way that gauge blocks can
prevent electrical shock to the operator. Take necessary pre-
be inserted to displace the core for calibration/verification
cautions to avoid exposure to power signals.
purposes. As shown in Fig. 1.
6.4.3 Sensor Mounting Block, a device used to attach the
7.2 Safety Precautions:
sensor to the comparator stand. Alternatively, mount the sensor
7.2.1 Examine the sensor body for burrs or sharp edges, or
to the test equipment in which the transducer is to be used. The
both. Remove any protrusions that might cause harm.
mounting holder shall be antiferromagnetic.
7.2.2 The transducer can be permanently damaged if incor-
rectly connected to the power supply or if connected to a power
6.5 Test Method B—Micrometer Fixture Calibration/
supply with the wrong excitation level.
Verification:
7.2.3 Follow the manufacturer’s recommendations with re-
6.5.1 Micrometer Fixture, a precision instrument for linear
gard to safety.
measurement capable of obtaining readings over the total linear
range of the displacement transducer. The spindle must be
7.3 Technical Precautions:
nonrotating. The micrometer fixture is to be calibrated annually
7.3.1 The core and body of the displacement transducer are
by the manufacturer or other qualified personnel. As shown in
a matched set. For best performance, do not interchange cores
Fig. 2.
with other displacement transducer bodies.
7.3.2 Replace the core and body if either shows any signs of
7. Hazards
dents, bending, or other defects that may affect performance of
7.1 Safety Hazards: the device.
7.3.3 Store the body and core in a protective case when not
in use.
7.3.4 Do not exceed the allowable input voltage of the
sensor as specified by the manufacturer.
7.3.5 Do not connect a voltage source to the output leads of
the sensor.
7.3.6 Do not over tighten the sensor within the mounting
block.
7.3.7 The behavior of some transducers may be affected by
metallic holders. If possible, the working holder should be used
during calibration/verification.
8. Calibration/Verification and Standardization
8.1 Calibration of Linear Displacement Transducers for
Geotechnical Purposes shall be performed in accordance with
D6027/D6027M.
8.1.1 If using Test Method A, verify that the gauge blocks
are of sufficient precision and bias and in a clean, unscratched
condition.
8.1.2 If using Test Method B, verify that the micrometer
fixture is in good working order and of sufficient precision and
FIG. 1 Comparator Stand Fixture bias.
D6027/D6027M − 24
8.2 Verification of Linear Displacement Transducers for 9.8.6 Secure the adjustable arm on the support column of
Geotechnical Purposes shall be performed in accordance with the comparator stand in this position by tightening the screw of
Practices E2309/E2309M. the adjustable arm.
9.8.7 Remove the gauge block (or series of blocks) from
9. Procedure
beneath the core rod and allow the core rod to rest on the
comparator base. Note the transducer reading with the core
9.1 Perform this calibration/verification in an environment
now extended from the transducer body. Logically, it shall
as close to that in which the sensor will be used as possible.
equal approximately the voltage reading for the transducer with
The displacement transducer, calibration/verification gauge
the core pulled from the transducer body to the end of the linear
blocks, micrometer fixture, and comparator stand shall be in
operating range as indicated by the manufacturer’s calibration
the environment in which they are to be calibrated/verified for
data (for if it is not, something is wrong).
a sufficient length of time or for at least 1 hour prior to
9.8.8 Record the value of sensor output as sensor reading for
calibration/verification to stabilize temperature effects.
zero displacement.
9.2 Verify that the power supply is adjusted to supply the
9.8.9 Select appropriate gauge blocks to displace the core
recommended voltage to the sensor.
through its total linear range in steps. It is recommended that a
9.3 With equipment turned off, connect all power supply,
minimum of five readings equally spaced throughout the sensor
signal conditioning, and recording equipment exactly as it will
total linear range be used.
be used in service. Allow all electronics to warm up for a
9.8.10 Raise the core rod and place the appropriate gauge
sufficient length of time, or for at least 30 min before beginning
block(s) on the comparator stand base beneath the core rod in
any readings.
a manner to raise incrementally the core step-by-step into the
transducer body.
9.4 Record type and serial number of the sensor to be
calibrated/verified. If it has no serial number, record the model 9.8.11 Record the gauge block height in Column 1 and the
number and other identifying markings. corresponding output of the sensor readout equipment in
Column 2 as shown in Table 1.
9.5 Record the maximum allowable input voltage specified
9.8.12 Continue to add gauge blocks at the selected dis-
by the manufacturer and the input voltage used for this
placement increments and record readings in Table 1 until the
calibration/verification.
core has been displaced through its total linear range.
9.6 Record the total linear range of the sensor and the range
9.8.13 Remove the gauge blocks in reverse order and record
over which the transducer will be calibrated.
readings in Table 1 until the core again rests on the comparator
9.7 Record the type and serial number of the reference base.
standard used.
9.8.14 Repeat these steps for a minimum of two times to
obtain data on repeatability.
9.8 Test Method A—Precision Gauge Block Calibration/
9.8.15 Calculate the calibration factor, linearity, computed
Verification:
displacement, and maximum percent error for each reading and
9.8.1 Attach the sensor mounting block to the adjustable
as described in Section 10.
arm of the comparator stand as shown in Fig. 1 or mount into
test equipment as it will be used in service.
9.9 Test Method B—Micrometer Fixture Calibration/
9.8.2 Slide the displacement transducer core and core ex-
Verification:
tension rod assembly into the displacement transducer body.
9.9.1 Secure the sensor body into the chuck of the microm-
eter fixture. Do not over tighten the chuck around the sensor
NOTE 3—Some displace
...


This document is not an ASTM standard and is intended only to provide the user of an ASTM standard an indication of what changes have been made to the previous version. Because
it may not be technically possible to adequately depict all changes accurately, ASTM recommends that users consult prior editions as appropriate. In all cases only the current version
of the standard as published by ASTM is to be considered the official document.
Designation: D6027/D6027M − 15 D6027/D6027M − 24
Standard Practice for
Calibrating Calibration/Verification of Linear Displacement
Transducers for Geotechnical Purposes
This standard is issued under the fixed designation D6027/D6027M; the number immediately following the designation indicates the
year of original adoption or, in the case of revision, the year of last revision. A number in parentheses indicates the year of last
reapproval. A superscript epsilon (´) indicates an editorial change since the last revision or reapproval.
1. Scope*
1.1 This practice outlines the procedure for calibrating calibration/verification of displacement transducers and their readout
systems for geotechnical purposes. It covers any transducer used to measure displacement, which gives an electrical output that
is linearly proportional to displacement. This includes linear variable displacement transducers (LVDTs), linear displacement
transducers (LDTs) and linear strain transducers (LSTs).
1.2 This calibrationcalibration/verification procedure is used to determine the relationship between output of the transducer and
its readout system and change in length. This relationship is used to convert readings from the transducer readout system into
engineering units.
1.3 This calibrationcalibration/verification procedure also is used to determine the accuracy of the transducer and its readout
system over the range of its use to compare with the manufacturer’s specifications for the instrument and the suitability of the
instrument for a specific application.
1.4 Units—The values stated in either SI units or inch-pound units given in brackets are to be regarded separately as the standard.
The values stated in each system may not be exact equivalents; therefore, each system shall be used independently of the other.
Combination values from the two systems may resultsresult in non-conformance with standard.
1.5 All observed and calculated values shall conform to the guidelines for significant digits and rounding established in Practice
D6026 unless superseded by this standard.
1.5.1 The procedures used to specify how data are collected, recorded or calculated in this standard are regarded as the industry
standard. In addition they are representative of the significant digits that generally should be retained. The procedures used do not
consider material variation, purpose for obtaining the data, special purpose studies, or any consideration for the user’s objectives;
it is common practice to increase or reduce significant digits of reported data to be commensurate with these consideration.con-
siderations. It is beyond the scope of this standard to consider significant digits used in analytical methods for engineering design.
1.6 This practice offers a set of instructions for performing one or more specific operations. This standard cannot replace education
or experience and should be used in conjunction with professional judgment. Not all aspects of this practice may be applicable in
all circumstances. This ASTM standard is not intended to represent or replace the standard of care by which the adequacy of a
This practice is under the jurisdiction of ASTM Committee D18 on Soil and Rock and is the direct responsibility of Subcommittee D18.95 on Information Retrieval and
Data Automation.
Current edition approved July 1, 2015Jan. 1, 2024. Published July 2015January 2024. Originally approved in 1996. Last previous edition approved in 2004 as
D6027–96(2004), which was withdrawn February 2013 and reinstated in July 2015. DOI: 10.1520/D6027_D6027M-15.2015 as D6027-15. DOI: 10.1520/D6027_D6027M-
24.
*A Summary of Changes section appears at the end of this standard
Copyright © ASTM International, 100 Barr Harbor Drive, PO Box C700, West Conshohocken, PA 19428-2959. United States
D6027/D6027M − 24
given professional service must be judged, nor should this document be applied without consideration of a project’s many unique
aspects. The word “standard” in the title of this document means only that the document has been approved through the ASTM
consensus process.
1.7 This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility
of the user of this standard to establish appropriate safety and healthsafety, health, and environmental practices and determine
the applicability of regulatory limitations prior to use.
1.8 This international standard was developed in accordance with internationally recognized principles on standardization
established in the Decision on Principles for the Development of International Standards, Guides and Recommendations issued
by the World Trade Organization Technical Barriers to Trade (TBT) Committee.
2. Referenced Documents
2.1 ASTM Standards:
D653 Terminology Relating to Soil, Rock, and Contained Fluids
D3740 Practice for Minimum Requirements for Agencies Engaged in Testing and/or Inspection of Soil and Rock as Used in
Engineering Design and Construction
D6026 Practice for Using Significant Digits and Data Records in Geotechnical Data
E2309/E2309M Practices for Verification of Displacement Measuring Systems and Devices Used in Material Testing Machines
3. Terminology
3.1 Definitions—Definitions of terms used in this practice are in accordance with Terminology D653.
3.1 Definitions of Terms Specific to This Standard:Definitions:
3.1.1 Definitions of common technical terms used in this practice refer to Terminology D653. For definitions of common
metrology terms used in this standard, refer to the International Vocabulary of Metrology.
3.1.2 calibrated range, n—distance forover which the transducer linear displacement sensor system is calibrated.
3.1.3 core,calibration, n—central rod that moves in and out of the transducer body.operation that, under specified conditions, in
a first step, establishes a relation between the quantity values with measurement uncertainties provided by measurement standards
and corresponding indications with associated measurement uncertainties and, in a second step, uses this information to establish
a relation for obtaining a measurement result from an indication.
3.1.4 displacement measurement standard, n—system consisting of micrometer or Precision Gauge Blocks combined with an
appropriate device for indicating the magnitude (or a quantity proportional to the magnitude) of deformation of the member under
an applied displacement
3.1.5 displacement transducer, n—an electrical transducer which converts linear displacement to electrical output.
3.1.6 null position,metrological traceability, n—the core position within the sensor body at which the transducer voltage output
is zero (some transducers may not have a null position).property of a measurement result whereby the result can be related to a
reference through a documented unbroken chain of calibrations, each contributing to the measurement uncertainty.
3.1.7 power supply, n—a voltage source with output equal to that required by the sensor.
3.1.8 readout system, n—electronic equipment that accepts output from the signal conditioner for the transducer and provides a
visual display or digital record of the transducer output.
For referenced ASTM standards, visit the ASTM website, www.astm.org, or contact ASTM Customer Service at service@astm.org. For Annual Book of ASTM Standards
volume information, refer to the standard’s Document Summary page on the ASTM website.
These practices are under the jurisdiction of ASTM Committee E28 on Mechanical Testing and is the direct responsibility of Subcommittee E28.01 on Calibration of
Mechanical Testing Machines and Apparatus.
International vocabulary of metrology—Basic and general concepts and associated terms (VIM), 3rd Edition. Joint Committee for Guides in Metrology, 2012.
https://www.bipm.org/utils/common/documents/jcgm/JCGM_200_2012.pdf
D6027/D6027M − 24
3.1.9 signal conditioner, n—electronic equipment that makes the output of the transducer compatible with the readout system. The
signal conditioner may also filter the transducer output to remove noise.
3.1.10 total linear range (TLR), n—total distance that the core may move from the position of maximum voltage output to the
position of minimum voltage output with a linear relationship between displacement and voltage.
3.1.11 traceability certificate, transducer, n—a certificate of inspection certifying that the transducer meets indicated specifications
for its particular grade or model and whose accuracy is traceable to the National Institute of Standards and Technology or to another
international standard.device, used in measurement, that provides an output quantity having a specified relation to the input
quantity.
3.1.12 verification, n—provision of objective evidence that a given item fulfils specified requirements.
3.2 Definitions of Terms Specific to This Standard:
3.2.1 core, n—central rod that moves in and out of the transducer body.
3.2.2 null position, n—the core position within the sensor body at which the transducer voltage output is zero (some transducers
may not have a null position).
3.2.3 percent error of displacement, E , n—the ratio of the displacement measurement error to the applied displacement as
r
measured by the displacement measurement standard, expressed as a percent.
4. Summary of Practice
4.1 A displacement transducer is mounted in such manner to permit it to be subjected to a precise, known displacement.
4.2 Displacement is applied in steps over the full range of the transducer and readings taken from the readout device.
4.3 The slope of the best-fit straight line relating sensor readout data to displacement is determined by linear regression.
4.4 The percent error of the transducer readout system is calculated and compared with the requirements for the specific use of
the sensor.
4.5 See Appendix X1 for a flowchart of the calibration/verification process.
4.6 See Appendix X2 for identifying and determination measurement uncertainty component during a displacement transducer
system calibration/verification.
5. Significance and Use
5.1 The displacement transducer plays an important role in geotechnical applications to measure change in dimensions of
specimens.
5.2 The displacement transducer must be calibratedcalibrated/verified for use in the laboratory to ensure reliable conversions of
the sensor’s electrical output to engineering units.
5.3 The displacement transducer should be calibratedcalibrated/verified before initial use, at least annually thereafter, after any
change in the electronic configuration that employs the sensor, after any significant change in test conditions using the transducer
that differ from conditions during the last calibration,calibration/verification, and after any physical action on the transducer that
might affect its response.
5.4 Displacement transducer generally has a working range within which voltage output is linearly proportional to displacement
D6027/D6027M − 24
of the transducer. This procedure is applicable to the linear range of the transducer. Recommended practice is to use the
displacement transducer only within its linear working range.
NOTE 1—Verification as in Practices E2309/E2309M should not be confused with calibration
6. Apparatus
6.1 Linear Displacement Transducer, to be calibrated.
6.2 Power Supply with Output, equal to that required by the sensor.
NOTE 2—Some LVDTs use ac voltage while others use dc. The LVDTs and displacement transducer may be damaged if connected to the incorrect power
supply.
6.3 Signal Conditioning, Readout Equipment, and Related Cables and Fittings.
6.4 Test Method A—Precision Gauge Block Calibration:Calibration/Verification:
6.4.1 Precision Gauge Blocks, a set of precision reference blocks traceable to the National Institute for Standards and Technology
or otheranother recognized standard agency. A gauge block set shouldshall contain sizes necessary to perform satisfactorily the
calibration procedures as outlined in Section 9 over the total linear range of the transducer.transducer (otherwise the test cannot
be performed).
6.4.2 Comparator Stand, consisting of a base of warp-free stability and ground to a guaranteed flatness, a support column, and
an adjustable arm onto which the sensor mounting block can be securely attached. Alternatively, mount the sensor in the
configuration it will be used in such a way that gauge blocks can be inserted to displace the core for calibration
purposes.calibration/verification purposes. As shown in Fig. 1.
6.4.3 Sensor Mounting Block, a device used to attach the sensor to the comparator stand. Alternatively, mount the sensor to the
test equipment in which the transducer is to be used. The mounting holder shall be antiferromagnetic.
6.5 Test Method B—Micrometer Fixture Calibration:Calibration/Verification:
6.5.1 Micrometer Fixture, a precision instrument for linear measurement capable of obtaining readings over the total linear range
FIG. 1 Comparator Stand Fixture
D6027/D6027M − 24
of the displacement transducer. The spindle must be nonrotating. The micrometer fixture is to be calibrated annually by the
manufacturer or other qualified personnel. As shown in Fig. 2.
7. Hazards
7.1 Safety Hazards:
7.1.1 This practice involves electrical equipment. Verify that all electrical wiring is connected properly and that the power supply
and signal conditioner are grounded properly to prevent electrical shock to the operator. Take necessary precautions to avoid
exposure to power signals.
7.2 Safety Precautions:
7.2.1 Examine the sensor body for burrs or sharp edges, or both. Remove any protrusions that might cause harm.
7.2.2 The transducer can be permanently damaged if incorrectly connected to the power supply or if connected to a power supply
with the wrong excitation level.
7.2.3 Follow the manufacturer’s recommendations with regard to safety.
7.3 Technical Precautions:
7.3.1 The core and body of the displacement transducer are a matched set. For best performance, do not interchange cores with
other displacement transducer bodies.
7.3.2 Replace the core and body if either shows any signs of dents, bending, or other defects that may affect performance of the
device.
7.3.3 Store the body and core in a protective case when not in use.
7.3.4 Do not exceed the allowable input voltage of the sensor as specified by the manufacturer.
7.3.5 Do not connect a voltage source to the output leads of the sensor.
7.3.6 Do not over tighten the sensor within the mounting block.
7.3.7 The behavior of some transducers may be affected by metallic holders. If possible, the working holder should be used during
calibration.calibration/verification.
FIG. 2 Micrometer Fixture
D6027/D6027M − 24
8. CalibrationCalibration/Verification and Standardization
8.1 If using Test Method A, verify that the gauge blocks are of sufficient precision and bias and in a clean,Calibration of Linear
Displacement Transducers for Geotechnical Purposes shall be performed in accordance with D6027/D6027Munscratched
condition.
8.1.1 If using Test Method A, verify that the gauge blocks are of sufficient precision and bias and in a clean, unscratched condition.
8.1.2 If using Test Method B, verify that the micrometer fixture is in good working order and of sufficient precision and bias.
8.2 If using Test Method B, verify that the micrometer fixture is in good working orderVerification of Linear Displacement
Transducers for Geotechnical Purposes shall be performed in accordance with Practices E2309/E2309Mand of sufficient precision
and bias.
9. Procedure
9.1 Perform this calibrationcalibration/verification in an environment as close to that in which the sensor will be used as possible.
The displacement transducer, calibrationcalibration/verification gauge blocks, micrometer fixture, and comparator stand should-
shall be in the environment in which they are to be calibrated for calibrated/verified for a sufficient length of time or for at least
1 hhour prior to calibrationcalibration/verification to stabilize temperature effects.
9.2 Verify that the power supply is adjusted to supply the recommended voltage to the sensor.
9.3 With equipment turned off, connect all power supply, signal conditioning, and recording equipment exactly as it will be used
in service. Allow all electronics to warm up for a sufficient length of time, or for at least 30 min before beginning any readings.
9.4 Record type and serial number of the sensor to be calibrated.calibrated/verified. If it has no serial number, record the model
number and other identifying markings.
9.5 Record the maximum allowable input voltage specified by the manufacturer and the input voltage used for this
calibration.calibration/verification.
9.6 Record the total linear range of the sensor and the range over which the transducer will be calibrated.
9.7 Record the type and serial number of the reference standard used.
9.8 Test Method A—Precision Gauge Block Calibration:Calibration/Verification:
9.8.1 Attach the sensor mounting block to the adjustable arm of the comparator stand as shown in Fig. 1 or mount into test
equipment as it will be used in service.
9.8.2 Slide the displacement transducer core and core extension rod assembly into the displacement transducer body.
NOTE 3—Some displacement transducers require the core to be in place before powering the displacement transducer.
NOTE 4—Displacement transducer should be checked use for freedom of movement particularly if the core rod and body of the transducer are
manufactured as a single unit with a spring loaded core as part of the calibrationcalibration/verification procedure.
9.8.3 Place the sensor body into the sensor mounting block and tighten the appropriate screw on the mounting block. Do not over
tighten the screw on the mounting block. This can damage the sensor body.
9.8.4 Place a gauge block (or series of blocks) that has a height equal to the total linear range of the sensor under the core.
9.8.5 Adjust the sensor body up or down on the comparator stand support column as necessary to obtain a reading on the readout
D6027/D6027M − 24
equipment that is approximately equal to the reading for the transducer with the core pushed into the transducer body to the end
of the linear operating range as indicated by the manufacturer’s calibration data.
9.8.6 Secure the adjustable arm on the support column of the comparator stand in this position by tightening the screw of the
adjustable arm.
9.8.7 Remove the gauge block (or series of blocks) from beneath the core rod and allow the core rod to rest on the comparator
base. Note the transducer reading with the core now extended from the transducer body. It should Logically, it shall equal
approximately the voltage reading for the transducer with the core pulled from the transducer body to the end of the linear
operating range as indicated by the manufacturer’s calibration data.data (for if it is not, something is wrong).
9.8.8 Record the value of sensor output as sensor reading for zero displacement.
9.8.9 Select appropriate gauge blocks to displace the core through its total linear range in steps. It is recommended that a minimum
of five readings equally spaced throughout the sensor total linear range be used.
9.8.10 Raise the core rod and place the appropriate gauge block(s) on the comparator stand base beneath the core rod in a manner
to raise incrementally the core step-by-step into the transducer body.
9.8.11 Record the gauge block height in Column 1 and the corresponding output of
...

Questions, Comments and Discussion

Ask us and Technical Secretary will try to provide an answer. You can facilitate discussion about the standard in here.

Loading comments...