ASTM E2737-23
(Practice)Standard Practice for Digital Detector Array Performance Evaluation and Long-Term Stability
Standard Practice for Digital Detector Array Performance Evaluation and Long-Term Stability
SIGNIFICANCE AND USE
4.1 This practice is intended to be used by the DDA user to measure and record the baseline performance of an acquired DDA in order to monitor its performance throughout its service as an imaging system. This practice is not intended to be used as an “acceptance test” of a DDA.
4.2 This practice defines the tests to be performed and their required intervals. Also defined are the methods of tabulating results that DDA users will complete following initial baselining of the DDA system. These tests will also be performed periodically at the stated required intervals to evaluate the DDA system to determine if the system remains within acceptable operational limits as established in this practice and defined between the user and CEO.
4.3 There are several factors that affect the quality of a DDA image including the basic spatial resolution, geometric unsharpness, scatter, signal to noise ratio, contrast sensitivity, contrast/noise ratio, image lag, and for some types of DDAs, burn-in. There are several additional factors and settings which can affect these results (for example, integration time, detector parameters, imaging software, and even X-ray radiation quality). Additionally, detector correction techniques may have an impact on the quality of the image. This practice delineates tests for each of the properties listed herein and establishes standard techniques for assuring repeatability throughout the lifecycle testing of the DDA.
SCOPE
1.1 This practice covers the baseline and periodic performance evaluation of Digital Detector Array (DDA) systems used for industrial radiography. It is intended to ensure that the evaluation of image quality, as far as this is influenced by the DDA system, meets the needs of users, and their customers, and enables process control to monitor long-term stability of the DDA system.
1.2 This practice specifies the fundamental parameters of DDA systems to be measured to determine baseline performance, and to track the long-term stability of the DDA system.
1.3 The DDA system tests specified in this practice shall be completed upon acceptance of the system from the manufacturer to baseline the performance of the DDA. Periodic performance testing shall then be used to monitor long-term stability of the system in order to identify when an action needs to be taken due to system degradation beyond a certain defined level.
1.4 Two types of phantoms, the duplex plate and the five-groove wedge, are used for testing as specified herein. The use of these two types of phantoms is not intended to exclude the use of other phantom configurations. In the event the tests or phantoms specified herein are not sufficient or appropriate, the user, in coordination with the cognizant engineering organization (CEO) may develop additional or modified tests, test objects, phantoms, or image quality indicators to evaluate the DDA system performance. Acceptance levels for these ALTERNATE test methods shall be determined by agreement between the user and CEO.
1.5 The user of this practice shall consider that higher energies than 450 keV may require different test methods or modifications to the test methods described here. This practice is not intended for usage with isotopes.
1.6 This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this standard to establish appropriate safety, health, and environmental practices and determine the applicability of regulatory limitations prior to use.
1.7 This international standard was developed in accordance with internationally recognized principles on standardization established in the Decision on Principles for the Development of International Standards, Guides and Recommendations issued by the World Trade Organization Technical Barriers to Trade (TBT) Committee.
General Information
- Status
- Published
- Publication Date
- 30-Jun-2023
- Technical Committee
- E07 - Nondestructive Testing
- Drafting Committee
- E07.01 - Radiography (X and Gamma) Method
Relations
- Effective Date
- 01-Feb-2024
- Effective Date
- 15-Dec-2023
- Effective Date
- 01-Dec-2019
- Effective Date
- 01-Mar-2019
- Effective Date
- 01-Mar-2018
- Effective Date
- 01-Feb-2018
- Effective Date
- 01-Feb-2018
- Effective Date
- 01-Feb-2018
- Effective Date
- 01-Jan-2018
- Effective Date
- 15-Jun-2017
- Effective Date
- 01-Feb-2017
- Effective Date
- 01-Aug-2016
- Effective Date
- 01-Feb-2016
- Effective Date
- 01-Dec-2015
- Effective Date
- 01-Sep-2015
Overview
ASTM E2737-23: Standard Practice for Digital Detector Array Performance Evaluation and Long-Term Stability sets the guidelines for establishing baseline and ongoing performance metrics for Digital Detector Array (DDA) systems used in industrial radiography. This standard, published by ASTM International, supports users in quantifying and documenting the performance of DDA imaging systems, ensuring quality assurance and enabling process control throughout the system's operational lifecycle. ASTM E2737-23 is not designed for acceptance testing at purchase but emphasizes in-service monitoring and the detection of performance changes such as system degradation.
Key Topics
Baseline Performance Evaluation:
Defines the fundamental test parameters and intervals for establishing a performance reference point for DDA systems upon initial deployment.Long-Term Stability Monitoring:
Provides procedures for periodic evaluation to track and document potential changes in DDA performance, supporting timely intervention when degradations exceed agreed limits.Test Methods and Intervals:
Outlines a series of mandatory tests (e.g., spatial resolution, contrast sensitivity, signal-to-noise ratio, offset level, and bad pixel distribution) using specified phantoms and test objects. Typical test intervals are recommended, but may be adjusted through user and customer agreement.Quality Metrics:
Details factors influencing image quality, such as spatial resolution, geometric unsharpness, signal-to-noise ratio, contrast sensitivity, lag, burn-in, and effects of calibration, integration time, and detector/software settings.Phantom Types:
Specifies the use of duplex plate and five-groove wedge phantoms for standardized evaluation, with provisions for alternate or additional test objects as needed by agreement.Reporting and Documentation:
Requires detailed record-keeping of technique parameters, test results, control limits, and the status of any anomalous pixels or clusters to ensure traceability and consistency.Personnel Qualification:
Stipulates that personnel conducting evaluations must be qualified per relevant national and international standards (e.g., NAS410, EN 4179, ISO 9712).
Applications
Industrial Radiography Quality Assurance:
ASTM E2737-23 is critical for ensuring that DDA-based imaging systems maintain reliable performance for nondestructive testing (NDT) applications in sectors such as aerospace, automotive, energy, and manufacturing.Process Control:
Enables organizations to implement process checks and quality management procedures, helping meet customer specification and regulatory requirements.System Maintenance and Lifecycle Management:
Supports planning for equipment maintenance, repair, upgrades, or replacement by providing objective criteria for performance deviations.Data Integrity:
Promotes consistency in image quality data, supporting audits and compliance with internal and external standards.Flexibility in Testing:
Allows for tailored test plans, intervals, and alternate methods when standard phantoms or procedures are inadequate, provided changes are agreed upon between the user and the responsible engineering authority.
Related Standards
For comprehensive DDA system performance evaluation and integration in industrial radiography, ASTM E2737-23 is often referenced alongside:
- ASTM E2597/E2597M: Manufacturing characterization of Digital Detector Arrays
- ASTM E2002: Measurement of image unsharpness and spatial resolution
- ASTM E1025 & E1742: Use of image quality indicators (IQIs) and radiographic examination procedures
- ASTM E1165 & E2903: Focal spot measurement of X-ray tubes
- ASTM E543: Agency competency for nondestructive testing
- ASTM E2698 & E2736: Guidance on radiographic procedures using DDAs
- Personnel Qualification Standards: NAS410, EN 4179, ANSI/ASNT CP-189, ISO 9712
Adhering to ASTM E2737-23, alongside these standards, provides a robust framework for maintaining and verifying the performance of digital radiography systems, contributing to high confidence in NDT results and equipment reliability.
Buy Documents
ASTM E2737-23 - Standard Practice for Digital Detector Array Performance Evaluation and Long-Term Stability
REDLINE ASTM E2737-23 - Standard Practice for Digital Detector Array Performance Evaluation and Long-Term Stability
Get Certified
Connect with accredited certification bodies for this standard

BSI Group
BSI (British Standards Institution) is the business standards company that helps organizations make excellence a habit.

TÜV Rheinland
TÜV Rheinland is a leading international provider of technical services.

TÜV SÜD
TÜV SÜD is a trusted partner of choice for safety, security and sustainability solutions.
Sponsored listings
Frequently Asked Questions
ASTM E2737-23 is a standard published by ASTM International. Its full title is "Standard Practice for Digital Detector Array Performance Evaluation and Long-Term Stability". This standard covers: SIGNIFICANCE AND USE 4.1 This practice is intended to be used by the DDA user to measure and record the baseline performance of an acquired DDA in order to monitor its performance throughout its service as an imaging system. This practice is not intended to be used as an “acceptance test” of a DDA. 4.2 This practice defines the tests to be performed and their required intervals. Also defined are the methods of tabulating results that DDA users will complete following initial baselining of the DDA system. These tests will also be performed periodically at the stated required intervals to evaluate the DDA system to determine if the system remains within acceptable operational limits as established in this practice and defined between the user and CEO. 4.3 There are several factors that affect the quality of a DDA image including the basic spatial resolution, geometric unsharpness, scatter, signal to noise ratio, contrast sensitivity, contrast/noise ratio, image lag, and for some types of DDAs, burn-in. There are several additional factors and settings which can affect these results (for example, integration time, detector parameters, imaging software, and even X-ray radiation quality). Additionally, detector correction techniques may have an impact on the quality of the image. This practice delineates tests for each of the properties listed herein and establishes standard techniques for assuring repeatability throughout the lifecycle testing of the DDA. SCOPE 1.1 This practice covers the baseline and periodic performance evaluation of Digital Detector Array (DDA) systems used for industrial radiography. It is intended to ensure that the evaluation of image quality, as far as this is influenced by the DDA system, meets the needs of users, and their customers, and enables process control to monitor long-term stability of the DDA system. 1.2 This practice specifies the fundamental parameters of DDA systems to be measured to determine baseline performance, and to track the long-term stability of the DDA system. 1.3 The DDA system tests specified in this practice shall be completed upon acceptance of the system from the manufacturer to baseline the performance of the DDA. Periodic performance testing shall then be used to monitor long-term stability of the system in order to identify when an action needs to be taken due to system degradation beyond a certain defined level. 1.4 Two types of phantoms, the duplex plate and the five-groove wedge, are used for testing as specified herein. The use of these two types of phantoms is not intended to exclude the use of other phantom configurations. In the event the tests or phantoms specified herein are not sufficient or appropriate, the user, in coordination with the cognizant engineering organization (CEO) may develop additional or modified tests, test objects, phantoms, or image quality indicators to evaluate the DDA system performance. Acceptance levels for these ALTERNATE test methods shall be determined by agreement between the user and CEO. 1.5 The user of this practice shall consider that higher energies than 450 keV may require different test methods or modifications to the test methods described here. This practice is not intended for usage with isotopes. 1.6 This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this standard to establish appropriate safety, health, and environmental practices and determine the applicability of regulatory limitations prior to use. 1.7 This international standard was developed in accordance with internationally recognized principles on standardization established in the Decision on Principles for the Development of International Standards, Guides and Recommendations issued by the World Trade Organization Technical Barriers to Trade (TBT) Committee.
SIGNIFICANCE AND USE 4.1 This practice is intended to be used by the DDA user to measure and record the baseline performance of an acquired DDA in order to monitor its performance throughout its service as an imaging system. This practice is not intended to be used as an “acceptance test” of a DDA. 4.2 This practice defines the tests to be performed and their required intervals. Also defined are the methods of tabulating results that DDA users will complete following initial baselining of the DDA system. These tests will also be performed periodically at the stated required intervals to evaluate the DDA system to determine if the system remains within acceptable operational limits as established in this practice and defined between the user and CEO. 4.3 There are several factors that affect the quality of a DDA image including the basic spatial resolution, geometric unsharpness, scatter, signal to noise ratio, contrast sensitivity, contrast/noise ratio, image lag, and for some types of DDAs, burn-in. There are several additional factors and settings which can affect these results (for example, integration time, detector parameters, imaging software, and even X-ray radiation quality). Additionally, detector correction techniques may have an impact on the quality of the image. This practice delineates tests for each of the properties listed herein and establishes standard techniques for assuring repeatability throughout the lifecycle testing of the DDA. SCOPE 1.1 This practice covers the baseline and periodic performance evaluation of Digital Detector Array (DDA) systems used for industrial radiography. It is intended to ensure that the evaluation of image quality, as far as this is influenced by the DDA system, meets the needs of users, and their customers, and enables process control to monitor long-term stability of the DDA system. 1.2 This practice specifies the fundamental parameters of DDA systems to be measured to determine baseline performance, and to track the long-term stability of the DDA system. 1.3 The DDA system tests specified in this practice shall be completed upon acceptance of the system from the manufacturer to baseline the performance of the DDA. Periodic performance testing shall then be used to monitor long-term stability of the system in order to identify when an action needs to be taken due to system degradation beyond a certain defined level. 1.4 Two types of phantoms, the duplex plate and the five-groove wedge, are used for testing as specified herein. The use of these two types of phantoms is not intended to exclude the use of other phantom configurations. In the event the tests or phantoms specified herein are not sufficient or appropriate, the user, in coordination with the cognizant engineering organization (CEO) may develop additional or modified tests, test objects, phantoms, or image quality indicators to evaluate the DDA system performance. Acceptance levels for these ALTERNATE test methods shall be determined by agreement between the user and CEO. 1.5 The user of this practice shall consider that higher energies than 450 keV may require different test methods or modifications to the test methods described here. This practice is not intended for usage with isotopes. 1.6 This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this standard to establish appropriate safety, health, and environmental practices and determine the applicability of regulatory limitations prior to use. 1.7 This international standard was developed in accordance with internationally recognized principles on standardization established in the Decision on Principles for the Development of International Standards, Guides and Recommendations issued by the World Trade Organization Technical Barriers to Trade (TBT) Committee.
ASTM E2737-23 is classified under the following ICS (International Classification for Standards) categories: 11.040.50 - Radiographic equipment. The ICS classification helps identify the subject area and facilitates finding related standards.
ASTM E2737-23 has the following relationships with other standards: It is inter standard links to ASTM E1316-24, ASTM E1742/E1742M-23, ASTM E1316-19b, ASTM E1316-19, ASTM E1742/E1742M-18, ASTM E2698-18, ASTM E2903-18, ASTM E1025-18, ASTM E1316-18, ASTM E1316-17a, ASTM E1316-17, ASTM E1316-16a, ASTM E1316-16, ASTM E1316-15a, ASTM E1316-15. Understanding these relationships helps ensure you are using the most current and applicable version of the standard.
ASTM E2737-23 is available in PDF format for immediate download after purchase. The document can be added to your cart and obtained through the secure checkout process. Digital delivery ensures instant access to the complete standard document.
Standards Content (Sample)
This international standard was developed in accordance with internationally recognized principles on standardization established in the Decision on Principles for the
Development of International Standards, Guides and Recommendations issued by the World Trade Organization Technical Barriers to Trade (TBT) Committee.
Designation: E2737 − 23
Standard Practice for
Digital Detector Array Performance Evaluation and Long-
Term Stability
This standard is issued under the fixed designation E2737; the number immediately following the designation indicates the year of
original adoption or, in the case of revision, the year of last revision. A number in parentheses indicates the year of last reapproval. A
superscript epsilon (´) indicates an editorial change since the last revision or reapproval.
1. Scope modifications to the test methods described here. This practice
is not intended for usage with isotopes.
1.1 This practice covers the baseline and periodic perfor-
1.6 This standard does not purport to address all of the
mance evaluation of Digital Detector Array (DDA) systems
safety concerns, if any, associated with its use. It is the
used for industrial radiography. It is intended to ensure that the
responsibility of the user of this standard to establish appro-
evaluation of image quality, as far as this is influenced by the
priate safety, health, and environmental practices and deter-
DDA system, meets the needs of users, and their customers,
mine the applicability of regulatory limitations prior to use.
and enables process control to monitor long-term stability of
1.7 This international standard was developed in accor-
the DDA system.
dance with internationally recognized principles on standard-
1.2 This practice specifies the fundamental parameters of
ization established in the Decision on Principles for the
DDA systems to be measured to determine baseline
Development of International Standards, Guides and Recom-
performance, and to track the long-term stability of the DDA
mendations issued by the World Trade Organization Technical
system.
Barriers to Trade (TBT) Committee.
1.3 The DDA system tests specified in this practice shall be
completed upon acceptance of the system from the manufac-
2. Referenced Documents
turer to baseline the performance of the DDA. Periodic
2.1 ASTM Standards:
performance testing shall then be used to monitor long-term
E543 Specification for Agencies Performing Nondestructive
stability of the system in order to identify when an action needs
Testing
to be taken due to system degradation beyond a certain defined
E1025 Practice for Design, Manufacture, and Material
level.
Grouping Classification of Hole-Type Image Quality In-
1.4 Two types of phantoms, the duplex plate and the
dicators (IQI) Used for Radiography
five-groove wedge, are used for testing as specified herein. The
E1165 Test Method for Measurement of Focal Spots of
use of these two types of phantoms is not intended to exclude
Industrial X-Ray Tubes by Pinhole Imaging
the use of other phantom configurations. In the event the tests
E1316 Terminology for Nondestructive Examinations
or phantoms specified herein are not sufficient or appropriate,
E1742/E1742M Practice for Radiographic Examination
the user, in coordination with the cognizant engineering orga-
E2002 Practice for Determining Image Unsharpness and
nization (CEO) may develop additional or modified tests, test
Basic Spatial Resolution in Radiography and Radioscopy
objects, phantoms, or image quality indicators to evaluate the
E2446 Practice for Manufacturing Characterization of Com-
DDA system performance. Acceptance levels for these AL-
puted Radiography Systems
TERNATE test methods shall be determined by agreement
E2597/E2597M Practice for Manufacturing Characterization
between the user and CEO.
of Digital Detector Arrays
E2698 Practice for Radiographic Examination Using Digital
1.5 The user of this practice shall consider that higher
Detector Arrays
energies than 450 keV may require different test methods or
E2736 Guide for Digital Detector Array Radiography
This practice is under the jurisdiction of ASTM Committee E07 on Nonde-
structive Testing and is the direct responsibility of Subcommittee E07.01 on
Radiology (X and Gamma) Method. For referenced ASTM standards, visit the ASTM website, www.astm.org, or
Current edition approved July 1, 2023. Published August 2023. Originally contact ASTM Customer Service at service@astm.org. For Annual Book of ASTM
approved in 2010. Last previous edition approved in 2018 as E2737 – 10 (2018). Standards volume information, refer to the standard’s Document Summary page on
DOI: 10.1520/E2737-23. the ASTM website.
Copyright © ASTM International, 100 Barr Harbor Drive, PO Box C700, West Conshohocken, PA 19428-2959. United States
E2737 − 23
E2903 Test Method for Measurement of the Effective Focal 3.2.10 phantom—a part or item being used to quantify DDA
Spot Size of Mini and Micro Focus X-ray Tubes characterization metrics.
2.2 Industry Standards:
3.2.11 saturation pixel value—the maximum possible us-
ANSI/ASNT CP-189 Standard for Qualification & Certifi-
able pixel value of the DDA after offset correction.
cation of Nondestructive Testing Personnel
NOTE 1—Saturation may occur because of a saturation of the pixel
EN 4179 Qualification & Approval of Personnel for Non-
itself, the amplifier, or digitizer, where the DDA encounters saturation
Destructive Testing
pixel values as a function of increasing exposure levels.
ISO 9712 Non-Destructive Testing - Qualification & Certi-
3.2.12 user—the user and operating organization of the
fication of NDT Personnel
DDA system.
NAS 410 National Aerospace Standard: Certification &
3.3 Definitions: Abbreviations Specific to This Standard:
Qualification of Nondestructive Test Personnel
3.3.1 D —diameter of the IQI hole (in pixels).
SNT-TC-1A Personnel Qualification & Certification in Non- hole
destructive Testing
3.3.2 CNC—Computer Numerical Control.
3.3.3 GSL—Groove Sensitivity Level – The smallest long
3. Terminology
groove which is visible in the image at the first single dot
3.1 Definitions—The definition of terms relating to gamma
marking.
and X-radiology, which appear in Terminology E1316, Practice
3.3.4 MT—the penetrated material thickness in the ROI
E2002, Practice E2597/E2597M, Practice E2698, and Guide
under consideration.
E2736, shall apply to the terms used in this practice.
3.3.5 MT —material thickness of the plate(s) under the
step
3.2 Definitions of Terms Specific to This Standard:
IQI.
3.2.1 active DDA area—the active pixelized region of the
3.3.6 MT —thickness of hole-type IQI.
IQI
DDA, which is recommended by the manufacturer as usable.
3.3.7 MT —total material thickness of plate and hole-type
total
3.2.2 burn-in—change in gain of the scintillator that persists
IQI (=MT + MT ).
step IQI
well beyond the exposure.
3.3.8 PV [hole]—median pixel value of ROI within the
median
3.2.3 duplex plate phantom Type 1—a phantom manufac-
IQI hole.
tured from a single material type and having two thickness
made up of either two overlapping plates or a single plate 3.3.9 PV —mean pixel value, for example, of a ROI.
mean
machined to provide two thicknesses with the two thicknesses
3.3.10 PV [beside squares]—mean pixel value measured
mean
typically aligning on 3 edges (Fig. 1).
inside the area between two boxes.
3.2.3.1 Discussion—Duplex plate phantom Type 1 was first
3.3.11 PV —mean pixel value of the ROI on the thick
thick
mentioned in E2737 – 10.
area of the five-groove wedge.
3.2.4 duplex plate phantom Type 2—a phantom manufac-
3.3.12 PV —mean pixel value at the thinnest area of the
thin
tured from a single material type and having two thickness
five-groove wedge.
made up of either two overlapping plates or a single plate
3.3.13 PV (Offset)—mean pixel value of the approxi-
mean
machined to provide two thicknesses (Fig. 2).
mately central 90 % of the area in the offset image.
3.2.5 five-groove wedge—a continuous wedge with five long
3.3.14 Sigma[beside squares]—standard deviation of the
grooves on one side.
pixel values in the area between two boxes.
3.2.6 frame rate—number of frames acquired per second.
4. Significance and Use
3.2.7 lag—residual signal in the DDA that occurs shortly
after detector read-out and erasure.
4.1 This practice is intended to be used by the DDA user to
measure and record the baseline performance of an acquired
3.2.8 manufacturer—DDA system manufacturer, supplier
DDA in order to monitor its performance throughout its service
for the user of the DDA system.
as an imaging system. This practice is not intended to be used
3.2.9 material thickness range (MTR)—the material thick-
as an “acceptance test” of a DDA.
ness range within a single DDA image, whereby a minimum
specific image quality is achieved throughout the entire thick- 4.2 This practice defines the tests to be performed and their
required intervals. Also defined are the methods of tabulating
ness range.
results that DDA users will complete following initial baselin-
ing of the DDA system. These tests will also be performed
Available from American National Standards Institute (ANSI), 25 W. 43rd St.,
periodically at the stated required intervals to evaluate the
4th Floor, New York, NY 10036, http://www.ansi.org.
DDA system to determine if the system remains within
Available from British Standards Institution (BSI), 389 Chiswick High Rd.,
London W4 4AL, U.K., http://www.bsigroup.com.
acceptable operational limits as established in this practice and
Available from International Organization for Standardization (ISO), ISO
defined between the user and CEO.
Central Secretariat, Chemin de Blandonnet 8, CP 401, 1214 Vernier, Geneva,
Switzerland, https://www.iso.org.
4.3 There are several factors that affect the quality of a DDA
Available from Aerospace Industries Association (AIA), 1000 Wilson Blvd.,
image including the basic spatial resolution, geometric
Suite 1700, Arlington, VA 22209, http://www.aia-aerospace.org.
unsharpness, scatter, signal to noise ratio, contrast sensitivity,
Available from American Society for Nondestructive Testing (ASNT), P.O. Box
28518, 1711 Arlingate Ln., Columbus, OH 43228-0518, http://www.asnt.org. contrast/noise ratio, image lag, and for some types of DDAs,
E2737 − 23
burn-in. There are several additional factors and settings which 7. General Procedures Applied to All Phantom Types
can affect these results (for example, integration time, detector
7.1 DDA Correction Method—As part of the baseline
parameters, imaging software, and even X-ray radiation qual-
testing, the DDA offset, gain corrections shall be acquired in
ity). Additionally, detector correction techniques may have an
accordance with the manufacturer’s recommendation, using a
impact on the quality of the image. This practice delineates
typical process as applied during production product evalua-
tests for each of the properties listed herein and establishes
tions. These same correction procedures shall be used at
standard techniques for assuring repeatability throughout the
normal production intervals throughout the periodic testing of
lifecycle testing of the DDA.
the in-service DDA. Additionally, the DDA corrections shall be
re-acquired when the periodic test results fall out of the
5. Basis of Application
established control limits. Reference Annex A1.
5.1 The following items are subject to contractual agree- 7.1.1 Bad Pixel Standardization for DDAs—Baseline Im-
ment between the parties using or referencing this standard. ages shall also be corrected for bad pixels as would be done in
production using routine bad pixel correction procedures. A
5.1.1 Personnel Qualification—Personnel performing ex-
standardized nomenclature is presented in Practice E2597/
aminations to this practice shall be qualified in accordance with
E2597M. The identification and correction of bad pixels in a
NAS410, EN 4179, ANSI/ASNT CP 189, ISO 9712, or
DDA shall be as agreed upon between the user and the CEO.
SNT-TC-1A and certified by the employer or certifying agency
The threshold levels used to identify bad pixels shall be
as applicable. Other equivalent qualification documents may be
recorded in the test report in full or in reference. The bad pixel
used when specified on the contract or purchase order. The
data shall be presented as an image or as a report containing
applicable revision shall be the latest unless otherwise specified
specific parameters for bad pixels, cluster kernel pixels, rel-
in the contractual agreement between parties.
evant clusters, non-relevant clusters, and lines.
5.1.2 If specified in the contractual agreement, NDT agen-
cies shall be qualified and evaluated as described in Specifi-
7.2 Procedure for Measurement of the Offset Level—Before
cation E543. The applicable edition of Specification E543 shall
measurement of the Offset Level, the DDA should be
be specified in the contract.
powered-on and not exposed for approximately ten minutes.
One image with 30 s acquisition time (for example 1 s frames
6. Apparatus
and averaging all 30 frames) shall be captured without radia-
tion (Offset Image). Bad Pixel Correction is active, no gain or
6.1 Phantom Types and Selection—The tests performed
offset correction shall be done. The Offset Level is the mean
herein may be completed either by the use of a Type 1 or Type
pixel value of the approximate central 90 % of area in the offset
2 Duplex Plate Phantom with separate IQIs (See Fig. 1 and Fig.
image. An ROI of greater than 90 % may be used providing
2), or with a Five-Groove Wedge Phantom (See Fig. 5 and Fig.
consideration is made for defective or underperforming pixels
6). The phantoms are available for purchase or may be
in the border of the detector.
manufactured by the user. A Phantom Record shall be gener-
ated providing individual phantom identification, basic mate-
7.3 Procedure for Evaluation of Bad Pixels—The baseline
rial type, basic dimensional data, and traceability to records of
and performance monitoring evaluation for bad pixels shall be
any IQIs used with the phantom. Certification of material alloy
performed in accordance with Practice E2597/E2597M unless
or dimensions is not required for duplex plate phantoms.
otherwise agreed upon by the user and CEO. The frequency of
evaluation shall be agreed upon by the user and CEO. The
6.2 Phantom Materials—The phantoms may be manufac-
documentation of bad pixels shall be performed by evaluating
tured from any material group, however Aluminum is recom-
an acquired image for any individual nonconforming pixels,
mended for light metal applications (material group 02 of equal
clusters, or lines that display a pixel intensity value that is
or lower atomic number and density as listed in Practice
outside of tolerance compared to the mean surrounding pixels.
E1025) and Stainless Steel is recommended for more dense
This can be completed by selecting one of several secondary
material applications (material group 1 of equal or higher
evaluation methods: visual examination, ASTM procedure, or
atomic number and density as listed in Practice E1025). It is
manufacturers recommended procedure. An example of a
not necessary to make use of other materials that more closely
Secondary Evaluation for Bad Pixels would be a simple visual
represent a given product being evaluated. If a facility evalu-
screening for bad pixels during normal viewing of a production
ates materials from more than one material group, a phantom
image. Newly identified bad pixels shall be added to an
from only one material group needs to be processed. Radio-
existing bad pixel map, or a completely new map may be
graphically homogeneous material alloys are preferred. 7022
utilized. Any relevant cluster or line shall be clearly noted and
Aluminum and 316L Stainless Steel are strongly recom-
added to the bad pixel map, as non-correctable pixels could
mended. Other materials may be used when approved by the
hide relevant indications. The location of correctable and
CEO. Materials displaying grain structure mottling or visible
non-correctable bad pixels shall be documented. In addition, a
scatter artifacts, reduce the ability to effectively measure DDA
report may contain the number of bad pixels, cluster kernel
system performance and variability. When required, the se-
pixels, total clusters, relevant clusters, non-relevant clusters,
lected material shall be agreed upon between the user and
and lines.
CEO. Previously established baseline test materials are not
required to be modified to align with the above material 7.4 Technique Parameters—The various tests shall be com-
recommendations. pleted using documented baseline technique parameters. It is
E2737 − 23
not required that these technique parameters represent condi- 7.4.7.1 Frame Rate or Integration Time.
tions used in production. Both the Detector and X-ray Source
7.4.7.2 Frame Averaging.
may degrade over time and impact image quality, therefore at
7.4.7.3 Total Exposure Time.
a minimum, the following parameters shall be recorded and
7.4.8 Detector Corrections (correction and bad pixel sub-
used in acquiring the baseline images as well as the long-term
stitution). Detector correction technique parameters may be
stability data. These technique parameters shall be recorded as
recorded on a separate technique:
part of the baseline and ongoing tests reports in full or in
7.4.8.1 Frame rate.
reference.
7.4.8.2 Number of frames averaged.
7.4.1 X-ray System Identification:
7.4.8.3 Number of Gain corrections (including each Gain’s
7.4.1.1 Detector Model Number and Serial Number.
Approximate Mean Pixel Intensity Value or Percent of Satu-
7.4.1.2 X-ray Tube Model Number and Serial Number.
ration Pixel Value).
7.4.2 X-ray Tube Settings/Configuration:
7.4.9 Image Acquisition Software and Image Processing:
7.4.2.1 X-ray tube voltage (kV).
7.4.9.1 Software Revision.
7.4.2.2 Tube current (mA).
7.4.2.3 Focal spot size. (As measured according to Test
7.5 Technique Energy Selection—The energy used shall be
Methods E1165 or E2903, or another standard. The recorded
appropriate for the Phantom material and thickness range to
focal spot may be taken from the manufactures documenta-
provide the required image quality in imaging the selected
tion.)
phantom and associated IQIs.
7.4.3 X-ray Tube/Detector – Beam Filtration:
7.4.3.1 Material Type.
8. General Tests Required for all Phantom Types
7.4.3.2 Material Thickness.
8.1 User Tests for Baseline and Long-Term Stability—
7.4.4 Beam Collimation:
Quality assurance requires periodic tests of the DDA system to
7.4.4.1 Collimation Location (Tube/Detector/Part).
ensure the proper performance of the system. The time interval
7.4.4.2 Blade Positioning or Collimation Opening Values.
depends on the degree of usage of the system and shall be
7.4.4.3 Collimation Material.
defined by the user with consideration of the DDA system
7.4.5 Geometry:
manufacturer’s information. If no time intervals are established
7.4.5.1 Source to Detector Distance (SDD).
by the contracting parties, the intervals for the performance
7.4.5.2 Object to Detector Distance (ODD) or Source to
checks shall be as defined within Table 1.
Object Distance (SOD).
7.4.6 Detector Settings: 8.1.1 Offset level Test—Degradation of the DDA may reduce
7.4.6.1 Detector Gain Setting. the system sensitivity after extensive usage. For this reason, the
7.4.6.2 Binning Mode. DDA system shall be checked for increasing offset value. The
7.4.6.3 Orientation (Landscape/Portrait/N/A). Offset value is the mean DDA response with no DDA correc-
7.4.7 Exposure Time Per Image: tions and without radiation. Offset values can be influenced by
TABLE 1 System Performance Tests and Process Check of the DDA System using the DUPLEX PLATE
System Performance Test System Performance Test Process Check Control Limits
F A C
Baseline Software Tube Detector Test Intervals Method
Parameter
Update Change Repair
detector
Basic Spatial Resolution iSR x x x x 6 Months ±3 Sigma, or ±20 %
b
E
(Detector)
image
Basic Spatial Resolution iSR x x x x 10 Business Days or ±3 Sigma, or ±20 %
b
(Image) Before Use
Contrast Sensitivity in 4T hole CS x x x x 10 Business Days or ±3 Sigma, or ±20 %
4T
Before Use
Signal to Noise Ratio SNR x x x x 10 Business Days or ±3 Sigma, or ±20 %
Before Use
Signal Level SL x x x x 10 Business Days or ±3 Sigma, or ±20 %
Before Use
B
Offset Level OL x x x 10 Business Days or +50 %
Before Use
Bad Pixel Distribution in x x x x 3 Months As agreed upon by user
accordance with E2597/E2597M and CEO
Bad Pixel Distribution Second- x Daily or Before Use
D
ary Evaluation (7.3)
A
Test Intervals: Unless other intervals are defined and agreed upon by user and CEO.
B
Offset Level: A recorded mean pixel value of a standard offset correction fulfills this requirement 7.2). The baseline for the offset level measurement can be a single
measurement; it is not required to collect 30 days of test data.
C
Control Limits Method: See 10.2.3, Control Limits Values. It is understood that one method of control limits being used is ±20 %. Some industries are now transitioning
to ±3 Sigma. Either is acceptable unless otherwise specified by the CEO.
D
Secondary Evaluation for Bad Pixels: Example—One method of performing this evaluation would be a simple visual screening for bad pixels during normal viewing of
a production image.
E
Basic Spatial Resolution (Detector): The baseline for this measurement can be a single measurement; it is not required to collect 30 days of test data.
F
Baseline: See Section 10 for Application of Baseline Tests and Test Methods.
E2737 − 23
temperature, therefore, where operational temperatures vary, it thick plate, thereby reducing the phantom’s weight. This is
is important to understand the impact on offset measurement especially true when dealing with phantoms for large DDAs
values. and lower geometric magnification imaging techniques. The
8.1.2 Bad Pixel Distribution—Newly identified bad pixels thicker plate shall be large enough in width and length to
shall be added to the Bad Pixel Map. Any relevant cluster shall provide clear separation from the IQI and the plate edge to
be clearly noted and added to the Bad Pixel Map. reduce influence from edge gradient within the image.
8.1.3 Image Lag and Burn-In (Nonmandatory)—The test for 9.1.2 Image Quality Indicators—IQIs used in combination
Image Lag and Burn-In are tests typically performed by with the Duplex Phantom provide an initial evaluation of the
detector manufactures for a given model number detector. quality of a DDA system as well as a method for monitoring
These tests shall be performed only if required by the CEO. If system performance. Practice E2002 duplex wire IQIs along
required, the tests shall be performed in accordance with with Practice E1025 or Practice E1742/E1742M hole-type IQIs
Practice E2597/E2597M at a frequency defined by the CEO. shall be used along with the Duplex Plate Phantom.
9.1.2.1 IQI Selection—Each hole type IQI shall be 2 % of
9. Duplex Plate Phantom Requirements
the specific plate thickness unless otherwise agreed upon
between the user and the CEO. The Practice E2002 Duplex
9.1 Duplex Plate Phantom Configuration—
Gauge model shall be selected based on the spatial resolution
9.1.1 Phantom Material Thicknesses/Dimensions—Either
range common to the majority of techniques being used.
Type 1 or Type 2 phantom shall made of a single material type
9.1.2.2 IQI Placement—Either two Practice E1025 or two
consisting of two thicknesses (see Fig. 1 and Fig. 2. The
Practice E1742/E1742M hole-type IQIs shall be placed on
thickness of step 2 shall be minimum of 2 times the thickness
either side of the phantom. Placing the IQIs on the flat side will
of step.
provide the same geometric conditions for all IQIs. The hole
9.1.1.1 Other thicknesses may be used if agreed between
type IQIs shall be placed on an area of the plate corresponding
user and CEO. It is not required that the phantom thicknesses
to the IQI’s thickness (Fig. 1 and Fig. 2). A minimum of one
represent the thickness range of product being evaluated. It is
Practice E2002 Duplex Wire IQI shall be placed on the thinner
recommended that the outside dimensions of the phantom be
area as shown in Fig. 1 and Fig. 2 with an angle of 2° to 5° to
sufficient in size that when imaged, the phantom is projected
the DDA pixel matrix directions. If required by the CEO, the
across the entire detector for the geometric magnification used.
spatial system resolution shall be measured in two perpendicu-
The phantom may be manufactured by overlapping two or
lar directions. This requirement may be met by either use of
more plates or by manufacturing from a single plate of
two Practice E2002 IQIs or by use of a single IQI providing
material. An advantage of Type 2 Phantom is that it can be
images from two separate positions are acquired during the
manufactured with the thinner plate being much larger than the
test. Other IQI placement may be used if agreed upon between
user and CEO.
9.2 Duplex Phantom Tests:
9.2.1 The tests listed in 9.2.2 – 9.2.6 and Table 1 shall be
performed with the selected duplex phantom and correspond-
ing IQIs. The tests shall be performed initially to establish the
baseline and at specified intervals.
detector
9.2.2 Basic Spatial Resolution – Detector (iSR )—In
b
accordance with Practice E2002 method using the duplex-wire
gage at the detector.
image
9.2.3 Basic Spatial Resolution – Image (iSR )—In ac-
b
cordance with Practice E2002 method using the duplex-wire
gage and duplex plate phantom positioned at a specific
exposure geometry.
9.2.4 Contrast Sensitivity – In 4T Hole—Measured on the
two separate IQIs located on the thin and thick step of the
duplex plate phantom.
9.2.5 Signal to Noise Ratio—Measured on the thin and thick
step of the duplex plate phantom.
9.2.6 Signal Level—Measured on the thin and thick step of
the duplex plate phantom.
9.3 Duplex Plate Phantom Exposure Procedures:
9.3.1 For tracking performance of a DDA, all of the same
NOTE 1—Fig. 1 plate sizes and thicknesses are only examples and not
technique parameters which were used to establish the system
meant to be restrictive.
baseline shall be used during long-term stability or process
NOTE 2—See 9.3.2 for placement of phantom for exposure.
checking.
NOTE 3—ROI locations are only examples and not meant to be
9.3.2 Exposure of the Duplex Plate Phantom Assembly—
restrictive.
FIG. 1 Type 1 Duplex Plate Phantom with IQIs and ROI Positions The side with the IQIs shall face the radiation source and an
E2737 − 23
NOTE 1—Fig. 2 plate sizes and thicknesses are only examples and not meant to be restrictive. Examples of standardized dimensions for manufacturing
Type 2 duplex plate phantoms within the requirements of this Practice are listed below.
NOTE 2— See 9.3.2 for placement of phantom for exposure.
NOTE 3—When required to measure the spatial resolution in two perpendicular directions, it is optional to use two Practice E2002 duplex wire gages.
See 9.1.2.2.
NOTE 4—Examples of ROI locations are included in Fig. 4.
Examples of Standardized Dimensions for Manufacturing
Type 2 Duplex Plate Phantoms
Stainless Steel Duplex Plate:
Min. Thickness: 5 mm (0.12 in.), Max. Thickness: 20 mm (0.79 in.)
Base Plate 400 mm by 300 mm (15.75 in. by 11.81 in.) 5 mm Thick (0.12 in.)
Top Plate 100 mm by 80 mm 15 mm Thick (0.59 in.)
(3.94 in. by 3.15 in.)
Aluminum Duplex Plate:
Min. Thickness: 12 mm (0.47 in.), Max. Thickness: 50 mm (1.97 in.)
Base Plate 400 mm by 300 mm (15.75 in. by 11.81 in.) 12 mm Thick (0.47 in.)
Top Plate 100 mm by 80 mm 38 mm Thick (1.5 in.)
(3.94 in. by 3.15 in.)
FIG. 2 Type 2 Duplex Plate Phantoms with IQIs
image shall be acquired with the technique parameters outlined upon between the user and the CEO should address the specific
in 7.4 recorded along with the energy setting as described in tests to perform, the data presentation, and the frequency of
7.5. When establishing the Phantom Technique, it is recom- testing and any required approvals.
mended that the projected image of the phantom cover the
10.2 Establishing Control Limits—Unless otherwise speci-
entire active area of the detector. This helps to avoid a
fied by the CEO, the baseline performance upper and lower
substantial gradient in the image which can negatively influ-
control limits shall be established through the following
ence the ability to provide quality contrast sensitivity measure-
process:
ments. In the event the projected image of the phantom does
10.2.1 Acquisition Interval—System performance test data
not cover the entire detector active area, it is acceptable to use
shall be acquired daily and recorded for the first 30 days the
masking around the phantom or to employ beam collimation. It
equipment is in use. Evaluation of product is allowed within
is recommended that the masking be of a similar material
these 30 days providing the established image quality require-
density of the phantom and be permanently attached to the
ments are consistently met and any required technique approv-
phantom to provide more repeatable results. Inconsistent place-
als ha
...
This document is not an ASTM standard and is intended only to provide the user of an ASTM standard an indication of what changes have been made to the previous version. Because
it may not be technically possible to adequately depict all changes accurately, ASTM recommends that users consult prior editions as appropriate. In all cases only the current version
of the standard as published by ASTM is to be considered the official document.
Designation: E2737 − 10 (Reapproved 2018) E2737 − 23
Standard Practice for
Digital Detector Array Performance Evaluation and Long-
Term Stability
This standard is issued under the fixed designation E2737; the number immediately following the designation indicates the year of
original adoption or, in the case of revision, the year of last revision. A number in parentheses indicates the year of last reapproval. A
superscript epsilon (´) indicates an editorial change since the last revision or reapproval.
1. Scope
1.1 This practice describes the evaluation of DDA systems for industrial radiology.covers the baseline and periodic performance
evaluation of Digital Detector Array (DDA) systems used for industrial radiography. It is intended to ensure that the evaluation
of image quality, as far as this is influenced by the DDA system, meets the needs of users, and their customers, and enables process
control and long termto monitor long-term stability of the DDA system.
1.2 This practice specifies the fundamental parameters of Digital Detector Array (DDA) DDA systems to be measured to
determine baseline performance, and to track the long term long-term stability of the DDA system.
1.3 The DDA system performance tests specified in this practice shall be completed upon acceptance of the system from the
manufacturer and at intervals specified in this practice to monitor long term stability of the system. The intent of these tests is to
monitor the system performance for degradation and to baseline the performance of the DDA. Periodic performance testing shall
then be used to monitor long-term stability of the system in order to identify when an action needs to be taken when the system
degrades by a certain due to system degradation beyond a certain defined level.
1.4 Two types of phantoms, the duplex plate and the five-groove wedge, are used for testing as specified herein. The use of the
gages provided in this standard is mandatory for each test. these two types of phantoms is not intended to exclude the use of other
phantom configurations. In the event thesethe tests or gages are not sufficient, phantoms specified herein are not sufficient or
appropriate, the user, in coordination with the cognizant engineering organization (CEO) may develop additional or modified tests,
test objects, gages,phantoms, or image quality indicators to evaluate the DDA system. system performance. Acceptance levels for
these ALTERNATE tests test methods shall be determined by agreement between the user, CEO and manufacturer.user and CEO.
1.5 The user of this practice shall consider that higher energies than 450 keV may require different test methods or modifications
to the test methods described here. This practice is not intended for usage with isotopes.
1.6 This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility
of the user of this standard to establish appropriate safety, health, and environmental practices and determine the applicability of
regulatory limitations prior to use.
1.7 This international standard was developed in accordance with internationally recognized principles on standardization
established in the Decision on Principles for the Development of International Standards, Guides and Recommendations issued
by the World Trade Organization Technical Barriers to Trade (TBT) Committee.
This practice is under the jurisdiction of ASTM Committee E07 on Nondestructive Testing and is the direct responsibility of Subcommittee E07.01 on Radiology (X and
Gamma) Method.
Current edition approved Feb. 1, 2018July 1, 2023. Published February 2018August 2023. Originally approved in 2010. Last previous edition approved in 20102018 as
E2737 – 10.E2737 – 10 (2018). DOI: 10.1520/E2737-10R18.10.1520/E2737-23.
Copyright © ASTM International, 100 Barr Harbor Drive, PO Box C700, West Conshohocken, PA 19428-2959. United States
E2737 − 23
2. Referenced Documents
2.1 ASTM Standards:
E543 Specification for Agencies Performing Nondestructive Testing
E1025 Practice for Design, Manufacture, and Material Grouping Classification of Hole-Type Image Quality Indicators (IQI)
Used for Radiography
E1165 Test Method for Measurement of Focal Spots of Industrial X-Ray Tubes by Pinhole Imaging
E1316 Terminology for Nondestructive Examinations
E1742/E1742M Practice for Radiographic Examination
E2002 Practice for Determining Image Unsharpness and Basic Spatial Resolution in Radiography and Radioscopy
E2445/E2445ME2446 Practice for Performance Evaluation and Long-Term Stability Manufacturing Characterization of
Computed Radiography Systems
E2597/E2597M Practice for Manufacturing Characterization of Digital Detector Arrays
E2698 Practice for Radiographic Examination Using Digital Detector Arrays
E2736 Guide for Digital Detector Array Radiography
E2903 Test Method for Measurement of the Effective Focal Spot Size of Mini and Micro Focus X-ray Tubes
2.2 Industry Standards:
ANSI/ASNT CP-189 Standard for Qualification & Certification of Nondestructive Testing Personnel
EN 4179 Qualification & Approval of Personnel for Non-Destructive Testing
ISO 9712 Non-Destructive Testing - Qualification & Certification of NDT Personnel
NAS 410 National Aerospace Standard: Certification & Qualification of Nondestructive Test Personnel
SNT-TC-1A Personnel Qualification & Certification in Nondestructive Testing
3. Terminology
3.1 Definitions—The definition of terms relating to gamma and X-radiology, which appear in Terminology E1316, Practice E2002,
Practice E2597/E2597M, GuidePractice E2736E2698, and PracticeGuide E2698E2736, shall apply to the terms used in this
practice.
3.2 Definitions of Terms Specific to This Standard:
3.2.1 digital detector array (DDA) system—an electronic device that converts ionizing or penetrating radiation into a discrete array
of analog signals which are subsequently digitized and transferred to a computer for display as a digital image corresponding to
the radiologic energy pattern imparted upon the input region of the device. The conversion of the ionizing or penetrating radiation
into an electronic signal may transpire by first converting the ionizing or penetrating radiation into visible light through the use
of a scintillating material. These devices can range in speed from many seconds per image to many images per second, up to and
in excess of real-time radioscopy rates (usually 30 frames per seconds).
3.2.1 active DDA area—the active pixelized region of the DDA, which is recommended by the manufacturer as usable.
3.2.2 signal-to-noise ratio (SNR)—burn-in—quotient of mean value of the intensity (signal) and standard deviation of the intensity
(noise). The SNR depends on the radiation dose and the DDA system properties.change in gain of the scintillator that persists well
beyond the exposure.
3.2.3 contrast-to-noise ratio (CNR)—duplex plate phantom Type 1—quotient of the difference of the signal levels between two
material thicknesses, and standard deviation of the intensity (noise) of the base material. The CNR depends on the radiation dose
For referenced ASTM standards, visit the ASTM website, www.astm.org, or contact ASTM Customer Service at service@astm.org. For Annual Book of ASTM Standards
volume information, refer to the standard’s Document Summary page on the ASTM website.
The saturation gray value is provided by the manufacturer.Available from American National Standards Institute (ANSI), 25 W. 43rd St., 4th Floor, New York, NY 10036,
http://www.ansi.org.
DDAs can operate in either an extended exposure mode where a single frame may be acquired for 10 s, or multiple frames can be acquired and averaged over the same
10 s. The user is to select the appropriate frame time for the DDA at hand based on manufacturer’s recommendations and typical exposure times used for production.Available
from British Standards Institution (BSI), 389 Chiswick High Rd., London W4 4AL, U.K., http://www.bsigroup.com.
Available from International Organization for Standardization (ISO), ISO Central Secretariat, Chemin de Blandonnet 8, CP 401, 1214 Vernier, Geneva, Switzerland,
https://www.iso.org.
Available from Aerospace Industries Association (AIA), 1000 Wilson Blvd., Suite 1700, Arlington, VA 22209, http://www.aia-aerospace.org.
Available from American Society for Nondestructive Testing (ASNT), P.O. Box 28518, 1711 Arlingate Ln., Columbus, OH 43228-0518, http://www.asnt.org.
E2737 − 23
and the DDA system properties.a phantom manufactured from a single material type and having two thickness made up of either
two overlapping plates or a single plate machined to provide two thicknesses with the two thicknesses typically aligning on 3 edges
(Fig. 1).
3.2.3.1 Discussion—
Duplex plate phantom Type 1 was first mentioned in E2737 – 10.
3.2.5 contrast sensitivity—recognized contrast percentage of the material to examine. It depends on 1/CNR.
3.2.4 spatial resolution (SR)—duplex plate phantom Type 2—the spatial resolution indicates the smallest geometrical detail, which
can be resolved using the DDA with given geometrical magnification. It is the half of the value of the detector unsharpness divided
by the magnification factor of the geometrical setup and is similar to the effective pixel size.a phantom manufactured from a single
material type and having two thickness made up of either two overlapping plates or a single plate machined to provide two
thicknesses (Fig. 2).
3.2.5 material thickness range (MTR)—five-groove wedge—the wall thickness range within one image of a DDA, whereby the
thinner wall thickness does not saturate the DDA and at the thicker wall thickness, the signal is significantly higher than the noise.a
continuous wedge with five long grooves on one side.
3.2.6 frame rate—number of frames acquired per second.
3.2.7 lag—residual signal in the DDA that occurs shortly after detector read-out and erasure.
3.2.10 burn-in—change in gain of the scintillator that persists well beyond the exposure.
3.2.8 bad pixel—manufacturer—a pixel identified with a performance outside of the specification range for a pixel of a DDA as
defined in Practice DDA system manufacturer, supplier for the user of the DDA system.E2597/E2597M.
3.2.9 five-groove wedge—material thickness range (MTR)—a continuous wedge with five long grooves on one side (seethe
material thickness range within a single DDA image, whereby a minimum specific image quality is Fig. 1).achieved throughout
the entire thickness range.
NOTE 1—Fig. 1 plate sizes and thicknesses are only examples and not meant to be restrictive.
NOTE 2—See 9.3.2 for placement of phantom for exposure.
NOTE 3—ROI locations are only examples and not meant to be restrictive.
FIG. 1 5-Groove-Wedge (steel) – see AppendixType 1 Duplex Plate Phantom with IQIs and ROI Positions
E2737 − 23
3.2.10 phantom—a part or item being used to quantify DDA characterization metrics.
3.2.14 duplex plate phantom—two plates of the same material; Plate 2 has same size in x- and half the size in v- direction of Plate
1; the thickness of Plate 1 matches the minimum thickness of the material for inspection; the thickness of Plate 1 plus Plate 2
matches the maximum thickness of the material for inspection (see Fig. 2).
3.2.15 DDA offset image—image of the DDA in the absence of x-rays providing the background signal of all pixels.
3.2.16 DDA gain image—image obtained with no structured object in the x-ray beam to calibrate pixel response in a DDA.
3.2.17 calibration—correction applied for the offset signal and the non-uniformity of response of any or all of the X-ray beam,
scintillator, and the read out structure.
3.2.18 gray value—the numeric value of a pixel in the DDA image. This is typically interchangeable with the term pixel value,
detector response, Analog-to-Digital unit and detector signal.
3.2.11 saturation graypixel value—the maximum possible usable graypixel value of the DDA after offset correction.
NOTE 1—Saturation may occur because of a saturation of the pixel itself, the amplifier, or digitizer, where the DDA encounters saturation graypixel values
as a function of increasing exposure levels.
3.2.12 user—the user and operating organization of the DDA system.
NOTE 1—Fig. 2 plate sizes and thicknesses are only examples and not meant to be restrictive. Examples of standardized dimensions for manufacturing
Type 2 duplex plate phantoms within the requirements of this Practice are listed below.
NOTE 2— See 9.3.2 for placement of phantom for exposure.
NOTE 3—When required to measure the spatial resolution in two perpendicular directions, it is optional to use two Practice E2002 duplex wire gages.
See 9.1.2.2.
NOTE 4—Examples of ROI locations are included in Fig. 4.
Examples of Standardized Dimensions for Manufacturing
Type 2 Duplex Plate Phantoms
Stainless Steel Duplex Plate:
Min. Thickness: 5 mm (0.12 in.), Max. Thickness: 20 mm (0.79 in.)
Base Plate 400 mm by 300 mm (15.75 in. by 11.81 in.) 5 mm Thick (0.12 in.)
Top Plate 100 mm by 80 mm 15 mm Thick (0.59 in.)
(3.94 in. by 3.15 in.)
Aluminum Duplex Plate:
Min. Thickness: 12 mm (0.47 in.), Max. Thickness: 50 mm (1.97 in.)
Base Plate 400 mm by 300 mm (15.75 in. by 11.81 in.) 12 mm Thick (0.47 in.)
Top Plate 100 mm by 80 mm 38 mm Thick (1.5 in.)
(3.94 in. by 3.15 in.)
FIG. 2 Type 2 Duplex Plate PhantomPhantoms with IQIs positioned; one ASTM E1025 or E1742/E1742M Penetrameter on each plate and
one ASTM E2002 Duplex Wire IQI on the thinner plate. The boxes ROI 1 to ROI 4 are for evaluation of signal level and SNR.
E2737 − 23
3.2.21 customer—the company, government agency, or other authority responsible for the design, or end user, of the system or
component for which radiologic examination is required, also known as the CEO. In some industries, the customer is frequently
referred to as the “Prime”.
3.2.22 manufacturer—DDA system manufacturer, supplier for the user of the DDA system.
3.3 Definitions: Abbreviations Specific to This Standard:
3.3.1 D —diameter of the IQI hole (in pixels).
hole
3.3.2 CNC—Computer Numerical Control.
3.3.3 GSL—Groove Sensitivity Level – The smallest long groove which is visible in the image at the first single dot marking.
3.3.4 MT—the penetrated material thickness in the ROI under consideration.
3.3.5 MT —material thickness of the plate(s) under the IQI.
step
3.3.6 MT —thickness of hole-type IQI.
IQI
3.3.7 MT —total material thickness of plate and hole-type IQI (=MT + MT ).
total step IQI
3.3.8 PV [hole]—median pixel value of ROI within the IQI hole.
median
3.3.9 PV —mean pixel value, for example, of a ROI.
mean
3.3.10 PV [beside squares]—mean pixel value measured inside the area between two boxes.
mean
3.3.11 PV —mean pixel value of the ROI on the thick area of the five-groove wedge.
thick
3.3.12 PV —mean pixel value at the thinnest area of the five-groove wedge.
thin
3.3.13 PV (Offset)—mean pixel value of the approximately central 90 % of the area in the offset image.
mean
3.3.14 Sigma[beside squares]—standard deviation of the pixel values in the area between two boxes.
4. Significance and Use
4.1 This practice is intended to be used by the NDT using organization DDA user to measure and record the baseline performance
of the DDA and an acquired DDA in order to monitor its performance throughout its service as an NDT imaging system.imaging
system. This practice is not intended to be used as an “acceptance test” of a DDA.
4.2 It is to be understood that the DDA has already been selected and purchased by the user from a manufacturer based on the
inspection needs at hand. This practice is not intended to be used as an “acceptance test” of the DDA, but rather to establish a
performance baseline that will enable periodic performance tracking while in-service.
4.3 Although many of the properties listed in this standard have similar metrics to those found in Practice E2597/E2597M, data
collection methods are not identical, and comparisons among values acquired with each standard should not be made.
4.2 This practice defines the tests to be performed and their required intervals. Also defined are the methods of tabulating results
that DDA users will complete following initial baselining of the DDA system. These tests will also be performed periodically at
the stated required intervals to evaluate the DDA system to determine if the system remains within acceptable operational limits
as established in this practice orand defined between the user and customer (CEO).CEO.
E2737 − 23
4.3 There are several factors that affect the quality of a DDA image including the basic spatial resolution, geometricalgeometric
unsharpness, scatter, signal to noise ratio, contrast sensitivity (contrast/noise ratio),sensitivity, contrast/noise ratio, image lag, and
burn in. for some types of DDAs, burn-in. There are several additional factors and settings which can affect these results (for
example, integration time, detector parameters or imaging software), which affect these results. Additionally, calibration techniques
may also parameters, imaging software, and even X-ray radiation quality). Additionally, detector correction techniques may have
an impact on the quality of the image. This practice delineates tests for each of the properties listed herein and establishes standard
techniques for assuring repeatability throughout the lifecycle testing of the DDA.
5. General Testing Procedures
5.1 The tests performed herein can be completed either by the use of the five-groove wedge phantom (see Fig. 1) or with separate
IQIs on the Duplex Plate Phantom (see Fig. 2).
5.2 DDA Calibration Method—Prior to testing, the DDA shall be calibrated for offset and, or gain to generate corrected images
per manufacturer’s recommendation. It is important that the calibration procedure be completed as would be done in production
during routine calibration procedures, and that these same procedures be used throughout the periodic testing of the DDA after it
is in-service.
5.3 Bad Pixel Standardization for DDAs—Images collected for testing shall be corrected for bad pixels as would be done in
production during routine bad pixel correction procedures per manufacturer’s recommendation wherever required. A standardized
nomenclature is presented in Practice E2597/E2597M. The identification and correction of bad pixels in a delivered DDA remain
in the purview of agreement between the user and the system manufacturer. The various tests shall be completed under similar
conditions as in production. Some parameters to control are listed below. If several different energies are used in production, the
complete settings with the highest energy level shall be used for these tests.
5.3.1 X-ray tube voltage [kV]
5.3.2 tube current [mA]
5.3.3 focus detector distance (FDD) [mm]
5.3.4 object detector distance (ODD) [mm]
5.3.5 total exposure time per image [ms]
5.3.6 detector corrections (calibration and bad pixel substitution)
5.3.7 detector settings
5.3.8 image acquisition software and image processing
5. Basis of Application
5.1 The following items are subject to contractual agreement between the parties using or referencing this standard.
5.1.1 Personnel Qualification—Personnel performing examinations to this practice shall be qualified in accordance with NAS410,
EN 4179, ANSI/ASNT CP 189, ISO 9712, or SNT-TC-1A and certified by the employer or certifying agency as applicable. Other
equivalent qualification documents may be used when specified on the contract or purchase order. The applicable revision shall
be the latest unless otherwise specified in the contractual agreement between parties.
5.1.2 If specified in the contractual agreement, NDT agencies shall be qualified and evaluated as described in Specification E543.
The applicable edition of Specification E543 shall be specified in the contract.
6. Apparatus
6.1 Phantom Types and Selection—The tests performed herein may be completed either by the use of a Type 1 or Type 2 Duplex
E2737 − 23
Plate Phantom with separate IQIs (See Fig. 1 and Fig. 2), or with a Five-Groove Wedge Phantom (See Fig. 5 and Fig. 6). The
phantoms are available for purchase or may be manufactured by the user. A Phantom Record shall be generated providing
individual phantom identification, basic material type, basic dimensional data, and traceability to records of any IQIs used with
the phantom. Certification of material alloy or dimensions is not required for duplex plate phantoms.
6.2 Phantom Materials—The phantoms may be manufactured from any material group, however Aluminum is recommended for
light metal applications (material group 02 of equal or lower atomic number and density as listed in Practice E1025) and Stainless
Steel is recommended for more dense material applications (material group 1 of equal or higher atomic number and density as
listed in Practice E1025). It is not necessary to make use of other materials that more closely represent a given product being
evaluated. If a facility evaluates materials from more than one material group, a phantom from only one material group needs to
be processed. Radiographically homogeneous material alloys are preferred. 7022 Aluminum and 316L Stainless Steel are strongly
recommended. Other materials may be used when approved by the CEO. Materials displaying grain structure mottling or visible
scatter artifacts, reduce the ability to effectively measure DDA system performance and variability. When required, the selected
material shall be agreed upon between the user and CEO. Previously established baseline test materials are not required to be
modified to align with the above material recommendations.
7. Application of Baseline Tests and Test MethodsGeneral Procedures Applied to All Phantom Types
7.1 DDA System Baseline Performance Tests Correction Method—
6.1.1 The user shall accept the DDA system based on manufacturer’s results of Practice E2597/E2597M on the specific detector
as provided in a data sheet for that serialized DDA or other agreed to acceptance test between the user and manufacturer (not
covered in this practice). The user baselines the DDA using the tests defined in Table 1. Additional tests are to be defined in
agreement between the CEO and the using organization in terms of the specific tests to perform, how the data is presented, and
the frequency of testing. This approach does the following:
6.1.1.1 Provides a quantitative baseline of performance,
6.1.1.2 provides results in a defined form that can be reviewed by the CEO and
6.1.1.3 offers a means to perform process checking of performance on a continuing basis.As part of the baseline testing, the DDA
offset, gain corrections shall be acquired in accordance with the manufacturer’s recommendation, using a typical process as applied
during production product evaluations. These same correction procedures shall be used at normal production intervals throughout
the periodic testing of the in-service DDA. Additionally, the DDA corrections shall be re-acquired when the periodic test results
fall out of the established control limits. Reference Annex A1.
6.1.2 Acceptance values, and tolerances thereof obtained from these tests shall also be in agreement between the CEO and the
using organization.
7.1.1 Bad Pixel Standardization for DDAs—Acceptance levels for individual bad pixels, bad clusters, relevant bad clusters,
Baseline Images shall also be corrected for bad pixels as would be done in production using routine bad pixel correction
procedures. A standardized nomenclature is presented in Practice E2597/E2597Mand bad lines, and their statistical distribution
within the DDA, as well as proximity to said anomalies is to be determined by agreement. The identification and correction of bad
pixels in a DDA shall be as agreed upon between the user and the CEO. The user and or CEO may refer to the Guide for DDAs
(threshold levels used to identify bad E2736), Practice E2597/E2597M, as well as consult with the manufacturer on how the
prevalence of these anomalous pixels might impact a specific application. This practice does not set limits, but does offer a means
for tracking such anomalous pixels in the table templates provided herein.pixels shall be recorded in the test report in full or in
reference. The bad pixel data shall be presented as an image or as a report containing specific parameters for bad pixels, cluster
kernel pixels, relevant clusters, non-relevant clusters, and lines.
6.1.4 Given that the other elements of the DDA system are within their tolerances including the x-ray source/generator, the
imaging system, and the inspection itself (for example errors with gain/offset mapping are controlled, as is any severe x-ray scatter
in the inspection), and the test produces a result below the “agreed to” requirements, the detector is not to be placed in service
unless it is repaired, replaced, or some other change is instituted that will assure the quality of the inspection as stated in the
agreement between contracting parties.
6.1.5 The results of the initial test of the new system shall be documented, as delineated in Table 2 and Table 3 and taken as
reference values “Result (new)” for further use.
E2737 − 23
TABLE 1 System Performance Tests and Process Check of the DDA System
System Performance Test System Performance Test Process Check
Base Software Tube Detector Short Long Usage
Unit
Parameter Line Update Change Change/ Version Version Five-Hole
Repair Wedge
Spatial Resolution SR μm x x x x x x
x
Contrast Sensitivity CS % x x x x x x
Material Thickness Range MTR mm x x x x x x x
Signal to Noise Ratio SNR x x x x x x
Signal Level SL x x x x x
Image Lag Lag % x x x
Burn In BI % x x x x
Offset Level OL x x x x x
Bad Pixel Distribution x x x x x x
TABLE 1 System Performance Tests and Process Check of the DDA System using the DUPLEX PLATE
System Performance Test System Performance Test Process Check Control
Baseline Software Tube Detector Test Method
Parameter
F A
Update Change Repair Intervals
detector
Basic Spatial Resolution iSR x x x x 6 Months ±3 Sigma,
b
E
(Detector)
image
Basic Spatial Resolution iSR x x x x 10 Business Days or Before ±3 Sigma,
b
(Image) Use
Contrast Sensitivity in 4T hole CS x x x x 10 Business Days or Before ±3 Sigma,
4T
Use
Signal to Noise Ratio SNR x x x x 10 Business Days or Before ±3 Sigma,
Use
Signal Level SL x x x x 10 Business Days or Before ±3 Sigma,
Use
B
Offset Level OL x x x 10 Business Days or Before
Use
Bad Pixel Distribution in x x x x 3 Months As agreed upon
accordance with E2597/E2597M
Bad Pixel Distribution Second- x Daily or Before Use
D
ary Evaluation (7.3)
A
Test Intervals: Unless other intervals are defined and agreed upon by user and CEO.
B
Offset Level: A recorded mean pixel value of a standard offset correction fulfills this requirement 7.2). The baseline for the offset level measurement can be a single
measurement; it is not required to collect 30 days of test data.
C
Control Limits Method: See 10.2.3, Control Limits Values. It is understood that one method of control limits being used is ±20 %. Some industries are now transitioning
to ±3 Sigma. Either is acceptable unless otherwise specified by the CEO.
D
Secondary Evaluation for Bad Pixels: Example—One method of performing this evaluation would be a simple visual screening for bad pixels during normal viewing of
a production image.
E
Basic Spatial Resolution (Detector): The baseline for this measurement can be a single measurement; it is not required to collect 30 days of test data.
F
Baseline: See Section 10 for Application of Baseline Tests and Test Methods.
6.1.6 Maximum deviations from Result (new) as tolerances and limits defined between contracting parties shall also be
documented in Table 2 and Table 3 as reference values “Limit” for further use.
6.1.7 If a replacement DDA is placed into service, the reference values from the acceptance test shall be updated, and a new
baseline formed.
7.2 User Tests After Repair, Hardware- or Software Upgrade—Procedure for Measurement of the Offset Level—After
modifications, such as repair or upgrade of the DDA system hardware, specialized tests are required to prove the proper
performance of the DDA system with the new conditions. With a new DDA the reference values from the acceptance test Before
measurement of the Offset Level, the DDA should be powered-on and not exposed for approximately ten minutes. One image with
30 s acquisition time (for example 1 s frames and averaging all 30 frames) shall be captured without radiation (Offset Imageshall
be updated, too. Changes of the functionality of the system (for example, by new software version), which influence the image
quality, also need a test to prove the proper performance of the system after changes.). Bad Pixel Correction is active, no gain or
offset correction shall be done. The Offset Level is the mean pixel value of the approximate central 90 % of area in the offset image.
An ROI of greater than 90 % may be used providing consideration is made for defective or underperforming pixels in the border
of the detector.
7.3 User Tests for Long-Term Stability—Procedure for Evaluation of Bad Pixels—Quality assurance requires periodic tests of the
DDA system to ensure the proper performance of the system.The baseline and performance monitoring evaluation for bad pixels
shall be performed in accordance with Practice E2597/E2597M The time interval depends on the degree of usage of the system
E2737 − 23
TABLE 2 Test Report of DDA SystemReport of Bad Pixels and ClustersContent to be Included Within Test Report when using the
DUPLEX PLATE
Tests Control Limits Test Value Resuls
Pass/Fail
Measurement Location
Min Max
image
DP-Thin - Position 1 Basic Spatial Resolution (iSR )
b
image
DP-Thin - Position 2 Basic Spatial Resolution (iSR )
b
DP-Thin Contrast Sensitivity in 4T hole (CS ) [%]
4T
DP-Thick Contrast Sensitivity in 4T hole (CS ) [%]
4T
DP-Thin Signal to Noise Ratio
DP-Thick Signal to Noise Ratio
DP-Thick Signal Level
DP-Thin Signal Level
N/A Offset Level
detector
Face of Detector Basic Spatial Resolution (iSR )
b
N/A Bad Pixel Distribution
and shall be defined unless otherwise agreed upon by the user and CEO. The frequency of evaluation shall be agreed upon by the
user with consideration of the DDA system manufacturer’s information. There may be two versions of the long version stability
tests, the complete program and a short version. The intervals for the performance checks shall not exceed ten days. The check
for bad pixel shall be done daily. Details shall be agreed upon between the customer and the user.and CEO. The documentation
of bad pixels shall be performed by evaluating an acquired image for any individual nonconforming pixels, clusters, or lines that
display a pixel intensity value that is outside of tolerance compared to the mean surrounding pixels. This can be completed by
selecting one of several secondary evaluation methods: visual examination, ASTM procedure, or manufacturers recommended
procedure. An example of a Secondary Evaluation for Bad Pixels would be a simple visual screening for bad pixels during normal
viewing of a production image. Newly identified bad pixels shall be added to an existing bad pixel map, or a completely new map
may be utilized. Any relevant cluster or line shall be clearly noted and added to the bad pixel map, as non-correctable pixels could
hide relevant indications. The location of correctable and non-correctable bad pixels shall be documented. In addition, a report may
contain the number of bad pixels, cluster kernel pixels, total clusters, relevant clusters, non-relevant clusters, and lines.
E2737 − 23
TABLE 3 Report of Bad Pixels and ClustersSystem Performance Tests and Process Check of the DDA System using the FIVE-GROOVE
WEDGE
System Performance Test Process Check Control Limits
System Performance Test
G A C
Baseline Software Tube Detector Test Intervals Method
Parameter
Update Change Repair
detector
Basic Spatial Resolution iSR x x x x 6 Months ±3 Sigma, or ±20 percent
b
F
(Detector)
E
Groove Sensitivity Level GSL x x x 10 Business Days or ±3 Sigma, or ±20 percent
Before Use
E2737 − 23
Contrast Sensitivity (groove) CS x x x x 10 Business Days or ±3 Sigma, or ±20 percent
4T
Before Use
Signal to Noise Ratio SNR x x x x 10 Business Days or ±3 Sigma, or ±20 percent
Before Use
Signal Level SL x x x x 10 Business Days or ±3 Sigma, or ±20 percent
Before Use
Material Thickness Range MTR x x x x 10 Business Days or ±3 Sigma, or ±20 percent
Before Use
B
Offset Level OL x x x 10 Business Days or +50 percent
Before Use
Bad Pixel Distribution Per x x x x 3 Months As agreed upon by user
E2597/E2597M and CEO
Bad Pixel Distribution Second- x Daily or Before Use
D
ary Evaluation ( 7.3)
A
Test Intervals: Unless other intervals are defined and agreed upon by user and CEO.
B
Offset Level: A recorded mean pixel value of a standard offset correction fulfills this requirement (7.2). The baseline for the offset level measurement can be a single
measurement; it is not required to collect 30 days of test data.
C
Control Limits Method: See 15.2.3, Control Limits Values. It is understood that one previously applied method of establishing control Limits was ±20 %. Some industries
are now transitioning to ±3 Sigma. Either is acceptable unless otherwise specified by the CEO.
D
Secondary Evaluation for Bad Pixels: Example: One method of performing this evaluation would be a simple visual screening for bad pixels during normal viewing of a
production image.
E
Groove Sensitivity Level: The smallest visible groove shall be taken as the Groove Sensitivity Level.
F
Basic Spatial Resolution (Detector): The baseline for this measurement can be a single measurement; it is not required to collect 30 days of test data.
G
Baseline: See Section 15 for Application of Baseline Tests and Test Methods.
7.4 Technique Parameters—The various tests shall be completed using documented baseline technique parameters. It is not
required that these technique parameters represent conditions used in production. Both the Detector and X-ray Source may degrade
over time and impact image quality, therefore at a minimum, the following parameters shall be recorded and used in acquiring the
baseline images as well as the long-term stability data. These technique parameters shall be recorded as part of the baseline and
ongoing tests reports in full or in reference.
7.4.1 X-ray System Identification:
7.4.1.1 Detector Model Number and Serial Number.
7.4.1.2 X-ray Tube Model Number and Serial Number.
7.4.2 X-ray Tube Settings/Configuration:
7.4.2.1 X-ray tube voltage (kV).
7.4.2.2 Tube current (mA).
7.4.2.3 Focal spot size. (As measured according to Test Methods E1165 or E2903, or another standard. The recorded focal spot
may be taken from the manufactures documentation.)
7.4.3 X-ray Tube/Detector – Beam Filtration:
7.4.3.1 Material Type.
7.4.3.2 Material Thickness.
7.4.4 Beam Collimation:
7.4.4.1 Collimation Location (Tube/Detector/Part).
7.4.4.2 Blade Positioning or Collimation Opening Values.
7.4.4.3 Collimation Material.
7.4.5 Geometry:
E2737 − 23
7.4.5.1 Source to Detector Distance (SDD).
7.4.5.2 Object to Detector Distance (ODD) or Source to Object Distance (SOD).
7.4.6 Detector Settings:
7.4.6.1 Detector Gain Setting.
7.4.6.2 Binning Mode.
7.4.6.3 Orientation (Landscape/Portrait/N/A).
7.4.7 Exposure Time Per Image:
7.4.7.1 Frame Rate or Integration Time.
7.4.7.2 Frame Averaging.
7.4.7.3 Total Exposure Time.
7.4.8 Detector Corrections (correction and bad pixel substitution). Detector correction technique parameters may be recorded on
a separate technique:
7.4.8.1 Frame rate.
7.4.8.2 Number of frames averaged.
7.4.8.3 Number of Gain corrections (including each Gain’s Approximate Mean Pixel Intensity Value or Percent of Saturation Pixel
Value).
7.4.9 Image Acquisition Software and Image Processing:
7.4.9.1 Software Revision.
7.5 Technique Energy Selection—The energy used shall be appropriate for the Phantom material and thickness range to provide
the required image quality in imaging the selected phantom and associated IQIs.
8. General Tests Required for all Phantom Types
8.1 User Tests for Baseline and Long-Term Stability—Quality assurance requires periodic tests of the DDA system to ensure the
proper performance of the system. The time interval depends on the degree of usage of the system and shall be defined by the user
with consideration of the DDA system manufacturer’s information. If no time intervals are established by the contracting parties,
the intervals for the performance checks shall be as defined within Table 1.
8.1.1 Offset level Test—Degradation of the DDA may reduce the system sensitivity after extensive usage. For this reason, the DDA
system shall be checked for increasing offset value. The Offset value is the mean DDA response with no DDA corrections and
without radiation. Offset values can be influenced by temperature, therefore, where operational temperatures vary, it is important
to understand the impact on offset measurement values.
8.1.2 Bad Pixel Distribution—Newly identified bad pixels shall be added to the Bad Pixel Map. Any relevant cluster shall be
clearly noted and added to the Bad Pixel Map.
8.1.3 Image Lag and Burn-In (Nonmandatory)—The test for Image Lag and Burn-In are tests typically performed by detector
manufactures for a given model number detector. These tests shall be performed only if required by the CEO. If required, the tests
shall be performed in accordance with Practice E2597/E2597M at a frequency defined by the CEO.
E2737 − 23
9. ApparatusDuplex Plate Phantom Requirements
9.1 The tests described in Table 1 and in Section 6 require the usage of either the five-groove wedge (see Fig. 1); or the Duplex
Plate Phantom with separate IQIs—E2002 Duplex Wire and proper E1025 or E1742/E1742M penetrameters (see Fig. 2). However,
this document does not preclude the use of other gauges or phantoms which can measure the same parameters listed in Table 1.
The use of alternate gauges must be approved by the CEO.Duplex Plate Phantom Configuration—
9.1.1 Phantom Material Thicknesses/Dimensions—The five-groove wedge shall be made from light metal as aluminum or heavy
metal as stainless steel—selected based on theEither Type 1 or Type 2 phantom shall made of a single material type consisting of
two thicknesses (see Fig. 1 metaland Fig. 2used in the application. Material and size can be taken from. The thickness of step 2
shall be minimum of 2 Appendix X1.times the thickness of step.
9.1.1.1 Other thicknesses may be used if agreed between user and CEO. It is not required that the phantom thicknesses represent
the thickness range of product being evaluated. It is recommended that the outside dimensions of the phantom be sufficient in size
that when imaged, the phantom is projected across the entire detector for the geometric magnification used. The phantom may be
manufactured by overlapping two or more plates or by manufacturing from a single plate of material. An advantage of Type 2
Phantom is that it can be manufactured with the thinner plate being much larger than the thick plate, thereby reducing the
phantom’s weight. This is especially true when dealing with phantoms for large DDAs and lower geometric magnification imaging
techniques. The thicker plate shall be large enough in width and length to provide clear separation from the IQI and the plate edge
to reduce influence from edge gradient within the image.
9.1.2 Duplex Plate Phantom—Image Quality Indicators—Two flat plates of the same material are used – either constructed from
Aluminum (Material Group 02), Titanium (Material Group 01) or Stainless Steel (Material Group 1) – as needed to match the metal
used in the application. TheIQIs used in combination with the Duplex Phantom provide an initial evaluation of the quality of a
DDA system as well as a method for monitoring system performance. Practice E2002 second plate covers half of the firstduplex
wire IQIs along with Practice E1025 plate (seeor Practice Fig. 2E1742/E1742M). The thickness of the first plate is similar to the
thinnest material thickness of the object in inspection, the thickness of both plate together is similar to the highest material
thickness. hole-type IQIs shall be used along with the Duplex Plate Phantom.
9.1.2.1 Application Procedures for DDA system Quality Indicators—IQI Selection—The DDA system quality indicators provide
an evaluation of the quality of a DDA system as well as for periodic quality control. Arrangement of the DDA system qualityEach
hole type IQI shall be 2 % of the specific plate thickness unless otherwise agreed upon between the user and the CEO. The Practice
E2002indicators shall be in accordance with this practice, or as specified by the CEO. Duplex Gauge model shall be selected based
on the spatial resolution range common to the majority of techniques being used.
9.1.2.2 IQI Placement—Either two Practice E1025 or two Practice E1742/E1742MThe hole-type IQIs shall be placed on either
side of the phantom. Placing the IQIs on the flat side will provide the same geometric conditions for all IQIs. The hole type IQIs
shall be placed on an area of the plate corresponding to the IQI’s thickness (Fig. 1 and Fig. 2). A minimum of one Practice E2002
Duplex Wire Gauge is IQI shall be placed on the thinner area as shown in Fig. 1 and Fig. 2. Two with E1025 oran angle
E1742/E1742M penetrameters are placed on the plates, one on each plate of eitherof 2° to 5° to the DDA pixel matrix directions.
If required by the E1025 or the CEO, the spatial system resolution shall be measured in two perpendicular directions. This
requirement may be met by either use of two Practice E1742/E1742ME2002, but not mixed, each adequate to the required contrast
sensitivity (1-1T up to 4-4T). The four marked areas ( IQIs or by use of a single IQI providing images from two separate positions
are acquired during the test. Other IQI placement may ROI 1 to ROI 4) shall be left uncovered (seebe used if agreed Fig. 2).upon
between user and CEO.
9.2 Duplex Phantom Tests:
9.2.1 The tests listed in 9.2.2 – 9.2.6 and Table 1 shall be performed with the selected duplex phantom and corresponding IQIs.
The tests shall be performed initially to establish the baseline and at specified intervals.
detector
9.2.2 Basic Spatial Resolution – Detector (iSR )—In accordance with Practice E2002 method using the duplex-wire gage
b
at the detector.
image
9.2.3 Basic Spatial Resolution – Image (iSR )—In accordance with Practice E2002 method using the duplex-wire gage and
b
duplex plate phantom positioned at a specific exposure geometry.
E2737 − 23
9.2.4 Contrast Sensitivity – In 4T Hole—Measured on the two separate IQIs located on the thin and thick step of the duplex plate
phantom.
9.2.5 Signal to Noise Ratio—Measured on the thin and thick step of the duplex plate phantom.
9.2.6 Signal Level—Measured on the thin and thick step of the duplex plate phantom.
9.3 Duplex Plate Phantom Exposure Procedures:
9.3.1 For tracking performance of a DDA, all of the same technique parameters which were used to establish the system baseline
shall be used during long-term stability or process checking.
9.3.2 Exposure of the Duplex Plate Phantom Assembly—The side with the IQIs shall face the radiation source and an image shall
be acquired with the technique parameters outlined in 7.4 recorded along with the energy setting as described in 7.5. When
establishing the Phantom Technique, it is recommended that the projected image of the phantom cover the entire active area of the
detector. This helps to avoid a substantial gradient in the image which can negatively influence the ability to provide quality
contrast sensitivity measurements. In the event the projected image of the phantom does not cover the entire detector active area,
it is acceptable to use masking
...








Questions, Comments and Discussion
Ask us and Technical Secretary will try to provide an answer. You can facilitate discussion about the standard in here.
Loading comments...