Human-centered Steel Bridge Inspection enabled by Augmented Reality and Artificial Intelligence

Print
General Information
Solicitation Number: 1597
Status: Solicitation posted
Date Posted: Apr 10, 2023
Last Updated: Apr 02, 2024
Solicitation Expires: Jun 10, 2024
Partners: KS, NC, TX
Lead Organization: Kansas Department of Transportation
Financial Summary
Suggested Contribution:
Commitment Start Year: 2024
Commitment End Year: 2027
100% SP&R Approval: Not Requested
Commitments Required: $600,000.00
Commitments Received: $360,000.00
Contact Information
Lead Study Contact(s): David Behzadpour
David.Behzadpour@ks.gov
FHWA Technical Liaison(s): Hoda Azari
hoda.azari@dot.gov
Phone: 202-493-3064
Study Champion(s): Mark Hurt
Mark.Hurt@ks.gov
Phone: 7852968905
Organization Year Commitments Technical Contact Name Funding Contact Name Contact Number Email Address
Kansas Department of Transportation 2024 $40,000.00 Mark Hurt David Behzadpour 785-291-3847 David.Behzadpour@ks.gov
Kansas Department of Transportation 2025 $40,000.00 Mark Hurt David Behzadpour 785-291-3847 David.Behzadpour@ks.gov
Kansas Department of Transportation 2026 $40,000.00 Mark Hurt David Behzadpour 785-291-3847 David.Behzadpour@ks.gov
North Carolina Department of Transportation 2024 $40,000.00 David Snoke Curtis Bradley 919-707-6664 cbradley8@ncdot.gov
North Carolina Department of Transportation 2025 $40,000.00 David Snoke Curtis Bradley 919-707-6664 cbradley8@ncdot.gov
North Carolina Department of Transportation 2026 $40,000.00 David Snoke Curtis Bradley 919-707-6664 cbradley8@ncdot.gov
Texas Department of Transportation 2024 $40,000.00 Justin Wilson Ned Mattila 512-416-4727 ned.mattila@txdot.gov
Texas Department of Transportation 2025 $40,000.00 Justin Wilson Ned Mattila 512-416-4727 ned.mattila@txdot.gov
Texas Department of Transportation 2026 $40,000.00 Justin Wilson Ned Mattila 512-416-4727 ned.mattila@txdot.gov

Background

State DOTs currently are relying on trained inspectors to visually inspect bridge components for detecting structural deterioration and damage, which can be limited in accuracy, speed, repeatability, and reliability. On the other hand, computer vision (CV) can see what human eyes cannot, and artificial intelligence (AI) such as deep learning has shown tremendous ability to conceptualize and generalize. By integrating CV and Augmented Reality (AR), a recent NCHRP Highway IDEA project (Li et al., 2022) completed by this project team successfully demonstrated how human-centered AR environment and automated CV algorithms can empower bridge inspectors to perform more accurate and efficient field inspections of steel bridges for fatigue cracks.

As illustrated in Figure 1, the inspector wearing an AR headset (Microsoft HoloLens 2) examines the steel bridge and records a short video of the target structural surface through the AR headset. The video is then automatically uploaded to the server, where the computer vision algorithm analyzes the video by detecting and analyzing surface motion through feature points (pinks dots in the upper right figure). These feature points are then projected in near real time in front of the inspector’s eyes as holograms through the AR headset, allowing the inspector to interact with the hologram through a virtual menu to examine the results under different threshold values for crack detection, enabling human-in-the-loop decision-making.

The NCHRP Highway IDEA project has successfully demonstrated the concept of human-centered bridge inspection by integrating CV and AR using an AR headset as the hardware platform. However, further developments are needed for successful adoption of this tool in practical bridge inspections. In addition, the idea of human-centered bridge inspection would have a broader impact if realized on a wider range of mobile platforms such as tablet devices. The goal of this proposed pooled fund study is to develop a full-fledged AR-based bridge inspection tool that leverages CV and AI to support field detection, quantification, and documentation of various damages and deteriorations for steel bridges. 

Objectives

The main objective of this proposed research is to provide state DOTs practical tools for supporting human-centered steel bridge inspection with real-time defect (e.g., fatigue cracks and corrosion) detection, documentation, tracking, and decision making. The proposed research will not only bridge the gaps identified in the IDEA project, but also expand the existing capability by developing AI algorithms for crack and corrosion detection. In addition to AR headsets, the project will also develop AR-based inspection capability using tablet devices.  The tablet device can be used to perform AR-based inspection directly in a similar way to the AR headset. It can also leverage Unmanned Aerial Vehicles (UAV) for remote image and video acquisition during inspections, enabling bridge inspections from a distance in a human-centered manner, as illustrated in Figure 2. 

Scope of Work

The scope of work includes three main tasks from the development and creation of CV and AI algorithms for steel fatigue crack and corrosion detection and quantification (Task 1), comprehensive design and development of AR-based software to facilitate human-centered damage detection, visualization, documentation, tracking, and decision-making (Task 2), and extensive laboratory and field implementation, testing, and evaluation (Task 3).

Task 1: CV and AI algorithms for crack and corrosion inspection

Two types of algorithms will be included in the AR inspection tool. The first method is based on video analysis and will be improved upon the NCHRP IDEA product in terms of accuracy and sensitivity. In addition, this research will also include image-based deep learning algorithms to enable classification, detection, and segmentation of cracks and corrosion, as illustrated in Figure 3 for the case of crack identification, using images taken by the AR headset, tablet, or UAV. Focus will be placed on minimizing the complexity of the deep learning model to reduce computation, with the goal of enabling real-time image processing and damage inference for practical inspections. With the two methods available, the inspector can first use the image-based deep learning method to identify and segment the regions where cracks and corrosion may exist, then apply the video-based algorithm to further examine the crack region for a refined result.

Task 2: AR-based software for human-center bridge inspection

This task will develop AR-based software environment and user interface to enable human-in-the-loop decision making during field inspections. A process will be developed to convert the damage detection result into holograms and deploy them to the 3D real-world environment with accurate anchorage onto the structural surface. A cloud database will be created to store inspection results. This ability is the key to enabling documentation, allowing for comparisons and tracking of bridge damage in space and time. Build upon the user interface developed in the NCHRP Highway IDEA project, a more comprehensive virtual menu will be created to facilitate a smooth and user-friendly interface for human-centered bridge inspection. In addition, the software for AR headset will be adapted to enable AR-based inspection by using a tablet device. When a UAV is used to facilitate bridge inspection from a distance, the tablet device will receive the damage detection result for the inspector to facilitate human-centered documentation and decision-making, as illustrated in Figure 2.

Task 3: Laboratory and field testing

The developed AR software and AI algorithms will be tested extensively in both laboratory and field settings. A large-scale girder bridge subassemblage with realistic fatigue and corrosion damage will be established in the structural testing laboratory at the University of Kansas for testing the developed AR inspection tools. In addition, several bridges in the inventory of KDOT and other participating member states will be selected for field testing and validation. The team will work closely with the KDOT inspection crew to ensure the tools are relevant and address practical challenges.

This project will result in user-friendly AR software packages for participating member states empowered by AI algorithms for automated damage detection that can be readily adopted by bridge inspectors to perform AI and AR assisted bridge inspections using both AR headsets and tablet devices. In addition, quarterly reports and a final report will be generated in MS Word format. The team will hold quarterly online report meetings with participating parties during the project. The team also plans to hold on one in-person mid-project participant meeting in Year 3. The team will also disseminate the findings and results from this research through journal and conference publications. 

Comments

     

·       Funding requested: $40,000/year per each participating state for 3 years.

·       5 states (Total budget $600,000)

Please see figures in the enclosed complete proposal in the attachment:

Figure 1: Human-centered fatigue crack inspection tool developed under NCHRP IDEA 223

Figure 2: Human-centered bridge inspection enabled by integrating AI, AR, and UAV

Figure 3: Classification, detection, and segmentation of cracks using deep learning

Documents Attached
Title File/Link Type Privacy Download
Laboratory demonstration using a large-scale steel bridge girder specimen TPF 1597 Lab Demo.mp4 Other Public
Human-centered Steel Bridge Inspection enabled by Augmented Reality and Artificial Intelligence Proposal.pdf Work Plan Public

Human-centered Steel Bridge Inspection enabled by Augmented Reality and Artificial Intelligence

General Information
Solicitation Number: 1597
Status: Solicitation posted
Date Posted: Apr 10, 2023
Last Updated: Apr 02, 2024
Solicitation Expires: Jun 10, 2024
Partners: KS, NC, TX
Lead Organization: Kansas Department of Transportation
Financial Summary
Suggested Contribution:
Commitment Start Year: 2024
Commitment End Year: 2027
100% SP&R Approval: Not Requested
Commitments Required: $600,000.00
Commitments Received: $360,000.00
Contact Information
Lead Study Contact(s): David Behzadpour
David.Behzadpour@ks.gov
FHWA Technical Liaison(s): Hoda Azari
hoda.azari@dot.gov
Phone: 202-493-3064
Commitments by Organizations
Agency Year Commitments Technical Contact Name Funding Contact Name Contact Number Email Address
Kansas Department of Transportation 2024 $40,000.00 Mark Hurt David Behzadpour 785-291-3847 David.Behzadpour@ks.gov
Kansas Department of Transportation 2025 $40,000.00 Mark Hurt David Behzadpour 785-291-3847 David.Behzadpour@ks.gov
Kansas Department of Transportation 2026 $40,000.00 Mark Hurt David Behzadpour 785-291-3847 David.Behzadpour@ks.gov
North Carolina Department of Transportation 2024 $40,000.00 David Snoke Curtis Bradley 919-707-6664 cbradley8@ncdot.gov
North Carolina Department of Transportation 2025 $40,000.00 David Snoke Curtis Bradley 919-707-6664 cbradley8@ncdot.gov
North Carolina Department of Transportation 2026 $40,000.00 David Snoke Curtis Bradley 919-707-6664 cbradley8@ncdot.gov
Texas Department of Transportation 2024 $40,000.00 Justin Wilson Ned Mattila 512-416-4727 ned.mattila@txdot.gov
Texas Department of Transportation 2025 $40,000.00 Justin Wilson Ned Mattila 512-416-4727 ned.mattila@txdot.gov
Texas Department of Transportation 2026 $40,000.00 Justin Wilson Ned Mattila 512-416-4727 ned.mattila@txdot.gov

Background

State DOTs currently are relying on trained inspectors to visually inspect bridge components for detecting structural deterioration and damage, which can be limited in accuracy, speed, repeatability, and reliability. On the other hand, computer vision (CV) can see what human eyes cannot, and artificial intelligence (AI) such as deep learning has shown tremendous ability to conceptualize and generalize. By integrating CV and Augmented Reality (AR), a recent NCHRP Highway IDEA project (Li et al., 2022) completed by this project team successfully demonstrated how human-centered AR environment and automated CV algorithms can empower bridge inspectors to perform more accurate and efficient field inspections of steel bridges for fatigue cracks.

As illustrated in Figure 1, the inspector wearing an AR headset (Microsoft HoloLens 2) examines the steel bridge and records a short video of the target structural surface through the AR headset. The video is then automatically uploaded to the server, where the computer vision algorithm analyzes the video by detecting and analyzing surface motion through feature points (pinks dots in the upper right figure). These feature points are then projected in near real time in front of the inspector’s eyes as holograms through the AR headset, allowing the inspector to interact with the hologram through a virtual menu to examine the results under different threshold values for crack detection, enabling human-in-the-loop decision-making.

The NCHRP Highway IDEA project has successfully demonstrated the concept of human-centered bridge inspection by integrating CV and AR using an AR headset as the hardware platform. However, further developments are needed for successful adoption of this tool in practical bridge inspections. In addition, the idea of human-centered bridge inspection would have a broader impact if realized on a wider range of mobile platforms such as tablet devices. The goal of this proposed pooled fund study is to develop a full-fledged AR-based bridge inspection tool that leverages CV and AI to support field detection, quantification, and documentation of various damages and deteriorations for steel bridges. 

Objectives

The main objective of this proposed research is to provide state DOTs practical tools for supporting human-centered steel bridge inspection with real-time defect (e.g., fatigue cracks and corrosion) detection, documentation, tracking, and decision making. The proposed research will not only bridge the gaps identified in the IDEA project, but also expand the existing capability by developing AI algorithms for crack and corrosion detection. In addition to AR headsets, the project will also develop AR-based inspection capability using tablet devices.  The tablet device can be used to perform AR-based inspection directly in a similar way to the AR headset. It can also leverage Unmanned Aerial Vehicles (UAV) for remote image and video acquisition during inspections, enabling bridge inspections from a distance in a human-centered manner, as illustrated in Figure 2. 

Scope of Work

The scope of work includes three main tasks from the development and creation of CV and AI algorithms for steel fatigue crack and corrosion detection and quantification (Task 1), comprehensive design and development of AR-based software to facilitate human-centered damage detection, visualization, documentation, tracking, and decision-making (Task 2), and extensive laboratory and field implementation, testing, and evaluation (Task 3).

Task 1: CV and AI algorithms for crack and corrosion inspection

Two types of algorithms will be included in the AR inspection tool. The first method is based on video analysis and will be improved upon the NCHRP IDEA product in terms of accuracy and sensitivity. In addition, this research will also include image-based deep learning algorithms to enable classification, detection, and segmentation of cracks and corrosion, as illustrated in Figure 3 for the case of crack identification, using images taken by the AR headset, tablet, or UAV. Focus will be placed on minimizing the complexity of the deep learning model to reduce computation, with the goal of enabling real-time image processing and damage inference for practical inspections. With the two methods available, the inspector can first use the image-based deep learning method to identify and segment the regions where cracks and corrosion may exist, then apply the video-based algorithm to further examine the crack region for a refined result.

Task 2: AR-based software for human-center bridge inspection

This task will develop AR-based software environment and user interface to enable human-in-the-loop decision making during field inspections. A process will be developed to convert the damage detection result into holograms and deploy them to the 3D real-world environment with accurate anchorage onto the structural surface. A cloud database will be created to store inspection results. This ability is the key to enabling documentation, allowing for comparisons and tracking of bridge damage in space and time. Build upon the user interface developed in the NCHRP Highway IDEA project, a more comprehensive virtual menu will be created to facilitate a smooth and user-friendly interface for human-centered bridge inspection. In addition, the software for AR headset will be adapted to enable AR-based inspection by using a tablet device. When a UAV is used to facilitate bridge inspection from a distance, the tablet device will receive the damage detection result for the inspector to facilitate human-centered documentation and decision-making, as illustrated in Figure 2.

Task 3: Laboratory and field testing

The developed AR software and AI algorithms will be tested extensively in both laboratory and field settings. A large-scale girder bridge subassemblage with realistic fatigue and corrosion damage will be established in the structural testing laboratory at the University of Kansas for testing the developed AR inspection tools. In addition, several bridges in the inventory of KDOT and other participating member states will be selected for field testing and validation. The team will work closely with the KDOT inspection crew to ensure the tools are relevant and address practical challenges.

This project will result in user-friendly AR software packages for participating member states empowered by AI algorithms for automated damage detection that can be readily adopted by bridge inspectors to perform AI and AR assisted bridge inspections using both AR headsets and tablet devices. In addition, quarterly reports and a final report will be generated in MS Word format. The team will hold quarterly online report meetings with participating parties during the project. The team also plans to hold on one in-person mid-project participant meeting in Year 3. The team will also disseminate the findings and results from this research through journal and conference publications. 

Comments

     

·       Funding requested: $40,000/year per each participating state for 3 years.

·       5 states (Total budget $600,000)

Please see figures in the enclosed complete proposal in the attachment:

Figure 1: Human-centered fatigue crack inspection tool developed under NCHRP IDEA 223

Figure 2: Human-centered bridge inspection enabled by integrating AI, AR, and UAV

Figure 3: Classification, detection, and segmentation of cracks using deep learning

Title Type Private
Laboratory demonstration using a large-scale steel bridge girder specimen Other N
Human-centered Steel Bridge Inspection enabled by Augmented Reality and Artificial Intelligence Work Plan N

Currently, Transportation Pooled Fund is not supported on mobile devices, please access this Web portal using a desktop or laptop computer.