Texture Analysis to Enhance Drone-Based Multi-Modal Inspection of Structures

  1. Nooralishahi, Parham
  2. Ramos, Gabriel
  3. Pozzer, Sandra
  4. Ibarra-Castanedo, Clemente
  5. Lopez, Fernando
  6. Maldague, Xavier P. V.
  7. González Aguilera, Diego 1
  8. González Jorge, Higinio
  1. 1 Universidad de Salamanca
    info

    Universidad de Salamanca

    Salamanca, España

    ROR https://ror.org/02f40zc51

Revista:
Drones

ISSN: 2504-446X

Año de publicación: 2022

Volumen: 6

Número: 12

Páginas: 407

Tipo: Artículo

DOI: 10.3390/DRONES6120407 GOOGLE SCHOLAR lock_openAcceso abierto editor

Otras publicaciones en: Drones

Resumen

The drone-based multi-modal inspection of industrial structures is a relatively new field of research gaining interest among companies. Multi-modal inspection can significantly enhance data analysis and provide a more accurate assessment of the components’ operability and structural integrity, which can assist in avoiding data misinterpretation and providing a more comprehensive evaluation, which is one of the NDT4.0 objectives. This paper investigates the use of coupled thermal and visible images to enhance abnormality detection accuracy in drone-based multi-modal inspections. Four use cases are presented, introducing novel process pipelines for enhancing defect detection in different scenarios. The first use case presents a process pipeline to enhance the feature visibility on visible images using thermal images in pavement crack detection. The second use case proposes an abnormality classification method for surface and subsurface defects using both modalities and texture segmentation for piping inspections. The third use case introduces a process pipeline for road inspection using both modalities. A texture segmentation method is proposed to extract the pavement regions in thermal and visible images. Further, the combination of both modalities is used to detect surface and subsurface defects. The texture segmentation approach is employed for bridge inspection in the fourth use case to extract concrete surfaces in both modalities.

Referencias bibliográficas

  • Head of Grenfell Tower Investigation and Review Team (2019). Grenfell Tower Fire: Preliminary Report, London Fire Brigade. Technical Report.
  • US Chemical Safety and Hazard Investigation Board (2014). Investigation Report Overview: Explosion and Fire at the Macondo Well, Technical Report.
  • Turksezer, (2021), J. Constr. Eng. Manag., 147, pp. 04021165, 10.1061/(ASCE)CO.1943-7862.0002195
  • Afsari, (2021), EPiC Ser. Built Environ., 2, pp. 274, 10.29007/cdpd
  • Moradi, S., Zayed, T., and Golkhoo, F. (2019). Review on computer aided sewer pipeline defect detection and condition assessment. Infrastructures, 4.
  • Irizarry, (2012), J. Inf. Technol. Constr. (ITcon), 17, pp. 194
  • Syed, (2021), Trans. Emerg. Telecommun. Technol., 32, pp. e4133
  • Asadzadeh, (2022), J. Pet. Sci. Eng., 208, pp. 109633, 10.1016/j.petrol.2021.109633
  • Zhang, D., Watson, R., Dobie, G., MacLeod, C., and Pierce, G. (2018, January 22–25). Autonomous ultrasonic inspection using unmanned aerial vehicle. Proceedings of the 2018 IEEE International Ultrasonics Symposium (IUS), Kobe, Japan.
  • Arias, (2020), Measurement, 165, pp. 108106, 10.1016/j.measurement.2020.108106
  • Ramon-Soria, P., Gomez-Tamm, A.E., Garcia-Rubiales, F.J., Arrue, B.C., and Ollero, A. (2019, January 3–8). Autonomous landing on pipes using soft gripper for inspection and maintenance in outdoor environments. Proceedings of the 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Macau, China.
  • Carrozzo, M., De Vito, S., Esposito, E., Salvato, M., Formisano, F., Massera, E., Di Francia, G., Veneri, P.D., Iadaresta, M., and Mennella, A. (2018, January 20–22). UAV intelligent chemical multisensor payload for networked and impromptu gas monitoring tasks. Proceedings of the 2018 5th IEEE International Workshop on Metrology for AeroSpace (MetroAeroSpace), Rome, Italy.
  • Nooralishahi, (2022), IEEE Access, 10, pp. 41429, 10.1109/ACCESS.2022.3167393
  • Alhammad, (2021), Thermosense: Thermal Infrared Applications XLIII, Volume 11743, pp. 35
  • Pozzer, (2021), J. Perform. Constr. Facil., 35, pp. 04020131, 10.1061/(ASCE)CF.1943-5509.0001541
  • Osman, A., Duan, Y., and Kaftandjian, V. (2021). Handbook of Nondestructive Evaluation 4.0, Springer.
  • Taheri, H., Gonzalez Bocanegra, M., and Taheri, M. (2022). Artificial Intelligence, Machine Learning and Smart Technologies for Nondestructive Evaluation. Sensors, 22.
  • Farah, (2008), IEEE Trans. Geosci. Remote Sens., 46, pp. 4153, 10.1109/TGRS.2008.2001554
  • Xiao, (2019), Infrared Phys. Technol., 101, pp. 162, 10.1016/j.infrared.2019.06.016
  • Blum, R.S., and Liu, Z. (2018). Multi-Sensor Image Fusion and Its Applications, CRC Press.
  • Lee, S., An, K.E., Jeon, B.D., Cho, K.Y., Lee, S.J., and Seo, D. (2018, January 12–14). Detecting faulty solar panels based on thermal image processing. Proceedings of the 2018 IEEE International Conference on Consumer Electronics (ICCE), Las Vegas, NV, USA.
  • Henriques, M.J., and Roque, D. (2015, January 21–24). Unmanned aerial vehicles (UAV) as a support to visual inspections of concrete dams. Proceedings of the Second International Dam World Conference, Lisbon, Portugal.
  • MnDOT: Minnesota Department of Transportation (2022, September 10). Drone Technology Enhances Bridge Inspections. Available online: https://mntransportationresearch.org/2022/03/15/drone-technology-enhances-bridge-inspections.
  • Wells, J., and Lovelace, B. (2022, September 10). Improving the Quality of Bridge Inspections Using Unmanned Aircraft Systems (UAS). Technical Report, 2018. Available online: http://www.dot.state.mn.us/research/reports/2018/201826.pdf.
  • Nooralishahi, P., Ibarra-Castanedo, C., Deane, S., López, F., Pant, S., Genest, M., Avdelidis, N.P., and Maldague, X.P. (2021). Drone-Based Non-Destructive Inspection of Industrial Sites: A Review and Case Studies. Drones, 5.
  • Sreenath, (2020), Procedia Comput. Sci., 170, pp. 656, 10.1016/j.procs.2020.03.174
  • Khelifi, A., Ciccone, G., Altaweel, M., Basmaji, T., and Ghazal, M. (2021). Autonomous Service Drones for Multimodal Detection and Monitoring of Archaeological Sites. Appl. Sci., 11.
  • Jalil, B., Moroni, D., Pascali, M., and Salvetti, O. (2018, January 14–17). Multimodal image analysis for power line inspection. Proceedings of the International Conference on Pattern Recognition and Artificial Intelligence, Montreal, QC, Canada.
  • Choi, (2022), IEEE Trans. Ind. Inform., 18, pp. 7686, 10.1109/TII.2022.3147833
  • Chu, (2022), IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens., 15, pp. 8109, 10.1109/JSTARS.2022.3195977
  • Ciampa, (2019), J. Phys. Conf. Ser., 1249, pp. 012016, 10.1088/1742-6596/1249/1/012016
  • Kumar, N., and Gumhold, S. (2020). FuseVis: Interpreting neural networks for image fusion using per-pixel saliency visualization. Computers, 9.
  • Huang, B., Yang, F., Yin, M., Mo, X., and Zhong, C. (2020). A review of multimodal medical image fusion techniques. Comput. Math. Methods Med.
  • Lahat, (2015), Proc. IEEE, 103, pp. 1449, 10.1109/JPROC.2015.2460697
  • Szeliski, (2007), Found. Trends® Comput. Graph. Vis., 2, pp. 1, 10.1561/0600000009
  • Phogat, (2014), Int. J. Sci. Eng. Res., 5, pp. 44
  • Ma, (2019), Inf. Fusion, 45, pp. 153, 10.1016/j.inffus.2018.02.004
  • Mao, (2020), Opt. Express, 28, pp. 25293, 10.1364/OE.396604
  • Ramírez, J., Vargas, H., Martínez, J.I., and Arguello, H. (2021, January 11–16). Subspace-Based Feature Fusion from Hyperspectral and Multispectral Images for Land Cover Classification. Proceedings of the 2021 IEEE International Geoscience and Remote Sensing Symposium IGARSS, Brussels, Belgium.
  • Barani, (2016), Int. J. Sci. Eng. Comput. Technol., 6, pp. 339
  • Bavirisetti, (2019), Circuits, Syst. Signal Process., 38, pp. 5576, 10.1007/s00034-019-01131-z
  • Liu, (2015), Inf. Fusion, 24, pp. 147, 10.1016/j.inffus.2014.09.004
  • Bavirisetti, (2016), Infrared Phys. Technol., 76, pp. 52, 10.1016/j.infrared.2016.01.009
  • Pozzer, (2022), NDT E Int., 132, pp. 102709, 10.1016/j.ndteint.2022.102709
  • Al Lafi, G. (2017). 3D Thermal Modeling of Built Environments Using Visual and Infrared Sensing. [Ph.D. Thesis, Concordia University].
  • Armesto, (2012), Autom. Constr., 27, pp. 24, 10.1016/j.autcon.2012.05.011
  • Pietikainen, M.K. (2000). Texture Analysis in Machine Vision, World Scientific.
  • Bharati, (2004), Chemom. Intell. Lab. Syst., 72, pp. 57, 10.1016/j.chemolab.2004.02.005
  • Chebbah, N.K., Ouslim, M., and Benabid, S. (2022). New computer aided diagnostic system using deep neural network and SVM to detect breast cancer in thermography. Quant. Infrared Thermogr. J., 1–16.
  • Liu, X., Wang, Y., and Luan, J. (2021). Facial Paralysis Detection in Infrared Thermal Images Using Asymmetry Analysis of Temperature and Texture Features. Diagnostics, 11.
  • Curio, (2000), IEEE Trans. Intell. Transp. Syst., 1, pp. 155, 10.1109/6979.892152
  • Feng, (2015), Remote Sens., 7, pp. 1074, 10.3390/rs70101074
  • Di, (2019), Geophys. J. Int., 218, pp. 1262, 10.1093/gji/ggz226
  • Koutsoudis, (2021), J. Cult. Herit., 49, pp. 1, 10.1016/j.culher.2021.04.004
  • Kakadiaris, I.A., Passalis, G., Theoharis, T., Toderici, G., Konstantinidis, I., and Murtuza, N. (2005, January 20–25). Multimodal face recognition: Combination of geometry with physiological information. Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR’05), San Diego, CA, USA.
  • Racoviteanu, (2012), Remote Sens., 4, pp. 3078, 10.3390/rs4103078
  • Liu, (2018), Agric. For. Meteorol., 252, pp. 144, 10.1016/j.agrformet.2018.01.021
  • Jarc, A., Perš, J., Rogelj, P., Perše, M., and Kovačič, S. (2007). Texture Features for Affine Registration of Thermal (FLIR) and Visible Images, Citeseer.
  • Chen, C., Chandra, S., and Seo, H. (2022). Automatic Pavement Defect Detection and Classification Using RGB-Thermal Images Based on Hierarchical Residual Attention Network. Sensors, 22.
  • Hwang, (2021), Struct. Health Monit., 20, pp. 3424, 10.1177/1475921721989407
  • Gallardo-Saavedra, S., Hernández-Callejo, L., Alonso-García, M.d.C., Muñoz-Cruzado-Alba, J., and Ballestín-Fuertes, J. (2020). Infrared thermography for the detection and characterization of photovoltaic defects: Comparison between illumination and dark conditions. Sensors, 20.
  • Wang, (2022), Opt. Lasers Eng., 156, pp. 107078, 10.1016/j.optlaseng.2022.107078
  • Mo, (2022), Neurocomputing, 493, pp. 626, 10.1016/j.neucom.2022.01.005
  • Cordts, M., Omran, M., Ramos, S., Rehfeld, T., Enzweiler, M., Benenson, R., Franke, U., Roth, S., and Schiele, B. (2016, January 27–30). The Cityscapes Dataset for Semantic Urban Scene Understanding. Proceedings of the 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, USA.
  • Kuutti, (2021), IEEE Trans. Intell. Transp. Syst., 22, pp. 712, 10.1109/TITS.2019.2962338
  • Alberti, (2020), IEEE Robot. Autom. Lett., 5, pp. 5526, 10.1109/LRA.2020.3009075
  • Ronneberger, O., Fischer, P., and Brox, T. (2015). International Conference on Medical Image Computing and Computer-Assisted Intervention, Springer.
  • Zhou, Z., Siddiquee, M.M.R., Tajbakhsh, N., and Liang, J. (2018). Deep Learning in Medical Image Analysis and Multimodal Learning for Clinical Decision Support, Springer.
  • He, K., Zhang, X., Ren, S., and Sun, J. (2016, January 27–30). Deep residual learning for image recognition. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA.
  • Iakubovskii, P. (2022, August 03). Segmentation Models Pytorch. Available online: https://github.com/qubvel/segmentation_models.pytorch.
  • Wightman, R., Touvron, H., and Jégou, H. (2021). ResNet Strikes Back: An Improved Training Procedure in Timm. arXiv.
  • Loshchilov, I., and Hutter, F. (2016). Sgdr: Stochastic gradient descent with warm restarts. arXiv.
  • Müller, S.G., and Hutter, F. (2021, January 11–17). TrivialAugment: Tuning-free Yet State-of-the-Art Data Augmentation. Proceedings of the IEEE/CVF International Conference on Computer Vision, Virtual.
  • Wang, (2004), IEEE Trans. Image Process., 13, pp. 600, 10.1109/TIP.2003.819861
  • Zuiderveld, K. (1994). Contrast limited adaptive histogram equalization. Graph. Gems, 474–485.
  • Astrachan, P.M. (2014). Enhanced Histogram Equalization. (8,698,961), U.S. Patent.
  • Lin, T.Y., Dollár, P., Girshick, R., He, K., Hariharan, B., and Belongie, S. (2016, January 27–30). Feature Pyramid Networks for Object Detection. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA.
  • Huang, G., Liu, Z., Van Der Maaten, L., and Weinberger, K.Q. (2017, January 21–26). Densely Connected Convolutional Networks. Proceedings of the 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Honolulu, HI, USA.
  • Bavirisetti, (2015), IEEE Sens. J., 16, pp. 203, 10.1109/JSEN.2015.2478655
  • Li, H., Wu, X.J., and Kittler, J. (2018, January 20–24). Infrared and visible image fusion using a deep learning framework. Proceedings of the 2018 24th international conference on pattern recognition (ICPR), Beijing, China.
  • Li, (2013), IEEE Trans. Image Process., 22, pp. 2864, 10.1109/TIP.2013.2244222
  • Simonyan, K., and Zisserman, A. (2014). Very deep convolutional networks for large-scale image recognition. arXiv.
  • Toet, (1989), Pattern Recognit. Lett., 9, pp. 245, 10.1016/0167-8655(89)90003-2
  • Zhang, (2009), Signal Process., 89, pp. 1334, 10.1016/j.sigpro.2009.01.012
  • Shi, (2016), IEEE Trans. Intell. Transp. Syst., 17, pp. 3434, 10.1109/TITS.2016.2552248
  • Zou, (2018), IEEE Trans. Image Process., 28, pp. 1498, 10.1109/TIP.2018.2878966
  • Kingma, D.P., and Ba, J. (2014). Adam: A method for stochastic optimization. arXiv.
  • Kim, (2020), IEEE Trans. Image Process., 29, pp. 8055, 10.1109/TIP.2020.3011269
  • Oommen, (2018), Res. Nondestruct. Eval., 29, pp. 183, 10.1080/09349847.2017.1304597
  • Lee, (2016), Infrared Phys. Technol., 78, pp. 223, 10.1016/j.infrared.2016.08.009