Improving Vehicle Assistance Systems: Evaluation of Augmented Capabilities through Infrared Thermal Camera Integration
DOI:
https://doi.org/10.15282/ijame.22.1.2025.20.0937Keywords:
Object detection, Normal camera, Infrared camera, Deep learning, Collision avoidenceAbstract
Nighttime driving is difficult owing to low visibility and lights. Nighttime accidents are more dangerous due to reduced obstacle detection, poor vision, and trouble evaluating distances. Knowing the causes and dynamics of nighttime accidents is essential for improving road safety and preventing collisions when natural light is limited. This study proposes using an infrared thermal sensor to assist drivers in mitigating the issue of inadequate light at night, with the ultimate goal of preventing crashes under such circumstances. The investigation compared the infrared thermal camera sensor with the normal camera visual to evaluate how well it worked at night. The testing has been done on the road in Pekan, Pahang. Yolov8 deep learning has been integrated with both cameras to detect items like cars, motorcycles, and traffic lights. The test findings demonstrated how temperature variations can be utilized to precisely detect items on different types of roadways. The study showed that infrared thermal sensors are impressive at detecting traffic lamps, motorcycles, and vehicles. The infrared camera's actual detection on confusion matrices was 0.98 for traffic lamps and 0.87 for motorcycles and vehicles. This shows how well the infrared thermal camera works in dark conditions, with a faster frame rate of 64.94 fps than regular cameras at 55.25 fps. The results of this study demonstrate that using infrared technology can enhance object detection capabilities and, hence, enhance nighttime road safety.
References
[1] A. Mesic, J. Damsere-Derry, A. Gyedu, C. Mock, J. Larley, I. Opoku et al., “Generating consensus on road safety issues and priorities in Ghana: A modified Delphi approach,” Injury, vol. 54, no. 9, p. 110765, 2023.
[2] S. Tinnes, Q. Noir, G. Delubac, J. Matias, R. Dona, and B. Ciuffo, “Automatic emergency braking: How can affordable thermal camera improve reliability and extend use cases to nighttime conditions,” in Infrared Technology and Applications L, G. F. Fulop, M. H. MacDougal, D. Z. Ting, and M. Kimata, Eds., SPIE, 2024, p. 130460Y.
[3] S. Chen, M. Kuhn, K. Prettner, and D. E. Bloom, “The global macroeconomic burden of road injuries: estimates and projections for 166 countries,” Lancet Planet Health, vol. 3, no. 9, pp. e390–e398, 2019.
[4] J. J. Rolison, S. Regev, S. Moutari, and A. Feeney, “What are the factors that contribute to road accidents? An assessment of law enforcement views, ordinary drivers’ opinions, and road accident records,” Accident Analysis & Prevention, vol. 115, pp. 11–24, 2018.
[5] H. Li, N. Bamminger, Z. F. Magosi, C. Feichtinger, Y. Zhao, T. Mihalj et al., “The effect of rainfall and illumination on automotive sensors detection performance,” Sustainability, vol. 15, no. 9, 2023.
[6] J. Yang, S. Liu, H. Su, and Y. Tian, “Driving assistance system based on data fusion of multisource sensors for autonomous unmanned ground vehicles,” Computer Networks, vol. 192, p. 108053, 2021.
[7] Y. Xu, Z. Chen, Z. Gong, Z. Xia, T. Yuan, Z. Gu et al., “Hybrid modulation scheme for visible light communication using CMOS camera,” Optics Communications, vol. 440, pp. 89–94, 2019.
[8] I. Abdulrashid, R. Zanjirani Farahani, S. Mammadov, and M. Khalafalla, “Transport behavior and government interventions in pandemics: A hybrid explainable machine learning for road safety,” Transportation Research Part E: Logistics and Transportation Review, vol. 193, p. 103841, 2025.
[9] Z. Eusofe and H. Evdorides, “Assessment of road safety management at institutional level in Malaysia: A case study,” IATSS Research, vol. 41, no. 4, pp. 172–181, 2017.
[10] J. Park, B. K. Thota, and K. Somashekar, “Sensor-Fused Nighttime System for Enhanced Pedestrian Detection in ADAS and Autonomous Vehicles,” Sensors, vol. 24, no. 14, 2024.
[11] Z.-Q. Zhao, P. Zheng, S.-T. Xu, and X. Wu, “Object detection with deep learning: A review,” IEEE Transactions on neural Networks and Learning Systems, vol. 30, no. 11, pp. 3212–3232, 2019.
[12] Z. Liu, Y. He, C. Wang, and R. Song, “Analysis of the influence of foggy weather environment on the detection effect of machine vision obstacles,” Sensors, vol. 20, no. 2, 2020.
[13] N. Yaghoobi Ershadi and J. M. Menéndez, “Vehicle tracking and counting system in dusty weather with vibrating camera conditions,” Journal of Sensors, vol. 2017, no. 1, p. 3812301, 2017.
[14] H. Li, D. J. Graham, H. Ding, and G. Ren, “Comparison of empirical Bayes and propensity score methods for road safety evaluation: A simulation study,” Accident Analysis & Prevention, vol. 129, pp. 148–155, 2019.
[15] M. Kutila, P. Pyykönen, H. Holzhüter, M. Colomb, and P. Duthon, “Automotive LiDAR performance verification in fog and rain,” in 2018 21st International Conference on Intelligent Transportation Systems (ITSC), 2018, pp. 1695–1701.
[16] R. Donà, K. Mattas, S. Vass, G. Delubac, J. Matias, S. Tinnes et al., “Thermal cameras and their safety implications for pedestrian protection: A mixed empirical and simulation-based characterization,” Transportation Research Record, vol. 0, no. 0, p. 03611981241278346, 2024.
[17] Z. Wang, J. Zhan, Y. Li, Z. Zhong, and Z. Cao, “A new scheme of vehicle detection for severe weather based on multi-sensor fusion,” Measurement, vol. 191, p. 110737, 2022.
[18] I. Slimani, A. Zaarane, and I. Atouf, “Traffic monitoring system for vehicle detection in day and night conditions,” Transport and Telecommunication, vol. 24, no. 3, pp. 256–265, 2023.
[19] P. Tsirtsakis, G. Zacharis, G. S. Maraslidis, and G. F. Fragulis, “Deep learning for object recognition: A comprehensive review of models and algorithms,” International Journal of Cognitive Computing in Engineering, vol. 6, pp. 298–312, 2025.
[20] X. Dai, J. Hu, H. Zhang, A. Shitu, C. Luo, A. Osman et al., “Multi-task faster R-CNN for nighttime pedestrian detection and distance estimation,” Infrared Physics & Technology, vol. 115, p. 103694, 2021.
[21] J. Li, Z. Xu, L. Fu, X. Zhou, and H. Yu, “Domain adaptation from daytime to nighttime: A situation-sensitive vehicle detection and traffic flow parameter estimation framework,” Transportation Research Part C: Emerging Technologies, vol. 124, p. 102946, 2021.
[22] B.-F. Wu, H.-Y. Huang, C.-J. Chen, Y.-H. Chen, C.-W. Chang, and Y.-L. Chen, “A vision-based blind spot warning system for daytime and nighttime driver assistance,” Computers & Electrical Engineering, vol. 39, no. 3, pp. 846–862, 2013.
[23] A. Tampuu, R. Aidla, J. A. van Gent, and T. Matiisen, “LiDAR-as-camera for end-to-end driving,” Sensors, vol. 23, no. 5, 2023.
[24] G. Li, Y. Yang, X. Qu, D. Cao, and K. Li, “A deep learning based image enhancement approach for autonomous driving at night,” Knowledge-Based Systems, vol. 213, p. 106617, 2021.
[25] X. Liu, J. Wang, and J. Li, “URTSegNet: A real-time segmentation network of unstructured road at night based on thermal infrared images for autonomous robot system,” Control Engineering Practice, vol. 137, p. 105560, 2023.
[26] J. D. Choi and M. Y. Kim, “A sensor fusion system with thermal infrared camera and LiDAR for autonomous vehicles and deep learning based object detection,” ICT Express, vol. 9, no. 2, pp. 222–227, 2023.
[27] M. Zou, J. Yu, Y. Lv, B. Lu, W. Chi, and L. Sun, “A novel day-to-night obstacle detection method for excavators based on image enhancement and multisensor fusion,” IEEE Sensors Journal, vol. 23, no. 10, pp. 10825–10835, 2023.
[28] S. Ikram, I. S. Bajwa, A. Ikram, M. Abdullah-Al-Wadud, and H. Pk, “A transformer-based multimodal object detection system for real-world applications,” IEEE Access, vol. 13, pp. 29162–29176, 2025.
Downloads
Published
Issue
Section
License
Copyright (c) 2025 The Author(s)

This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.