Classificação de Imagens Utilizando Fusão de Sensores Termal e Visível

Autores

DOI:

https://doi.org/10.55972/spectrum.v24i1.396

Palavras-chave:

Sensoriamento remoto, Infravermelho, Reconhecimento de padrões

Resumo

Utilizando uma câmera com sensor duplo (visível e termal), este trabalho avalia a alteração na exatidão global de quatro classes de interesse utilizando-se diferentes composições de canais nas imagens analisadas. São testadas as composições RGB e RGBI (composição RGB mais canal infravermelho). Os resultados são comparados utilizando os algoritmos k vizinhos mais próximos (k-NN) e máquina de vetores de suporte (SVM). Os resultados experimentais indicam que o uso da composição RGBI aumenta a acurácia na classificação em 9,7%, no k-NN, e 1,9% no SVM.

Referências

Mohd Noor, N., Abdullah, A. and Hashim, M., 2018. Remote sensing UAV/drones and its applications for urban areas: A review, IOP Conference Series: Earth and Environmental Science 2018.

Ludovisi, R., Tauro, F., Salvati, R., Khoury, S., Mugnozza, G.S. and Harfouche, A., 2017. Uav-based thermal imaging for high-throughput field phenotyping of black poplar response to drought. Frontiers in Plant Science, 8.

Henry, C., Poudel, S., Lee, S.-. and Jeong, H., 2020. Automatic detection system of deteriorated PV modules using drone with thermal camera. Applied Sciences (Switzerland), 10(11).

Sambolek, S. and Ivasic-Kos, M., 2021. Automatic person detection in search and rescue operations using deep CNN detectors. IEEE Access, 9, pp. 37905-37922.

Rodin, C.D., De Lima, L.N., De Alcantara Andrade, F.A., Haddad, D.B., Johansen, T.A. and Storvold, R., 2018. Object Classification in Thermal Images using Convolutional Neural Networks for Search and Rescue Missions with Unmanned Aerial Systems, Proceedings of the International Joint Conference on Neural Networks 2018.

Andrade, F. A. de A. et al. Autonomous unmanned aerial vehicles in search and rescue missions using real-time cooperative model predictive control. Sensors (Switzerland), v. 19, n. 19, 2019. DOI 10.3390/s19194067.

Rasmussen, N. D. et al. “Fused visible and infrared video for use in wilderness search and rescue.” 2009. Anais [...]. [s.l.: s.n.], 2009. DOI 10.1109/WACV.2009.5403048.

171. St-Laurent, L., Maldague, X., Prevost, D.: “Combination of colour and thermal sensors for enhanced object detection.” In: 10th International Conference on Information Fusion (2007).

Klein, L.A. 2012, "Sensor and data fusion: A tool for information assessment and decision making: Second edition" in Sensor and Data Fusion: A Tool for Information Assessment and Decision Making: Second Edition, pp. 1-474.

A. J. A. Rivera, A. D. C. Villalobos, J. C. N. Monje, J. A. G. Mariñas, and C. M. Oppus, “Post-disaster rescue facility: Human detection and geolocation using aerial drones,” IEEE Reg. 10 Annu. Int. Conf. Proceedings/TENCON, pp. 384–386, Feb. 2017, doi: 10.1109/TENCON.2016.7848026.

H. Kayan, R. Eslampanah, F. Yeganli, and M. Askar, “Heat leakage detection and surveiallance using aerial thermography drone,” 26th IEEE Signal Process. Commun. Appl. Conf. SIU 2018, pp. 1–4, Jul. 2018, doi: 10.1109/SIU.2018.8404366.

P. M. Hell and P. J. Varga, “Assisting law enforcement tasks with thermal camera drones,” CANDO-EPE 2020 - Proceedings, IEEE 3rd Int. Conf. Work. Obuda Electr. Power Eng., pp. 97–102, Nov. 2020, doi: 10.1109/CANDO-EPE51100.2020.9337768.

H. Greidanus, “Assessment of the coastal maritime environment with airborne mid-wave infrared imagery,” Int. Geosci. Remote Sens. Symp., vol. 3, pp. 1542–1544, 2001, doi: 10.1109/IGARSS.2001.976905.

M. Sadi, Y. Zhang, W. F. Xie, and F. M. A. Hossain, “Forest Fire Detection and Localization Using Thermal and Visual Cameras,” 2021 Int. Conf. Unmanned Aircr. Syst. ICUAS 2021, pp. 744–749, Jun. 2021, doi: 10.1109/ICUAS51884.2021.9476865.

J. M. Suiter, M. B. Lapis, and R. Collins, “Multispectral image fusion for the small aircraft transportation system,” AIAA/IEEE Digit. Avion. Syst. Conf. - Proc., vol. 1, 2004, doi: 10.1109/DASC.2004.1391317.

R. Hartono, D. Ardianto, S. Salaswati, R. Yatim, and A. Hadi Syafrudin, “Design Requirement of LWIR Optical Filter for LAPAN-A4 Satellite,” Proc. 2019 IEEE Int. Conf. Aerosp. Electron. Remote Sens. Technol. ICARES 2019, Oct. 2019, doi: 10.1109/ICARES.2019.8914354.

B. Arifin, A. M. Tahir, and I. Priyanto, “LAPAN’S Mid Wavelength Infrared Camera Module,” Int. Geosci. Remote Sens. Symp., pp. 6401–6404, Sep. 2020, doi: 10.1109/IGARSS39084.2020.9323882.

Gade, R.; Moeslund, T. B. Thermal cameras and applications: A survey. Machine Vision and Applications, v. 25, n. 1, p. 245–262, 2014. DOI 10.1007/s00138-013-0570-5.

Serway, R.A., Jewett, J.W.: Physics for Scientists and Engineers with Modern Physics, 6th edn. Brooks/Cole–Thomson Learning (2004).

Duo-Datasheet-US. Disponível em: <https://flir.netx.net/file/asset /10904/original/attachment>. Acesso em: 05/07/2021.

Irani, M.; Anandan, P. “Robust Multi-Sensor Image Alignment.” Sixth International Conference on Computer Vision, 1998.

Istenic, R. et al. “Thermal and visual image registration in hough parameter space.” 2007. p. 106–109. DOI 10.1109/IWSSIP.2007.4381164.

Zitová, B. & Flusser, J. 2003, "Image registration methods: A survey", Image and Vision Computing, vol. 21, no. 11, pp. 977-1000. DOI 10.1016/S0262-8856(03)00137-9.

Richards, John A.; JIA, Xiuping. “Remote Sensing Digital Image Analysis: An Introduction.” 4. ed. New York: Springer, 2005. 464 p. ISBN 9783540251286.

Falqueto, L. E. F, “Reconhecimento de alvos artificiais em ambientes complexos por meio de imagens SAR polarimétricas obtidas por plataformas orbitais, 2019, 133f. Dissertação de mestrado em Ciências e Tecnologias Espaciais – Instituto Técnológico de Aeronáutica, São José dos Campos.

Kotsiantis, S. B; Zaharakis, I. D; Pintelas, P. E., “Machine Learning: a review of classification and combining techniques. Artificial Intelligence Review, v. 26, n. 3, p. 159-190, 2006.

Downloads

Publicado

22.09.2023

Como Citar

[1]
R. Avilez Fiedler e F. Bernardo Lovato Eick, “Classificação de Imagens Utilizando Fusão de Sensores Termal e Visível”, Spectrum, vol. 24, nº 1, p. 34–39, set. 2023.