Band selection for hyperspectral image visualization and classification

Mery L. Picco

Argentina

National University of Río Cuarto image/svg+xml

Departamento de Matemáticas

Marcelo S. Ruiz

Argentina

National University of Río Cuarto image/svg+xml

Departamento de Matemáticas

Juliana R. Maldonado

Argentina

National University of Río Cuarto image/svg+xml

Departamento de Matemáticas

|

Accepted: 2025-03-15

|

Published: 2025-05-20

DOI: https://doi.org/10.4995/raet.2025.22291
Funding Data

Downloads

Keywords:

hyperspectral images, band selection, classification, regularization methods

Supporting agencies:

This research was not funded

Abstract:

In the hyperspectral remote sensing images processing, band selection is an essential task for many specific applications, including supervised classification. The objective of this work is to compare the performance of the classical strategy, which involves variable selection as a preliminary step to classification, with new proposals of penalized algorithms that perform classification and variable selection simultaneously. For the comparison, an extract of a hyperspectral image EO-1 Hyperion, covering an area in the province of Córdoba, Argentina, was used. Additionally, a simulation study was conducted. The obtained results show that penalized algorithms are more effective in selecting relevant bands while providing good predictive properties, mainly in the context of high dimensionality, that is, when the size of the training sample is small relative to the number of variables.

Show more Show less

References:

Adenan, N. I., Mohamad, M., Rambli, A. (2022). The Performance of Ridge Regression, LASSO and Elastic Net in Modeling Market Value Data*. International Journal of Academic and Applied Research (IJAAR), 6(9), 131-134.

Breiman, L. (2001). Random Forests. Machine Learning, 45, 5-32. https://doi.org/10.1023/A:1010933404324

Chang, Z., Du, Z., Zhang, F., Huang, F., Chen, J., Li, W., Guo, Z. (2020). Landslide Susceptibility Prediction Based on Remote Sensing Images and GIS: Comparisons of Supervised and Unsupervised Machine Learning Models. Remote Sensing, 12(3), 502. https://doi.org/10.3390/rs12030502

Dmitriev, P. A., Kozlovsky, B. L., Dmitrieva, A. A., Rajput, V. D., Minkina, T. M., Varduni, T. V. (2022). Identification of species of the genus Populus L. based on the data of hyperspectral crown survey for climate change monitoring. Environmental Challenges, 9(100619). https://doi.org/10.1016/j.envc.2022.100619

Esmaeili, M., Abbasi-Moghadam , D., Sharif, A., Tariq, A. (2023). Hyperspectral Image Band Selection Based on CNN Embedded GA (CNNeGA). IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, 16, 1927-1950. https://doi.org/10.1109/JSTARS.2023.3242310

Friedman, J., Hastie , T., Tibshirani, R. (2010). Regularization paths for generalized linear models via coordinate descent. Journal of statistical software, 33(1). https://doi.org/10.1163/ej.9789004178922.i-328.7

Greenwood, C. J., Youssef, G. J., Letcher, P., Macdonald, J. A., Hagg, L. J., Sanson, A., . . . Olsson, C. A. (2020). A comparison of penalised regression methods for informing the selection of predictive markers. PloS one, 15(11), e0242730. https://doi.org/10.1371/journal.pone.0242730

Hastie, T., Tibshirani , R., Friedman, J. (2009). The Elements of Statistical Learning: Data Mining, Inference, and Prediction. Springer Series in Statistics. https://doi.org/10.1007/978-0-387-84858-7

Herawati, N., Wijayanti, A., Sutrisno, A., Nusyirwan, Misgiyati. (2024). The Performance of RidgeRegression, LASSO, and Elastic-Net in Controlling Multicollinearity: A Simulation and Application. Journal of Modern Applied Statistical Methods, 23. https://doi.org/10.56801/Jmasm.V23.i2.4

Huang, T., Ou, G., Wu, Y., Zhang, X., Liu, Z., Xu, H., ... Xu, C. (2023). Estimating the Aboveground Biomass of Various Forest Types with High Heterogeneity at the Provincial Scale Based on Multi-Source Data. Remote Sensing, 15(14), 3550. https://doi.org/10.3390/rs15143550

James, G., Witten, D., Hastie, T., Tibshirani, R. (2021). An Introduction to statistical Learning with applications in R. Springer. https://doi.org/10.1007/978-1-0716-1418-1

Kursa, M.B., Rudnicki, W.R. (2010). Feature Selection with the Boruta Package. Journal of Statistical Software, 1-13. https://doi.org/10.18637/jss.v036.i11

Lone, Z.A., Pais, A.R. (2022). Object detection in hyperspectral images. Digital Signal Processing, 131(103752). https://doi.org/10.1016/j.dsp.2022.103752

Lu, F., Petkova, E. (2014). A comparative study of variable selection methods in the context of developing psychiatric screening instruments. Statistics in medicine, 33(3), 401-421. https://doi.org/10.1002/sim.5937

Marshall, M., Thenkabail, P. (2015). Advantage of hyperspectral EO-1 Hyperion over multispectral IKONOS, GeoEye-1, WorldView-2, Landsat ETM+, and MODIS vegetation indices in crop biomass estimation. ISPRS journal of photogrammetry and remote sensing, 108, 205-218. https://doi.org/10.1016/j.isprsjprs.2015.08.001

Nijhawan, R., Srivastava, I., Shukla, P. (2017). Land cover classification using super-vised and unsupervised learning techniques. 2017 International Conference on Computational Intelligence in Data Science (ICCIDS), 1-6. https://doi.org/10.1109/ICCIDS.2017.8272630

Picco, M., Ruiz, M. (2022). Sparse Estimation of the Precision Matrix and Plug-In Principle in Linear Discriminant Analysis for Hyperspectral Image Classification. Trends in Computational and Applied Mathematics, 23, 595-605. https://doi.org/10.5540/tcam.2022.023.03.00595

R Core Team. (2020). R: A Language and Environment for Statistical Computing. (R. F. Computing, Productor) Obtenido de https://www.R-project.org/

Rotari, M., Kulachi, M. (2023). Variable selection wrapper in presence of correlated input variables for random forest models. Quality and Reliability Engineering International, 40. https://doi.org/10.1002/qre.3398

Sun, W., Du, Q. (2019). Hyperspectral band selection: A review. IEEE Geoscience and Remote Sensing Magazine, 7(2), 118-139. https://doi.org/10.1109/MGRS.2019.2911100

Tibshirani, R. (1996). Regression shrinkage and selection via the Lasso. Journal of the Royal Statistical Society. Series B (Methodological), 58(1), 267-288. https://doi.org/10.1111/j.2517-6161.1996.tb02080.x

Wang, H., Lengerich , B., Aragam , B., Xing, E. (2019). Precision Lasso: accounting for correlations and linear dependencies in high-dimensional genomic data. Bioinformatics, 35(7), 1181-1187. https://doi.org/10.1093/bioinformatics/bty750

Witten, D., Tibshirani, R. (2011). Penalized classification using Fisher's linear discriminant. Journal of the Royal Statistical Society Series B: Statistical Methodology, 73(5), 753-772. https://doi.org/10.1111/j.1467-9868.2011.00783.x

Xia, J., Falco, N., Benediktsson, J., Du, P., Chanussot, J. (2016). Hyperspectral Image Classification with Rotation Random Forest Via KPCA. IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, PP. https://doi.org/10.1109/JSTARS.2016.2636877

Yuan, M., Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society Series B, 68, 49-67. https://doi.org/10.1111/j.1467-9868.2005.00532.x

Zhang, C. (2010). Nearly unbiased variable selection under minimax concave penalty. The Annals of Statistics, 38(2), 894-942. https://doi.org/10.1214/09-AOS729

Zhang, Y., Cao, G., Li, X., Wang, B., Fu, P. (2019). Active Semi-Supervised Random Forest for Hyperspectral Image Classification. Remote Sensing, 11(24), 2974. https://doi.org/10.3390/rs11242974

Zhang, Y., Liu, J., Li, W., Liang, S. (2023). A Proposed Ensemble Feature Selection Method for Estimating Forest Aboveground Biomass from Multiple Satellite Data. Remote Sensing, 15(4), 1096. https://doi.org/10.3390/rs15041096

Zhu, W., Lévy, C., Ternès, N. (2023). Variable selection in high-dimensional logistic regression. hal-04152936. Obtenido de https://agroparistech.hal.science/hal-04152936

Zou, H. (2019). Classification with high dimensional features. WIREs computational Statistics, 11(1). https://doi.org/10.1002/wics.1453

Zou, H., Hastie, T. (2005). Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B: Statistical Methodology, 67(2), 301-320. https://doi.org/10.1111/j.1467-9868.2005.00503.x

Show more Show less