Open Access Te Herenga Waka-Victoria University of Wellington
Browse

File(s) stored somewhere else

Please note: Linked content is NOT stored on Open Access Te Herenga Waka-Victoria University of Wellington and we can't guarantee its availability, quality, security or accept any liability.

Mapping Burned Areas with Multitemporal–Multispectral Data and Probabilistic Unsupervised Learning

journal contribution
posted on 2022-11-01, 10:44 authored by Rogério G Negri, Andréa EO Luz, Alejandro FreryAlejandro Frery, Wallace Casaca
The occurrence of forest fires has increased significantly in recent years across the planet. Events of this nature have resulted in the leveraging of new automated methodologies to identify and map burned areas. In this paper, we introduce a unified data-driven framework capable of mapping areas damaged by fire by integrating time series of remotely sensed multispectral images, statistical modeling, and unsupervised classification. We collect and analyze multiple remote-sensing images acquired by the Landsat-8, Sentinel-2, and Terra satellites between August–October 2020, validating our proposal with three case studies in Brazil and Bolivia whose affected regions have suffered from recurrent forest fires. Besides providing less noisy mappings, our methodology outperforms other evaluated methods in terms of average scores of 90%, 0.71, and 0.65 for overall accuracy, F1-score, and kappa coefficient, respectively. The proposed method provides spatial-adherence mappings of the burned areas whose segments match the estimates reported by the MODIS Burn Area product.

History

Preferred citation

Negri, R. G., Luz, A. E. O., Frery, A. C. & Casaca, W. (n.d.). Mapping Burned Areas with Multitemporal–Multispectral Data and Probabilistic Unsupervised Learning. Remote Sensing, 14(21), 5413-5413. https://doi.org/10.3390/rs14215413

Journal title

Remote Sensing

Volume

14

Issue

21

Pagination

5413-5413

Publisher

MDPI AG

Publication status

Published online

Online publication date

2022-10-28

eISSN

2072-4292

Language

en