Publication
Title
Classification of hyperspectral images using unsupervised support vector machine
Author
Abstract
In this paper, we introduce a new unsupervised classifier for Hyperspectral images (HSI) using image segmentation and spectral unmixing. In the proposed method, first the number of classes is considered equal to the number of endmembers. Second, the endmember matrix is defined. Third, the abundance fraction maps are extracted. Fourth, an initial groundtruth is constructed by choosing the location of maximum absolute value of abundance fractions corresponding to each pixel. Fifth, each pixel which has the same eight neighboring (vertical, horizontal and diagonal) class is a good candidate for training data and after that some of good candidate pixels are randomly selected as final training data and remaining pixels are considered as testing data. Finally, support vector machine is applied to the HSI and initial groundtruth is iteratively repeated. In order to validate the efficiency of the proposed algorithm, two real HSI datasets are used. The obtained classification results are compared with some of state-of-the-art initial algorithms and the classification accuracy of the proposed method is close to the supervised algorithms.
Language
English
Source (journal)
Proceedings of the Society of Photo-optical Instrumentation Engineers / SPIE: International Society for Optical Engineering. - Bellingham, Wash.
Source (book)
Conference on Image and Signal Processing for Remote Sensing XXIII, SEP 11-13, 2017, Warsaw, POLAND
Publication
Bellingham : Spie-int soc optical engineering , 2017
ISSN
0277-786X
ISBN
978-1-5106-1318-8
978-1-5106-1319-5
978-1-5106-1318-8
DOI
10.1117/12.2278058
Volume/pages
10427 (2017) , 7 p.
Article Reference
104270H
ISI
000425842500013
Medium
E-only publicatie
Full text (Publisher's DOI)
UAntwerpen
Faculty/Department
Research group
Publication type
Subject
Affiliation
Publications with a UAntwerp address
External links
Web of Science
Record
Identifier
Creation 29.03.2018
Last edited 09.10.2023
To cite this reference