Model transitions in descending FLVQ (Articolo in rivista)

Type
Label
  • Model transitions in descending FLVQ (Articolo in rivista) (literal)
Anno
  • 1998-01-01T00:00:00+01:00 (literal)
Http://www.cnr.it/ontology/cnr/pubblicazioni.owl#doi
  • 10.1109/72.712148 (literal)
Alternative label
  • Baraldi, A and Blonda, P and Parmiggiani, F and Pasquariello, G and Satalino, G (1998)
    Model transitions in descending FLVQ
    in IEEE transactions on neural networks; IEEE, Institute of electrical and electronics engineers, New York (Stati Uniti d'America)
    (literal)
Http://www.cnr.it/ontology/cnr/pubblicazioni.owl#autori
  • Baraldi, A and Blonda, P and Parmiggiani, F and Pasquariello, G and Satalino, G (literal)
Pagina inizio
  • 724 (literal)
Pagina fine
  • 738 (literal)
Http://www.cnr.it/ontology/cnr/pubblicazioni.owl#numeroVolume
  • 9 (literal)
Rivista
Http://www.cnr.it/ontology/cnr/pubblicazioni.owl#numeroFascicolo
  • 5 (literal)
Note
  • ISI Web of Science (WOS) (literal)
Http://www.cnr.it/ontology/cnr/pubblicazioni.owl#affiliazioni
  • IMGA-CNR, Bologna 40129, Italy; IESI-CNR, 70126 Bari, Italy (literal)
Titolo
  • Model transitions in descending FLVQ (literal)
Abstract
  • Fuzzy learning vector quantization (FLVQ), also known as the fuzzy Kohonen clustering network, was developed to improve performance and usability of on-line hard-competitive Kohnen's vector quantization and soft-competitive self organizing map (SOM) algorithms. The FLVQ effectiveness seems to depend on the range of change of the weighting exponent m(t). In the first part of this work, extreme m(t) values (1 and ?, respectively) are employed to investigate FLVQ asymptotic behaviors. This analysis shows that when m(t) tends to either one of its extremes, FLVQ is affected by trivial vector quantization, which causes centroids collapse in the grand mean of the input data set. No analytical criterion has been found to improve the heuristic choice of the range of m(t) change. In the second part of this paper, two FLVQ and SOM classification experiments of remote sensed data are presented. In these experiments the two nets are connected in cascade to a supervised second stage, based on the delta rule. Experimental results confirm that FLVQ performance can be greatly affected by the user's definition of the range of change of the weighting exponent. Moreover, FLVQ shows instability when its traditional termination criterion is applied. Empirical recommendations are proposed for the enhancement of FLVQ robustness. Both the analytical and the experimental data reported seem to indicate that the choice of the range of m(t) change is still open to discussion and that alternative clustering neural-network approaches should be developed to pursue during training: 1) monotone reduction of the neurons' learning rate and 2) monotone reduction of the overlap among neuron receptive fields. (literal)
Editore
Prodotto di
Autore CNR
Insieme di parole chiave

Incoming links:


Prodotto
Autore CNR di
Http://www.cnr.it/ontology/cnr/pubblicazioni.owl#rivistaDi
Editore di
Insieme di parole chiave di
data.CNR.it