Efficient kernel models for learning and approximate minimization problems (Articolo in rivista)

Type
Label
  • Efficient kernel models for learning and approximate minimization problems (Articolo in rivista) (literal)
Anno
  • 2012-01-01T00:00:00+01:00 (literal)
Http://www.cnr.it/ontology/cnr/pubblicazioni.owl#doi
  • 10.1016/j.neucom.2012.04.023 (literal)
Alternative label
  • C. Cervellera; M. Gaggero; D. Maccio (2012)
    Efficient kernel models for learning and approximate minimization problems
    in Neurocomputing (Amst.); ELSEVIER SCIENCE B.V., AMSTERDAM (Paesi Bassi)
    (literal)
Http://www.cnr.it/ontology/cnr/pubblicazioni.owl#autori
  • C. Cervellera; M. Gaggero; D. Maccio (literal)
Pagina inizio
  • 74 (literal)
Pagina fine
  • 85 (literal)
Http://www.cnr.it/ontology/cnr/pubblicazioni.owl#altreInformazioni
  • Journal Q1 in Computer Science Applications (literal)
Http://www.cnr.it/ontology/cnr/pubblicazioni.owl#numeroVolume
  • 97 (literal)
Rivista
Note
  • ISI Web of Science (WOS) (literal)
  • Scopu (literal)
Http://www.cnr.it/ontology/cnr/pubblicazioni.owl#affiliazioni
  • 1. Institute of Intelligent Systems for Automation, National Research Council, Via De Marini 6, 16149 Genoa, Italy 2. Institute of Intelligent Systems for Automation, National Research Council, Via De Marini 6, 16149 Genoa, Italy 3. Institute of Intelligent Systems for Automation, National Research Council, Via De Marini 6, 16149 Genoa, Italy (literal)
Titolo
  • Efficient kernel models for learning and approximate minimization problems (literal)
Abstract
  • This paper investigates techniques for reducing the computational burden of local learning methods relying on kernel functions in the framework of approximate minimization, i.e., when they are employed to find the minimum of a given cost functional. The considered approach is based on an optimal choice of the kernel width parameters through the minimization of an empirical cost and can provide a solution to important problems, such as function approximation and multistage optimization. However, when the stored data are too many, the kernel model output evaluation can take a long time, making local learning unsuited to contexts where a fast function evaluation is required. At the same time, the training procedure to obtain the kernel widths can become too demanding as well. Here it is shown that a large saving in the computational effort can be achieved by considering subsets of the available data suitably chosen according to different criteria. An analysis of the performance of the new approach is provided. Then, simulation results show in practice the effectiveness of the proposed techniques when applied to learning and approximate minimization problems. (literal)
Editore
Prodotto di
Autore CNR

Incoming links:


Prodotto
Autore CNR di
Http://www.cnr.it/ontology/cnr/pubblicazioni.owl#rivistaDi
Editore di
data.CNR.it