Optimizing one-shot learning with binary Synapses (Articolo in rivista)

Type
Label
  • Optimizing one-shot learning with binary Synapses (Articolo in rivista) (literal)
Anno
  • 2008-01-01T00:00:00+01:00 (literal)
Alternative label
  • Romani, S; Amit, DJ; Amit, Y (2008)
    Optimizing one-shot learning with binary Synapses
    in Neural computation
    (literal)
Http://www.cnr.it/ontology/cnr/pubblicazioni.owl#autori
  • Romani, S; Amit, DJ; Amit, Y (literal)
Pagina inizio
  • 1928 (literal)
Pagina fine
  • 1950 (literal)
Http://www.cnr.it/ontology/cnr/pubblicazioni.owl#numeroVolume
  • 20 (literal)
Rivista
Note
  • ISI Web of Science (WOS) (literal)
Http://www.cnr.it/ontology/cnr/pubblicazioni.owl#affiliazioni
  • \"[Romani, Sandro] Univ Roma La Sapienza, I-00185 Rome, Italy; [Amit, Yali] Dept Stat & Comp Sci, Chicago, IL 60637 USA; [Amit, Daniel J.] Hebrew Univ Jerusalem, Racah Inst Phys, IL-91904 Jerusalem, Israel; [Amit, Daniel J.] Univ Rome, INFM, I-00185 Rome, Italy (literal)
Titolo
  • Optimizing one-shot learning with binary Synapses (literal)
Abstract
  • A network of excitatory synapses trained with a conservative version of Hebbian learning is used as a model for recognizing the familiarity of thousands of once-seen stimuli from those never seen before. Such networks were initially proposed for modeling memory retrieval (selective delay activity). We show that the same framework allows the incorporation of both familiarity recognition and memory retrieval, and estimate the network's capacity. In the case of binary neurons, we extend the analysis of Amit and Fusi (1994) to obtain capacity limits based on computations of signal-to-noise ratio of the field difference between selective and non-selective neurons of learned signals. We show that with fast learning (potentiation probability approximately 1), the most recently learned patterns can be retrieved in working memory (selective delay activity). A much higher number of once-seen learned patterns elicit a realistic familiarity signal in the presence of an external field. With potentiation probability much less than 1 (slow learning), memory retrieval disappears, whereas familiarity recognition capacity is maintained at a similarly high level. This analysis is corroborated in simulations. For analog neurons, where such analysis is more difficult, we simplify the capacity analysis by studying the excess number of potentiated synapses above the steady-state distribution. In this framework, we derive the optimal constraint between potentiation and depression probabilities that maximizes the capacity. (literal)
Autore CNR
Insieme di parole chiave

Incoming links:


Autore CNR di
Http://www.cnr.it/ontology/cnr/pubblicazioni.owl#rivistaDi
Insieme di parole chiave di
data.CNR.it