The hidden dimension: a paradigmatic view of data-driven NLP. (Articolo in rivista)

Type
Label
  • The hidden dimension: a paradigmatic view of data-driven NLP. (Articolo in rivista) (literal)
Anno
  • 1999-01-01T00:00:00+01:00 (literal)
Alternative label
  • Vito Pirrelli, François Yvon (1999)
    The hidden dimension: a paradigmatic view of data-driven NLP.
    in Journal of experimental and theoretical artificial intelligence (Online); TAYLOR & FRANCIS LTD, London (Regno Unito)
    (literal)
Http://www.cnr.it/ontology/cnr/pubblicazioni.owl#autori
  • Vito Pirrelli, François Yvon (literal)
Pagina inizio
  • 391 (literal)
Pagina fine
  • 408 (literal)
Http://www.cnr.it/ontology/cnr/pubblicazioni.owl#numeroVolume
  • 11 (literal)
Rivista
Http://www.cnr.it/ontology/cnr/pubblicazioni.owl#affiliazioni
  • Istituto di Linguistica Computazionale \"A. Zampolli\", CNR, Pisa ENST, Department of Computer Science and CNRS, Paris (literal)
Titolo
  • The hidden dimension: a paradigmatic view of data-driven NLP. (literal)
Abstract
  • Many tasks in language analysis are described as the maximally economic mapping of one level of linguistic representation onto another such level. Over the past decade, many different machine-learning strategies have been developed to automatically induce such mappings directly from data. In this paper, we contend that the way most learning algorithms have been applied to problems of language analysis reflects a strong bias towards a compositional (or biunique) model of interlevel mapping. Although this is justified in some cases, we contend that biunique inter-level mapping is not a jack of all trades. A model of analogical learning, based on a paradigmatic reanalysis of memorized data, is presented here. The methodological pros and cons of this approach are discussed in relation to a number of germane linguistic issues and illustrated in the context of three case studies: word pronunciation, word analysis, and word sense disambiguation. The evidence produced here seems to suggest that the brain is not designed to carry out the logically simplest and maximally economic way of relating form and function in language. Rather we propose a radical shift of emphasis in language learning from syntagmatic inter-level mapping to paradigmatically-constrained intra-level mapping. (literal)
Editore
Prodotto di
Autore CNR
Insieme di parole chiave

Incoming links:


Autore CNR di
Prodotto
Http://www.cnr.it/ontology/cnr/pubblicazioni.owl#rivistaDi
Editore di
Insieme di parole chiave di
data.CNR.it