刊讯|SSCI 期刊《语言学年鉴》2022年第8卷
Annual Review Of Linguistics
Volume 8, 2022
Annual Review Of Linguistics(SSCI一区,2021IF:3.705)2022年第8卷共发文26篇。论文涉及手势语的论元结构、构式形态学与关系形态学、视角转换、儿童一语习得、连贯建构、音乐与语言、基于语料库的类型学研究、态度动词、二语句子加工、名物化、完成体、人工耳蜗与语言学习、黏着语义学、克里奥尔语韵律研究、口音适应、立场、神经语言学与语言加工、深度学习中的语义结构、句法孤岛等。
目录
■ Introduction, by Colin Phillips, and Mark Liberman, Vol. 8, 2022, pp. i–iv
■ How I Got Here and Where I'm Going Next, by Sarah G. Thomason, Vol. 8, 2022, pp. 1–17
■ Argument Structure in Sign Languages, by Vadim Kimmelman, Vol. 8, 2022, pp. 19–38
■ Advances in Morphological Theory: Construction Morphology and Relational Morphology, by Jenny Audring, Vol. 8, 2022, pp. 39–58
■ Perspective Shift Across Modalities, by Emar Maier and Markus Steinbach, Vol. 8, 2022, pp. 59–76
■ Learning Through Processing: Toward an Integrated Approach to Early Word Learning, by Stephan C. Meylan and Elika Bergelson, Vol. 8, 2022, pp. 77–99
■ The Probabilistic Turn in Semantics and Pragmatics, by Katrin Erk, Vol. 8, 2022, pp. 101–121
■ Coherence Establishment as a Source of Explanation in Linguistic Theory, by Andrew Kehler, Vol. 8, 2022, pp. 123–142
■ When Do Children Lose the Language Instinct? A Critical Review of the Critical Periods Literature, by Joshua K. Hartshorne, Vol. 8, 2022, pp. 143–151
■ Music and Language, by David Temperley, Vol. 8, 2022, pp. 153–170
■ Crosslinguistic Corpus Studies in Linguistic Typology, by Stefan Schnell and Nils Norman Schiborr, Vol. 8, 2022, pp. 171–191
■ On the Acquisition of Attitude Verbs, by Valentine Hacquard and Jeffrey Lidz, Vol. 8, 2022, pp. 193–212
■ Meaning and Alternatives, by Nicole Gotzner and Jacopo Romoli, Vol. 8, 2022, pp. 213–234
■Second Language Sentence Processing, by Holger Hopp, Vol. 8, 2022, pp. 235–256
■ Nominalization and Natural Language Ontology, by Scott Grimm and Louise McNally, Vol. 8, 2022, pp. 257–277
■ Perfects Across Languages, by Östen Dahl, Vol. 8, 2022, pp. 279–297
■ Speech and Language Outcomes in Adults and Children with Cochlear Implants, by Terrin N. Tamati, David B. Pisoni, Aaron C. Moberly, Vol. 8, 2022, pp. 299–319
■ Glue Semantics, by Ash AsudehVol. 8, 2022, pp. 321–341
■ Intonation and Prosody in Creole Languages: An Evolving Ecology, by Shelome Gooden, Vol. 8, 2022, pp. 343–364
■ Navigating Accent Variation: A Developmental Perspective, by Elizabeth K. Johnson, Marieke van Heugten, Helen Buckler, Vol. 8, 2022, pp. 365–387
■ Reverse Engineering Language Acquisition with Child-Centered Long-Form Recordings, by Marvin Lavechin, Maureen de Seyssel, Lucas Gautheron, Emmanuel Dupoux, Alejandrina Cristia, Vol. 8, 2022, pp. 389–407
■ Stance and Stancetaking, by Scott F. Kiesling, Vol. 8, 2022, pp. 409–426
■ Neurocomputational Models of Language Processing, by John T. Hale, Luca Campanelli, Jixing Li, Shohini Bhattasali, Christophe Pallier, Jonathan R. Brennan, Vol. 8, 2022, pp. 427–446
■ Semantic Structure in Deep Learning, by Ellie Pavlick, Vol. 8, 2022, pp. 447–471
■ Deriving the Wug-Shaped Curve: A Criterion for Assessing Formal Theories of Linguistic Variation, by Bruce Hayes, Vol. 8, 2022, pp. 473–494
■ Structural, Functional, and Processing Perspectives on Linguistic Island Effects, by Yingtong Liu, Elodie Winckel, Anne Abeillé, Barbara Hemforth, Edward Gibson, Vol. 8, 2022, pp. 495–525
摘要
Sarah G. Thomason, Department of Linguistics, University of Michigan, Ann Arbor, Michigan, USA
Abstract My career falls into two distinct periods. The first two decades featured insecurity combined with the luck of wandering into situations that ultimately helped me become a better linguist and a better teacher. I had the insecurity mostly under control by the watershed year of 1988, when I published a favorably reviewed coauthored book on language contact and also became editor of Language. Language contact has occupied most of my research time since then, but my first encounter with Séliš-Ql'ispé (a.k.a. Montana Salish), in 1981, led to a 40-year dedication to finding out more about the language and its history.
Key words contact-induced language change, deliberate language change, doodling, Séliš-Ql'ispé, Serbo-Croatian, Spring Workshop on Theory and Method in Linguistic Reconstruction, autobiography
Argument Structure in Sign Languages
Vadim Kimmelman, Department of Linguistic, Literary and Aesthetic Studies, University of Bergen, Bergen, Norway
Abstract Although sign languages are crucial for research on the human linguistic capacity, argument structure in these languages is rarely addressed in theoretical and typological studies. This article provides an overview of existing research on argument structure and argument structure alternations in sign languages. It demonstrates that there are many fundamental similarities between the two modalities in the domain of argument structure, such as the basic valency patterns and the semantic basis of argument structure. At the same time, modality effects, such as iconicity, simultaneity, and the use of space, play an important role in argument structure realization. Finally, the article discusses how emerging sign languages present an opportunity to observe and study the emergence of argument structure.
Key words sign language linguistics, argument structure, syntax, modality effects, language emergence
Advances in Morphological Theory: Construction Morphology and Relational Morphology
Jenny Audring, Leiden University Centre for Linguistics, Leiden University, Leiden, The Netherlands
Abstract In recent years, construction-based approaches to morphology have gained ground in the research community. This framework is characterized by the assumption that the mental lexicon is extensive and richly structured, containing not only a large number of stored words but also a wide variety of generalizations in the form of schemas. This review explores two construction-based theories, Construction Morphology and Relational Morphology. After outlining the basic theoretical architecture, the article presents an array of recent applications of a construction-based approach to morphological phenomena in various languages. In addition, it offers reflections on challenges and opportunities for further research. The review highlights those aspects of the theory that have proved particularly helpful in accommodating both the regularities and the quirks that are typical of the grammar of words.
Key words constructions, Construction Morphology, Relational Morphology, morphology, lexicon, schemas
Perspective Shift Across Modalities
Emar Maier, Center for Language and Cognition, Faculty of Arts, and Department of Theoretical Philosophy, University of Groningen, Groningen, The Netherlands
Markus Steinbach, Department of German Philology, Georg-August-Universität Göttingen, Göttingen, Germany
Abstract Languages offer various ways to present what someone said, thought, imagined, felt, and so on from their perspective. The prototypical example of a perspective-shifting device is direct quotation. In this review we define perspective shift in terms of indexical shift: A direct quotation like “Selena said, ‘Oh, I don't know.’” involves perspective shift because the first-person indexical ‘I’ refers to Selena, not to the actual speaker. We then discuss a variety of noncanonical modality-specific perspective-shifting devices: role shift in sign language, quotatives in spoken language, free indirect discourse in written language, and point-of-view shift in visual language. We show that these devices permit complex mixed forms of perspective shift which may involve nonlinguistic gestural as well as visual components.
Key words perspective, role shift, visual language, free indirect discourse, gesture, indexicals
Learning Through Processing: Toward an Integrated Approach to Early Word Learning
Stephan C. Meylan, Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology, Cambridge, Massachusetts, USA / Department of Psychology and Neuroscience, Duke University, Durham, North Carolina, USA
Elika Bergelson, Department of Psychology and Neuroscience, Duke University, Durham, North Carolina, USA
Abstract Children's linguistic knowledge and the learning mechanisms by which they acquire it grow substantially in infancy and toddlerhood, yet theories of word learning largely fail to incorporate these shifts. Moreover, researchers’ often-siloed focus on either familiar word recognition or novel word learning limits the critical consideration of how these two relate. As a step toward a mechanistic theory of language acquisition, we present a framework of “learning through processing” and relate it to the prevailing methods used to assess children's early knowledge of words. Incorporating recent empirical work, we posit a specific, testable timeline of qualitative changes in the learning process in this interval. We conclude with several challenges and avenues for building a comprehensive theory of early word learning: better characterization of the input, reconciling results across approaches, and treating lexical knowledge in the nascent grammar with sufficient sophistication to ensure generalizability across languages and development.
Key words word learning, language acquisition, lexical processing, linguistic input, infancy, vocabulary growth
The Probabilistic Turn in Semantics and Pragmatics
Katrin Erk, Department of Linguistics, University of Texas at Austin, Austin, Texas, USA
Abstract This article provides an overview of graded and probabilistic approaches in semantics and pragmatics. These approaches share a common set of core research goals: (a) a concern with phenomena that are best described as graded, including a vast lexicon of words whose meanings adapt flexibly to the contexts in which they are used, as well as reasoning under uncertainty about interlocutors, their goals, and their strategies; (b) the need to show that representations are learnable, i.e., that a listener can learn semantic representations and pragmatic reasoning from data; (c) an emphasis on empirical evaluation against experimental data or corpus data at scale; and (d) scaling up to the full size of the lexicon. The methods used are sometimes explicitly probabilistic and sometimes not. Previously, there were assumed to be clear boundaries among probabilistic frameworks, classifiers in machine learning, and distributional approaches, but these boundaries have been blurred. Frameworks in semantics and pragmatics use all three of these, sometimes in combination, to address the four core research questions above.
Key words semantics, pragmatics, lexical semantics, probabilities, Bayesian models, machine learning
Coherence Establishment as a Source of Explanation in Linguistic Theory
Andrew Kehler, Department of Linguistics, University of California, San Diego, La Jolla, California, USA
Abstract The primary goal of coherence theory is to provide an explanation for the coherence properties of discourse: what properties distinguish a discourse from a mere collection of utterances, and what drives comprehenders to draw inferences in service of establishing coherence. However, the importance of coherence theory goes well beyond that; it also plays a crucial role in theories of a variety of discourse-dependent linguistic phenomena. This article surveys some ways in which coherence theory has been leveraged in this way, appealing to both Relational analyses and Question-Under-Discussion models. Theories of coherence establishment should therefore have a place in the linguist's toolbox as a source of explanation in linguistic theory.
Key words coherence relations, discourse structure, anaphora, ellipsis, Coordinate Structure Constraint, tense, accent placement, QUDs
When Do Children Lose the Language Instinct? A Critical Review of the Critical Periods Literature
Joshua K. Hartshorne, Department of Psychology and Neuroscience, Boston College, Boston, Massachusetts, USA
Abstract While it is clear that children are more successful at learning language than adults are—whether first language or second—there is no agreement as to why. Is it due to greater neural plasticity, greater motivation, more ample opportunity for learning, superior cognitive function, lack of interference from a first language, or something else? A difficulty in teasing apart these theories is that while they make different empirical predictions, there are few unambiguous facts against which to test the theories. This is particularly true when it comes to the most basic questions about the phenomenon: When does the childhood advantage dissipate, and how rapidly does it do so? I argue that a major reason for the lack of consensus is limitations in the research methods used to date. I conclude by discussing a recently emerging methodology and by making suggestions about the path forward.
Key words critical periods, massive online experiments, statistical power, language acquisition
Music and Language
David Temperley, Eastman School of Music, University of Rochester, Rochester, New York, USA
Abstract This review presents a highly selective survey of connections between music and language. I begin by considering some fundamental differences between music and language and some nonspecific similarities that may arise out of more general characteristics of human cognition and communication. I then discuss an important, specific interaction between music and language: the connection between linguistic stress and musical meter. Next, I consider several possible connections that have been widely studied but remain controversial: cross-cultural correlations between linguistic and musical rhythm, effects of musical training on linguistic abilities, and connections in cognitive processing between music and linguistic syntax. Finally, I discuss some parallels regarding the use of repetition in music and language, which until now has been a little-explored topic.
Key words music–language connections, rhythm, stress, meter, syntax, repetition
Crosslinguistic Corpus Studies in Linguistic Typology
Stefan Schnell, Department of Comparative Language Science, University of Zurich, Zurich, Switzerland
Nils Norman Schiborr, Department of General Linguistics, University of Bamberg, Bamberg, Germany
Abstract Corpus-based studies have become increasingly common in linguistic typology over recent years, amounting to the emergence of a new field that we call corpus-based typology. The core idea of corpus-based typology is to take languages as populations of utterances and to systematically investigate text production across languages in this sense. From a usage-based perspective, investigations of variation and preferences of use are at the core of understanding the distribution of conventionalized structures and their diachronic development across languages. Specific findings of corpus-based typological studies pertain to universals of text production, for example, in prosodic partitioning; to cognitive biases constraining diverse patterns of use, for example, in constituent order; and to correlations of diverse patterns of use with language-specific structures and conventions. We also consider remaining challenges for corpus-based typology, in particular the development of crosslinguistically more representative corpora that include spoken (or signed) texts, and its vast potential in the future.
Key words corpus-based typology, distributional typology, universals in language use, typology of language use, multilingual corpora,corpus linguistics
On the Acquisition of Attitude Verbs
Valentine Hacquard and Jeffrey Lidz, Department of Linguistics, University of Maryland, College Park, Maryland, USA
Abstract Attitude verbs, such as think, want, and know, describe internal mental states that leave few cues as to their meanings in the physical world. Consequently, their acquisition requires learners to draw from indirect evidence stemming from the linguistic and conversational contexts in which they occur. This provides us a unique opportunity to probe the linguistic and cognitive abilities that children deploy in acquiring these words. Through a few case studies, we show how children make use of syntactic and pragmatic cues to figure out attitude verb meanings and how their successes, and even their mistakes, reveal remarkable conceptual, linguistic, and pragmatic sophistication.
Key words attitude verbs, syntactic bootstrapping, indirect speech acts, word learning, theory of mind
Meaning and Alternatives
Nicole Gotzner, Department of Cognitive Sciences, University of Potsdam, Potsdam, Germany
Jacopo Romoli, Department of Foreign Languages, University of Bergen, Bergen, Norway
Abstract Alternatives and competition in language are pervasive at all levels of linguistic analysis. More specifically, alternatives have been argued to play a prominent role in an ever-growing class of phenomena in the investigation of natural language meaning. In this article, we focus on scalar implicatures, as they are arguably the most paradigmatic case of an alternative-based phenomenon. We first review the main challenge for theories of alternatives, the so-called symmetry problem, and we briefly discuss how it has shaped the different approaches to alternatives. We then turn to two more recent challenges concerning scalar diversity and the inferences of sentences with multiple scalars. Finally, we describe several related alternative-based phenomena and recent conceptual approaches to alternatives. As we discuss, while important progress has been made, much more work is needed both on the theoretical side and on understanding the empirical landscape better.
Key words alternatives, scalar implicatures, symmetry problem, focus, polarity sensitivity, negative strengthening
Second Language Sentence Processing
Holger Hopp, Department of English and American Studies, University of Braunschweig, Braunschweig, GermanyAbstract Second language (L2) sentence processing research studies how adult L2 learners understand sentences in real time. I review how L2 sentence processing differs from monolingual first-language (L1) processing and outline major findings and approaches. Three interacting factors appear to mandate L1–L2 differences: (a) capacity restrictions in the ability to integrate information in an L2; (b) L1–L2 differences in the weighting of cues, the timing of their application, and the efficiency of their retrieval; and (c) variation in the utility functions of predictive processing. Against this backdrop, I outline a novel paradigm of interlanguage processing, which examines bilingual features of L2 processing, such as bilingual language systems, nonselective access to all grammars, and processing to learn an L2. Interlanguage processing goes beyond the traditional framing of L2 sentence processing as an incomplete form of monolingual processing and reconnects the field with current approaches to grammar acquisition and the bilingual mental lexicon.
Key words second language, bilingualism, sentence comprehension, parsing, grammar
Nominalization and Natural Language Ontology
Scott Grimm, Department of Linguistics, University of Rochester, Rochester, New York, USA
Louise McNally, Departament de Traducció i Ciències del Llenguatge, Universitat Pompeu Fabra, Barcelona, SpainAbstract Nominalization (e.g., sleeping, that we slept, sleepiness) allows speakers to refer and ascribe properties to whatever sorts of entities clauses, verbs, or adjectives typically denote. Characterizing these relatively abstract entities has challenged semanticists and philosophers of language for over 50 years, thanks especially to Zeno Vendler's early work. Vendler took different kinds of English nominalization constructions to support positing facts, propositions, and events as distinct ontological objects. However, his conclusions remain controversial. The research on nominalization and natural language ontology has never been brought together or put into perspective; in this article, we clarify the complex variety of subsequent ontological positions that build on Vendler's work, identifying points of consensus and disagreement. We also reflect on the consequences and challenges of focusing on the English data and offer a glimpse of how the landscape might change with greater attention to nominalization in other languages.
Key words nominalization, natural language ontology, events, facts, propositions, abstract objects
Perfects Across Languages
Östen Dahl, Department of Linguistics, Stockholm University, 106 91 Stockholm, SwedenAbstract The theoretical study of perfects tends to be based on data from European languages, particularly English. To find the proper place for perfects, we have to go beyond English to be able to separate what is idiosyncratic from what is generalizable. A central function of perfects is to speak of how the present is different from the past, especially from the immediate past. A perfect typically relates how a past state changes into the present one. Crosslinguistically, we find two major types of perfect: constructions involving the auxiliary verbs ‘be’ and ‘have’, common in Indo-European and neighboring families, and iamitives, which are the result of the grammaticalization of words meaning ‘already’. The status of iamitives is controversial. In this review, I argue that they can be separated both from ‘already’ and from European-style perfects but that it makes sense to postulate a more inclusive crosslinguistic perfect category.
Key words perfects, iamitives, tense, aspect, grammaticalization
Speech and Language Outcomes in Adults and Children with Cochlear Implants
Terrin N. Tamati, Department of Otolaryngology—Head and Neck Surgery, The Ohio State University Wexner Medical Center, Columbus, Ohio, USA / Department of Otorhinolaryngology/Head and Neck Surgery, University Medical Center, University of Groningen, Groningen, The Netherlands
David B. Pisoni, DeVault Otologic Research Laboratory, Department of Otolaryngology—Head and Neck Surgery, Indiana University School of Medicine, Indianapolis, Indiana, USA / Department of Psychological and Brain Sciences, Indiana University, Bloomington, Indiana, USA
Aaron C. Moberly, Department of Otolaryngology—Head and Neck Surgery, The Ohio State University Wexner Medical Center, Columbus, Ohio, USAAbstract Cochlear implants (CIs) represent a significant engineering and medical milestone in the treatment of hearing loss for both adults and children. In this review, we provide a brief overview of CI technology, describe the benefits that CIs can provide to adults and children who receive them, and discuss the specific limitations and issues faced by CI users. We emphasize the relevance of CIs to the linguistics community by demonstrating how CIs successfully provide access to spoken language. Furthermore, CI research can inform our basic understanding of spoken word recognition in adults and spoken language development in children. Linguistics research can also help us address the major clinical issue of outcome variability and motivate the development of new clinical tools to assess the unique challenges of adults and children with CIs, as well as novel interventions for individuals with poor outcomes.
Key words cochlear implants, individual variability, speech perception, language development, adverse conditions
Glue Semantics
Ash Asudeh, Department of Linguistics, University of Rochester, Rochester, New York 14627, USAAbstract Glue Semantics (Glue) is a general framework for semantic composition and the syntax–semantics interface. The framework grew out of an interdisciplinary collaboration at the intersection of formal linguistics, formal logic, and computer science. Glue assumes a separate level of syntax; this can be any syntactic framework in which syntactic structures have heads. Glue uses a fragment of linear logic for semantic composition. General linear logic terms in Glue meaning constructors are instantiated relative to a syntactic parse. The separation of the logic of composition from structural syntax distinguishes Glue from other theories of semantic composition and the syntax–semantics interface. It allows Glue to capture semantic ambiguity, such as quantifier scope ambiguity, without necessarily positing an ambiguity in the syntactic structure. Glue is introduced here in relation to four of its key properties, which are used as organizing themes: resource-sensitive composition, flexible composition, autonomy of syntax, and syntax/semantics non-isomorphism.
Key words syntax–semantics interface, semantic compositionality, resource sensitivity, flexible composition, autonomy of syntax,quantifier scope
Intonation and Prosody in Creole Languages: An Evolving Ecology
Shelome Gooden, Department of Linguistics, University of Pittsburgh, Pittsburgh, Pennsylvania, USAAbstract Research on the prosody and intonation of creole languages has largely remained an untapped resource, yet it is important for enriching our understanding of how or if their phonological systems changed or developed under contact. Further, their hybrid histories and current linguistic ecologies present descriptive and analytical treasure troves. This has the potential to inform many areas of linguistic inquiry including contact effects on the typological classification of prosodic systems, socioprosodic variation (individual and community level), and the scope of diversity in prosodic systems among creole languages and across a variety of languages similarly influenced by language contact. Thus, this review highlights the importance of pushing beyond questions of creole language typology and genetic affiliation. I review the existing research on creole language prosody and intonation, provide some details on a few studies, and highlight some key challenges and opportunities for the subfield and for linguistics in general.
Key words creole language, prosody, contact, intonation, ecology, variation
Navigating Accent Variation: A Developmental Perspective
Elizabeth K. Johnson, Department of Psychology, University of Toronto, Toronto, Ontario, Canada / Department of Psychology, University of Toronto Mississauga, Mississauga, Ontario, Canada
Marieke van Heugten, The MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Penrith, Australia
Helen Buckler, School of English, University of Nottingham, Nottingham, United KingdomAbstract Adult processing of other-accented speech is fast, dependent on lexical access, and readily generalizable to new words. But what does children's processing of other-accented speech look like? Although many acquisition researchers have emphasized how other-accented speech presents a formidable challenge to young children, we argue that the field has perhaps underestimated children's early accent processing abilities. In support of this view, we present evidence that 2-year-olds’ accent processing abilities appear to be in many respects adult-like, and discuss the growing literature on children's ability to cope with multi-accent input in the natural world. We outline different theoretical outlooks on the transition children make from infancy to later childhood, and discuss how the growing sophistication of infants’ accent processing abilities feeds into their social perception of the world (and perhaps vice versa). We also argue that efficient processing and meaningful interpretation of accent variation are fundamental to human cognition, and that early proficiency with accent variation (along with all of the implied representational and learning capacities) is difficult to explain without assuming the early emergence of abstract speech representations.
Key words accent accommodation, language acquisition, speech development, perceptual adaptation, linguistic diversity, developmental sociolinguistics, social cognition
Reverse Engineering Language Acquisition with Child-Centered Long-Form Recordings
Marvin Lavechin, Laboratoire de Sciences Cognitives et Psycholinguistique, Département d’Etudes cognitives, ENS, EHESS, CNRS, PSL University, Paris, France / Cognitive Machine Learning Team, INRIA, Paris, France / Facebook AI Research, Paris, France
Maureen de Seyssel, Laboratoire de Sciences Cognitives et Psycholinguistique, Département d’Etudes cognitives, ENS, EHESS, CNRS, PSL University, Paris, France / Cognitive Machine Learning Team, INRIA, Paris, France / Laboratoire de linguistique formelle, Université de Paris, CNRS, Paris, France
Lucas Gautheron, Laboratoire de Sciences Cognitives et Psycholinguistique, Département d’Etudes cognitives, ENS, EHESS, CNRS, PSL University, Paris, France
Emmanuel Dupoux, Laboratoire de Sciences Cognitives et Psycholinguistique, Département d’Etudes cognitives, ENS, EHESS, CNRS, PSL University, Paris, France / Cognitive Machine Learning Team, INRIA, Paris, France / Facebook AI Research, Paris, France
Alejandrina Cristia, Laboratoire de Sciences Cognitives et Psycholinguistique, Département d’Etudes cognitives, ENS, EHESS, CNRS, PSL University, Paris, FranceAbstract Language use in everyday life can be studied using lightweight, wearable recorders that collect long-form recordings—that is, audio (including speech) over whole days. The hardware and software underlying this technique are increasingly accessible and inexpensive, and these data are revolutionizing the language acquisition field. We first place this technique into the broader context of the current ways of studying both the input being received by children and children's own language production, laying out the main advantages and drawbacks of long-form recordings. We then go on to argue that a unique advantage of long-form recordings is that they can fuel realistic models of early language acquisition that use speech to represent children's input and/or to establish production benchmarks. To enable the field to make the most of this unique empirical and conceptual contribution, we outline what this reverse engineering approach from long-form recordings entails, why it is useful, and how to evaluate success.
Key words long-form recordings, LENA, language acquisition, computational studies, ecological validity, reverse engineering
Stance and Stancetaking
Scott F. Kiesling, Department of Linguistics, University of Pittsburgh, Pittsburgh, Pennsylvania, USAAbstract Stance and stancetaking are considered here as related concepts that help to explain the patterning of language and the motivations for the use of lexical items, constructions, and discourse markers. I begin with a discussion of how stance can be used in variation analysis to help explain the patterning of variables and directions of change, and how stance is central in any understanding of the indexicality of sociolinguistic variables. I then provide a discussion of several approaches to theorizing stance and explicate a stance model that combines a number of these approaches, arguing that such a model should include three dimensions: evaluation, alignment, and investment. Finally, I outline several ways that stance has been operationalized in quantitative analyses, including analyses based on the model outlined.
Key words stance, stancetaking, sociolinguistics, pragmatics, discourse analysis, variation analysis
Neurocomputational Models of Language Processing
John T. Hale, Department of Linguistics, University of Georgia, Athens, Georgia, USA
Luca Campanelli, Department of Linguistics, University of Georgia, Athens, Georgia, USA / Haskins Laboratories, New Haven, Connecticut, USA / Department of Communicative Disorders, University of Alabama, Tuscaloosa, Alabama, USA
Jixing Li, Neuroscience of Language Lab, NYU Abu Dhabi, Abu Dhabi, United Arab Emirates
Shohini Bhattasali, Institute for Advanced Computer Studies, Department of Linguistics, and Program in Neuroscience and Cognitive Science, University of Maryland, College Park, Maryland, USA
Christophe Pallier, Cognitive Neuroimaging Unit, INSERM U992, Gif-sur-Yvette, France
Jonathan R. Brennan, Department of Linguistics, University of Michigan, Ann Arbor, Michigan, USAAbstract Efforts to understand the brain bases of language face the Mapping Problem: At what level do linguistic computations and representations connect to human neurobiology? We review one approach to this problem that relies on rigorously defined computational models to specify the links between linguistic features and neural signals. Such tools can be used to estimate linguistic predictions, model linguistic features, and specify a sequence of processing steps that may be quantitatively fit to neural signals collected while participants use language. Progress has been helped by advances in machine learning, attention to linguistically interpretable models, and openly shared data sets that allow researchers to compare and contrast a variety of models. We describe one such data set in detail in the Supplemental Appendix.
Key words neurolinguistics, brain, computational model, deep learning, parsing, lexicon
Semantic Structure in Deep Learning
Ellie Pavlick, Department of Computer Science, Brown University, Providence, Rhode Island, USAAbstract Deep learning has recently come to dominate computational linguistics, leading to claims of human-level performance in a range of language processing tasks. Like much previous computational work, deep learning–based linguistic representations adhere to the distributional meaning-in-use hypothesis, deriving semantic representations from word co-occurrence statistics. However, current deep learning methods entail fundamentally new models of lexical and compositional meaning that are ripe for theoretical analysis. Whereas traditional distributional semantics models take a bottom-up approach in which sentence meaning is characterized by explicit composition functions applied to word meanings, new approaches take a top-down approach in which sentence representations are treated as primary and representations of words and syntax are viewed as emergent. This article summarizes our current understanding of how well such representations capture lexical semantics, world knowledge, and composition. The goal is to foster increased collaboration on testing the implications of such representations as general-purpose models of semantics.
Key words deep learning, semantics, natural language processing, neural network interpretability, neural network analysis
Deriving the Wug-Shaped Curve: A Criterion for Assessing Formal Theories of Linguistic Variation
Bruce Hayes, Department of Linguistics, University of California, Los Angeles, California, USAAbstract In this review, I assess a variety of constraint-based formal frameworks that can treat variable phenomena, such as well-formedness intuitions, outputs in free variation, and lexical frequency-matching. The idea behind this assessment is that data in gradient linguistics fall into natural mathematical patterns, which I call quantitative signatures. The key signatures treated here are the sigmoid curve, going from zero to one probability, and the wug-shaped curve, which combines two or more sigmoids. I argue that these signatures appear repeatedly in linguistics, and I adduce examples from phonology, syntax, semantics, sociolinguistics, phonetics, and language change. I suggest that the ability to generate these signatures is a trait that can help us choose between rival frameworks.
Key words maximum entropy, probabilistic grammar, variation
Structural, Functional, and Processing Perspectives on Linguistic Island Effects
Yingtong Liu, Department of Linguistics, Harvard University, Cambridge, Massachusetts, USA / Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology, Cambridge, Massachusetts, USA
Elodie Winckel, Department of English and American Studies, Friedrich-Alexander Universität Erlangen-Nürnberg, Erlangen, Germany
Anne Abeillé, Laboratoire de Linguistique Formelle, Université de Paris, LLF, CNRS, Paris, France
Barbara Hemforth, Laboratoire de Linguistique Formelle, Université de Paris, LLF, CNRS, Paris, France
Edward Gibson, Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology, Cambridge, Massachusetts, USAAbstract Ross (1967) observed that “island” structures like “Who do you think [NP the gift from__] prompted the rumor?” or “Who did you hear [NP the statement [S that the CEO promoted__]]?” are not acceptable, despite having what seem to be plausible meanings in some contexts. Ross (1967) and Chomsky (1973) hypothesized that the source of the unacceptability is in the syntax. Here, we summarize how theories of discourse, frequency, and memory from the literature might account for such effects. We suggest that there is only one island structure—a class of coordination islands—that is best explained by a syntactic/semantic constraint. We speculate that all other island structures are likely to be explained in terms of discourse, frequency, and memory.
Key words syntactic islands, long-distance dependencies, filler-gap dependencies, discourse constraints, usage-based grammar,syntactic constructions, Focus–Background Conflict, linguistic interference, linguistic encoding
期刊简介
The Annual Review Of Linguistics, in publication since 2015, covers significant developments in the field of linguistics, including phonetics, phonology, morphology, syntax, semantics, pragmatics, and their interfaces. Reviews synthesize advances in linguistic theory, sociolinguistics, psycholinguistics, neurolinguistics, language change, biology and evolution of language, typology, as well as applications of linguistics in many domains.
《语言学年鉴》自2015年起出版,涵盖语言学领域的重大发展事件,包括语音学、音系学、形态学、句法学、语义学、语用学和它们间的接口。本年鉴综合了语言学理论、社会语言学、心理语言学、神经语言学、语言演变、生物性和语言演化、类型学研究以及语言学在许多领域的应用发展研究。
官网地址:
https://www.annualreviews.org/journal/linguistics
本文来源:ANNUAL REVIEW OF LIGUISTICS
点击“阅读原文”可跳转下载
课程推荐
2022-08-25
2022-08-25
2022-08-24
2022-08-23
2022-08-21
2022-08-19
2022-08-18
2022-08-16
2022-08-15
2022-08-14
2022-08-13
2022-08-12
2022-08-11
2022-08-10
2022-08-09
2022-08-08
2022-08-07
2022-08-06
2022-08-05
2022-08-04
欢迎加入
今日小编:young
审 核:心得小蔓
转载&合作请联系
"心得君"
微信:xindejun_yyxxd
点击“阅读原文”可跳转下载