Graduate and Postdoc Research Projects

The Interaction of the Nonmanual Component with Semantics in Embedded Polar Interrogatives in TiD
Emre Hakgüder
This project focuses on instrumental utterances in sign languages. Previous research has shown that instrumental events can be expressed in sign languages using classifier predicates, and that the iconic form of the classifier depends on several factors. This project builds on previous findings and aims to create a cross-linguistic topography of sign language instrumentals. The ultimate aim is to identify each factor and their relative importance in each language and make the findings into a computer algorithm which generates instrumental utterances from non-linguistic input.

Timing of speech and gesture: An information-theoretic approach
Yağmur Deniz Kısa, Kensy Cooperrider, Geoffrey Brookshire, Defu Yap
What determines the relative timing of speech and gesture? Why do gestures tend to precede speech and under what conditions does their temporal order deviate from this canonical pattern? To find out, we will develop tools to automate the coding of speech and gesture timing, in order to test a new information-theoretic hypothesis: the relative timing of speech and gesture can be determined by their relative informativity.

Gesture, Interaction, and Structure Task (GIST)
Laura Horton, Ryan Lepic, Jenny Lu, James Waller

Grammaticalization of the body and space in Nicaraguan Sign Language
Kat Montemurro
We look at the role of spatial modulation in the development of person distinctions in Nicaraguan Sign Language (NSL), a language that emerged in the late 1970s in Managua. While space has long been studied in young and emerging sign languages (Senghas 2003, 2010, Padden et al. 2010, Kocab et al. 2015), neutral space is not the only resource for modulation (Meir et al. 2007). In mature sign languages, there is a grammatical first/non-first person distinction which poses the body (first) in opposition to neutral space (non-first) (Meier 1990, Engberg-Pedersen 1995). As such, we isolate phonological expressions of both the body and space: the use of points and the use of axis i.e. front-back or left-right (figure 1) to establish R-loci.

Are non-native speakers of English sensitive to gradient information in speech?
Jenny Lu, Shannon Heald, Howard Nusbaum

A Phonologically Annotated Crosslinguistic Database of Sign Language Lexicons
Aurora Martinez del Rio
This project seeks to develop a cross-linguistic dataset of the lexicons of two historically unrelated sign languages – American Sign Language (ASL) and British Sign Language (BSL) – using an articulatory featural coding system. The dataset, comprising codes for the handshape, location, and movement features, will be used to explore the relationship between frequency distribution, complexity, and linguistic structure. Additional methodological goals within the project include examining how sign-naïve and native-signing coders compare in their respective annotations for handshape configuration.

Skip to toolbar