Jon Rawski

Jon Rawski

Researcher

Stony Brook University

About

I am a PhD candidate in the Linguistics Department at Stony Brook University, advised by Jeffrey Heinz. I am also a Graduate Assistant in the Center for Neural Circuit Dynamics, and a Junior Research Fellow at the Institute for Advanced Computational Science.

My work concerns the mathematics of human language and learning. These cognitive feats emerge from humans’ unique neuronal structure and computing power, allowing linguistic insight to contribute to the broader cognitive sciences and artificial intelligence. I frequently use tensor algebra, formal language theory, grammatical inference, differentiable machine learning, and finite model theory to clarify the nature of the human mental capacity of language.

Interests

  • Computational/Mathematical Linguistics
  • Cognitive Science
  • Artificial Intelligence

Education

  • PhD in Linguistics, 2021 (expected)

    Stony Brook University

  • MSci in Cognitive Science, 2016

    Higher School of Economics

  • BA in Linguistics, 2013

    University of Minnesota

Recent & Upcoming Presentations

See my CV for a full list of all my talks.

Computational Restrictions on Iterative Prosodic Processes

We formalize various iterative prosodic processes including stress, syllabification and epenthesis using logical graph transductions, showing that the necessary use of fixed point operators without quantification restricts them to a structured subclass of subsequential functions.

Overcoming Poverty of Stimulus with Structure and Parameters

We show how phonological typology computationally emerges from combinations of relativized representations and the simplest of a suite of online learning algorithms.

Understanding Machine Learning with Language and Tensors

I analyze the functional expressivity of sequential neural machine learning using tensor algebra and simulations on formal languages.

Projects & Publications

*

The Computational Power of Harmony

We overview vowel harmony computationally, describing necessary and sufficient conditions on phonotactics, processes, and learning.

Phonological Abstractness in the Mental Lexicon

We overview the notion of phonological abstractness, various types of evidence for it, and consequences for linguistics and psychology.

Typology Emerges from Simplicity in Representations and Learning

We derive the well-studied subregular classes of formal languages, which computationally characterize natural language typology, purely from the perspective of algorithmic learning problems.

The Logical Nature of Phonology Across Speech and Sign

This article examines whether the computational properties of phonology hold across spoken and signed languages, using model theory and logical transductions.

What can formal language theory do for animal cognition studies?

We comment on mathematical fallacies present in artificial grammar learning experiments and suggest how to integrate psycholinguistic and mathematical results.

Probing RNN Encoder-Decoder Generalization of Subregular Functions using Reduplication

We analyze the expressivity of a variety of recurrent encoder-decoder networks, showing they are limited to learning subsequential functions, and connecting RNNs with attention mechanisms to a class of deterministic 2-way transducers.

Multi-Input Strictly Local Functions for Templatic Morphology

We provide an automata-theoretic characterization of templatic morphology, extending strict locality to consider n-ary functions.

Multi-Input Strictly Local Functions for Tonal Phonology

We provide an automata-theoretic characterization of tonal phonology, extending strict locality to consider n-ary functions.

History of Phonology: Learnability

This chapter examines the brief but vibrant history of learnability in phonology.

Tensor Product Representations of Subregular Formal Languages

I provide a vector space characterization of the Star-Free and Locally Threshold testable classes of formal languages, over arbitrary data structures.

Learning with Partially Ordered Representations

We describe a partial order on the space of model-theoretic constraints and a learning algorithm for constraint inference.

Finite-State Locality in Semitic Root-and-Pattern Morphology

We describe the finite-state nature of root-and-pattern morphology using Semitic as a case study, and discuss issues of finiteness vs. infinity, and template emergence.

No Free Lunch in Linguistics or Machine Learning: Reply to Pater

We caution about confusing ignorance of biases with absence of biases in machine learning and linguistics, especially for neural networks.

Quantified Sentences as a Window into Prediction and Priming: An ERP Study

We used event related potentials (ERPs) to examine the processing of quantified sentences in an auditory/visual truth value judgment task, specifically to probe truth value and quantifier type influences on the N400 and ERP markers of quantifier complexity.

Phonological Complexity is Subregular: Evidence from Sign Language

I show the complexity of several signed processes is subregular across speech and sign using string representations.

Research Positions

 
 
 
 
 

Graduate Assistant

Stony Brook Center for Neural Circuit Dynamics

Jan 2019 – Present
 
 
 
 
 

Research Assistant

Stony Brook Computational Linguistics Lab

Jun 2018 – Present
 
 
 
 
 

Research Fellow

Institute for Advanced Computational Science

Sep 2017 – Present
 
 
 
 
 

Graduate Assistant

HSE Theoretical Neuroscience Group

Sep 2015 – Jul 2016
 
 
 
 
 

Research Assistant

HSE Neurolinguistics Laboratory

Jan 2015 – May 2015
 
 
 
 
 

Undergraduate Researcher

UMN Dept. of Speech-Language-Hearing Sciences

Jan 2013 – Aug 2013
 
 
 
 
 

Student Manager, Research Assistant

NuMI Off-Axis Electron Neutrino Appearance (NOvA) Laboratory, UMN Physics Dept.

Jun 2012 – Aug 2013