learning algorithms

The Computational Power of Harmony

We overview vowel harmony computationally, describing necessary and sufficient conditions on phonotactics, processes, and learning.

Typology Emerges from Simplicity in Representations and Learning

We derive the well-studied subregular classes of formal languages, which computationally characterize natural language typology, purely from the perspective of algorithmic learning problems.

What can formal language theory do for animal cognition studies?

We comment on mathematical fallacies present in artificial grammar learning experiments and suggest how to integrate psycholinguistic and mathematical results.

Probing RNN Encoder-Decoder Generalization of Subregular Functions using Reduplication

We analyze the expressivity of a variety of recurrent encoder-decoder networks, showing they are limited to learning subsequential functions, and connecting RNNs with attention mechanisms to a class of deterministic 2-way transducers.

History of Phonology: Learnability

This chapter examines the brief but vibrant history of learnability in phonology.

Tensor Product Representations of Subregular Formal Languages

I provide a vector space characterization of the Star-Free and Locally Threshold testable classes of formal languages, over arbitrary data structures.

Learning with Partially Ordered Representations

We describe a partial order on the space of model-theoretic constraints and a learning algorithm for constraint inference.

No Free Lunch in Linguistics or Machine Learning: Reply to Pater

We caution about confusing ignorance of biases with absence of biases in machine learning and linguistics, especially for neural networks.