I am a Ph.D. student in the Cornell University department of linguistics and a consultant at the National Institute of Standards and Technology. My main research interests are in computational models of computational semantics and undsupervised learning for syntax and semantics. In particular, I am interesting in realistic computational learning algorithms and finite-state representations of knowledge and meaning. My research involves the interface between syntax and semantics as well as the relationship between computer science, linguistics, and cognitive science.
I am currently working on a model of possible world semantics capable of efficiently representing regular sets of worlds and relations between them. The system is fully compositional, with well-defined lexical semantics. Though the system is somewhat limited in its expressive power, it is capable of representing many semantic operations, including definiteness, modality, plurality, and presupposition. The system can efficiently be used for applications such as natural language interfaces for games or knowledge systems, and provides insight into the cognitive science of language-based reasoning.
I am also working on an algorithm for unsupervised learning of syntax in various formalisms, including Combinatory Categorial Grammars, Relational Grammars, and Minimalist Grammars. This algorithm learns purely from strings, using probabilistic heuristics to improve learnability.
I have worked on the editing team for Semantics and Linguistic Theory 26 and 26 and I am currently on the organizing committe for NELS 49. I have also helped run several workshops for the Cornell Linguistics Circle on topics including natural language processing, LaTeX for linguists, and basic web development.