Hello there! I'm a 1st year MLT student at Carnegie Mellon's Language Technologies Institute, where I'm fortunate to be co-advised by Daniel Fried, Matt Gormley, and Vincent Hellendoorn.

I'm broadly interested in NLP for code, though I've recently been thinking about bringing interactivity into code generation. Code generation, like many NLP tasks, is usually posed as a problem of mapping an input to an output in a single turn of interaction. But in an era of LLMs and chatbots, is this still the best framing of the task? Outside of code, I'm also interested in natural language generation – specifically, in what methods from the past can (and cannot) teach us about how to decode with today's models.

Before this, I did my undergrad also at CMU in computer science with a concentration in machine learning. In my free time, I enjoy playing and watching soccer. please dear god let Tottenham win something this season :')


It's MBR All the Way Down: Modern Generation Techniques Through the Lens of Minimum Bayes Risk
Amanda Bertsch*, Alex Xie*, Graham Neubig, Matthew R. Gormley
In Proceedings of the First Big Picture Workshop. 2023.