Recent Posts

Models can be built incrementally by modifying their hyperparameters during training. This is most common in transfer learning settings, in which we seek to adapt the knowledge in an existing model for a new domain or task. The more general problem of continuous learning is also an obvious application. Even with a predefined data set, however, incrementally constraining the topology of the network can offer benefits as regularization. Dynamic Hyperparameters The easiest incrementally modified models to train may be those in which hyperparameters are updated at each epoch.


[Note - This is a repost of a post I made on my old blog while I was in undergrad. I’m including it in case someone finds it useful, since my old blog is defunct. I haven’t significantly edited it, so I’m sorry if it doesn’t fit into my current style.] This post is directed to a lay CS audience. I am an undergraduate in CS, so I consider myself part of that audience.



  • Naomi Saphra and Adam Lopez. Evaluating Informal-Domain Word Representations With UrbanDictionary. The First Workshop on Evaluating Vector Space Representations for NLP, 2016. [paper] [code]

  • Naomi Saphra and Adam Lopez. AMRICA: an AMR Inspector for Cross-language Alignments. Proceedings of NAACL-HLT, 2016. [paper] [code]

  • Ryan Cotterell, Adithya Renduchintala, Naomi Saphra, and Chris Callison-Burch. An Algerian Arabic-French Code-Switched Corpus. LREC Workshop on Free/Open-Source Arabic Corpora and Corpora Processing Tools, 2014.

  • Nathan Schneider, Brendan O’Connor, Naomi Saphra, David Bamman, Manaal Faruqui, Noah A. Smith, Chris Dyer, Jason Baldridge. A framework for (under)specifying dependency syntax without overloading annotators. Proceedings of the ACL Linguistic Annotation Workshop, 2013. [paper]

  • Andrea Vedaldi, Siddharth Mahendran, Stavros Tsogkas, Subhransu Maji, Ross Girshick, Juho Kannala, Esa Rahtu, Iasonas Kokkinos, Matthew B. Blaschko, David Weiss, Ben Taskar, Karen Simonyan, Naomi Saphra, Sammy Mohamed. Understanding objects in detail with fine-grained attributes. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 3622-3629, 2014.



PhD, Informatics. University of Edinburgh (In Progress, est. 2020)

Masters of Science and Engineering, Computer Science. Johns Hopkins University, 2015

Bachelor of Science, Computer Science. Carnegie Mellon University, 2013.
Minor: Language Technologies

The Recurse Center (Sabbatical) Experimental unstructured educational programming retreat.

Research Experience

Johns Hopkins University (2013-2015):

Research Assistant. Machine translation, informal domain adaptation, and semantics research in pursuit of a PhD.

Language Technologies Institute, Carnegie Mellon University (2010-2013):

Research Assistant. Helped design fragmented dependency parse framework and implemented Python libraries for it.

The Johns Hopkins University Center for Language and Speech Processing (Summer 2012):

Undergraduate Research Fellow at the CLSP Summer Workshop. On computer vision team, implemented machine learning algorithms in Matlab to verify and explore boundary annotations from Mechanical Turk.

Industry Experience

Koko, inc. (2017)

Contractor. Developed neural classifiers for informal text at NYC startup.

Google (Summer 2015):

SWE Intern, Machine intelligence group. Worked on semantics of complex noun phrases.

Google (Summer 2013):

SWE Intern, machine intelligence group. Using models of real-world knowledge as feedback to semantic annotation systems.

Facebook, Inc. (Summer 2011):

Engineering Intern. Engineered and tested new features for the People You May Know model.


Tutor, University of Edinburgh (2016-2018)

Machine Learning And Pattern Recognition; Probabilistic Modeling And Reasoning; Informatics Research Review.

Course Assistant, Johns Hopkins University (Fall 2013):

Natural Language Processing.

Teacher’s Assistant, Irvington High School (2008):

Ancient Greek. (I will never drop this item from my CV.)


  • Student Social Co-Chair for ACL 2014.
  • Volunteer for North American Computational Linguistics Olympiad.
  • Carnegie Mellon TechNights volunteer (outreach program).
  • Carnegie Mellon Tartanhacks Mentor (hackathon).


  • Google Europe Scholarship for Students with Disabilities 2017.
  • Carnegie Mellon School of Computer Science Dragon Award.
  • National Merit Semifinalist.
  • Nannina Rasulo Memorial Scholarship Award for Technology 2009.