Yash Kumar Lal

Yash Kumar Lal


PhD student in Computer Science at Stony Brook; affiliated with LUNR lab

Advisor: Niranjan Balasubramanian.

Email: ylal <at> cs.stonybrook <dot> edu

Hello!

I'm a second-year PhD student at Stony Brook. Currently, I am exploring problems in natural language understanding; particularly, reasoning in stories. I collaborate with Nate Chambers and Ray Mooney on this line of work. I am also working on analysing energy efficiency of NLP models, also collaborating with Aruna Balasubramanian.

Previously, I have dabbled with problems in machine translation, clickbait detection and word sense disambiguation.

I graduated with a Master's degree in Computer Science from Johns Hopkins University in May 2020. I was primarily advised by Philipp Koehn and worked on a variety of problems across natural language processing.

Over the years, I have been involved in several notable side projects. I worked on a chat platform - Ping - that allowed users to communicate with each other regardless of language. I was involved in efforts to build a service - hello friend - that allowed people without internet access to avail various crucial facilities. In a past life, I was an iOS developer for several small-scale applications.


Updates

August 2021

Demo paper accepted to EMNLP, 2021. I'll be attending in-person. See you in Punta Cana!

May 2021

Two papers accepted at ACL, 2021 --- one in main conference, one in Findings

September 2020

Paper accepted to Findings of EMNLP, 2020

June 2019

Paper accepted at LoResMT, MT Summit 2019.

June 2019

Presenting at ACL 2019. See you in Italy!

July 2018

Presented a short paper and a workshop paper at SIGIR 2018

November 2017

Winner of Acceleprise award and RocketSpace award at AngelHack Global Hackathon Series 2017, finishing in top 6 around the world

June 2017

Awarded overall first place and Amazon AWS prize at AngelHack Hyderabad 2017

May 2017

Finished in Top 10 (across India) in Microsoft code.fun.do 2017


Publications

  • IrEne-viz: Visualizing Energy Consumption of Transformer Models
    In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing: System Demonstrations, EMNLP 2021 [Paper] [BibTex] [Poster] [Demo]
  • IrEne: Interpretable Energy Prediction for Transformers
    In Proceedings of the Association for Computational Linguistics, ACL 2021 [Online] [Paper] [BibTex]
  • TellMeWhy: A Dataset for Answering Why-Questions in Narratives
    In Proceedings of the Findings of the Association for Computational Linguistics, ACL 2021 [Online] [Paper] [BibTex] [Poster] [Slides]
  • Temporal Reasoning in Natural Language Inference
    In Proceedings of the Findings of the Association for Computational Linguistics, EMNLP 2020 [Online] [Paper] [BibTex]
  • Sentence-Level Adaptation for Low-Resource Neural Machine Translation
    In Proceedings of the AMTA 2019 Workshop on Technologies for MT of Low Resource Languages (LoResMT) 2019 [Online] [Paper] [BibTex]
  • De-Mixing Sentiment from Code-Mixed Text
    In Proceedings of the 57th Annual Meeting of Association for Computational Linguistics - Student Research Workshop (ACL-SRW) 2019 [Online] [Paper] [BibTex]
  • Johns Hopkins University Submission for WMT News Translation Task
    In Proceedings of the Fourth Conference on Machine Translation (WMT) 2019 [Online] [Paper] [BibTex]
  • Check It Out : Politics and Neural Networks
    In Proceedings of CLEF 2018 Fact-Checking Shared Task
  • Identifying Clickbait: A Multi-Strategy Approach Using Neural Networks
    In Proceedings of The 41st International ACM SIGIR Conference on Research & Development in Information Retrieval, (SIGIR) 2018 [Online] [Paper] [BibTex]
  • SWDE: A Sub-Word And Document Embedding Based Engine for Clickbait Detection
    In Proceedings of SIGIR 2018 Workshop on Computational Surprise in Information Retrieval, (CompS Workshop) [Online] [Paper] [BibTex]

Service