Positions
Prospective Students

  • For all prospective students (undergrad/msc/phd), please read this before you contact me. Due to the large volume of emails, I may not be able to answer all emails; apologies!
  • To international undergrad students: Please email me only if you have your own funding for internship as I do not have any funding to support interns.

Research Areas
  • Neuro-Symbolic AI Agents, e.g. integrating LLMs and symbolic methods for multi-modal agents that operate/learn by perception, planning, acting, and feeback
  • Machine Learning, e.g. deep generative models, Bayesian learning and inference, deep causal models, explainable AI, deep reinforcement learning
  • Knowledge Graphs and LLMs, e.g. effective approaches for hybrid LLM-KG reasoning, certifying LLM knwoledge/reasoning
  • Efficient and effective LLMs, e.g. better architectures, better decoding/inference, better training algorithms (long-context LLMs, KV-caching)
  • Continual Learning, e.g. LLM knwoledge editing, retreival augmented generation (RAG)
  • Reliable and Safe LLMs, e.g. alignment algorithms (RLHF), social-cultural norms, jailbreaking and defence mechanisms
  • Multilingual and Multimodal Foundation Models, e.g. covering low-resource languages, speech/video/text/code modalities
  • Real-life Applications, e.g. text/speech/document translation, question-answering, coding, dialogue systems
Highlights
Recent Grants and Projects
  • Trustworthy Generative AI: Towards Safe and Aligned Foundation Models, 840K, 2024-2026
  • Aligning Large Language Models with Human Intention, EBay Research, 130K, 2024-2027
  • Exploiting Context in Multilingual Understanding and Generation, ARC Future Fellowship, 850K, 2020-2025
  • Hierarchical Abstractions and Reasoning for Neuro-Symbolic Systems, 5.5M, 2023-2027
  • Large-scale multimodal knowledge management: From organization and user modeling to fast contextual presentation, 4M, 2022-2025
  • Dialogue Assistance for Negotiations in Cross-cultural Settings: A Neuro-Symbolic Computational Approach, 3.4M, 2022-2024
  • Learning to Learn and Adapt with Less Labels, 3.2M, 2019-2022
  • Continual Active Learning, EBAy Research, 200K, 2022-2023
  • Document-Wide Neural Machine Translation, EBay Research, 130K, 2020-2021
  • Effective Multi-Task Learning for Neural Machine Translation, Amazon Faculty Research Award, 113K, 2019-2020
  • Effective Learning of Document Neural Machine Translation, Google Faculty Research Award, 100K, 2019-2020
  • Document Machine Translation as Deep Structured Prediction, Google Faculty Research Award, 120K, 2018-2019
  • Explaining the outcomes of complex models, 430K, ARC DP, 2018-2022
  • Learning Deep Semantics for Automatic Translation between Human Languages, ARC DP, 450K, 2015-2018
Recent Professional Services
  • Reviewer, International Conference on Learning Representations (ICLR), 2025
  • Senior Area Chair, Empirical Methods in Natural Language Processing (EMNLP), 2024
  • Reviewer, Neural Information Processing Systems (NeurIPS), 2024
  • Reviewer, Association for Computational Linguistics (ACL), 2024