Tu Vu
I am a Research Scientist at Google DeepMind and an incoming Assistant Professor at Virginia Tech. Previously, I received my PhD in Computer Science at the University of Massachusetts Amherst, advised by Mohit Iyyer. I also spent three years (2020–2023) as a research intern/student researcher at Google DeepMind and Google Research. My research aims to develop effective and efficient methods for advancing and democratizing artificial intelligence in the era of large language models (LLMs).
My current research focuses on addressing limitations of LLMs and developing novel learning paradigms. Specific areas of focus include:
- In-context learning and tool-use LLMs: injecting knowledge into LLM prompts and augmenting LLMs with external tools
- Instruction tuning: enhancing LLMs’ instruction-following capabilities
- Parameter-efficient transfer learning: efficiently transferring knowledge across tasks, languages, and modalities
- Long-context modeling: designing efficient model architectures for long sequences
- Advanced planning and reasoning: improving LLMs’ ability to solve complex reasoning problems
- Few-shot learning: learning from limited human-labeled data.
If you are interested in doing a PhD at Virginia Tech and joining my lab, please apply to the Virginia Tech Graduate School and list me as a potential advisor. Please also check out the application deadlines and information for prospective students.
Recent news
Feb. 2024 | I am now serving as an Area Chair for ACL Rolling Review (ARR) |
---|---|
Jan. 2024 | Flan-MoE got accepted to ICLR 2024! |
Nov. 2023 | Talk at Graph Neural Networks Reading Group, Google |
Oct. 2023 | New preprint on LLM factuality |
Aug. 2023 | I joined Google Research in Mountain View, CA as a Research Scientist |
Jul. 2023 | I successfully defended my PhD thesis! |
Advisees
// GroupQuyet Do (incoming PhD student @ Virginia Tech) | |
Thinh Pham (incoming PhD student @ Virginia Tech) | |
Rishab Balasubramanian (incoming PhD student @ Virginia Tech) | |
Linus Pin-Jie Lin (incoming PhD student @ Virginia Tech) |
Prateek Yadav (Research Intern @ Google Gemini, Summer 2024) | |
Simeng (Shirley) Han (Student Researcher @ Google DeepMind, Summer 2024) | |
Dheeraj Mekala (PhD student @ UCSD, Spring & Summer 2022; one paper accepted to EMNLP 2022) |
Selected publications
For an up-to-date list of my research papers, please see my Google Scholar profile or my Semantic Scholar profile.- PreprintFreshLLMs: Refreshing large language models with search engine augmentationarXiv preprint arXiv:2310.03214, 2023