Associate Supervisor of Truong Giang Do
PhD course, Deakin University, 2024
F975, PhD. Candidate Truong Giang Do. Improving Foundation Models by Addressing the Binding Problem. 2024-2027
Publications
Rethinking Sparse Mixture of Experts from a Unified Perspective
Published/Accepted at ICML, 2026
Authors: Giang Do, Hung Le, Truyen Tran
Link
Eigenvectors of Experts are Training-free Non-collapsing Routers
Published/Accepted at ICML (Spotlight), 2026
Authors: Giang Do, Hung Le, Truyen Tran
Link
Do Domain-specific Experts exist in MoE-based LLMs?
Published/Accepted at ACL-Findings, 2026
Authors: Giang Do, Hung Le, Truyen Tran
Link
On the Role of Discrete Representation in Sparse Mixture of Experts
Published/Accepted at TMLR, 2025
Authors: Giang Do, Kha Pham, Hung Le, Truyen Tran
Link
SimSMoE: Toward Efficient Training Mixture of Experts via Solving Representational Collapse
Published/Accepted at NAACL-Findings, 2025
Authors: Giang Do, Hung Le, Truyen Tran
Link