A. Castelló, J. Bellavita, G. Dinh, Y. Ikarashi, H. Martínez. Tackling the Matrix Multiplication Micro-kernel Generation with Exo. CGO 2024. PDF download
C. Hong, Q. Huang, G. Dinh, M. Subedar, Y. S. Shao. DOSA: Differentiable Model-Based One-Loop Search for DNN Accelerators. MICRO 2023. PDF download
G. Dinh, I. Kannan, H. Luo, C. Hong, Y. Cho, J. Demmel, X. S. Li, Y. Liu. Sample-Efficient Mapspace Optimization for DNN Accelerators with Bayesian Learning. MLArchSys 2023. PDF download
C. Hong, Q. Huang, G. Dinh, Y. S. Shao. DOSA: One-Loop DSE for DNN Accelerators Using Differentiable Models. MLArchSys 2023. PDF download
S. Kim, C. Hooper, T. Wattanawong, M. Kang, R. Yan, H. Genc, G. Dinh, Q. Huang, K. Keutzer, M. Mahoney, Y. S. Shao, A. Gholami. Full Stack Optimization of Transformer Inference. ASSYST 2023. PDF download
Y. Cho, J. Demmel, G. Dinh, X. S. Li, Y. Liu, H. Luo, O. Marques, W. Sid-Lakhdar. GPTune User Guide. PDF download
A. Chen, J. Demmel, G. Dinh, M. Haberle, O. Holtz. Communication Bounds for Convolutional Neural Networks. PASC ’22. PDF download
Q. Huang, M. Kang, G. Dinh, T. Norell, A. Kalaiah, J. Demmel, J. Wawrzynek, Y. S. Shao. CoSA: Scheduling by Constrained Optimization for Spatial Accelerators. ISCA 2021. PDF download
G. Dinh and J. Demmel. Communication-Optimal Tilings for Projective Nested Loops with Arbitrary Bounds. SPAA ’20 (as brief announcement). PDF download
J. Demmel and G. Dinh. Communication-Optimal Convolutional Neural Nets. MDS ’20. PDF download.
I’ve had the privilege of working with some brilliant undergraduate students:
Julian Bellavita - now PhD student in CS at Cornell
Iniyaal Kannan - MS in CS at UC Berkeley, now at Amazon
Anthony Chen - now PhD student in applied math and scientific computing at University of Michigan
Mason Haberle - now PhD student in math at NYU Courant
In Spring of 2022, I organized a reading group on the intersections of theory and hardware.