Hardware Accelerators for Machine Learning (Spring 2022)

ECE 618: Guest Lecture at Jan. 31

Jeff (Jun) Zhang


[Website]

[Related Papers] (Password request via wjiang8@gmu.edu)

Title

Towards Energy-Efficient and Robust Domain-Specific Computing: from Circuits to Architectures and Systems

Bio

Jeff (Jun) Zhang is a Postdoctoral Fellow at Harvard University. He received his Ph.D. degree from the Electrical and Computer Engineering Department at New York University in 2020. His general research interests are in deep learning, computer architecture, and EDA.

Talk Abstract

Deep neural networks (DNNs) have achieved or surpassed state-of-the-art results in a wide range of machine learning applications. In this talk, I will address these challenges and unlock cross-stack opportunities to enable energy-efficient and reliable deep learning acceleration with high accuracy and performance. I will also cover the design of high-quality Machine-Learning-as-a-Service (MLaaS), discuss the existing issues on deep learning at scale, and introduce novel hardware/software co-design techniques to improve the quality of service. To conclude, future research directions centering around AI/ML systems and hardware accelerators will be presented.