 | RESEARCH |
Our research has focused on system-level optimizations for emerging applications such as deep learning frameworks and big data analysis platforms. Our current interests include:
- Compiler Optimization for Deep Learning Frameworks
- Optimization for Big Data Frameworks
- System Software for Emerging Non-Volatile Memory and Storage
Current Grants
- QoS Compiler for Multi-Tenant Deep Learning Applications, National Research Fund (NRF), 2025.09-2028.08
- Explore compiler techniques to enhance Quality of Service (QoS) for deep learning applications in multi-tenant GPU environments.
- Co-design of Peta-Scale Host-Storage for Large AI Applications, Samsung Electronics, 2022.05-2026.05
- Investigate the characteristics of big-data, deep learning applications running on specialized-SDDs and explore the design space.
[ Past Grants ]