wang@osdi21@USENIX

Total: 1

#1 PET: Optimizing Tensor Programs with Partially Equivalent Transformations and Automated Corrections [PDF] [Copy] [Kimi] [REL]

Authors: Haojie Wang ; Jidong Zhai ; Mingyu Gao ; Zixuan Ma ; Shizhi Tang ; Liyan Zheng ; Yuanzhi Li ; Kaiyuan Rong ; Yuanyong Chen ; Zhihao Jia

High-performance tensor programs are critical for efficiently deploying deep neural network (DNN) models in real-world tasks. Existing frameworks optimize tensor programs by applying fully equivalent transformations, which maintain equivalence on every element of output tensors. This approach misses possible optimization opportunities as transformations that only preserve equivalence on subsets of the output tensors are excluded. We propose PET, the first DNN framework that optimizes tensor programs with partially equivalent transformations and automated corrections. PET discovers and applies program transformations that improve computation efficiency but only maintain partial functional equivalence. PET then automatically corrects results to restore full equivalence. We develop rigorous theoretical foundations to simplify equivalence examination and correction for partially equivalent transformations, and design an efficient search algorithm to quickly discover highly optimized programs by combining fully and partially equivalent optimizations at the tensor, operator, and graph levels. Our evaluation shows that PET outperforms existing systems by up to 2.5×, by unlocking previously missed opportunities from partially equivalent transformations.