Explore an innovative approach to deep learning model sparsity in this 16-minute conference talk from OSDI '22. Learn about Tensor-with-Sparsity-Attribute (TeSA), a new abstraction that augments the default Tensor abstraction for dense models. Discover how TeSA enables sparsity attributes and patterns to be specified, propagated, and utilized across entire deep learning models, resulting in highly efficient, specialized operators. Understand the SparTA framework's ability to accommodate various sparsity patterns and optimization techniques, delivering significant speedups in inference latency compared to state-of-the-art solutions. Gain insights into the evolution of sparsity patterns, obstacles in sparsity optimization, and the importance of end-to-end model sparsity approaches. Examine the framework's architecture, execution transformation, and code specialization techniques, as well as its performance across various patterns and models.
SparTA - Deep-Learning Model Sparsity via Tensor-with-Sparsity-Attribute