# What can TACO be used for?

# Why should I use TACO?

TACO is fast

Under the hood, the TACO library employs a novel compiler-based technique to generate kernels that are optimized for the computations you want to perform. This enables TACO to achieve performance that exceeds the MATLAB Tensor Toolbox by up to several orders of magnitude and that is competitive with other high-performance sparse linear and tensor algebra libraries like Eigen, Intel MKL, and SPLATT.

TACO is versatile

The compiler-based technique that underlies TACO enables it to support a wide variety of linear and tensor algebra operations, ranging from simpler ones like sparse matrix-vector multiplication to more complex ones like MTTKRP on tensors of any order. Tensors can be stored in a wide range of storage formats, including many commonly used sparse matrix and tensor formats such as CSR.

TACO is easy to use

With TACO, you can define even complex tensor algebra computations on dense and sparse tensors in just a couple of lines of C++ or Python code using tensor index notation. The TACO library takes care of generating the potentially very complicated kernels that are needed to perform your desired computations.

# Media

# Acknowledgements

TACO is developed by members of the Commit research group in MIT CSAIL and is built on work supported by the National Science Foundation under Grant No. CCF-1533753, by the U.S. Department of Energy, Office of Science, Office of Advanced Scientific Computing Research under Award Numbers DE-SC008923 and DE-SC014204, by the Direction Générale de l'Armement (Projet ERE 2016929), and by the Toyota Research Institute.