General Architecture TPU Compilation Toolchain

TPU-MLIR is a TPU compiler open-source project. The project provides a complete toolchain, which is used to transform the neural networks pre-trained under different frameworks into binary files bmodel that can be efficiently operated on TPU.
Features
Open Source
All the codes of TPU-MLIR project have been opened and are free to all users. Users can participate in improving the project and jointly creating an AI compiler that stands at the forefront of the times.
Easy to Start
After reading the examples in the development manual and the project, users can quickly understand the process and principle of model transformation and the project. In addition, since TPU-MLIR is designed based on the current mainstream compiler tool library MLIR, it can also be a hands-on project for those users who want to learn and practice MLIR.
Easy to Use
The project has provided a complete toolchain. Users can quickly complete the transformation of models directly through the existing interfaces to save time and cost spent in the process of adapting different networks.
Multi AI Framework Support
Currently, TPU-MLIR supports tflite and onnx formats, and models in these formats can be directly converted through this project. In addition, based on the ecology of onnx, most of the models developed by the mainstream deep learning frameworks (such as PyTorch and Caffe) can also be indirectly transformed into bmodel by first being transformed into onnx models.
Coexistence of Accuracy and Efficiency
TPU-MLIR supports INT8 symmetric and asymmetric quantization, which greatly improves the performance and ensures the high accuracy of the model in combination with the Calibration and Tune technologies of the original development enterprise. In addition, TPUMLIR also applies graph optimization and operator segmentation and optimization techniques to ensure the efficient operation of the model.
Rich documentation
A good open-source project must be supported by a complete user document so that users can get started at the lowest cost and participate in and contribute together. Meanwhile, if the user document is supported, contributors can also explore some uncommon modules to maximize the efficiency of collaboration.
Video Tutorial
Partner