[mlir][Linalg] Break unnecessary dependency through unused `outs` tensor.
LinalgOps that are all parallel do not use the value of `outs` tensor. The semantics is that the `outs` tensor is fully overwritten. Using anything other than `init_tensor` can add false dependencies between operations, when the use is just for the shape of the tensor. Adding a canonicalization to always use `init_tensor` in such cases, breaks this dependence. Differential Revision: https://reviews.llvm.org/D102561
Loading
Please register or sign in to comment