Skip to content
Commit e3750caf authored by Nicolas Vasilache's avatar Nicolas Vasilache
Browse files

[mlir][Linalg] Add a linalg.reshape op

Summary:
This diff adds a new operation to linalg to allow reshaping of an
existing view into a new view in the same buffer at the same offset.

More specifically:
The `linalg.reshape` op produces a new view whose sizes are a reassociation
of the original `view`. Depending on whether or not the reassociated
MemRefType is contiguous, the resulting memref may require explicit alloc
and copies.

A reassociation is defined as a continous grouping of dimensions and is
represented with a affine map array attribute. In the future, non-continous
groupings may be allowed (i.e. permutations, reindexings etc).

For now, it is assumed that either:
  1. a reassociation produces and consumes contiguous MemRefType or,
  2. the reshape op will be folded into its consumers (by changing the shape
     of the computations).
All other cases are undefined behavior and a reshape op may not lower to
LLVM if it cannot be proven statically that it does not require alloc+copy.

A reshape may either collapse or expand dimensions, depending on the
relationship between source and target memref ranks. The verification rule
is that the reassociation maps are applied to the memref with the larger
rank to obtain the memref with the smaller rank. In the case of a dimension
expansion, the reassociation maps can be interpreted as inverse maps.

Examples:

```mlir
   // Dimension collapse (i, j) -> i' and k -> k'
   %1 = linalg.reshape %0 [(i, j, k) -> (i, j),
                           (i, j, k) -> (k)] :
     memref<?x?x?xf32, stride_spec> into memref<?x?xf32, stride_spec_2>
```

```mlir
   // Dimension expansion i -> (i', j') and (k) -> (k')
   %1 = linalg.reshape %0 [(i, j, k) -> (i, j),
                           (i, j, k) -> (k)] :
     memref<?x?xf32, stride_spec> into memref<?x?x?xf32, stride_spec_2>
```

The relevant invalid and roundtripping tests are added.

Reviewers: AlexEichenberger, ftynse, rriddle, asaadaldien, yangjunpro

Subscribers: kiszk, merge_guards_bot, mehdi_amini, jpienaar, burmako, shauheen, antiagainst, arpith-jacob, mgester, lucyrfox, llvm-commits

Tags: #llvm

Differential Revision: https://reviews.llvm.org/D72168
parent d877229b
Loading
Loading
Loading
Loading
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment