Skip to content
Commit 3b7ec85a authored by Jean Perier's avatar Jean Perier
Browse files

[flang] Use unix logical representation for fir.logical

The front-end and the runtime are currently using the unix logical
representation, but lowering was not. These inconsistencies could
caused issues.

The only place that defines what the logical representation is in
lowering is the translation from FIR to LLVM (FIR is agnostic to the
actual representation). More precisely, the LLVM implementation of
`fir.convert` between `i1` and `fir.logcial` is what defines the
representation:
- `fir.convert` from `i1` to `fir.logical` defines the `.true.` and `.false.`
canonical representations
- `fir.convert` from `fir.logical` to `i1` decides what the test for
truth is.

Unix representation is:
- .true. canonical integer representation is 1
- .false. canonical integer representation is 0
- the test for truth is "integer representation != 0"

For the record, the previous representation that was used was in
codegen was:
- .true. canonical integer representation is -1 (all bits 1)
- .false. canonical integer representation is 0
- the test for truth is "integer representation lowest bit == 1"

Differential Revision: https://reviews.llvm.org/D121200
parent ba8ee4a4
Loading
Loading
Loading
Loading
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Please to comment