Skip to content
  • Rui Ueyama's avatar
    Apply different tokenization rules to linker script expressions. · 731a66ae
    Rui Ueyama authored
    The linker script lexer is context-sensitive. In the regular context,
    arithmetic operator characters are regular characters, but in the
    expression context, they are independent tokens. This afects how the
    lexer tokenizes "3*4", for example. (This kind of expression is real;
    the Linux kernel uses it.)
    
    This patch defines function `maybeSplitExpr`. This function splits the
    current token into multiple expression tokens if the lexer is in the
    expression context.
    
    Differential Revision: https://reviews.llvm.org/D29963
    
    llvm-svn: 295225
    731a66ae
Loading