FlexAttention https://x.com/cHHillee/status/1821253769147118004 Also saw it mentioned through Pytorch tutorial: https://pytorch.org/blog/flexattention/ cited by https://pytorch.org/tutorials/intermediate/transformer_building_blocks.html