Bitmask

Super interesting topic that I was introduced to while working on the Poker AI.

Remove the bits you don’t care about.

Masking bits to 0

Just use AND and put 1 where you want the positions to be turned on. Ex: a bitmask of 1000 only cares about the fourth bit, so you would and it with everything else.

Use XOR to toggle a bit.

Deep Learning

When you do Matrix multiplication, you can essentially think of it as a bitmask. Encountered this idea through Matrix multiplication, you can essentially think of it as a bitmask. Encountered this idea through Andrej Karpathy.

C = torch.randn((27,2))
F.one_hot(torch.tensor(5), num_classes=27).float() @ C # This gives C[5]