On the Role of Attention Masks and LayerNorm in Transformers | Read Paper on Bytez