In-Context Learning with Transformers: Softmax Attention Adapts to Function Lipschitzness | Read Paper on Bytez