A step-by-step walkthrough of scaled dot-product attention, the core mechanism inside every transformer.