Tags → #ml
-
Inside an Attention Head
A step-by-step walkthrough of scaled dot-product attention, the core mechanism inside every transformer.
-
Whitman Bot
In which we train an RNN to mimic Walt Whitman's prose style (somewhat).
A step-by-step walkthrough of scaled dot-product attention, the core mechanism inside every transformer.
In which we train an RNN to mimic Walt Whitman's prose style (somewhat).