197. Positional Encoding

In transformer models, a method of injecting information about the position of tokens in a sequence, allowing the model to capture order dependencies.

Last updated