183. Bidirectional Encoder Representations from Transformers (BERT)

A pre-trained transformer model developed by Google that processes text in both directions (left-to-right and right-to-left) to understand context more effectively.

Last updated