Contextual embedding is a technique used in Natural Language Processing (NLP).
Contextual embeddings assign each word a representation based on its context, thereby capturing uses of words across varied contexts and encoding knowledge that transfers across languages.