Skip to main content

Learning Resource

Learn some of the key concepts involved in language generation and understanding.

  • Tokens are words or parts of words that our models take as input or produce as output.
  • Embeddings are lists of numbers that represent a word, token, or an entire piece of text. Embeddings capture information about the meaning and context of the words or sentences they represent.
  • Temperature is a value that controls the outputs of a generation model by tuning the degree of randomness involved in picking output tokens.
  • Likelihood is a measure of how “expected” each token is in a piece of text.
  • Model Evaluation: Learn how likelihood scores make it possible to evaluate generative models