Skip to main content

Detokenize

This endpoint takes tokens using byte-pair encoding and returns their text representation. To learn more about tokenization and byte pair encoding, see the tokens page.

Usage#

    Sample Response#

    {
    "text": "detokenized! :D"
    }

    Request#

    tokens#

    list of ints

    The list of tokens to be detokenized, the maximum length is 65536 tokens.

    Response#

    text#

    string

    A string representing the list of tokens.