Supported LLMs

Valence supports a number of capable and versatile LLMs out-of-the box. You can use any of these models with the runLlm function defined in Inference API

You can find the list below:

Last updated