Supported LLMs

Valence supports a number of capable and versatile LLMs out-of-the box. You can use any of these models with the runLlm function defined in Inference API

You can find the list below:

Model
Model ID

Llama-3-8B-Instruct

meta-llama/Meta-Llama-3-8B-Instruct

Last updated