# Supported LLMs

Valence supports a number of capable and versatile LLMs out-of-the box. You can use any of these models with the `runLlm` function defined in [inference-api](https://docs.vannalabs.ai/build/building-dapps/inference-api "mention")

You can find the list below:

<table><thead><tr><th width="201">Model</th><th width="411">Model ID</th></tr></thead><tbody><tr><td>Llama-3-8B-Instruct</td><td><code>meta-llama/Meta-Llama-3-8B-Instruct</code></td></tr><tr><td></td><td></td></tr><tr><td></td><td></td></tr></tbody></table>
