Language Models

Note

This was added in version 2.15.0

It’s now possible to run the entire range of supported Large Language Models on CoCalc Cloud. You can configure this like all other settings in the “Admin panel” → “Site settings”. Click on AI-LLM to filter the configuration values.

On top of that, you can limit the choices for your users by only included selected models in list called User Selectable LLMs.

In particular, running your own Ollama server and registering it might be an interesting option.

Note

Just like with all the other admin settings, you can also control their values via your my-values.yaml, in the globals.settings dict.