This app can work with multiple LLM models at once. It supports Ollama which allows you to run locally and free. The additional setup is required: - For Ollama, follow the instruction in its website to download and setup your LLM server. NOTE: THIS APP CAN'T BE RUN WITHOUT INSTALLED OLAMA.