Sütiket használunk, hogy segítsenek javítani weboldalunkat. Kérjük, olvassa el a Cookie szabályzat .

App Central

Kategória

Erről az alkalmazásról

Ollama + Open WebUI asustor NAS App

Ollama + Open WebUI

Leírás

Ollama⁠ makes it easy to get up and running with large language models locally.
Open WebUI is an extensible, feature-rich, and user-friendly self-hosted AI platform designed to operate entirely offline. It supports various LLM runners like Ollama and OpenAI-compatible APIs, with built-in inference engine for RAG, making it a powerful AI deployment solution.


Mi az új ebben a verzióban?

Version: 0.9.4.r01:

- Initial release for Ollama Docker version 0.9.4 with Open WebUI.
- The app will pull and install gemma3:4b as default AI model, so it need to take more time to install the app. Please install the app again if installing failed by timeout.