我們使用 cookie 來幫助我們改善網頁體驗。請閱讀我們的 Cookie 政策

App Central

分類

關於此應用程式

Ollama + Open WebUI asustor NAS App

Ollama + Open WebUI

說明

Ollama⁠ makes it easy to get up and running with large language models locally.
Open WebUI is an extensible, feature-rich, and user-friendly self-hosted AI platform designed to operate entirely offline. It supports various LLM runners like Ollama and OpenAI-compatible APIs, with built-in inference engine for RAG, making it a powerful AI deployment solution.

Note:
1. The app will pull and install gemma3:4b as default AI model, so it need to take more time to install the app. Please install the app again if installing failed by timeout.


更新項目

Version: 0.9.4.r01:

- Initial release for Ollama Docker version 0.9.4 with Open WebUI.