당사는 당사 웹페이지를 개선하기 위해 쿠키를 사용합니다. 당사의 쿠키 정책 을 읽으십시오.

App Central

범주

이 앱 정보

Ollama + Open WebUI asustor NAS App

Ollama + Open WebUI

설명

Ollama⁠ makes it easy to get up and running with large language models locally.
Open WebUI is an extensible, feature-rich, and user-friendly self-hosted AI platform designed to operate entirely offline. It supports various LLM runners like Ollama and OpenAI-compatible APIs, with built-in inference engine for RAG, making it a powerful AI deployment solution.


이 버전의 새로운 기능

Version: 0.9.4.r01:

- Initial release for Ollama Docker version 0.9.4 with Open WebUI.
- The app will pull and install gemma3:4b as default AI model, so it need to take more time to install the app. Please install the app again if installing failed by timeout.