Installation
Note
Project will run on GPU by default. To run on CPU, use the docker-compose.cpu.yml
instead.
Clone this repository and navigate to the project folder:
git clone https://github.com/NotYuSheng/Multimodal-Large-Language-Model.git cd Multimodal-Large-Language-Model
Build the Docker images:
docker-compose build
Run the images:
docker-compose up -d
Access the Streamlit webpage from the host:
<host-ip>:8501
API calls to Ollama server can be made to:
<host-ip>:11434