Installation

Note

Project will run on GPU by default. To run on CPU, use the docker-compose.cpu.yml instead.

  1. Clone this repository and navigate to the project folder:

    git clone https://github.com/NotYuSheng/Multimodal-Large-Language-Model.git
    cd Multimodal-Large-Language-Model
    
  2. Build the Docker images:

    docker-compose build
    
  3. Run the images:

    docker-compose up -d
    
  4. Access the Streamlit webpage from the host:

    <host-ip>:8501
    

    API calls to Ollama server can be made to:

    <host-ip>:11434