Skip to main content

MattGPT

I am able to run a local GPT platform based on this software

Here's my Docker Compose.

docker-compose.yaml
services:
  ollama:
    image: ghcr.io/open-webui/open-webui:ollama
    container_name: mattgpt
    volumes:
      - ollama:/root/.ollama
      - open-webui:/app/backend/data
    ports:
       - 8080:8080
    restart: always
    network_mode: bridge
    deploy:
      resources:
        reservations:
          devices:
            - driver: nvidia
              count: 2
              capabilities: [gpu]

volumes:
  ollama:
    driver: local # Define the driver and options under the volume name
    driver_opts:
      type: none
      device: ./ollama
      o: bind

  open-webui:
    driver: local # Define the driver and options under the volume name
    driver_opts:
      type: none
      device: ./open-webui
      o: bind

Sample Screenshot.

image.png