2026/1/10

web ui for vlm : live-view-webui

live-vlm-webui 是給 vlm 的 web 界面。
安裝就用 pip
pip install live-vlm-webui
然後啟動:
$ live-vlm-webui
2026-01-10 22:02:23,010 - live_vlm_webui.server - INFO - No model/API specified, auto-detecting local services...
2026-01-10 22:02:23,016 - live_vlm_webui.server - INFO - ✅ Auto-detected Ollama at http://localhost:11434/v1
2026-01-10 22:02:23,016 - live_vlm_webui.server - INFO -    Selected model: llama3.2-vision:latest
2026-01-10 22:02:23,047 - live_vlm_webui.server - INFO - Initialized VLM service:
2026-01-10 22:02:23,047 - live_vlm_webui.server - INFO -   Model: llama3.2-vision:latest
2026-01-10 22:02:23,047 - live_vlm_webui.server - INFO -   API: http://localhost:11434/v1 (Local)
2026-01-10 22:02:23,047 - live_vlm_webui.server - INFO -   Prompt: Describe what you see in this image in one sentence.
2026-01-10 22:02:23,047 - live_vlm_webui.server - INFO - Serving static files from: /home/charles-chang/livevlmwebui/venv/lib/python3.12/site-packages/live_vlm_webui/static/images
2026-01-10 22:02:23,047 - live_vlm_webui.server - INFO - Serving favicon files from: /home/charles-chang/livevlmwebui/venv/lib/python3.12/site-packages/live_vlm_webui/static/favicon
2026-01-10 22:02:23,048 - live_vlm_webui.server - INFO - SSL enabled - using HTTPS
2026-01-10 22:02:23,048 - live_vlm_webui.server - INFO - Starting server on 0.0.0.0:8090
2026-01-10 22:02:23,048 - live_vlm_webui.server - INFO - 
2026-01-10 22:02:23,048 - live_vlm_webui.server - INFO - ======================================================================
2026-01-10 22:02:23,048 - live_vlm_webui.server - INFO - Access the server at:
2026-01-10 22:02:23,048 - live_vlm_webui.server - INFO -   Local:   https://localhost:8090
2026-01-10 22:02:23,049 - live_vlm_webui.server - INFO -   Network: https://192.168.145.77:8090
2026-01-10 22:02:23,049 - live_vlm_webui.server - INFO -   Network: https://172.20.0.1:8090
...
2026-01-10 22:02:23,050 - live_vlm_webui.server - INFO -   Network: https://192.168.94.37:8090
2026-01-10 22:02:23,050 - live_vlm_webui.server - INFO - ======================================================================
2026-01-10 22:02:23,050 - live_vlm_webui.server - INFO - 
2026-01-10 22:02:23,050 - live_vlm_webui.server - INFO - Press Ctrl+C to stop
2026-01-10 22:02:23,069 - live_vlm_webui.gpu_monitor - INFO - Auto-detected NVIDIA GPU (NVML available)
2026-01-10 22:02:23,069 - live_vlm_webui.gpu_monitor - INFO - Detected system: ASUS EX-B760M-V5
2026-01-10 22:02:23,076 - live_vlm_webui.gpu_monitor - INFO - NVML initialized for GPU: NVIDIA TITAN RTX
2026-01-10 22:02:23,076 - live_vlm_webui.server - INFO - GPU monitor initialized
2026-01-10 22:02:23,076 - live_vlm_webui.server - INFO - GPU monitoring task started
2026-01-10 22:02:23,076 - live_vlm_webui.server - INFO - GPU monitoring loop started
======== Running on https://0.0.0.0:8090 ========
(Press CTRL+C to quit)
然後在browsser 上開啟 https://localhost:8090 就可以。
API BASE URL 填入 ollama 的或是 sglang .. 都可以。
然後選 vlm model,開啟 camera,就會開始依照 prompt 內容對 video 做敘述...

沒有留言:

張貼留言