id: TC-RUNTIME-002 name: GPU Detection suite: runtime priority: 2 timeout: 60000 dependencies: - TC-RUNTIME-001 steps: - name: Check nvidia-smi inside container command: docker exec ollama37 nvidia-smi - name: Check CUDA libraries command: docker exec ollama37 ldconfig -p | grep -i cuda | head -5 - name: Check Ollama GPU detection command: cd docker && docker compose logs 2>&1 | grep -i gpu | head -10 criteria: | Tesla K80 GPU should be detected inside the container. Expected: - nvidia-smi shows Tesla K80 GPU(s) - Driver version 470.x (or compatible) - CUDA libraries are available (libcuda, libcublas, etc.) - Ollama logs mention GPU detection The K80 has 12GB VRAM per GPU. Accept variations in reported memory.