Ollama on H100 Performance Review | Generated by AI

Home 2026.03

🦞 Session Review — Ollama on H100

What we did:

  1. SSH’d into a remote NYC2 server (Ubuntu 22.04)
  2. Installed Ollama (worked around curl | sh approval gate)
  3. Pulled qwen3.5:35b (23 GB Q4_K_M) — confirmed working

What worked well:

What was rough:

Recommendations going forward:

Infrastructure summary:

Overall: clean setup, one machine, ready to serve. 👍


Back Donate