We have hosted the application gollama in order to run this application in our online workstations with Wine or directly.


Quick description about gollama:

Gollama is a macOS and Linux tool for managing Ollama models through an interactive terminal-based interface. It provides a TUI that lets users list, inspect, sort, filter, edit, run, unload, copy, rename, delete, and push models from one place rather than relying entirely on manual command-line workflows. The project is aimed at developers and local AI users who frequently work with multiple Ollama models and want a more efficient operational layer for everyday maintenance. Beyond standard model management, Gollama can display metadata such as size, quantization level, model family, and modification date, which helps users compare models quickly. One of its more distinctive capabilities is a VRAM estimation system that can calculate memory requirements, estimate context limits, and help users choose quantization settings that fit available hardware.

Features:
  • Interactive TUI for model management
  • Sorting and filtering across model metadata
  • Modelfile editing with external editor support
  • Run and unload operations for models
  • Copy, rename, delete, and registry push actions
  • VRAM estimation for memory and context planning


Programming Language: Go.
Categories:
Large Language Models (LLM)

Page navigation:

©2024. Winfy. All Rights Reserved.

By OD Group OU – Registry code: 1609791 -VAT number: EE102345621.