BareGit
Commits
Clone:
Name Latest commit Last update
..
πŸ“‚ static
πŸ“„ README.md model-switcher: update README and increase CRT flicker intensity 4 weeks ago
πŸ“„ main.py model-switcher: add systemd service log viewer to web interface 4 weeks ago
πŸ“„ model-switcher.service.example model-switcher: fix read-only /etc by downgrading ProtectSystem to 'yes' 4 weeks ago
πŸ“„ prompts.md Initial commit: Add project structure and LLM model switcher tool 4 weeks ago

LLM Model Switcher

A simple web-based tool to switch between different llama.cpp model configurations on a headless AI server, featuring a retro-futuristic terminal interface.

How it works

The tool manages llama.cpp configurations by manipulating symbolic links and interacting with systemd:

  1. Configurations: It scans /etc/llama.cpp.d for .conf files. Each file should contain the environment variables for a specific model.
  2. Active Link: The systemd service for llama.cpp is expected to read its parameters from /etc/llama.cpp.conf.
  3. Switching: When you select a model in the web UI:
    • The tool updates the symlink at /etc/llama.cpp.conf to point to the selected file in /etc/llama.cpp.d.
    • It restarts the llama.cpp systemd service to apply the changes.
  4. Monitoring: The tool provides a real-time view of the last 100 lines of the llama.cpp service logs using journalctl.

Features

Requirements

Usage

Run the script with sudo:

sudo python3 main.py

By default, the server will be available at http://127.0.0.1:7330.

Command Line Options

Project Structure