SwizzLM (Swizzle ‘em) is a terminal-based tool that provides access to any LLM from a single pane. Talk to models through the OpenRouter API, locally-hosted models (Ollama/LM Studio) or externally hosted models through automated browser sessions. This project was created to improve client-side AI workflows and to remove the friction to benchmark and discover new models.