A Claude Code plugin that iteratively refines product specifications by debating between multiple LLMs until all models reach consensus.
vllm-mlx
OpenAI and Anthropic compatible server for Apple Silicon. Run LLMs and vision-language models (Llama, Qwen-VL, LLaVA) with continuous batchi
Install
About
OpenAI and Anthropic compatible server for Apple Silicon. Run LLMs and vision-language models (Llama, Qwen-VL, LLaVA) with continuous batching, MCP tool calling, and multimodal support. Native MLX backend, 400+ tok/s. Works with Claude Code.
Similar
The agent harness performance optimization system. Skills, instincts, memory, security, and research-first development for Claude Code, Code
🪨 why use many token when few token do trick — Claude Code skill that cuts 65% of tokens by talking like caveman
The complete guide to Claude Code: CLAUDE.md, hooks, skills, MCP servers, and commands
CLI tool for configuring and monitoring Claude Code
A Claude Code plugin that shows what's happening - context usage, active tools, running agents, and todo progress