EveryMCP

MCP Server

LM Studio MCP

Run local large language models and expose them via OpenAI-compatible API with LM Studio.

AI & ML ToolsLocalLocal LLM InferenceSource: lmstudio-ai
Author
lmstudio
Repository
https://github.com/lmstudio-ai/lmstudio-mcp

Installation

Install LM Studio and enable local server mode.

Use Cases

  • Model inference
  • Model management
  • Chat completions

Tags

local-llminferenceopenai-compatible

Need Implementation Help?

We can integrate LM Studio MCP into your production stack, wire auth and policies, and ship a maintainable MCP setup.

View implementation service