PowerMCP is an open-source collection of MCP servers for power system software like PowerWorld and OpenDSS. These tools enable LLMs to directly interact with power system applications, facilitating intelligent coordination, simulation, and control in the energy domain.
The Model Context Protocol (MCP) is a revolutionary standard that enables AI applications to seamlessly connect with various external tools. Think of MCP as a universal adapter for AI applications, similar to what USB-C is for physical devices. It provides:
- Standardized connections to power system software and data sources
- Secure and efficient data exchange between AI agents and power systems
- Reusable components for building intelligent power system applications
- Interoperability between different AI models and power system tools
We're building an open-source community focused on accelerating AI adoption in the power domain through MCP. Our goals are:
- Collaboration: Bring together power system experts, AI researchers, and software developers
- Innovation: Create and share MCP servers for various power system software and tools
- Education: Provide resources and examples for implementing AI in power systems
- Standardization: Develop best practices for AI integration in the energy sector
Check out these demos showcasing PowerMCP in action:
-
Contingency Evaluation Demo: An LLM automatically operates power system software, such as PowerWorld and pandapower, to perform contingency analysis and generate professional reports.
-
Loadgrowth Evaluation Demo: An LLM automatically operates power system software, such as PowerWorld, to evaluate different load growth scenarios and generate professional reports with recommendations.
MCP follows a client-server architecture where:
- Hosts are LLM applications (like Claude Desktop or IDEs) that initiate connections
- Clients maintain 1:1 connections with servers, inside the host application
- Servers provide context, tools, and prompts to clients
Check out these helpful tutorials to get started with MCP:
- Getting Started with MCP: Official introduction to the Model Context Protocol fundamentals.
- Core Architecture: Detailed explanation of MCP's client-server architecture.
- Building Your First MCP Server: Step-by-step guide to creating a basic MCP server.
- Anthropic MCP Tutorial: Learn how to use MCP with Claude models.
- Cursor MCP Tutorial: Learn how to use MCP with Cursor.
- Other Protocol: Open AI Function Calling Tool
To use these MCP tools with an LLM:
- Install the MCP Python SDK:
pip install mcp-server-git
- Run your MCP server:
python your_server.py
- Configure your LLM application (e.g., Claude Desktop, Cursor) to use the MCP server:
{
"mcpServers": {
"servername": {
"command": "python",
"args": ["your_server.py"]
}
}
}
For instance, for pandapower
you could configure the server as follows:
{
"mcpServers": {
"pandapower": {
"command": "python",
"args": ["pandapower/panda_mcp.py"]
}
}
}
For detailed documentation about MCP, please visit:
We welcome contributions! Please see our Contributing Guidelines for details.
This project is licensed under the MIT License - see the LICENSE file for details.
- All contributors who help make this project better
- The Power and AI Initiative (PAI) at Harvard SEAS