![]() |
MCP server for AI chat |
Perform operations in NextFEM Designer VIA AI chat |
This guide will show you how to implement a local MCP server for
- Anthropic Claude AI
- GitHub Copilot AI in Visual Studio Code
- GitHub Copilot AI in Visual Studio 2022
- LM studio (version >= 0.3.17)
- OpenAI ChatGTP Desktop (coming soon)
MCP server is a simple interface that allows you to connect your local NextFEM Designer installation to your favorite AI provider. You don’t need to have a paid plan of Claude or GitHub copilot, this server is working with the free version of both providers, as long with the free version of NextFEM Designer.
Version 1.0.0.1 – Release date: 20 October 2025
Version 1.0.0.0 – Release date: 08 October 2025
Prerequisite
NextFEM Designer is supposed to be already installed on your system. Be sure to activate the REST server at startup of the program, by activating the option depicted below.

Installation in Claude Desktop (Windows)
Install Claude Desktop for WIndows.
MCP server consists in an executable exposing the tools to be connected with AI. This is supplied by our MCPserver.
1. Download the NextFEM MCP server executable from here. Then decompress the .exe to a known and reachable folder;
2.Configure Claude Desktop to load MCP server at startup. Open folder:
%appdata%\Claude
and double-click claude_desktop_config.json to edit it.
If it is not existing, please do not create it by hand, but, from inside Claude Desktop, select File / Settings / Developers / Change configuration button.
Then change the content of the file to:
{
"mcpServers": {
"NextFEM": {
"command": "C:\\myPath\\NextFEMmcpServer.exe",
"args": []
}
}
}
Remember to change myPath with the MCP server path.
3. That’s all. Restart NextFEM Designer and Claude Desktop. You’ll see a hammer with the number of NextFEM Designer tools available in Claude.

See it in action
Installation in GitHub Copilot for Visual Studio Code (Windows)
1. Download the NextFEM MCP server executable from here. Then decompress the .exe to a known and reachable folder;
2.Configure Visual Studio Code to load MCP server at startup. Edit file:
%appdata%\Code\User\mcp.json
with the following lines:
{
"servers": {
"NextFEM": {
"type": "stdio",
"command": "C:\\myPath\\NextFEMmcpServer.exe",
"args": []
}
},
"inputs": []
}


Installation in GitHub Copilot for Visual Studio 2022 (Windows)
1. Download the NextFEM MCP server executable from here. Then decompress the .exe to a known and reachable folder;
2. Configure Visual Studio 2022 by adding the MCP server from the tools icon in the Copilot chat.
3. After the addition, enable the NextFEM tools.

Installation in LM Studio (Windows)
1. Download the NextFEM MCP server executable from here. Then decompress the .exe to a known and reachable folder;
2. Configure LM Studio to use the MCP server together with the LLM you’re using. On chat window, click on button “Program” on the right sidebar; then click on “Edit mcp.json”.

Put in the JSON file the following lines:
{
"mcpServers": {
"NextFEM": {
"command": "C:\\myPath\\NextFEMmcpServer.exe"
}
}
}
Remember to change myPath with the MCP server path. Finally, enable the tool appeared on the right sidebar (“mcp/next-fem”).
Notes
- Use clear and circumstanced prompts – e.g. always refer at least once to NextFEM Designer in order to force the AI to use MCP tools
- Be aware that only a few selected commands in NextFEM API are available as a tool. Avoid to make requests not covered by the commands.

