{"id":6322,"date":"2025-12-18T09:31:52","date_gmt":"2025-12-18T08:31:52","guid":{"rendered":"https:\/\/www.nextfem.it\/it\/?page_id=6322"},"modified":"2026-02-11T17:30:55","modified_gmt":"2026-02-11T16:30:55","slug":"nextfem-ai-tools","status":"publish","type":"page","link":"https:\/\/www.nextfem.it\/it\/nextfem-ai-tools\/","title":{"rendered":"NextFEM AI Tools"},"content":{"rendered":"<table style=\"width: 100%; border: none; border-collapse: collapse; border-radius: 15px; background-color: #f0cc9e;\" border=\"medium\">\n<tbody>\n<tr>\n<td style=\"width: 30%; border: none;\" rowspan=\"2\"><img decoding=\"async\" class=\"aligncenter wp-image-6256\" src=\"https:\/\/www.nextfem.it\/it\/wp-content\/uploads\/NFmcpServer.png\" alt=\"\" width=\"141\" height=\"141\" srcset=\"https:\/\/www.nextfem.it\/it\/wp-content\/uploads\/NFmcpServer.png 240w, https:\/\/www.nextfem.it\/it\/wp-content\/uploads\/NFmcpServer-150x150.png 150w, https:\/\/www.nextfem.it\/it\/wp-content\/uploads\/NFmcpServer-24x24.png 24w, https:\/\/www.nextfem.it\/it\/wp-content\/uploads\/NFmcpServer-48x48.png 48w, https:\/\/www.nextfem.it\/it\/wp-content\/uploads\/NFmcpServer-96x96.png 96w\" sizes=\"(max-width: 141px) 100vw, 141px\" \/><\/td>\n<td style=\"width: 70%; border: none;\">\n<h2>NextFEM AI Tools<\/h2>\n<\/td>\n<\/tr>\n<tr>\n<td style=\"width: 70%; border: none;\">\n<h5>AI-DRIVEN DESIGN IN NextFEM PROGRAMS<\/h5>\n<\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n<h2>AI Assistant v2<\/h2>\n<p><span style=\"font-family: tahoma, arial, helvetica, sans-serif;\"><img decoding=\"async\" class=\" wp-image-6080 alignleft\" src=\"https:\/\/www.nextfem.it\/it\/wp-content\/uploads\/AIassistant.png\" alt=\"\" width=\"60\" height=\"60\" srcset=\"https:\/\/www.nextfem.it\/it\/wp-content\/uploads\/AIassistant.png 32w, https:\/\/www.nextfem.it\/it\/wp-content\/uploads\/AIassistant-24x24.png 24w\" sizes=\"(max-width: 60px) 100vw, 60px\" \/><\/span><\/p>\n<p>NextFEM Designer v2.7 integrates the new <strong>AI Assistant v2<\/strong>, which supports the NextFEM MCP tools (now shipped with the program). It can be started from the <em>Plugins<\/em> tab.<br \/>\nThis plugin keeps the support for simple modelling AI calls in v1 (see chapter below), however the usage with MCP tools is strongly encouraged. The MCP mode can be activated by selecting the &#8220;<em>MCP tools<\/em>&#8221; checkbox in the main window.<\/p>\n<p><img fetchpriority=\"high\" decoding=\"async\" class=\"aligncenter size-full wp-image-6323\" src=\"https:\/\/www.nextfem.it\/it\/wp-content\/uploads\/AI_Assistant_v2.jpg\" alt=\"\" width=\"317\" height=\"541\" srcset=\"https:\/\/www.nextfem.it\/it\/wp-content\/uploads\/AI_Assistant_v2.jpg 317w, https:\/\/www.nextfem.it\/it\/wp-content\/uploads\/AI_Assistant_v2-176x300.jpg 176w\" sizes=\"(max-width: 317px) 100vw, 317px\" \/><br \/>\nAfter selecting the checkbox, please wait for the local MCP server to be started. When you read the number of available MCP tools, you&#8217;re ready to go.<\/p>\n<p>The providers supported by v1 continues to be supported. Generally, the AI client supports OpenAI chat model v1.<\/p>\n<table>\n<tbody>\n<tr>\n<td><strong>Provider<\/strong><\/td>\n<td><strong>Endpoint<\/strong><\/td>\n<\/tr>\n<tr>\n<td>OpenRouter<\/td>\n<td>https:\/\/openrouter.ai\/api\/v1<\/td>\n<\/tr>\n<tr>\n<td>HuggingFace<\/td>\n<td>https:\/\/router.huggingface.co\/v1<\/td>\n<\/tr>\n<tr>\n<td>Groq<\/td>\n<td>https:\/\/api.groq.com\/openai\/v1<\/td>\n<\/tr>\n<tr>\n<td>LM Studio<\/td>\n<td>http:\/\/localhost:1234\/v1<\/td>\n<\/tr>\n<tr>\n<td>Cohere<\/td>\n<td>https:\/\/api.cohere.com\/v1<\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n<p>Please note that the endpoint address to be provided are different from the ones that had to be supplied in v1.<\/p>\n<p>Finally, v2 supports the storage of endpoint address, API key and chat AI model specification in a CSV file called &#8220;LLMkeys.csv&#8221; that can be places in the installation folder of NextFEM Designer.<br \/>\nSample content of the CSV file:<\/p>\n<pre>#Endpoint;API key;Modelname\r\nhttps:\/\/api.yourprovider.com\/v1;apikey;gpt-4.1<\/pre>\n<p>Please contact us for further informations.<\/p>\n<h2>AI Assistant v1<\/h2>\n<p>This chapter illustrates how to use the <strong>plugin AI Assistant v1<\/strong> shipped in NextFEM Designer v.2.6.<br \/>\nThe plugin, free for everyone, permits to use AI APIs with NextFEM Designer. Users can interact via chat with their favourite AI provides, while the plugin tells the API to format the reply in a way that can be read by the plugin and converted in commands. In that sense, the plugin acts like an <strong>AI agent<\/strong> in NextFEM Designer.<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"aligncenter size-full wp-image-6081\" src=\"https:\/\/www.nextfem.it\/it\/wp-content\/uploads\/AI_Assistant_window.jpg\" alt=\"\" width=\"317\" height=\"541\" srcset=\"https:\/\/www.nextfem.it\/it\/wp-content\/uploads\/AI_Assistant_window.jpg 317w, https:\/\/www.nextfem.it\/it\/wp-content\/uploads\/AI_Assistant_window-176x300.jpg 176w\" sizes=\"(max-width: 317px) 100vw, 317px\" \/><\/p>\n<h4>Supported AI providers<\/h4>\n<p>The user must supply his own API key and AI\/LLM server address. The plugin support natively all LLM APIs that are OpenAI-like, and it has been tested with:<\/p>\n<ul>\n<li><strong><a href=\"https:\/\/platform.openai.com\/api-keys\" target=\"_blank\" rel=\"noopener\">OpenAI<\/a><\/strong><\/li>\n<li><a href=\"https:\/\/console.anthropic.com\/settings\/keys\" target=\"_blank\" rel=\"noopener\"><strong>Claude<\/strong><\/a><\/li>\n<li><a href=\"https:\/\/openrouter.ai\/settings\/keys\" target=\"_blank\" rel=\"noopener\"><strong>OpenRouter<\/strong><\/a><\/li>\n<li><a href=\"https:\/\/huggingface.co\/settings\/tokens\" target=\"_blank\" rel=\"noopener\"><strong>HuggingFace<\/strong><\/a><\/li>\n<li><a href=\"https:\/\/console.groq.com\/keys\" target=\"_blank\" rel=\"noopener\"><strong>Groq<\/strong><\/a><\/li>\n<li><a href=\"https:\/\/lmstudio.ai\/\" target=\"_blank\" rel=\"noopener\"><strong>LM Studio<\/strong><\/a> for running LLM models locally.<\/li>\n<\/ul>\n<p>Each one of the link above leads to the page from which you get your API key. Also, please refer to the documentation of each provider to get the server address and the model name.<\/p>\n<p>Please find some examples below:<\/p>\n<p>&nbsp;<\/p>\n<table>\n<tbody>\n<tr>\n<td><strong>Provider<\/strong><\/td>\n<td><strong>Endpoint<\/strong><\/td>\n<td><strong>Tested models<\/strong><\/td>\n<\/tr>\n<tr>\n<td>OpenRouter<\/td>\n<td>https:\/\/openrouter.ai\/api\/v1\/chat\/completions<\/td>\n<td>deepseek\/deepseek-chat-v3-0324:free<br \/>\nqwen\/qwen2.5-vl-32b-instruct:free<br \/>\ngoogle\/gemini-2.5-pro-exp-03-25:free<br \/>\nmistralai\/mistral-small-3.1-24b-instruct:free<br \/>\nopen-r1\/olympiccoder-32b:free<br \/>\ngoogle\/gemma-3-4b-it:free<br \/>\ndeepseek\/deepseek-v3-base:free<\/td>\n<\/tr>\n<tr>\n<td>HuggingFace<\/td>\n<td>https:\/\/router.huggingface.co\/v1\/chat\/completions<\/td>\n<td>Qwen\/Qwen2.5-VL-7B-Instruct<\/p>\n<p>google\/gemma-2-2b-it<\/p>\n<p>deepseek-ai\/DeepSeek-V3-0324<\/td>\n<\/tr>\n<tr>\n<td>Groq<\/td>\n<td>https:\/\/api.groq.com\/openai\/v1\/chat\/completions<\/td>\n<td>llama-3.3-70b-versatile<\/p>\n<p><em>and others&#8230;<\/em><\/td>\n<\/tr>\n<tr>\n<td>LM Studio<\/td>\n<td>http:\/\/localhost:1234\/v1\/chat\/completions<\/td>\n<td>claude-3.7-sonnet-reasoning-gemma3-12b<br \/>\n&#8230;<br \/>\n<em>(all LM Studio models supported)<\/em><\/td>\n<\/tr>\n<tr>\n<td>Cohere<\/td>\n<td>https:\/\/api.cohere.com\/v2\/chat<\/td>\n<td>command-a-03-2025<\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n<p>You can use also other APIs than the ones listed above, if they&#8217;re compatible with OpenAI SDK.<\/p>\n<p>Paste on the proper textboxes API key, server and model name, and you&#8217;re good to go!<\/p>\n<p><strong>Agenting and commands<\/strong><\/p>\n<p>Your message is transmitted to the server without the previous context (in order to support even free APIs providers), and with a system message constraining response format. If the server is capable enough (suggested models should have at least 3B parameters and a quantization higher or equal to 4). The system message asks also the LLM to not give human-readable explanations, in order to reduce tokens.<\/p>\n<p>Chat is always cleared when API key, server or model has changed. Server is asked to reply with NextFEM commands, than can be:<\/p>\n<ul>\n<li>reverted by undo<\/li>\n<li>executed even partially by the user, by selecting the rows to execute.<\/li>\n<\/ul>\n<p>This helps the user to control what&#8217;s been changed in the model, also by repeating modelling commands manually. Right-click in the chat box to open such commands.<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"aligncenter wp-image-6091\" src=\"https:\/\/www.nextfem.it\/it\/wp-content\/uploads\/AI_assistant_context__menu.jpg\" alt=\"\" width=\"1104\" height=\"626\" srcset=\"https:\/\/www.nextfem.it\/it\/wp-content\/uploads\/AI_assistant_context__menu.jpg 1302w, https:\/\/www.nextfem.it\/it\/wp-content\/uploads\/AI_assistant_context__menu-300x170.jpg 300w, https:\/\/www.nextfem.it\/it\/wp-content\/uploads\/AI_assistant_context__menu-1024x580.jpg 1024w, https:\/\/www.nextfem.it\/it\/wp-content\/uploads\/AI_assistant_context__menu-768x435.jpg 768w, https:\/\/www.nextfem.it\/it\/wp-content\/uploads\/AI_assistant_context__menu-600x340.jpg 600w, https:\/\/www.nextfem.it\/it\/wp-content\/uploads\/AI_assistant_context__menu-128x72.jpg 128w\" sizes=\"(max-width: 1104px) 100vw, 1104px\" \/><\/p>\n<p>If the LLM fails to provide valid nodes and\/or elements, press Undo in the program and try again by better describing your request in the prompt. Different LLMs have various behaviour, hence the same prompt that&#8217;s working with a certain LLM model could be not working with another.<\/p>\n<p>If you&#8217;re sharing screenshots, remember to hide your API key and server address!<\/p>\n<h2>NextFEM MCP server<\/h2>\n<p>This guide will show you how to use the local NextFEM MCP server for:<\/p>\n<ul style=\"list-style-type: square;\">\n<li><a href=\"https:\/\/www.anthropic.com\/claude\" target=\"_blank\" rel=\"noopener\"><strong>Anthropic Claude<\/strong><\/a> AI<\/li>\n<li>GitHub Copilot AI in <a href=\"https:\/\/code.visualstudio.com\/\" target=\"_blank\" rel=\"noopener\"><strong>Visual Studio Code<\/strong><\/a><\/li>\n<li>GitHub Copilot AI in <a href=\"https:\/\/visualstudio.microsoft.com\/vs\/\" target=\"_blank\" rel=\"noopener\"><strong>Visual Studio 2022<\/strong><\/a><\/li>\n<li><a href=\"https:\/\/lmstudio.ai\/\" target=\"_blank\" rel=\"noopener\"><strong>LM studio<\/strong><\/a> (version &gt;= 0.3.17)<\/li>\n<li><a href=\"https:\/\/chatgpt.com\/features\/desktop\/\" target=\"_blank\" rel=\"noopener\"><strong>OpenAI ChatGTP Desktop<\/strong><\/a><\/li>\n<\/ul>\n<p>MCP server is a simple interface that allows you to connect your local NextFEM Designer installation to your favorite AI provider. You don&#8217;t need to have a paid plan of Claude or GitHub copilot, this server is working with the free version of both providers, as long with the free version of NextFEM Designer.<\/p>\n<p><em><strong>DEPRECATED (see subsequent node) &#8211; Version 1.0.0.2 &#8211; Release date: 17 December 2025<br \/>\nDEPRECATED (see subsequent node) &#8211; Version 1.0.0.1 &#8211; Release date: 20 October 2025<br \/>\nDEPRECATED (see subsequent node) &#8211; Version 1.0.0.0 &#8211; Release date: 08 October 2025<\/strong><\/em><\/p>\n<p><strong>Note<\/strong>: From version 2.7 onwards, the MCP tools server <span style=\"text-decoration: underline;\">is supplied and updated with NextFEM Designer<\/span> and can be found in the installation folder.<\/p>\n<h4>Prerequisites<\/h4>\n<p>NextFEM Designer is supposed to be already installed on your system. Be sure to activate the REST server at startup of the program, by activating the option depicted below.<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"aligncenter wp-image-6039\" src=\"https:\/\/www.nextfem.it\/it\/wp-content\/uploads\/enable_REST_startup.jpg\" alt=\"\" width=\"741\" height=\"726\" srcset=\"https:\/\/www.nextfem.it\/it\/wp-content\/uploads\/enable_REST_startup.jpg 867w, https:\/\/www.nextfem.it\/it\/wp-content\/uploads\/enable_REST_startup-300x294.jpg 300w, https:\/\/www.nextfem.it\/it\/wp-content\/uploads\/enable_REST_startup-768x753.jpg 768w, https:\/\/www.nextfem.it\/it\/wp-content\/uploads\/enable_REST_startup-600x588.jpg 600w, https:\/\/www.nextfem.it\/it\/wp-content\/uploads\/enable_REST_startup-24x24.jpg 24w, https:\/\/www.nextfem.it\/it\/wp-content\/uploads\/enable_REST_startup-48x48.jpg 48w\" sizes=\"(max-width: 741px) 100vw, 741px\" \/><\/p>\n<p>&nbsp;<\/p>\n<h4>Installation in Claude Desktop (Windows)<\/h4>\n<p>Install <a href=\"https:\/\/claude.ai\/download\" target=\"_blank\" rel=\"noopener\"><strong>Claude Desktop for WIndows.<\/strong><\/a><\/p>\n<p>MCP server consists in an executable exposing the tools to be connected with AI. This is supplied by our MCPserver.<\/p>\n<p>1.\u00a0Find the NextFEM Designer installation folder (typically, it&#8217;s <em>C:\\Program Files\\NextFEM\\NextFEM Designer 64bit\\<\/em>);<\/p>\n<p>2.Configure Claude Desktop to load MCP server at startup. Open folder:<\/p>\n<p><code>%appdata%\\Claude<\/code><\/p>\n<p>and double-click <em>claude_desktop_config.json<\/em> to edit it.<\/p>\n<p>If it is not existing, please do not create it by hand, but, from inside Claude Desktop, select <em>File \/ Settings \/ Developers \/ Change configuration<\/em> button.<\/p>\n<p>Then change the content of the file to:<\/p>\n<div>\n<pre>{\r\n\u00a0 \u00a0 \"mcpServers\": {\r\n\u00a0 \u00a0 \u00a0 \u00a0 \"NextFEM\": {\r\n\u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \"command\": \"C:\\\\myPath\\\\NextFEMmcpServer.exe\",\r\n\u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \"args\": []\r\n\u00a0 \u00a0 \u00a0 \u00a0 }\r\n\u00a0 \u00a0 }\r\n}<\/pre>\n<\/div>\n<p>Remember to change <em>myPath<\/em> with the MCP server path.<\/p>\n<p>3. That&#8217;s all. Restart NextFEM Designer and Claude Desktop. You&#8217;ll see a hammer with the number of NextFEM Designer tools available in Claude.<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"aligncenter size-full wp-image-6044\" src=\"https:\/\/www.nextfem.it\/it\/wp-content\/uploads\/tools_claude.jpg\" alt=\"\" width=\"716\" height=\"77\" srcset=\"https:\/\/www.nextfem.it\/it\/wp-content\/uploads\/tools_claude.jpg 716w, https:\/\/www.nextfem.it\/it\/wp-content\/uploads\/tools_claude-300x32.jpg 300w, https:\/\/www.nextfem.it\/it\/wp-content\/uploads\/tools_claude-600x65.jpg 600w\" sizes=\"(max-width: 716px) 100vw, 716px\" \/><\/p>\n<h4>See it in action<\/h4>\n<p><iframe title=\"NextFEM Designer with MCP server for Claude AI\" width=\"800\" height=\"450\" src=\"https:\/\/www.youtube.com\/embed\/83A3OwJeTPY?feature=oembed\" frameborder=\"0\" allow=\"accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share\" referrerpolicy=\"strict-origin-when-cross-origin\" allowfullscreen><\/iframe><\/p>\n<h4>Installation in GitHub Copilot for Visual Studio Code (Windows)<\/h4>\n<p>1.\u00a0Find the NextFEM Designer installation folder (typically, it&#8217;s <em>C:\\Program Files\\NextFEM\\NextFEM Designer 64bit\\<\/em>);<\/p>\n<p>2.Configure Visual Studio Code to load MCP server at startup. Edit file:<\/p>\n<p><code><strong>%appdata%\\Code\\User\\mcp.json<\/strong><\/code><\/p>\n<p>with the following lines:<\/p>\n<div>\n<pre>{\r\n\t\"servers\": {\r\n\t\t\"NextFEM\": {\r\n\t\t\t\"type\": \"stdio\",\r\n\t\t\t\"command\": \"C:\\\\myPath\\\\NextFEMmcpServer.exe\",\r\n\t\t\t\"args\": []\r\n\t\t}\r\n\t},\r\n\t\"inputs\": []\r\n}<\/pre>\n<\/div>\n<div>Remember to change <em>myPath<\/em> with the MCP server path.<\/div>\n<div><\/div>\n<div>3. In Visual Studio Code, press F1 and then type <em>MCP<\/em><\/div>\n<div><\/div>\n<div><em><img loading=\"lazy\" decoding=\"async\" class=\"aligncenter size-full wp-image-6250\" src=\"https:\/\/www.nextfem.it\/it\/wp-content\/uploads\/github1.png\" alt=\"\" width=\"609\" height=\"266\" srcset=\"https:\/\/www.nextfem.it\/it\/wp-content\/uploads\/github1.png 609w, https:\/\/www.nextfem.it\/it\/wp-content\/uploads\/github1-300x131.png 300w, https:\/\/www.nextfem.it\/it\/wp-content\/uploads\/github1-600x262.png 600w\" sizes=\"(max-width: 609px) 100vw, 609px\" \/><\/em><\/div>\n<div><\/div>\n<div>then select <strong>MCP: List Servers<\/strong>. Then click on the <em>NextFEM<\/em> line and finally on <em>Start server<\/em>.<\/div>\n<div><\/div>\n<div>You&#8217;re done. Start using Copilot as usual; click on the toolbox in the bottom right corner to check that NextFEM MCP tools are active.<\/div>\n<div><\/div>\n<div><img loading=\"lazy\" decoding=\"async\" class=\"aligncenter wp-image-6251\" src=\"https:\/\/www.nextfem.it\/it\/wp-content\/uploads\/copilot2.png\" alt=\"\" width=\"848\" height=\"424\" srcset=\"https:\/\/www.nextfem.it\/it\/wp-content\/uploads\/copilot2.png 1028w, https:\/\/www.nextfem.it\/it\/wp-content\/uploads\/copilot2-300x150.png 300w, https:\/\/www.nextfem.it\/it\/wp-content\/uploads\/copilot2-1024x512.png 1024w, https:\/\/www.nextfem.it\/it\/wp-content\/uploads\/copilot2-768x384.png 768w, https:\/\/www.nextfem.it\/it\/wp-content\/uploads\/copilot2-600x300.png 600w\" sizes=\"(max-width: 848px) 100vw, 848px\" \/><\/div>\n<p>&nbsp;<\/p>\n<h4>Installation in GitHub Copilot for Visual Studio 2022 (Windows)<\/h4>\n<p>1.\u00a0Find the NextFEM Designer installation folder (typically, it&#8217;s <em>C:\\Program Files\\NextFEM\\NextFEM Designer 64bit\\<\/em>);<\/p>\n<p>2. Configure Visual Studio 2022 by adding the MCP server from the tools icon in the Copilot chat.<\/p>\n<p>3. After the addition, enable the NextFEM tools.<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"aligncenter size-full wp-image-6273\" src=\"https:\/\/www.nextfem.it\/it\/wp-content\/uploads\/VS_mcp.png\" alt=\"\" width=\"464\" height=\"361\" srcset=\"https:\/\/www.nextfem.it\/it\/wp-content\/uploads\/VS_mcp.png 464w, https:\/\/www.nextfem.it\/it\/wp-content\/uploads\/VS_mcp-300x233.png 300w\" sizes=\"(max-width: 464px) 100vw, 464px\" \/><\/p>\n<h4>Installation in LM Studio (Windows)<\/h4>\n<p>1.\u00a0Download the <a href=\"https:\/\/www.nextfem.it\/patches\/NextFEMmcpServer.zip\"><strong>NextFEM MCP server executable from here<\/strong><\/a>. Then decompress the .exe to a known and reachable folder;<\/p>\n<p>2. Configure LM Studio to use the MCP server together with the LLM you&#8217;re using. On chat window, click on button &#8220;Program&#8221; on the right sidebar; then click on &#8220;Edit mcp.json&#8221;.<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"aligncenter size-full wp-image-6264\" src=\"https:\/\/www.nextfem.it\/it\/wp-content\/uploads\/lmstudio.png\" alt=\"\" width=\"315\" height=\"257\" srcset=\"https:\/\/www.nextfem.it\/it\/wp-content\/uploads\/lmstudio.png 315w, https:\/\/www.nextfem.it\/it\/wp-content\/uploads\/lmstudio-300x245.png 300w\" sizes=\"(max-width: 315px) 100vw, 315px\" \/><\/p>\n<p>Put in the JSON file the following lines:<\/p>\n<pre>{\r\n  \"mcpServers\": {\r\n    \"NextFEM\": {\r\n      \"command\": \"C:\\\\myPath\\\\NextFEMmcpServer.exe\"\r\n    }\r\n  }\r\n}\r\n<\/pre>\n<p>Remember to change <em>myPath<\/em> with the MCP server path. Finally, enable the tool appeared on the right sidebar (&#8220;mcp\/next-fem&#8221;).<\/p>\n<h4>Installation in OpenAI chatGPT (Windows)<\/h4>\n<p>The paid versions of the chatGPT desktop client only support the use of remote MCP tools (i.e., those accessible from an internet server). The free version does not support the addition of MCP tools.<\/p>\n<p>For the following procedure, you need to have a free account on GitHub:<\/p>\n<p>1. install Node.js with the command:<\/p>\n<pre>winget install --silent --accept-package-agreements --accept-source-agreements OpenJS.NodeJS<\/pre>\n<p>Install also Microsoft DevTunnel with the command:<\/p>\n<pre>winget install Microsoft.devtunnel<\/pre>\n<p>and log in with your GitHub account after giving the command:<\/p>\n<pre>devtunnel user login -d -g<\/pre>\n<p>2. You can temporarily publish the local MCP server of NextFEM Designer with the command:<\/p>\n<pre>npx -y supergateway --stdio \"C:\\Program Files\\NextFEM\\NextFEM Designer 64bit\\NextFEMmcpServer.exe\" --outputTransport streamableHttp<\/pre>\n<p>leaving the prompt window active. Start a new command prompt and enter the command:<\/p>\n<pre>devtunnel host -p 8000 --allow-anonymous<\/pre>\n<p>leaving the prompt window active.<\/p>\n<p>3. Configure chatGPT with the address provided by the last command on the line &#8220;Connect via browser:&#8221;, which is usually in the format https:\/\/<em>randomCode<\/em>.devtunnels.ms<br \/>\n&#8211; Enable Developer Mode from<em> Settings \/ Apps<\/em><\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"aligncenter size-full wp-image-6409\" src=\"https:\/\/www.nextfem.it\/it\/wp-content\/uploads\/chatGPT_svil.png\" alt=\"\" width=\"532\" height=\"259\" srcset=\"https:\/\/www.nextfem.it\/it\/wp-content\/uploads\/chatGPT_svil.png 532w, https:\/\/www.nextfem.it\/it\/wp-content\/uploads\/chatGPT_svil-300x146.png 300w\" sizes=\"(max-width: 532px) 100vw, 532px\" \/><\/p>\n<p>&#8211;\u00a0Select &#8220;Create app&#8221; and fill in the fields:<\/p>\n<p>Name: NextFEM MCP<\/p>\n<p>Authentication: no authentication<\/p>\n<p>URL: https:\/\/randomCode.devtunnels.ms\/mcp<\/p>\n<p>NOTE: be sure to add the suffix \/mcp<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"aligncenter size-full wp-image-6410\" src=\"https:\/\/www.nextfem.it\/it\/wp-content\/uploads\/chatGPT_server.png\" alt=\"\" width=\"447\" height=\"665\" srcset=\"https:\/\/www.nextfem.it\/it\/wp-content\/uploads\/chatGPT_server.png 447w, https:\/\/www.nextfem.it\/it\/wp-content\/uploads\/chatGPT_server-202x300.png 202w\" sizes=\"(max-width: 447px) 100vw, 447px\" \/><\/p>\n<p>Then click Create. Now the NextFEM MCP tools are available in your chats\u2014add the tools from the menu in the message pane to use them.<\/p>\n<h4>Notes<\/h4>\n<ul style=\"list-style-type: square;\">\n<li>Use clear and circumstanced prompts &#8211; e.g. always refer at least once to NextFEM Designer in order to force the AI to use MCP tools<\/li>\n<li>Be aware that only a few selected commands in NextFEM API are available as a tool. Avoid to make requests not covered by the commands.<\/li>\n<\/ul>\n<p>&nbsp;<\/p>\n","protected":false},"excerpt":{"rendered":"<p>NextFEM AI Tools AI-DRIVEN DESIGN IN NextFEM PROGRAMS AI Assistant v2 NextFEM Designer v2.7 integrates the new AI Assistant v2, which supports the NextFEM MCP tools (now shipped with the &#8230;<\/p>\n","protected":false},"author":136,"featured_media":0,"parent":0,"menu_order":0,"comment_status":"closed","ping_status":"closed","template":"","meta":{"_bbp_topic_count":0,"_bbp_reply_count":0,"_bbp_total_topic_count":0,"_bbp_total_reply_count":0,"_bbp_voice_count":0,"_bbp_anonymous_reply_count":0,"_bbp_topic_count_hidden":0,"_bbp_reply_count_hidden":0,"_bbp_forum_subforum_count":0,"footnotes":""},"class_list":["post-6322","page","type-page","status-publish","hentry"],"_links":{"self":[{"href":"https:\/\/www.nextfem.it\/it\/wp-json\/wp\/v2\/pages\/6322","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.nextfem.it\/it\/wp-json\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/www.nextfem.it\/it\/wp-json\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/www.nextfem.it\/it\/wp-json\/wp\/v2\/users\/136"}],"replies":[{"embeddable":true,"href":"https:\/\/www.nextfem.it\/it\/wp-json\/wp\/v2\/comments?post=6322"}],"version-history":[{"count":2,"href":"https:\/\/www.nextfem.it\/it\/wp-json\/wp\/v2\/pages\/6322\/revisions"}],"predecessor-version":[{"id":6413,"href":"https:\/\/www.nextfem.it\/it\/wp-json\/wp\/v2\/pages\/6322\/revisions\/6413"}],"wp:attachment":[{"href":"https:\/\/www.nextfem.it\/it\/wp-json\/wp\/v2\/media?parent=6322"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}