Setting up the Azure Terraform MCP Server in Visual Studio Code

Executive technology leader responsible for platform reliability, cloud operations, security posture, and enterprise technology risk within an investor-backed fintech environment. I lead technology operations at the intersection of engineering execution, governance, and business outcomes — ensuring platforms are scalable, resilient, and trusted by investors, regulators, and clients.
Currently VP of DevOps at InvestorFlow, where I focus on building board-ready technology operations, strengthening risk and resilience, and shaping long-term platform strategy to support growth and regulatory confidence.
If you've been using Copilot or Claude in VS Code to help write Terraform for Azure, you'll have hit the same wall I did: the model is confidently wrong about provider arguments more often than feels acceptable. It'll suggest an azurerm_storage_account argument that was deprecated two versions ago, or it'll hallucinate an attribute on an AzAPI resource that simply doesn't exist. The autocomplete is fast, but you spend more time fact-checking it against the registry docs than you save by using it.
The Azure team published the Azure Terraform MCP Server to fix exactly this problem. It plugs into any MCP-aware client (VS Code, Claude Desktop, Cursor, and others) and gives the model a set of grounded tools it can actually call against. Instead of guessing what arguments azurerm_key_vault accepts, the model can ask the MCP server, get the real provider documentation, and write code based on that.
In this post I'll walkthrough getting it set up in VS Code, what the tools actually do, and a couple of gotchas that caught me out on first install.
What the MCP server actually gives you
At a high level, the Azure Terraform MCP Server is a Node-based process that exposes a set of tools to your AI client. The tools fall into four buckets:
Documentation lookups for the AzureRM provider, the AzAPI provider, and Azure Verified Modules. The model can ask for the schema of
azurerm_virtual_machine, get back the actual argument list, and write code against it.Resource export through
aztfexport. You point it at an existing Azure resource or resource group and it generates the Terraform to bring it under management.Policy validation through Conftest, including the AVM security policy set. You can validate a workspace or a plan file against Azure-specific Rego policies without setting up the toolchain yourself.
Auto-installation of Conftest and the Azure policy libraries, so you don't have to hand-roll the setup.
Prerequisites
You'll need a few things in place before the install will go anywhere:
Node.js 20 or higher. The server is a TypeScript package distributed via npm. Earlier Node versions will fail at install.
VS Code with MCP support. This is built in to recent VS Code releases as part of the GitHub Copilot Chat experience, you don't need a separate extension.
A GitHub Copilot or Copilot Pro licence (or another MCP-aware AI client). The MCP server itself doesn't care which model you use, but you need a client capable of calling MCP tools.
Optional but recommended:
aztfexport,terraform, and the Azure CLI logged in to your subscription. None of these are required just to use the documentation tools, but you'll need them if you want the resource export or policy validation features to actually work end-to-end.
Installing the MCP server
There are two ways to install: from npm globally, or from source. For most people, the npm install is the right choice.
npm install -g @azure/terraform-mcp-server
That gives you the azure-terraform-mcp-server command on your PATH. You can sanity check it by running:
azure-terraform-mcp-server --help
If it prints a help banner (or sits there waiting for MCP traffic on stdio), you're good. If npm complains about Node version, that's almost always because a node shim from nvm is pointing at an older runtime, check node -v and switch versions if needed.
Wiring it into VS Code
VS Code reads MCP server configuration from one of two places: a workspace-scoped .vscode/mcp.json, or your user-level settings. I prefer the workspace file for anything I'd want a colleague to pick up automatically when they clone the repo, and user settings for tools I always want available regardless of project.
Create .vscode/mcp.json in the root of your Terraform repository:
{
"servers": {
"azure-terraform": {
"command": "azure-terraform-mcp-server",
"env": {
"GITHUB_TOKEN": "${GITHUB_TOKEN}"
}
}
}
}
A couple of things worth noting in that config:
The GITHUB_TOKEN is technically optional, but I'd treat it as required in practice. Several of the documentation tools fetch from public GitHub repositories (the AVM module index, the AzAPI examples library), and an unauthenticated request hits the GitHub API rate limit of 60 requests per hour. With a token it goes to 5,000. If you're using this server through a working day, you will hit the unauthenticated limit and the model will start getting empty responses. Generate a personal access token with public_repo scope only, export it as GITHUB_TOKEN in your shell profile, and the environment syntax will pick it up.
The environment variable substitution syntax is a VS Code MCP convention. It means "read this from my environment variables at server start time". Don't paste your actual token into the JSON file, you'll commit it to the repo eventually and I'd rather not be the one explaining that to your security team.
If you want the export and policy tools to work, add the Azure auth variables too:
{
"servers": {
"azure-terraform": {
"command": "azure-terraform-mcp-server",
"env": {
"GITHUB_TOKEN": "${GITHUB_TOKEN}",
"ARM_SUBSCRIPTION_ID": "${ARM_SUBSCRIPTION_ID}",
"ARM_TENANT_ID": "${ARM_TENANT_ID}",
"ARM_CLIENT_ID": "${ARM_CLIENT_ID}",
"ARM_CLIENT_SECRET": "${ARM_CLIENT_SECRET}"
}
}
}
}
For local workstation use I'd skip the ARM_CLIENT_SECRET entirely and rely on az login instead. The aztfexport tool will pick up the active CLI session, which is far less of a faff than rotating service principal secrets.
Starting the server in VS Code
Once mcp.json is saved, open the Command Palette and run MCP: List Servers. You should see azure-terraform in the list with a status of Stopped. Hit Start on it and watch the output panel.
To check the tools actually loaded, open the Copilot Chat panel, click the tool picker (the small spanner icon), and you should see the Azure Terraform tools listed alongside any other MCP servers you have configured. You can toggle them on and off here per chat session, which is occasionally useful when you want to force the model to think for itself rather than reach for a tool.
Using it in practice
The way the MCP integration works in VS Code is that the model decides when to call tools based on what you ask. You don't invoke them directly. So the way to use the Azure Terraform MCP Server is just to ask Copilot questions that would benefit from grounded Azure context.
A few prompts that exercise different tools:
Documentation lookup:
"What arguments does the latest azurerm_storage_account resource accept for blob retention policy? Show me the HCL."
AVM module suggestion:
"Is there an Azure Verified Module for an Application Gateway? If so, give me the latest version and a basic usage example."
Resource export:
"I have a storage account at /subscriptions/xxx/resourceGroups/rg-data-prod/providers/Microsoft.Storage/storageAccounts/stdataprod001. Generate the Terraform for it."
Policy validation:
"Run the AVM security policies against the Terraform in the ./infra folder and tell me what's failing."
The thing I'd stress is that the model is still doing the orchestration. Sometimes it'll forget the tool exists and answer from its training data anyway. If that happens, prompt it explicitly: "use the Azure Terraform MCP server to look this up". Once it's used a tool once in a session it tends to lean on it for the rest of the conversation.
A couple tips
The cache directory. The server caches AzAPI schemas, AVM data, and Conftest policies in ~/.azure-terraform-mcp/. If something looks stale, that's the first place to clear.
Conftest auto-install. The setup_conftest_environment tool will offer to install Conftest for you using your platform's package manager (brew on macOS, apt on Linux, scoop or choco on Windows). It works well, but it'll prompt for confirmation, and the prompt comes back through the chat interface rather than your terminal.
Telemetry. The server sends anonymous usage telemetry to Azure Application Insights by default. There's no personal data collected, but if you'd rather opt out, set TELEMETRY_ENABLED=false in the env block of your mcp.json.
Provider version awareness. The documentation tools fetch the latest provider docs by default. If your project is pinned to an older AzureRM version, the model might suggest arguments that don't exist in your version yet.
My thoughts
I've been running this for a few weeks now across personal projects, and it's noticeably reduced the amount of time I spend correcting the model on Terraform syntax. The combination of grounded provider docs and AVM lookups means Copilot now writes Azure Terraform that actually compiles on first attempt more often than not, which wasn't true six months ago.
It's not a magic bullet. The model still makes architectural choices I disagree with, and the policy validation results need a human to interpret them properly. But as a piece of plumbing that makes Copilot meaningfully better at Azure-specific Terraform, the cost of setting it up pays off.




