terraform-test
by hashicorpterraform-test is a practical guide for writing and running Terraform tests with .tftest.hcl files, run blocks, assertions, mocks, and CI-friendly workflows. Use it to validate module outputs, resource arguments, conditional logic, and plan or apply behavior before merge.
This skill scores 83/100 because it gives agents a clear, reusable workflow for Terraform testing, with concrete triggers, examples, and CI guidance. For directory users, that means it is worth installing if they need help authoring .tftest.hcl files, running plan/apply tests, or setting up mocks and CI pipelines, though it is still somewhat specialized and version-sensitive.
- Strong triggerability: the description explicitly covers .tftest.hcl files, run blocks, assertions, mocking providers/data sources, and troubleshooting.
- Good operational clarity: the skill body includes core concepts plus linked references for mocks, CI/CD, and full examples.
- High agent leverage: examples and pipeline snippets reduce guesswork for unit, integration, and mock-based Terraform test workflows.
- Some guidance is version-dependent, especially mock providers requiring Terraform 1.7.0+, so users need to check compatibility.
- The skill is test-focused and may not help much outside Terraform testing workflows or for broader infrastructure design questions.
Overview of terraform-test skill
terraform-test is a Terraform testing skill for writing .tftest.hcl scenarios, checking module behavior, and validating infrastructure logic without guessing at syntax or workflow. It is best for engineers who want a practical terraform-test guide for test files, run blocks, assertions, and mocks, especially when the goal is to make Terraform changes safer before merge.
What terraform-test is good for
Use the terraform-test skill when you need to verify outputs, resource arguments, conditional logic, or environment-specific behavior. It is especially useful for module authors, platform teams, and reviewers who want repeatable checks instead of manual plan inspection.
Where it fits in the Terraform workflow
This skill fits after terraform init and terraform validate, and before or alongside CI execution. It helps turn a rough configuration intent into explicit test cases that can run in plan mode or apply mode depending on what you need to prove.
Key differentiators
The main value of terraform-test is that it centers Terraform-native tests rather than generic prompt advice. It covers test structure, assertion patterns, mock provider usage for Terraform 1.7+, and CI-friendly execution, so users can move from “I think this should work” to a concrete test file faster.
How to Use terraform-test skill
Install and open the right files
Install with npx skills add hashicorp/agent-skills --skill terraform-test. Then read SKILL.md first, followed by references/EXAMPLES.md for a full test suite pattern, references/MOCK_PROVIDERS.md for mocked unit tests, and references/CI_CD.md when you need pipeline execution.
Give the skill a testable goal
Strong prompts name the module, the behavior, and the expected outcome. For example: “Write a .tftest.hcl file for a VPC module that checks public subnet count, private subnet routing, and output values in plan mode.” That is better than “add tests,” because the skill can map the request to run blocks and assertions immediately.
Use the right input shape
The skill works best when you provide Terraform version, provider constraints, module inputs, and what must be proven. If you want mock providers, say so and confirm Terraform 1.7+; if you want real integration coverage, include the target cloud and any credentials or CI assumptions.
Start with a workflow, not a blank file
A practical terraform-test usage flow is: identify the behavior, choose plan or apply, decide whether mocks are allowed, then write one run block per scenario. Read references/EXAMPLES.md for the overall test layout, then adapt the variables, assertions, and filenames to your repository conventions.
terraform-test skill FAQ
Is terraform-test only for module testing?
No. It is strongest for modules, but it also helps with root configurations, output validation, provider behavior checks, and CI test execution. If you need Terraform-native verification, the skill is a good fit.
When should I not use terraform-test?
Skip it if you only need a one-off terraform plan explanation or if your stack cannot run Terraform tests in CI. Also avoid mock-provider patterns when you are on Terraform below 1.7, because that part of the workflow will not apply.
Is terraform-test easier than writing prompts by hand?
Usually yes, because it narrows the task to Terraform’s actual test syntax and file structure. A generic prompt may produce broad advice; the terraform-test skill is aimed at generating usable test cases, especially for run blocks and assertions.
Does terraform-test work for Code Generation tasks?
Yes, terraform-test for Code Generation is useful when you want generated test files that match a module’s interface and expected behavior. The main boundary is that generated tests still need real inputs, realistic assertions, and a clear decision about plan versus apply coverage.
How to Improve terraform-test skill
Provide concrete module facts
Better inputs lead to better tests. Include variable names, required outputs, resource names, provider aliases, and any invariants you care about, such as “public subnets must be 2” or “instance type must default to t3.micro.”
Tell the skill what can be mocked
The most common quality boost is clarifying whether provider calls should be mocked or real. For terraform-test install decisions, this matters because mocks reduce credentials needs and speed up unit tests, but they only work in plan mode and may hide provider-specific behavior.
Separate unit, integration, and regression cases
Ask for distinct test scenarios instead of one large file when the behaviors differ. A clean terraform-test guide usually divides fast plan-mode checks from slower integration checks, which makes CI simpler and the failure signal easier to read.
Iterate from failures, not assumptions
After the first run, refine the assertions that were too weak, too broad, or tied to unstable values. If a test fails on computed attributes, ask for a more stable check; if a module change is intentional, update the expected condition rather than broadening the test until it stops meaning anything.
