Skip to main content

Command Palette

Search for a command to run...

Terraform Modules

divide and rule

Updated
3 min read
Terraform Modules

Modules — What Are They?

If you’re familiar with any programming language or the principles of software development, you already know the benefits of modular programming — maintainability, reusability, and scalability.

Terraform adopts the same philosophy. By breaking down your infrastructure code into smaller, self-contained modules, you can build complex infrastructure in a clean, organized, and reusable way.

In short: Modular Terraform code is easier to maintain, extend, and test.


Sample Modular Terraform Project

Here’s a sample Terraform project structure showing how modules fit in:

.
├── backend.tf
├── main.tf
├── modules                 <- Container for all Terraform modules
│   ├── dynamodb
│   │   ├── main.tf
│   │   ├── outputs.tf
│   │   └── variables.tf
│   ├── ...
├── outputs.tf
├── terraform.tfstate
├── terraform.tfvars
└── variables.tf

You can reference modules in three main ways:

  1. Terraform Public Registry – for reusable community modules

  2. Private Registry – for internal organization modules

  3. Local Modules – for modules within your own repo


How to Reference a Module

Example reference in main.tf:

module "my-dynamodb-module" {
  source  = "./modules/dynamodb"
  version = "0.0.1"
  region  = var.aws_region
}

Here, module is a reserved Terraform keyword. You define the module source (local or remote), specify the version, and pass variables the module expects — like aws_region in this case.

Modules can accept multiple inputs and expose outputs that can be reused elsewhere in your Terraform configuration.


A Simple Module Example

Let’s create a simple Terraform module that provisions a DynamoDB table.

Folder Setup

modules/
└── dynamodb/
    ├── main.tf
    ├── outputs.tf
    └── variables.tf

variables.tf

Define input variables for your module:

variable "aws_region" {
  type    = string
  default = "us-east-1"
}

variable "table_name" {
  description = "Name of the DynamoDB table"
  type        = string
}

variable "resource_tags" {
  description = "Project-specific tags for resources"
  type        = map(string)
}

Our module now takes three inputs: aws_region, table_name, and resource_tags.

Terraform supports both primitive types (like string, bool) and complex types (list, set, map, object).


main.tf

Define the resource that the module will create:

provider "aws" {
  region = var.aws_region
}

resource "aws_dynamodb_table" "this" {
  name         = var.table_name
  hash_key     = "messageid"
  billing_mode = "PAY_PER_REQUEST"

  attribute {
    name = "messageid"
    type = "S"
  }

  tags = var.resource_tags
}

Each module can have its own provider configuration. If needed, you can override it when invoking the module using the providers argument.

This example creates a DynamoDB table with a single string hash key and uses your provided tags.

You can always refer to Terraform Registry for all available resource arguments.


outputs.tf

Outputs let you expose key values from a module for use elsewhere:

output "DynamoARN" {
  value = aws_dynamodb_table.this.arn
}

output "BillMode" {
  value = aws_dynamodb_table.this.billing_mode
}

output "TableName" {
  value = aws_dynamodb_table.this.name
}

You can now reference these in your main Terraform configuration like:

module.mytable.DynamoARN

This pattern allows you to chain modules and build infrastructure incrementally while keeping your code base modular and reusable.


Bonus: Special Module Parameters

Terraform modules support some powerful optional arguments:

  • count and for_each — iterate over input variables to create multiple instances of a module dynamically.

  • depends_on — enforce module execution order when dependencies exist between modules.

  • provider — assign a specific provider configuration to a module.

These features give you extra control over complex, multi-module deployments.


In Summary

Using modules makes infrastructure as code cleaner, reusable, and testable. You can isolate, test, and share modules across projects or teams, improving collaboration and maintainability.

More from this blog

B

Beyond Backfills

12 posts

Beyond Backfills is a space to explore the art and craft of data engineering. The goal isn’t just moving bytes—it’s understanding the systems that shape them, and the human curiosity that follows.