packer

Announcing the Public Beta of HCP Packer

The free public beta of HCP Packer lets DevOps teams track and automate build updates across their HashiCorp Packer and Terraform workflows.

Today, we are announcing the public beta of HCP Packer. This offering is free, and lets DevOps teams track and automate build updates across their Packer and Terraform workflows. In this post, we’ll show you some of the ways that HCP Packer can help improve the machine image management process throughout your organization.

HashiCorp Packer enables teams to create identical machine images for multiple clouds from a single source configuration. The single source is important because these images are built on top of base images, such as the ones that build an operating system or handle system-level security. But as images are provisioned across deployments, it can be difficult to figure out who maintains what, how images have changed over time, where they are deployed, and how to update them. This is especially problematic when performing critical system updates or resolving security issues.

HCP Packer addresses those issues by helping DevOps teams automate build management across provisioning pipelines. To note: it’s not just “Packer in the cloud.” HCP Packer lets your team integrate custom metadata into your existing Packer images, which is then used to track how those images are used across other images and throughout deployments.

This is a critical part of any multi-cloud environment because it helps you track the lifecycle of images across clouds, and automate image updates across the Packer and Terraform lifecycle.

»Track the Lifecycle of Images Across Clouds

You can’t standardize what you can’t track. By connecting your existing Packer builds to HCP Packer, you will be able to see pertinent metadata about an image, such as who maintains it and its associated version control repos. Critically, HCP Packer also tracks every iteration of that image. With this information available via the UI or API, your team can track the downstream builds that leverage this image iteration across cloud environments. If an image is used by multiple teams within an organization, you’ll know exactly where updates will need to be made.

The image below shows how HCP Packer tracks images and their iterations across clouds.

HCP Packer tracks downstream builds

»Automate Image Updates Across the Packer and Terraform Lifecycle

Understanding what builds are impacted by configuration changes is helpful. The next step is making this information actionable. How can you ensure that all of an image’s downstream builds are appropriately updated, especially in the case of security incidents? If teams want to use the most stable version of an image, how can you ensure that they will always be able to use the appropriate iteration?

HCP Packer comes with a powerful feature called channels. With channels, DevOps teams can automate changes dynamically across child images as well as your entire provisioning pipeline. Simply create labels for images that indicate the stability and nature of a build. Then, pin a specific iteration of an image to that channel. When you reference the channel name across your builds, they will use this image iteration. If a more stable version of that image emerges, pin this new image iteration to the channel. When you run packer build or terraform apply, the build will automatically update with the new image. Here’s how that works:

»Building Images From a Base Image

Often organizations use one or several base images within all images across deployments. These are often called “golden images,” and may include a specific operating system or critical security tools like HashiCorp Vault. HCP Packer introduces a new data source for Packer images, allowing your teams to reference these images dynamically within other child images’ Packer configurations. Here is an example of a how a child image can be built on top of a golden image that is being tracked in HCP Packer:

data "hcp-packer-iteration" "base-image" {
  bucket_name = "learn-packer-hcp-golden-base-image"
  channel     = "latest"
}

data "hcp-packer-image" "base-image" {
  bucket_name    = data.hcp-packer-iteration.base-image.bucket_name
  iteration_id   = data.hcp-packer-iteration.base-image.id
  cloud_provider = "aws"
  region         = "us-east-2"
}

source "amazon-ebs" "marketing-layer-2" {
  source_ami        = data.hcp-packer-image.base-image.id
  source_deregister = true
  instance_type     = "t2.small"
  ssh_username      = "ubuntu"
  ami_name          = "custom-secondary-image-redis-server"
}
data "hcp-packer-iteration" "base-image" {  bucket_name = "learn-packer-hcp-golden-base-image"  channel     = "latest"} data "hcp-packer-image" "base-image" {  bucket_name    = data.hcp-packer-iteration.base-image.bucket_name  iteration_id   = data.hcp-packer-iteration.base-image.id  cloud_provider = "aws"  region         = "us-east-2"} source "amazon-ebs" "marketing-layer-2" {  source_ami        = data.hcp-packer-image.base-image.id  source_deregister = true  instance_type     = "t2.small"  ssh_username      = "ubuntu"  ami_name          = "custom-secondary-image-redis-server"}

Referencing golden images with data from HCP Packer ensures a higher level of standardization across your organization’s deployments.

»Integrating with Terraform Provisioning Pipelines

Recently, we announced that Packer templates can now be written in HashiCorp Configuration Language (HCL). With this change, your teams can use the same familiar language and syntax across both the multi-cloud infrastructure you provision and the images built on top of it. HCP Packer takes this a step further, allowing you to reference and update images within your provisioning pipelines, reducing manual work for you and your team.

Using the HCP provider for Terraform and HCP Packer channels, your organization can ensure that every provisioning pipeline uses the appropriate images at all times. Simply reference HCP Packer channel names with the provider, and then trigger a new Terraform run every time an image is updated using Terraform Cloud. If this HCL is plugged into a Terraform configuration, this provisioning pipeline will always use the “production” iteration of an Ubuntu Packer image:

data "hcp_packer_iteration" "ubuntu" {
  bucket_name = "learn-packer-ubuntu"
  channel     = "latest"
}

data "hcp_packer_image" "ubuntu" {
  bucket_name    = data.hcp_packer_iteration.ubuntu.bucket_name
  iteration_id   = data.hcp_packer_iteration.ubuntu.ulid
  cloud_provider = "aws"
  region         = "us-east-2"
}

resource "aws_instance" "ubuntu" {
  ami           = data.hcp_packer_image.ubuntu.cloud_image_id
  instance_type = "t2.micro"
  ## ...
}
data "hcp_packer_iteration" "ubuntu" {  bucket_name = "learn-packer-ubuntu"  channel     = "latest"} data "hcp_packer_image" "ubuntu" {  bucket_name    = data.hcp_packer_iteration.ubuntu.bucket_name  iteration_id   = data.hcp_packer_iteration.ubuntu.ulid  cloud_provider = "aws"  region         = "us-east-2"} resource "aws_instance" "ubuntu" {  ami           = data.hcp_packer_image.ubuntu.cloud_image_id  instance_type = "t2.micro"  ## ...}

In the configuration above you can see an example of these channel names being referenced in the HCP Packer provider for Terraform.

»Getting Started with HCP Packer

Try HCP Packer today and introduce more automation to your builds. To get started, sign up for HCP Packer for free and check out the Packer tutorial collection on HashiCorp Learn. You will learn to integrate HCP Packer into your existing images and update these base images across child images and your Terraform provisioning pipelines.

If you are a Terraform Cloud Business customer and would like to discuss using HCP Packer at scale, contact your HashiCorp Customer Success Manager.

Sign up for the latest HashiCorp news