Presentation

Vault as a Security Platform - Future Direction

Get an overview of HashiCorp Vault and the problems it solves in the first half of this talk. In the second half, learn about the latest additions to the Vault platform as well as the future roadmap.

Speakers

  • Vinod Muralidhar
    Vinod MuralidharSr. Director, Product Management, Secure Product Line

Transcript

Hello everyone. My name is Vinod Muralidhar. I'm the director of product management for Vault. First of all, thank you all for taking the time to come watch our presentation today. While I get the privilege to stand in front of you and talk, there are a lot people from our engineering teams, product teams, education, product marketing, and design — so many different people — who are involved in making this possible. I want to take a moment to acknowledge and thank them before we start.

In today's presentation, I'm going to give you a bit of what we've been doing in the Vault team over the last six months to a year. Then continue to tell you more about what's about to come — a bit of the future direction.

Multi-Cloud Security in a “Zero Trust” World

Before I start, I wanted to give you a continuation of what Armon and Mitchell talked about in our keynote earlier today, specifically the concept of multi-cloud security in a Zero Trust world.

The traditional solutions for safeguarding infrastructure are rooted in the need to secure based on IP addresses — or based on hosts. We are talking about a dynamic infrastructure that makes it hard to do that based on host-based security or IP-based security. Applications talking to databases, users accessing hosts and services, servers talking across clouds; these are all different dimensions of places where security becomes of paramount importance.

Armon and Mitchell — you heard them talk about the four pillars of the multi-cloud security in a Zero Trust world; the four pillars being machine-to-machine authentication and authorization, machine-to-machine access, human-to-machine access, and human authentication and authorization.

But across these four pillars is one consistent theme, which is identity-driven controls. When the world has moved into a dynamic infrastructure, the concept of IP-based security does not hold good any longer, and identity-based security is the one true way to secure this infrastructure.

At HashiCorp, our security model is predicated by the use of identity to secure access. For any machine — or human user to communicate and contact each other — the identity serves as the mechanism we use to know whether or not they have access. Today, I'll be talking specifically about Vault and how Vault helps us solve the challenges of machine-to-machine access. But also data protection in the Zero Trust multi-cloud world.

HashiCorp Vault enables enterprises to centrally store, access, distribute dynamic secrets, but also secure your workloads from a data protection perspective. I'll be talking more about the many different use cases that we have. I'll give you bit about the origin stories of how Vault came into existence — how some of these use cases came into existence. Hopefully you'll find that useful.

Vault — A Quick Primer

In terms of where Vault is today, this slide helps you understand trillions of secrets served and millions of open source Vault downloads. By the way, we have our origins in open source and continue to invest heavily in our open source functionality.

Over 70% of the top 20 banks in the U.S. use Vault — 14 out of 20 use Vault. This says they really place their trust in Vault to deliver security for their mission-critical applications. We're talking about banks that are running mission-critical applications, to insurance companies, to stock markets. All of these depend on Vault to provide security for their services.

Secrets Management – Secrets Store

The origin story for Vault — and I like saying this story always — is when Armon and Mitchell, our founders, were building Terraform. That's where the origin for Vault happened. They were writing the code for Terraform as the founding team. They realized at some point that there are so many services that need to communicate with each other — and therefore need to authenticate and authorize each other for those communications to be secure.

They also found that if we do that, there are a lot of secrets that are either being put into config files, being written into GitHub, into code directly, all of that as well — and how do we avoid that? Vault essentially was a tool that was built by our founders as they were writing Terraform. That's exactly the use case that most of our customers and users start us off with — which is Vault as a secret store; as a way to centrally store, access, distribute your secrets across applications, systems, and infrastructure.

This diagram on your screen shows a very simple example. There's a web server trying to access a database. To do so, the web server either has a username and password or an API key or certificate — or something — that allows it to say I am this web server and I need to talk to you; the database.

At this point, though, rather than storing that secret information in its code or config, Vault allows you to take that information, put it in Vault, and then access it as you need it. I need to call the database for something. I go to Vault and say I need to access DB1. Vault looks at you, the web server, takes your identity, and says does this web server have access and permissions to connect to this database? If it does, it is provided access, which the web server can then use to connect to the database.

This is a simple use case — a secret store; we call this the KV secret store. But the important part is what goes on behind it. It's that the secrets you put in there are encrypted, stored very securely, is entirely API access — developer-friendly. A lot of other features that I will be talking about as part of the Vault platform. This is the core use case that most of our customers start us off with.

Secrets Management – Dynamic Secrets

But very soon, they also extend us into using this use case called the dynamic secrets. This means the same use case of a web server trying to access the database. Rather than having a long-lived username and password or an API key that sits there, and — if you lose that API key or the set of credentials — you're at risk of being exploited. Vault allows you to make this entire transaction be an ephemeral one.

The web server goes to Vault and says I need to access the database. Vault will — at that point — create a dynamic set of credentials for you for that database and then grant access to that interaction. The simple exchange you see there in the diagram. Web server goes and requests credentials, Vault will create the credentials, grant it access, which then allows it to connect to that database. The advantage here is that you reduce the surface area of the attack because now there are no long-lived credentials.

The second part is, because Vault manages this — as soon as your access has done, you set a time to live, you have a lease for that exchange — as soon as that has completed, Vault is able to completely stop access. It is able to revoke access to that token that's granted to you or delete the credentials that were created on that database.

Encryption as a Service – Data Protection

Here's a second origin story. Vault was essentially a secret store, and we've been very successful at it. We were called out as best-in-class secrets management solutions by Gartner. But as we were building the secrets management solution, the platform had to do a lot of cryptography — things that allow us to store these secrets in a very secure fashion.

Once you do that, you have to take care of the keys you're using to encrypt these passwords and secrets. Very soon, the founders’ team — or the early Vault team — realized there are lots of other developers who are trying to do things like this. They're trying to secure things — not necessarily inside Vault — but in other systems. Why don't we surface our complicated crypto and key management solutions to the outer developer world?

That's exactly what we did. We launched the [Transit Secrets Engine]((https://www.vaultproject.io/docs/secrets/transit/index.html) has as part of Vault Open Source that allows you to do Encryption as a Service.

In the same example, the web server wants to write data to the database. Instead, this is sensitive information — this is credit card numbers or social security. The web server would like to write this information in an encrypted fashion. It sends that information to Vault. Vault will encrypt it for you — not store any of the sensitive data in Vault — but send it back to the web server, which can then take it and store it in the database.

It is a pass through transit encryption; Vault will take care of all of the key management and key material needed — but also the crypto part of encrypting and decrypting this material as you send it through us.

Encryption as a Service – Key Management

We also help in a data protection workflow when the systems themselves do their own encryption. Vault allows for systems that do their own encryption to offload their key management to Vault.

Same example — web server is writing data to a database. The database does its own encryption — or it could be writing to a disk, and the disk does its own full disk encryption or transparent data encryption.

The problem there is the crypto part is taken care of, but you're still responsible for the keys themselves. That's where our solution comes in. A secrets engine that we call the KMIP Secrets Engine because it supports the KMIP protocol — the Key Management Interoperability Protocol — to support external systems' key management. We support a lot of different ones, and I'll talk about that in our roadmap, as well as things that we've done.

Vault Platform Capabilities

I've given you an overview of what Vault is and what Vault does — and our origin from a secrets management solution, all the way to being a dynamic secrets management solution, as well as a data protection solution.

But that's not all. Vault is not a security tool that you use for a small problem. It is a security platform. The Vault platform's core has capabilities that make all of these use cases more secure, available, performant, scalable — and offers things like business continuity.

Some of the examples are laid out here — and like the rest of my talk — everything here is only snippets of information. You can find a lot of information on our platform websites, both our learn platform, as well as vaultproject.io.

Unified Identity

In a multi-cloud scenario, we're talking about a single user having a set of credentials on AWS, another set of credentials on GitHub, another set on their active directory domain. Vault is able to combine all of these into a single entity. You are able to do very robust policy management and permissions management based on that entity. It's important to have this concept of unified identity.

Secure Plugins

Vault is also a very modular architecture. We have the secure plugin architecture. This means that — while we have a core that we built — you're also able to plug in your own secrets engine. We know of customers who built their own secrets engine because they have a custom database or something that they would like to manage Vault with.

Disaster Recovery

We also provide a lot of enterprise capabilities. Things like disaster recovery being able to provide business continuity in the case of when your Vault cluster goes down — or your primary goes down — you have a secondary to take its place by a fail-over mechanism.

Performance Replication

You're also able to do things like performance replication. Again, it's horizontal scaling of your infrastructure by providing two Vault clusters — both of which can serve different regions, for example.

Namespaces

Vault in large enterprises is usually deployed as a service, as in customers will have multiple application teams that are using Vault and a central team that manages Vault. We provide features like Namespaces that allow you to do both secure isolation, as well as self-governance.

Updates and Future Direction

From here on, I wanted to tell you about what we've been doing over the last few months and the themes of what they are. Like I was saying, 14 out of the top 20 U.S. banks are using Vault — and this extends to a lot of different verticals, by the way. Customers trust us with their most important sensitive information and their infrastructure.

Improving Ease of Use

We want to continue to focus on making the ease of operating Vault and the building of that platform to make it secure, and resilient, and robust. That's going to be a continued theme of how we want to build our roadmap.

Ecosystem and Native Integrations

We also realize we are a cog in the wheel as it comes to your infrastructure. While we like to think Vault is the center of everything, we also know that as a DevOps admin — as somebody who's running this infrastructure — Vault is only one of the many pieces that you have to take care of.

For that reason, building up our ecosystem and being first-class citizens on top of platforms that our customers have standardized on — whether it's cloud providers, whether it's runtime environments — we want to make sure that Vault has a native integration; is a first-class citizen on those platforms.

Data Protection Workflows

We also want to focus on these data protection workflows. Like I was saying earlier, our transit secrets engine been our data protection mechanism for the longest time. But in the last year, we've introduced an Advanced Data Protection module that offers a lot more in that space. I'll be talking more about that.

Platform

In this screen, there are a lot of features listed. In the platform, for example, we focused on things like monitoring. Again, as an operator who's running Vault, the most important thing is being able to monitor Vault for how it's operating, and we'll talk more in detail — providing storage and integrated storage.

Improving our replication, improving the way we do replication, and how we show that to the operator. There's a lot more that we did in the platform. I will be talking about some of those in a lot more detail.

Ecosystem

When it comes to the ecosystem, we understand it's not just forward and future-looking. Yes, we are wanting to be the cutting-edge solution as it relates to secrets management. We want to work on Kubernetes and things like serverless in the future. But at the same time, knowing that your infrastructure is rooted in an on-prem environment, you have things like support for Active Directory, OpenLDAP, or Microsoft SQL Server. Those are the things that we know you have to deal with in reality.

Our roadmap for the ecosystem constantly is one leg in the future. But also making sure that this infrastructure suits your needs today and not just for tomorrow. We know that will help you transition into the future as you evolve and make your journey toward the cloud.

Data Protection

Lastly, data protection. The ADP module was released last year, and we've been making a lot of improvements to that. I'll be talking in more detail about those.

Highlights and Roadmap Details

Let me start with the platform pieces.

New Splunk App

We introduced in Vault 1.5, a Splunk app that does telemetry and audit log data. The Splunk app includes sample dashboards for both DevOps and security use cases. All of the metrics that we service in dashboards like this are available in our open source, but also for anybody to build dashboards on. But we wanted to show the possibility — and recommendations — of how to monitor Vault.

You see a list of dashboards there. The summary one will show you the cluster metrics. All of these — by the way — you can filter based on either Namespaces, auth methods, or particular policies and things like that. 

The ops metrics talk about cluster health, request handling time, operation failure, leadership change, memory usage, replication metrics — things like that. The usage metrics talk about the number of tokens and entities; auth methods used to create tokens, tokens used — again, filtered by those different metrics that I was talking about.

The storage environment — whether it’s Consul or integrated storage — we do a lot of metrics around that as well.

Vault with Integrated Storage

The next important one is Vault and integrated storage. Traditionally, Vault offers pluggable storage, which means that we allow you to use any storage you want — but for Vault Enterprise, we support Consul as a backend storage option. That comes with the notion that you are required to use two systems for your deployment of Vault.

Integrated storage offers storage within Vault and eliminates the need to set up, manage, and monitor Vault. That's an important thing that we introduced — it has been introduced as part of 1.4. But we continue to improve on it by adding things like — in the future — cloud Auto-join, which allows for auto-discovery of peers, especially useful when IP addresses are not static anymore.

We also will introduce snapshot agent. That's a command that will start a process that will take snapshots and save them locally or push them to an external storage. We're not just working on making things easier and simpler for you but continuing to build on top of that as well.

Vault and Kubernetes

Wanted to jump quickly into the ecosystem. A lot of our customers have adopted Kubernetes. It's something that they'd been asking us about. Middle of last year is when we started our journey in making Vault and Kubernetes be first-class partners — or have a native integration.

First, customers asked us how to install and run Vault on Kubernetes. Doing YAML files and custom config was really hard. We started with building a helm chart. We followed up with what we call the Vault-K8s binary, which is a way for Vault-unaware applications. When you have legacy applications that are being migrated to a Kubernetes Pod environment, how do you ensure that they are able to use Vault without having to change their code with Vault APIs? Vault-, it is a sidecar approach to being able to inject secrets into your Pod without having to change the underlying code of the systems.

Later, in March of this year, we introduced the helm chart for Vault Enterprise and then followed it up in the 1.5 release with the support for the OpenShift container platform. In the future, we will continue to enhance that by adding support for the CSI driver, as well as the Red Hat-certified container image.

A lot of improvements being made. But the idea is — and I think in one of the earlier slides we saw that — you had 600,000 plus downloads in just the last few months for this binary. It's something that we've embarked on, and we will continue to build upon.

Data Protection

The last set of slides I want to talk about is our foray into data protection. The initial launch of our advanced data protection happened with what we called the KMIP Secrets Engine, which is a way for you to do external key management. I'll talk about that in the next slides.

Format Preserving Encryption

In Vault 1.4 in March of this year, we introduced this concept of a Transform Secrets Engine. The scenario here is that when you have PII data — sensitive information — that needs to be cryptographically protected while preserving essential information about the data — for example, the format of that data. This helps minimize cost, reduce complexity, and maintain compliance with industry standards like PCI DSS, where you want the credit card information to be stored as a 16 digit number but is fully encrypted.

Masking

In Vault 1.4, we introduced a Transform Secrets Engine that offers format preserved encryption — that's the visual you see there — as well as masking; being able to mask your credit card number into a series of characters of a single nature.

Tokenization

In the future, we are introducing the idea of tokenization. That's the same thing — replaces sensitive values with the unique token values. This is more a random set of characters that are unrelated to the original value. This is, again, something that a lot of our customers — especially ones that have needs for compliance requirements around not being able to recover; not being able to use an encryption or crypto format — rather use a tokenization that's more a random format.

Enterprise Key Management

I talked to you about this very briefly. We do offer — for systems that do external encryption — a way to manage their keys using the KMIP protocol. This was introduced last year. We did a bunch of certifications this year with NetApp, VMware, MongoDB, and others.

In the future, we plan to extend this to be a key management solution for cloud providers as well. In our upcoming release, you will see our support for Azure Key Vault — being able to support that; storing the root of trust for your key management needs on the Azure Key while being supported by Vault.

Summary and Conclusion

I know that was a lot of information. We've talked about a lot today. I wanted to summarize by saying the Vault roadmap — essentially how we think about building this product — is by focusing entirely on the end-to-end workflows and making it simpler and easier for both our operators and our developers.

Building on the ecosystem. Again, knowing that we're a cog in the wheel. We're a small piece in that larger infrastructure. Being able to double down our investment in advanced data protection. I know you heard about it in the keynote, but also offering a first-class managed service experience with Vault on HCP, which is our HCP Vault launch that happens today.

That's all I had for you today. Thank you for taking time to listen to me today. I hope that you have a really good conference. Thank you so much.

More resources like this one

  • 4/11/2024
  • FAQ

Introduction to HashiCorp Vault

Vault identity diagram
  • 12/28/2023
  • FAQ

Why should we use identity-based or "identity-first" security as we adopt cloud infrastructure?

  • 3/14/2023
  • Article

5 best practices for secrets management

  • 2/3/2023
  • Case Study

Automating Multi-Cloud, Multi-Region Vault for Teams and Landing Zones