Hear a story about one company that was able to use Vault encryption-as-a-service at a rate of 20K requests per second.
HashiCorp solutions engineer Lance Larsen has worked with Vault Enterprise customers with very low latency requirements for their encryption needs. Hear a story about one company that was able to use Vault encryption-as-a-service at a rate of 20K requests per second.
We had a customer approach us—in the credit- and risk-monitoring industry. They had a big challenge with trying to find an encryption-as-a-service platform that could actually meet their scale and their needs.
They're a large Hadoop user. They serialize and deserialize a lot of very sensitive data that they use to make actionable decisions about their users in real time—whether they're at risk. And they looked at managed cloud solutions, and the rate limiting, and the latency—with where their actual Hadoop clusters were running, it was not enough for them to actually go out and be confident that they could benefit from a remote key solution, where developers didn't have access to keys that were protected by a KMS solution, and combining that with some file system encryption at the HDFS level.
So they came to us with this challenge, put a pretty big RPS number in front of us—for 15 to 20,000 requests per second, with an end goal of getting to 50 to 60,000 RPS. Which seems like a tall order, but Vault's extremely performant in the way it actually offers encryption services. And in the Phase 1 rollout, we were able to work with them, tune their cluster, work with the development teams in the best way to actually leverage Vault's APIs and the community SDKs in our ecosystem. And we ended up, in our Phase 1 of that deployment, hitting our target of 20,000 requests per second.
And we actually worked directly with that customer to help them reach their longer-term roadmap by really working arm in arm with them to drive our performance standby node feature, which now allows encryption as a service to essentially be scaled out—in theory—infinitely across the cluster, horizontally.
So for them, a huge win: Not only did they get the benefit of developers not having the underlying access to the keys, the data was not only protected at the file system level in Hadoop, but also in the actual blob store. And they got the performance to actually go out and still drive business value by safely being able to make good real-time decisions for their customers and protect their data.