Serverless applications (sometimes called "lambdas" or "functions") frequently need to authenticate to an upstream service or API. This authentication might include credentials that talk to a database or an API key to issue a third-party request.
This often prompts the question: How do I safely and securely inject secrets into my serverless applications?
Because these applications tend to be stateless—they can terminate or auto-scale at any time—traditional methods for injecting secrets may not be applicable. Or these tried-and-true approaches may create unnecessary risks due to the architectures of serverless frameworks.
Here are five strategies for managing secrets in serverless applications, ranked from most to least secure.
1. Using identity and access management
Most cloud providers—and many on-premises solutions—offer robust identity and access management (IAM) controls. These controls allow you to grant your serverless application access to other resources without exchanging credentials.
This solution can even work across different providers. Suppose a cloud function on Google Cloud needs to access data stored in a private AWS S3 bucket. Typically you would generate an AWS access key pair and inject it into the cloud function.
Instead, you can create an OpenID Connect (OIDC) provider on AWS that allows Amazon to trust Google as an authentication provider. Then you grant the cloud function permission to access data in the private S3 bucket.
There is no need to generate access key pairs or to inject credentials, because the serverless app is authenticated and authorized using its own identity.
Unfortunately, using IAM for managing secrets is not always feasible. Not all services support IAM, and even fewer services support cross-provider IAM. In these cases, you will need to inject credentials directly.
[ Special Coverage: DevSecCon Seattle 2019 ]
2. Using a secrets manager
Many providers offer native secrets management solutions on their platform, such as AWS Secrets Manager or Azure Key Vault. These native secrets managers allow you to securely store and retrieve arbitrary values, with authentication and authorization backed by the provider's IAM solution.
Serverless applications can consume these secrets by calling the secret manager's API or by binding the secrets during deployment.
If your provider lacks a native secrets management solution, you may consider a third-party secrets manager such as HashiCorp Vault or CyberArk Conjur. Because these are long-running services that need to be maintained, upgraded, and patched, they are not the best solution to secure just a small number of serverless applications.
These tools are often better equipped for multi-cloud and hybrid cloud deployments, and they often include more functionality than a native secrets manager.
Secrets managers have many other benefits too, such as centralized access and audit logging, which help you better understand how secrets are being consumed in your environment. Sometimes using a secrets manager is not an option due to lack of legal approval or because you're blocked by an export law, in which case you will need to use one of the alternatives below.
3. Using an object store
While a robust secrets management solution is preferred, object storage may offer a lower barrier to entry, especially in development and staging environments where security requirements may be less strict. Most major providers, including Google Cloud Storage, AWS S3, and Azure Blob Storage, offer an object store, which allows you to store arbitrary blobs of data in a virtual filesystem-like architecture.
You can store secrets as objects in the object store and then download those objects in your serverless application during packaging or deployment, or on initial boot.
An object store is an option for storing secrets for serverless applications if—and only if—you properly configure IAM permissions. You can also permit auditing and logging on most object stores, giving you insights into how secrets are being consumed in your organization.
Failing to properly secure an object store, however, could turn ugly for your organization, as you become another headline.
Depending on your security requirements, you can also encrypt the data with an encryption key before writing it to the object store. Some providers offer this as part of their API—usually called customer-managed encryption keys (CMEK)—while others require you to write this functionality yourself.
The biggest drawback with using object storage is the permission model; it is rather difficult to truly secure the object store. A second drawback is that it can become costly at scale. In addition to the fixed monthly cost of storage, you are also responsible for bandwidth expenses when accessing the secret.
4. Using encrypted environment variables
If you are willing to forgo centralized auditing, logging, and management in favor of reduced costs and complexity, consider using encrypted environment variables. Before deploying the serverless application, encrypt all your secrets with an encryption key, preferably one backed by a key management service (KMS).
Once your secrets are encrypted, update your serverless application to read an encrypted secret from the environment variable, decrypt it, and keep the plaintext response in memory. If you are using a KMS, your serverless application needs permission to decrypt the ciphertext. This is usually an IAM permission.
Encrypted environment variables are not free, but moderate use will likely fall under the free tier with most cloud providers. The biggest drawback for encrypted environment variables is the lack of central management. Unfortunately with this approach, you lose central logging, auditing, and management for secrets.
5. Using plaintext environment variables
While storing plaintext secrets in environment variables is simple, it comes with considerable security drawbacks. Any process, library, or dependency running inside the process has access to the environment, and there are active exploits in the wild today. It is trivial for a malicious dependency to gain access to the process's environment and send that information to a system that hackers control.
Furthermore, most serverless frameworks do not consider environment variables to be secure. Anyone with read or view access on the serverless application can easily retrieve its environment variables by looking in the web console or making an API call.
You should not store secret or sensitive information in environment variables in plaintext.
Evolution, not revolution
Secrets management for serverless applications is an evolving field, but ultimately these methods alone will not prevent a determined attacker. Secrets management is one of many vectors attackers use to compromise systems. Ultimately serverless application authors need to audit and secure their dependency trees, follow the principle of least privilege, and practice zero trust for maximum security practices.
Join Seth at DevSecCon Seattle (September 16-17), where he will dive more into "Secrets In Serverless".
Keep learning
The future is security as code. Find out how DevSecOps gets you there with TechBeacon's Guide. Plus: See the SANS DevSecOps survey report for key insights for practitioners.
Get up to speed fast on the state of app sec testing with TechBeacon's Guide. Plus: Get Gartner's 2021 Magic Quadrant for AST.
Get a handle on the app sec tools landscape with TechBeacon's Guide to Application Security Tools 2021.
Download the free The Forrester Wave for Static Application Security Testing. Plus: Learn how a SAST-DAST combo can boost your security in this Webinar.
Understand the five reasons why API security needs access management.
Learn how to build an app sec strategy for the next decade, and spend a day in the life of an application security developer.
Build a modern app sec foundation with TechBeacon's Guide.