A Low-ish Code Approach to Implementing an OAuth Client Credentials Flow in Salesforce

Photo by Patrick Fore on Unsplash

Before digging into the meat of the post, I’d like to take a moment to ask all of you trapped in self-quarantine today due to the COVID-19 pandemic to lift a virtual pint together in honor of St. Patrick’s day and my first ever blog post!

I recently ran across a situation at work where I was asked to create an integration between Salesforce and the Azure File Storage service.

I had already done a successful proof of concept using the 2 legged client credentials Oauth flow in Postman. I assumed using a similar approach in Salesforce would be straight forward. I was wrong.

Trying to follow Salesforce best practices of favoring “clicks” over code, I was planning to create a Named Credential to establish the integration. I initially chose OAuth as the authentication protocol, not knowing at the time that the Salesforce Named Credentials OAuth configuration does not support the client credentials flow.

Choosing OAuth as the authentication protocol forced me to specify an Authentication Provider. At that point, my only viable option was to select Open ID Connect, as Azure wasn’t one of the choices listed.

After googling around a bit, I stumbled across a blog post by Mikkel Heisterberg that provided a very detailed explanation of how to integrate Salesforce and Microsoft Azure using an Open ID Connect Authentication Provider in conjunction with a Named Credential. Score!

I tried configuring my Salesforce dev org as outlined in the blog post and, after trying all manner of combinations and permutations of different settings, still no luck. Finally, at my wit’s end, I reached out to Mikkel for help. He was very gracious and extremely responsive and, after providing him with my configuration details, he quickly identified where I was going wrong and turned around a working proof of concept for me in under 24 hours. I was thoroughly impressed.

To make a long story short (although at this point it’s probably too late to say that), I ultimately decided against using this approach, primarily because of the lack of support for the 2-legged Oauth flow, which is designed for the machine-to-machine type of integration that was most representative of our use case. Instead, it only supports the 3-legged flow, which requires the involvement of an end user to approve access.

To get the integration to work properly using the Authentication Provider required creating a dummy user account in Azure, giving that user permissions to access the File Storage account, and pre-authorizing that dummy user by physically logging into the Azure account as the user when saving the configuration. It also required the additional step of configuring the Authentication Provider to request a refresh token. The refresh token would ensure that if the access token were to ever become invalid or expire, Salesforce would be able to obtain a new one on behalf of that dummy user. This would prevent the integration from failing unexpectedly, and the dummy user would not have to log in again once the initial integration was established.

This all felt a little kludgy to me, but I tried to weigh it against the benefits of having a solution that is completely no code and that requires minimal configuration. However, I had some concerns. For example, what would happen if the dummy user’s password were to expire. Would it break the integration? After reading over the Microsoft Azure documentation on token revocation, it seemed to indicate that this would not be an issue if the app was configured as a confidential client. However, I didn’t have time to dig into this any further or to test it adequately to ensure that it wouldn’t end up causing us problems down the road, so I made the decision to fall back and to look for another low/no code approach that would allow me to use the client credentials flow.

I looked at the other Named Credential authentication protocol options. JWT looked promising, as it provides a more modern implementation of the client credentials OAuth flow, and Azure does support it. However, you can’t use the OOTB Salesforce JWT protocol with Azure without the creation of a custom header, which isn’t supported. Therefore, to implement it in Salesforce would require a non-trivial amount of coding.

Still not wanting to give up on the Named Credential, because I really liked the benefits of removing the configuration from the code and having Salesforce manage it for me, I looked at my remaining options: AWS Signature Version 4, Password Authentication, and No Authentication. AWS Signature Version 4 was not applicable to our use case because, well, we were not using AWS. Password Authentication is no longer considered a best practice. And No Authentication? I think it goes without saying that this wouldn’t be the best choice from a security perspective. What to do? What to do?

Well, I created an elegant hack of course! Two Named Credentials. The first one using Password Authentication (don’t judge yet) to handle the first leg of the flow to request the access token from Azure, and the second one using No Authentication (no really don’t judge yet) to handle the actual Azure File Storage REST API calls. Used together, I was able to achieve the client credentials flow that I was looking for. Still a little code involved. But it’s mostly done through config, and it comes with the built in benefits of using Named Credentials.

Here’s the basic outline of the config (and code) that I used to get all of this working.

The screen shot below represents my Salesforce config settings for the first Named Credential, i.e., the one to request the access token from Azure. As I said above, it’s configured to use Password Authentication, when in reality it’s actually not. The reason I configured it this way was to leverage the 2 merge fields that Salesforce creates on your behalf later in the Apex code to set the client_id and client_secret request body parameters required by Azure to get the token. It’s also nice because the Password field is hidden, so using it to store the sensitive client secret value protects it by default and keeps all of it centralized to the Named Credential configuration. Note that to use this approach you also have to check the button “Allow Merge Fields in HTTP Body” or else the client id/client secret fields won’t be accessible from Apex. Obtaining the values for the client id and secret from Azure are a separate exercise left to the reader, but here’a a hint.

The next screenshot below represents the second Named Credential settings. No need to worry that this is configured to use No Authentication. The authorization for the API requests made using this Named Credential will be handled in Apex by adding the access token obtained using the Named Credential above to each request header.

And now, on to the coding part. At the bottom of the post is the Apex code that I wrote to prove out the concept.

In the first section of the code, I make a request to the Azure token endpoint using the first Named Credential. The key points there are that I am using the Username/Password Merge fields defined in the first Named Credential to pass in the client id and secret values. And also, I am getting the name of the resource that I want access to from the Endpoint (URL) specified in the second Named Credential so that I don’t have to hard code it. (And I’m using SOQL because that seemed to be the only way to access it from the code.)

In the next code section, I parse the JSON response that I receive back from Azure and pull out the access token. No magic there.

And finally, in the third section, I use the second Named Credential to make a test call to the Azure File Storage REST API to get the list of storage containers. And I pass the access token that I got from making the first request in the authorization header of the second request. And voila! It worked.

//Request Azure Oauth Token
HttpRequest req = new HttpRequest();
req.setMethod('GET');
req.setHeader('Content-Type','application/x-www-form-urlencoded');
req.setEndpoint('callout:Azure_OAuth_Get_Token');
req.setBody('client_id={!$Credential.Username}'
    + '&client_secret={!$Credential.Password}' 
    + '&grant_type=client_credentials'
    + '&resource=' + [SELECT Endpoint FROM NamedCredential WHERE DeveloperName = 'Azure_Blob_Storage'].Endpoint);
HttpResponse res = new Http().send(req);
System.debug(res.getBody());
 
//Parse OAuth Access Token
string accessToken = '';
JSONParser parser = JSON.createParser(res.getBody());
while (parser.nextToken() != null) {
    if ((parser.getCurrentToken() == JSONToken.FIELD_NAME) && (parser.getText() =='access_token')){
        parser.nextToken();
        accessToken= parser.getText();
    }
}
System.debug(accessToken);
 
//Send Request to Azure Storage to list storage containers
req = new HttpRequest();
req.setMethod('GET');
req.setEndpoint('callout:Azure_Blob_Storage?comp=list');
req.setHeader('Content-Type', 'application/json');
req.setHeader('x-ms-version', '2017-11-09');
req.setHeader('Authorization', 'Bearer ' + accessToken);
res = new Http().send(req);
System.debug(res.getBody());

Anyway, I hope you have found this post informative. And I hope that it helps someone out there to navigate the sometimes murky waters of Salesforce authentication, as well as to find the right balance between “clicks” and code.