On our previous article of our Salesforce Financial Services Cloud series, we covered the basics of the Application in the data model and the need for some custom build. Now that we have an Application, we need to be able to hold that data, specifically the customer data we have captured and where this lives.
Primarily we are going to talk about the ingestion of data into your org and the hydration of the platform. There are two main patterns to talk about here as in the context of applications we are often talking about customer data (or prospect data) and your pattern will be determined by whether or not Salesforce is the master for customer data. There are many considerations around this, however, there seems to be an industry shape forming within the UK whereby Corporate banking and Wealth/Investments will use Salesforce as a primary source for this data, whereas in Retail banking the customer/prospect is being mastered by an incumbent system. This distinction seems to be driven by a couple of items:
Volume of records – usually there is a much greater volume of personal customers in retail banking and moving/synchronisation of that data particularly where there is a large tranche of legacy systems that all rely on that data.
The relative maturity in Retail Banking – volume in itself is hardly a showstopper but Salesforce’s Financial Services Cloud or FSC (outside of the US) is a newer player and most retail banks have significant incumbent systems for holding customer data (that can be decades old mainframe builds) that is challenging to move away from. So, the question is often not “can we?” but “should we?” Given the costs of any such exercise – it may simply not yet be worth such a large upheaval until the product itself provides a killer use case and it seems highly unlikely that Salesforce would introduce any feature where it would have to be the master of the data as it would mean its value would be limited compared to features available through integrations.
Therefore, we have reached a position where we are highly likely to integrate with external systems and we can look at methods for doing so. We will focus on a couple of less well known/used methods, rather than the full spectrum (there is already an excellent article on the Salesforce architect site for that).
Salesforce have continued to open up the integration tools in the low code space (not least with the Mulesoft Composer) and this trend will continue.
Flow Local Actions
Allows you to connect your org to an external API using declarative tools and OpenAPI using the External Services Wizard. You do have to have in place a named credential (this is for authentication purposes) but after that, as long as you know the target URL (or even if you have a JSON copy of the schema), you can click your way to the integration and start using it in your org. One note specific to security, however, is the use of certificates. When you are connecting to external services (say Experian, Equifax etc) then you will need to consider how you manage certificates and keys – that is a separate topic that Salesforce do cover.
We’ve covered these two methods specifically because they are point and click and therefore offer a great combination of speed and security and are also reusable – once these are created you can use them within any of your Flows -no need for building custom Lightning Web Components to surface data from an API on the glass. They are therefore valuable methods within Financial Services and well positioned to meet our use cases of reading and updating customer or prospect data.
Next time we will look get back to core FSC capabilities and look at the Leads and Referrals functionality as well as options for holding prospect data (specifically Leads v Accounts).