How financial services can strike a balance with GenAI (2024)

With all the risks at play, it’s understandable for FIs to adopt a posture that’s overly cautious. Unfortunately, like it or not, competitors are using these tools, and as they start delivering tangible value (or savings), your customers may not wait for you to catch up.

For all the risks of early adoption, the risks of not acting are at least as great. It’s time to start training people in a broad-based way, creating opportunities for safe experimentation and use, and demonstrating the capacity to capture value.

To that end, we see five critical steps to getting started in the GenAI marathon, which include practices that can help manage the risks. All are non-negotiable, and some may need repeating.

Step 1: Ensure alignment with enterprise strategy

Whether organisations pursue bottom-up idea generation in a hackathon, top-down directives that emerge from an executive off-site, or a mix, leaders should align their generative AI strategy with the broader business strategy by ensuring they have clear answers to three questions:

  • What are the most important business objectives we wish to achieve through the use of GenAI?
  • What are the boundaries of our risk appetite in achieving those objectives?
  • What additional constraints do we have that derive from our whole-enterprise strategy, including within the areas of environmental, social and governance (ESG); brand; investor and regulatory relations; and alliances and partnerships?

Examples of business objectives include improving productivity, quality, compliance and risk management, or creating a new revenue stream. Your choice will depend on your organisation’s particular context and strategy. Consider, for example, a market leader with saturated share: GenAI applications that leverage scale to maximise efficiency and productivity might be the most attractive low-hanging fruit. Meanwhile, a neobank looking to win customers and make its mark might be more focused on applications that create compelling or distinct customer experiences or services.

The nature of GenAI, however, will often enable you to address many objectives at once. Automating the preparation of credit assessment and loan-verification information, for example, enhances productivity, but also likely improves quality, streamlines the customer and employee experience, and may even increase revenue and market share (depending on the state of the loan market).

Risk appetite, as a strategic consideration, is self-explanatory and, of course, is also context-specific and different for every organisation. However, for financial services, we would expect to see much more scrutiny and caution at this time with any AI that is customer facing or that affects regulatory and legal obligations. Even more caution is warranted for any fully digitised end-to-end process.

Both your objectives and risk appetite will be influenced by your current alliance and partner strategy, even if it was formulated without GenAI in mind. In an area as new and fast-moving as AI, there can be no presumptive partner choices, no matter how deep and long-lasting existing relationships may be, and it’s worth applying extra scrutiny to “sole sourcing” arrangements at this time. In our own firm, GenAI has been the catalyst for new partnerships for applications such aspreparing legal briefs, contract review, and the summary and analysis of customer conversations.

No matter the business objective, the risk profile or how outwards facing your generative AI activities are, they should be consistent with all aspects of the enterprise strategy, including your growth strategy, shareholder story, customer brand promise and employee value proposition. That consistency needs to be obvious and explainable not just for specialists in tech, but for all senior leaders and the board, a point we discuss in greater detail in our recentprimer on the implications of GenAI for directors.

Step 2: Ready the organisation with training, guard rails and protocols

The applications and use cases we’re seeing today are only the most obvious at this early stage of generative AI. Many will involve the kinds of low-volume tasks that have historically been too complex to automate, too infrequent to justify reengineering away and often too mundane for senior leaders to know much about. These are the grains of sand in the gears across FIs, and the reason that simplification, digitisation and transformation have been so hard to achieve.

This kind of innovation won’t come from the top—it will be led by those closest to the work. FIs must ensure that all employees have access to appropriate universal training, just as they do in other areas that pose risk, such as security, customer protection, data handling and privacy. GenAI can have a role to play in deploying this training and capability enhancement via conversational training systems and support bots that can assist employees with guidance as they go about their day-to-day jobs.

At the same time, the culture at many FIs will need to change to enable GenAI innovation. Around half of CEOssurveyed by PwC said that their culture doesn’t tolerate small-scale failures or encourage dissent and debate—and two-thirds of employees agreed with that dim outlook. Yet trial and error will be an essential part of building these capabilities.

However, we’re not suggesting unrestrained experimentation. Organisations will need to provide rules, frameworks and protocols to guide employees. Such guard rails serve to instil confidence in workers, offering clearly defined spaces that are open for exploration.

Finally, it’s worth thinking about how to prepare for GenAI deployment; the best approach will likely depend on the objectives being pursued. For repeatable, moderate-value use cases deep inside the organisation, we find that small, agile multidisciplinary teams (what we call “pods” in your “AI factory”) can create enormous value incredibly fast. By contrast, mission-critical and differentiating innovation in areas such as identity, digital currencies and embedded finance will require a cross-enterprise and broadly coordinated approach.

Step 3: Build tools for development, integration and operationalisation

Once the strategy, frameworks, training and capabilities are in place, you will need to select several critical tools and platforms on which teams can do the work. Here, it makes sense for FIs to start with sourcing (or building) the foundation models on which those tools and platforms are built. In that selection process, the choice between open-source and closed-source models should be considered, as should such factors as model size, portability, energy and water consumption (there may be reporting requirements in some territories), flexibility, stability, price, security, transparency, traceability and customisability.

On top of every foundation model will sit the rest of the GenAI development stack. At this stage, teams must decide where to host the model and data development environment, and make choices on the development platform and supporting tools, interfaces with other systems, data protection, mirroring (if appropriate) and storage, as well as access and other controls.

Decisions about data access will be especially important, as a significant obstacle for large-scale AI deployment is the availability of high-quality data. Large “lakes” of unstructured data are both an opportunity and a risk for GenAI. On the one hand, models such as LLMs are adept at making connections and finding structure where it is not obvious. On the other hand, unstructured, unreliable and incomplete data introduces noise and creates gaps, which an LLM may try to fill in ways that could introduce errors and risk. A coherent and consistent generative AI data-management strategy is vital for FIs that want to maximise the potential benefits.

Step 4: Embed responsible AI practices throughout the organisation

Given the rapid evolution of the GenAI landscape, rules, tools and guard rails that work today may become obsolete tomorrow. As a result, FIs need a holistic process and clear framework for establishing those guard rails, which includes overseeing and monitoring them, keeping them up to date, and doing it all in a manner consistent with the organisation’s approach to governance, accountability and transparency. This is necessary for any organisation, but for regulated entities like FIs, it’s absolutely critical to not only get it right but be able to demonstrate that it’s right.

Organisations have an abundance of questions to consider, which PwC’s Responsible AI Toolkit can help them navigate, including:

  • When projects are approved, who provides oversight and monitoring?
  • What diligence is required on existing frameworks and protocols to ensure they remain fit for purpose?
  • What additional rules are required for data on which model training might someday be done?
  • When and how should the risk of hidden bias, inaccuracy or private information leaks be assessed?
  • How will we know when external support is needed, and where it should come from?
  • How does all this interact with the overall approach to corporate risk management and governance, and who needs to do what?

Step 5: Select and prioritise use cases

Finally, with so many possible areas in which generative AI can help, such as business and strategic insight or risk management, how do FIs decide where to focus first? There are three vectors to consider: value, complexity and reusability.

Value will be indicated not only by the degree to which any activity or service is accelerated or transformed by GenAI but also by the importance of that activity to customers and the scale at which a GenAI solution can be applied.

Complexity involves the difficulty of developing or deploying the GenAI solution, as well as the ability to manage it safely in production. Activities that have lower inherent risk (e.g., because they are not customer facing, or because the service isn’t strategically critical or entirely new) are probably best to tackle first, at least for organisations in the early stages of AI maturity.

Reusability will be a function of whether delivering a use case supports the accumulation of assets and experience that teams can apply to subsequent use cases. At this time, few organisations have the kinds of libraries and tools that make such things as cloud-based application development much easier and faster than it was a decade ago. Those that build their resources first will create advantages that can grow exponentially—as solutions deployed today deliver savings to fund and tools to facilitate the deployment and scaling of future innovations.

How financial services can strike a balance with GenAI (2024)

References

Top Articles
Latest Posts
Article information

Author: Arline Emard IV

Last Updated:

Views: 6020

Rating: 4.1 / 5 (52 voted)

Reviews: 83% of readers found this page helpful

Author information

Name: Arline Emard IV

Birthday: 1996-07-10

Address: 8912 Hintz Shore, West Louie, AZ 69363-0747

Phone: +13454700762376

Job: Administration Technician

Hobby: Paintball, Horseback riding, Cycling, Running, Macrame, Playing musical instruments, Soapmaking

Introduction: My name is Arline Emard IV, I am a cheerful, gorgeous, colorful, joyous, excited, super, inquisitive person who loves writing and wants to share my knowledge and understanding with you.