Davos: Geopolitical risk, AI, ‘Basel endgame’ keep bankers up at night

At the 2024 World Economic Forum business leaders were wondering how to deal with flaring geopolitical risks and whether and how AI disinformation risk can be tempered.

You heard about it — despite there being a lot to talk about at Davos when it comes to the global economy – actually, plenty to heave a good sigh of relief over – it was drowned out by rising anxiety about the geopolitical risks that are looming in 2024 and add uncertainty to policymaking.

The risks also have real meaning and carry potential consequences for organizations conducting business and relying on each other as vendors and agents; it’s hard to plan when those shifting geopolitical factors and the outcome of the major elections the 10 most populous countries are holding this year are so uncertain.

Wars are raging in Europe and the Middle East and cold wars are simmering between much of the West and China.

Bombs are hitting cargo ships and diverting important shipping routes.

In the US, Donald Trump could be elected president in November, which can be viewed as having more significance than many other head-of-state elections, given both the importance of America on a global stage and his disdain for traditional multinational alliances.

One could also add to this list the fact that climate change is tangibly disrupting transportation, supply chains and the cost of doing business, and that every jurisdiction seems to have a different level of willingness and set of rules for tackling it.

“The majority of respondents (54%) anticipate some instability and a moderate risk of global catastrophes, while another 30% expect even more turbulent conditions.”

The World Economic Forum Global Risks 2024 Report

“We’re starting this year with the longest list I ever recall of potential disruptions,” said Christian Ulbrich, chief executive of global real-estate company JLL. “You really have to run your organization in an extremely agile way so that you can react immediately.” 

Political and social concerns have become business concerns in a way that seems quite exaggerated this year, especially with inflation being less of a concern than expected.

Survey results – expect the turbulent

Just released this month, the World Economic Forum Global Risks 2024 Report sheds some much-needed light on some of the key risks and issues the global economy is likely to face this year and in the coming years.

According to the report, “the majority of respondents (54%) anticipate some instability and a moderate risk of global catastrophes, while another 30% expect even more turbulent conditions. The outlook is markedly more gloomy over the 10-year time horizon, with nearly two-thirds of respondents expecting a stormy or turbulent outlook”.

The survey took into account the views of nearly 1,500 leaders across a variety of sectors, including business, academia, civil society and government, as well as more than 200 thematic leaders.

And according to 53% of respondents, AI-generated false information and disinformation is considered to be one of the highest risks over the next two years.

Geopolitical risk is melding with the AI disinformation risk in many ways, too: Censorship in certain countries, cyber intrusions and technological power competitions among countries (and businesses) are escalating worries about the use of AI tools and whether the drive to be first and best is subverting safety considerations.

AI has been used to spread (with social media’s help) misleading images of the wars raging right now and also to try to affect legitimate election processes.

The Forum’s report suggests that governments and businesses join together toward common goals when they can, working jointly on projects and stated goals that benefit society as well as agreeing on safeguards for technology.

The “Basel endgame” calls for a 20-25% increase in capital requirements for the largest banks; the comment period on the proposal ended on January 16.

And one approach discussed at Davos was regulating the way artificial intelligence works from the start, crafting policies that evaluate and audit algorithms to ensure the algorithms themselves aren’t misusing data in ways that could lead to unlawful outcomes.

In the US, for example, the Consumer Financial Protection Bureau has proposed that quality-control assessments be established for algorithms that evaluate a property’s collateral value in mortgage applications. And last April, four US agencies issued a joint pledge to enforce their laws and regulations to monitor artificial intelligence (AI) and enforce the core principles of fairness, equality and justice in the rollout and use of such AI products and services.

More traditional Davos concerns

To be sure, bank CEOs meeting in private at the World Economic Forum on Wednesday aired concerns about the competitive risks from fintech firms and private lenders, and complained about onerous regulations, a source familiar with the matter told Reuters.

One CEO, also speaking to Reuters before the meeting, suggested that there was concern about geopolitical risks potentially negating the benefits of interest rate cuts. According to Reuters the concern about regulations that are too onerous is linked to a common pushback on regulation affecting business.

In one example, Wall Street banks urged the US banking regulators to overhaul a draft rule hiking bank capital, seeking to water down the “Basel Endgame” proposal that bankers say will hurt the economy.

Their proposal for new bank regulation – dubbed the “Basel endgame” – calls for a 20-25% increase in capital requirements for the largest banks. The comment period on the proposal ended on January 16.

Regulators want to know that the banks have sufficient capital can cover customers’ deposits even if the loans they have made aren’t repaid or if their investments drop in value – thereby reducing the risk that a bank failure triggers system-wide financial instability.

But critics worry that such a sharp increase in capital could slow economic growth by reducing lending, among other concerns.