Machines are listening to your earnings call

Here's how a mining company learned to speak their language.

Go to the profile of Carole Cable
Oct 09, 2019
0
0

When the Associated Press announced in 2014 that its stories on quarterly earnings would be automatically written by software, the move made headlines. Today, AI-driven reporting is making headlines in a much more literal sense—by writing them. Roughly one out of every three articles posted on Bloomberg News now uses some sort of automated technology, and the practice has become so commonplace among leading news organisations that there’s an award for the best use of big data and AI in the industry. Earnings calls, given their reliance on numbers and fairly predictable format, remain a staple of automated journalism.

Investors are also making use of AI to scrutinize earnings calls. One way they do so is by analysing a call in real-time, producing a “sentiment score” of the words and phrases spoken by the company’s representatives—are the words positive, negative, or neutral? Advanced versions of these algorithms are trained in the nuance of financial language so rather than simply analysing individual words they consider them in context (recognising that a “sharp drop in overhead,” for example, is a positive statement despite containing words your average algorithm would read as negative).

A 2018 National Investor Relations Institute publication reported that investors are also using lie-detection techniques employed by the CIA to analyse calls, and that AI-driven approaches might soon assess a leader’s tone of voice, their intonation, or even their pauses in answering a question. You can question the merit of automatically generated data points like these, but not their influence, as this kind of data often informs automatically executed trades—which The Wall Street Journal estimates are 85% of all stock market trades. BarclayHedge, which compiles data for investors, found that 56% of hedge fund managers use AI to inform their trading decisions.

The bottom line: Machines are listening to earnings calls, reporting on them, and shaping how stakeholders—especially investors—react to what’s said on them. Which makes it surprising that most companies still approach their earnings calls as they did a decade ago, despite their audience having changed drastically.

But some companies are now preparing for those calls with the same tools that newsrooms and investors are using to listen to them: natural language processing, artificial intelligence, and machine learning.

Earlier this year, I worked with a global mining company to incorporate these tools into their earnings call preparation and analysis.

Built on analysis by data scientists at Brunswick, the communications consultancy I work for, we were able to study the company’s past earnings calls as well as those of their peers. This gave us a data-driven lay of the land—to see not only what the machines had heard, but also how the markets had reacted.

Among the most important pieces of data generated by our analysis was an overall “trend line” of the company’s earnings call. This line, in essence, represented the overall structure of the call and the clarity of its storytelling.

We incorporated this information, along with other research we conducted, into their preparation for an upcoming call. The data helped inform what their prepared remarks should address, how it could be structured, and the words that would best communicate their story. We then ran drafts of their prepared remarks through the algorithm to see how it performed— how did their call strategy compare with their call performance? Was the positive story they wanted to tell actually perceived as positive by the machines? The answer was yes.

Obviously, we couldn’t change the numbers and facts they had to report, but we could experiment with structure and language. It wasn’t about letting an algorithm determine what was said, but informing our judgment with a new perspective.

Leaders may be understandably sceptical. What can changes to language or structure really achieve when you have a tough story to tell or tough numbers to deliver? Others might see this approach as almost disingenuous—like you’re trying to game the system.

Both misconstrue what this technology and preparation are designed to do. They can’t transform bad news into good news. But as our client experienced, when you fully understand what your audience (machine and human) is hearing, you can tell a better story. Good news can be told a little more clearly; a rough quarter can be put into context. This kind of preparation arms senior leaders with a data-driven sense of what the investor reaction is likely to be after the call ends. It helps companies catch up in the analytics arms race with the market.

At its core, this is an exercise in modern storytelling, using cutting-edge technology to help ensure that what you say is what your (increasingly digital) audience hears.

Carole Cable is a speaker at the upcoming International Mining and Resources Conference + EXPO (IMARC), taking place 29-31 October in Melbourne. 

No comments yet.