Copying and distributing are prohibited without permission of the publisher.


Fintech pros announce stay of robot executioner’s hand on ABS jobs

By GlobalCapital
26 Feb 2019

To a standing-room-only crowd of trembling ABS professionals either hoping for a glimpse of the awesome future that artificial intelligence and machine learning promise for the finance industry, or awaiting the proclamation of a death sentence for their jobs, fintech panellists offered bittersweet consolation: the ABS industry still needs humans. For now, at least.

The hands of our future machine-managers still need human guidance in order to provide meaningful benefit to the finance industry, and securitization in particular, in the areas where they are now being loosed.

At a panel on blockchain technology, T-Rex founder and CEO Benjamin Cohen said that T-Rex, an enterprise solutions provider, does have rating agency and issuing clients that no longer need to communicate with one another in order to successfully structure deals, as the ratings methodologies are built into the structuring methodologies. However, the firm still mostly deals in esoteric ABS, where variations in ratings and structuring methodologies mean human bankers are still needed. When asked whether the variation in methodologies indicated structurers will still be needed in the future, Cohen responded by asking what timeline was being referenced, from one to five years. In one year, he said, structurers should still have their jobs.

At another panel on artificial intelligence and machine learning, panellists warned that AI acolytes under the impression that more data equates to a more robust model output have been forsaken by their data deities.

Enthusiasm for the quick and sure coming of our glorious fintech future can often lead businesses to implement technologies and fintech strategies before they know what they are using them for, which panellists agreed is forbidden.

“Good data trumps algorithms,” said FICO senior principal scientist Gerald Fahner. Erica Dorfman, head of finance and operations at Tally, agreed, saying that she had seen instances of underwriters putting as much data as possible into their models, sometimes rendering results inaccurate. “You won’t learn anything from whether a person writes their name in cursive or in block letters,” she said. “There are those that have tried out that idea and it is not accurate.”

Fahner emphasised that a black box approach to algorithms is also too proud and may anger Them. Model results must be explainable to end users, he warned. “Humans need to wrap their brains around what the process was that generated the [output] data,” he said. “If you leave the algorithm to the machine, without human oversight, there’s no guarantee it will make sense to the consumer.” The results “need to be acceptable to stakeholders,” he added.

Moderator Eiman Abdelmoneim of Sky Road LLC told of an asset manager who had hired Phd candidates to produce ideas for new technologies. Yet even with the Silicon Valley sacraments of Red Bull and pizza, AI alchemy came there none. Starting with a business problem first is crucial, Abdelmoneim said. “If you’re starting with a bunch of Phd candidates and a room full of pizza, you’re increasing your likelihood of failure,” he said.

At both the blockchain and AI sessions, panellists mercifully agreed not to “boil the ocean.”

By GlobalCapital
26 Feb 2019