Company, Technical

08 Nov 2023

Got GPT? OpenAI dev day and what this means for finance

Placeholder image

Author

Naré Vardanyan

Co-Founder and CEO

Below is a quick overview of yesterday’s developer party , courtesy of OpenAI,  and what this means for financial services in particular. Let's go

  • Increasing the context length is great for a variety of  use-cases, where previously RAG was hard and you needed a lot of workarounds
  • Decrease in price and promised decrease in latency is a definite quality of life improvement bringing LLM-s closer to be deployment ready tooling for enterprise, particularly in a data hungry setting. 
  • The ability to use vertical specific data directly does challenge all the shiny vector database companies though. Let us see how this layer plays out.
  • Agents make me excited. The agent store not so much. Let me explain. 

 No-code agents can technically be cool. The barrier to building a conversational agent is very low and getting lower, which means it is going to be very hard to truly monetize them. 


Hyper-personal mini-apps vs unicorns?

We are more likely to end up in a world where every person or family is going to customize and build their own agents that are hyper personalized to perform certain tasks. Since the majority of the data and reasoning will live on OpenAI’s side, making another app aka GPT for millions of people that millions of people will use and pay for seems a less likely future for me.

Or these will be indie apps and you will see thousands of similar kinds of apps on the store and OpenAI needs a stringent review process to make it worth the while of an end user. Indie store and Wordpress more than an Appstore is what I foresee is going to happen.

 I just saw a Negotiator GPT,  which was great, and it helped me negotiate with a customer in a role-play setting. Now, what is it going to cost me to build this specifically for my company with all the details of our contracts etc vs using your Negotiator GPT from your App Store? 

I think the era of feasibility of millions of feature requests is coming. We have written about this before. Hyper personal GPT-s vs a single app to serve everybody. If you want, your Negotiator GPT can throw awkward jokes from time to time, or give a pep talk. Your friend’s one might be great at Friends references. 


Custom models

Custom models were announced (at Ntropy we have loved this approach for a few years now).

Custom models are super interesting and are going to accelerate enterprise adoption. The costs to build a custom model I presume are still prohibitive, even if you get access to OpenAI engineers. How are these models going to be updated and maintained? There are a lot of open questions, but definitely a bold move. 


The everything tool

At this point OpenAI is trying to be everything. The Apple and the Microsoft, the Palantir in some cases, but at the same time the AWS of AI. This makes a lot of people scream from the rooftops how millions of startups will die and starting companies does not make sense. 

I think quite the opposite is true. Startups always die. 

 OpenAI builds amazing models at a crazy pace. They are not a product company and they are not a vertically integrated enterprise business. If you are building specific tooling for specific domains with a ton of unique data, there are so many opportunities. 

If you are building magical products with these new building blocks, you can build an empire right in front of OpenAI and on top of them, and no GPT is going to crush you.


AI-first or AI enabled?

Then in a lot of cases we already have magical products that are going to get even better. 

Unless building with AI from scratch makes you orders of magnitude faster, cheaper and simply better vs the existing solution with an AI offering, it will be hard to justify a new GPT taking off in existing verticals. 

 Let us take Notion. 

Technically I can create an onboarding and employee handbook GPT that reads all my internal documents and files about the company and then interactively answers questions for the new employees. I need to get all my files first and give GPT access to them and then iterate on the types of things it needs to do . 

However, most of this data already lives in my Notion and my employees already have access to it, hence Notion’s copilot is a beautiful experience that I will continue paying for stacking up my lifetime value as a customer. 

 I think the conversational UI is overrated and end to end GUI-s with a great user experience can still come out as winners.


What I will pay for

Let's get back to the Negotiator GPT.  

If you are building a product that teaches people to get better at negotiating or public speaking. You need a specialized GUI and maybe access to live coaches and real calls.  Then you can also add infinite internal lessons and a place where I can actually then practice with real humans and get thrown into actual situations, have my outcomes evaluated and have a leaderboard. This is something I will pay for and I am sure this is something I will look for. 

Unless the chat interface is sufficient and the quickest way to do things, GPT agents are less relevant, but “programmable agents as an API” to power new GUI-s and apps that can actually be personalized to “millions of feature requests” and personas are super powerful. 

An ideal setup is being able to program end to end apps written by GPT, with beautiful GUI-s, integrated with the App store. 


Now, let's talk about finance. 

From where we are today, the most interesting use cases in finance are the ones, where humans are dealing with lots and lots of files, reviewing applications, answering requests, and reconciling ledgers. Currently we are using a network of various models, automations, and other humans as validators to make this work.

This is very costly and very slow. Imagine if you could re-build the Wells Fargo of the future with these new tools.  What would GPT-Fargo look like and how much of the unit economics could it give back to the users? 


One model to rule them all

I was moderating a business lending workshop recently and I had a slide that said “One model to rule them all”. There were many questions on why this is the better option.

The idea is that instead of heterogeneous systems that work on narrow cases and require operational, infrastructure and human overheads, you could have a more general and unified intelligence stack. This would power everything from your customer acquisition to KYC/KYB, underwriting and even accounting and reconciliation.

This unified intelligence stack  is now becoming possible thanks to LLMs and we are actively working in this space. 


Further applications in finance

Below are some exciting use cases and ideas for finance that are also second order effects of this new intelligence stack. 

Customer acquisition, upselling and risk can become whole,  making user onboarding so much faster, better and cheaper. There is absolutely no reason why the people that you acquire for new products or as completely new users to your platform are not already derisked while you are in the process of acquiring them? Imagine a world where by getting a new user, you already have a close enough proxy to generate their relevant risk profile without them having to do much. Imagine the mechanics of a “for you page” but for risk. Yes you would do additional verifications as an end step, but the majority of users will pass with high enough confidence without having to proactively add data points. We have talked to many FI-s thinking and acting in this direction. 

Why is this possible today? We need to analyze piles and piles of publicly available unstructured data (in fact a lot of this data lives in the weights of the model) and correlate it with consumer permissioned private information (this was very costly and hard to scale before) to create persona-s you are going to compare every newly acquired user to. 

Until recently there was no technology that could do this with high enough accuracy at the needed scale.

Customer service and NPS scores of banks and FI-s are about to get crazy upgrades. You can have millions of agents fed off millions of previously asked questions and specifics, as well as scenarios. They will not be annoying, slow or make you push buttons or wait on the line. They will magically resolve the majority of your issues without you even having to talk to a human. Not talking to a human will be the better experience contrary to our current intuition and everything we have learnt about automated bots until now. 

As a fun experiment, you can even make a customer service bot that is compliant, but takes you on an infinite loop preventing you from closing an account. It will deliver 0 churn and cost very little. 

First party fraud will evolve and so will the ways of tackling it. You can end up with a GPT calling to dispute a transaction that has occurred, talking to a GPT on the other end that is resolving the case looking at the data. 

 As long as no human time and losses are involved, we are good.

 Bot to bot negotiation is a super exciting philosophical rabbit hole I do not want to get into now, despite being tempted 

Underwriting workflows are going to become orders of magnitude more efficient and automated, especially as we talk about underwriting on large scale non-standardized data pools for various cohorts of customers that you have not seen before or the ones you do not understand. 

I think this is obvious, but wealth management is going to get a serious boost in accessibility and will turn into true wealth creation. We will end up in a situation where instead of a generic 401k, you are getting a baked-in “wealth maker GPT’’ whose sole function is to create wealth for you given certain parameters and limitations. 

Work is changing once again. We are about to experience an army of indie creators who are making hundreds of thousands of dollars on this new AI-enabled internet. The concept of full-time employees with payslips will turn into a world of agent enabled sole traders, at least for white collar jobs. 

“GPT-s powering the banks” should learn how to serve these “workers powered by GPT-s” and improve their quality of life by automating mortgage, tax, car financing and more. 

I can go on and on. As you can probably tell, I am excited.

For all of this to exist we need great data, we need to keep lowering the costs and we need the AI skill gap to be minimized for enterprise adoption. The 92% adoption rate of OpenAI in Fortune 500 alas is deceiving (referring to Sam’s opening keynote). This is merely having a subscription to ChatGPT. Real deployments are yet to come. 


Conclusions

For these models to take off in a highly regulated environment, there also needs to be a certain level of comfort in relying and moving into a probabilistic paradigm from a deterministic one, which the current software stack is. 

Finally, I want to wrap with this probabilistic generation of conclusions (courtesy of OpenAI) imitating Matt Levine’s style:

"In a world where your fridge is smart enough to remind you to buy milk, it's no surprise that OpenAI's latest party tricks include teaching algorithms to play the stock market and babysit your bank account. At their recent developer day, OpenAI trotted out their language models, which are now apparently suave enough to charm both your customers and your spreadsheets. They're serving up AI that can sniff out a fraudulent transaction like a bloodhound or craft a personalized investment plan with the warmth of a Christmas card. But, as always, the devil's in the data. Will these shiny new tools actually be the Wall Street whisperers they claim to be, or is this just another techno-optimistic future where the AI is as trustworthy as a hedge fund manager in a bull market? Only time, and the markets, will tell."

Join hundreds of companies taking control of their transactions

Ntropy is the most accurate financial data standardization and enrichment API. Any data source, any geography.