Company, Podcast

12 Oct 2023

Podcast: Building with financial data with Ntropy co-founder and CEO, Naré Vardanyan

Placeholder image

Author

Naré Vardanyan

Co-Founder and CEO

This podcast episode features Naré Vardanyan, co-founder and CEO of Ntropy. Check out the summary and highlights below.

Highlights

What does Ntropy do?

Ntropy builds language models to better understand financial data at scale. We help any business, bank, FI, fintech or other to make sense of their customers financial data.

We believe that the information that lives in your transaction data on what you spend your money on is the biggest leverage to create better financial lives globally.

What is an large language model? What are the use cases?

Language models is a type of probabilistic model that uses statistics to make predictions. A large language model is a subset of language models, that has been trained on large data sets. A model that has been trained on one billion or more parameters is considered to be an LLM.

At Ntropy we have developed language models trained to perform one task, understand financial transactions with a superhuman accuracy.

What sort of data sets are your models pre-trained on?

The language models we have built have been trained on billions of transactions for a very specific task, understanding financial data with a human-like intuition.

Our language models have been trained on both consumer and business data which is unique. Most transaction enrichment solutions typically noly specialise in one or the other.

Additionally, our models have been trained on data from many different countries and many different languages which is why we are able to enrich data from any geography and in any language.

Once you have clean, standardized and enriched data, you can build additional insights on top of it. You can ask questions about the data.

Tell me about the cost of training and where does it factor into what you offer with Ntropy?

Training language models is extremely expensive from a number of different perspectives. Acquiring, cleaning and processing the huge datasets required is expensive and is requires ongoing maintenance. We spent millions on this specifically.

Training also takes time and usage of high-spec hardware as well as electricity and storage costs so adds another dimension to it.

Our focus is on optimizing the unit economics so that it is feasible for our customers. Processing billions of transactions in real time with GPT4 does not work for a cost or latency perspective.

Can you talk about latency specifically as a key factor in this process?

Latency really matters in a number of use cases. Transaction authorization needs to be done in 300 miliseconds whereas for a financial copilot, for a consumer to wait a few seconds for a response, that is acceptable. This means that the latency, as well as the cost, of some models restricts what they can be used for.

There is always trade-off between cost, latency and accuracy and each different use case has different priorities.

Can you talk about how some current customers are using Ntropy today?

Our top use case is lending and credit underwriting. Bank data is increasingly becoming a key data source in underwriting, in combination with accounting and sales and other sources is a way to build that conviction, to be able to underwrite a business and provide them access to capital.

Payments are another use case where we're seeing a lot of use of using enriched data to approve or decline transactions. Fraud is also very difficult and very much an unsolved problem so I think looking beyond the metadata, which is what the traditional model was, but into the actual context of a transaction really helps. So we've seen that being quite popular.

The third probably thing that we're seeing and how it's used is just in improving the UX which can increase engagement with digital banking apps.

What is the future looking like for Ntropy?

Ntropy's mission is to create better financial lives with this technology and the availability of financial data data.

There needs to be an infrastructure company leveraging this language models that is specifically focused on financial services that has the intuition, the data sets, the expertise, the framework, and can basically go and solve these problems for this specific vertical use case that financial services are.

Ntropy has the experience of dealing with this specific problem for a long time.

We have assembled a top class team who have been specialized in natural language processing and transformers for years have been working on these problems. There are very few companies in the financial services space with such experience.

Our experience also means we have a proprietary data set of 100 million+ merchants and a smart cache of billions of transactions which is the foundation for our industry leading accuracy.

Join hundreds of companies taking control of their transactions

Ntropy is the most accurate financial data standardization and enrichment API. Any data source, any geography.