The case of the Air Canada GPTBot

2 min

Juan José Soto

CEO

Air Canada must comply with what a GPT-based Bot replies to a passenger even if the answer is wrong (Link to the story).

Why did this happen? The explosion of ChatGPT and other systems capable of transforming knowledge into a conversational interface has created hundreds of startups and internal teams in large enterprises that use the OpenAI or Google API and connect it to their knowledge bases.

The big problem with this approach is that it keeps the black box of these technologies intact. By using the API, companies hand over their data and receive an answer without knowing how the answer is formed and why the system is answering what it answers.The company needs reliable analytics on what customers are asking and what GPT Bot is answering.

Unfortunately, these types of situations will start to multiply.The black box that these types of models entail will bring new regulations that take care of situations like Air Canada, and, likely, companies will soon not be able to create these types of systems if they do not have absolute visibility into how the system makes the answers.

Deep Talk has been taking care of this for some time now, and we have created a complete pipeline where every part of the process has visibility for companies. It allows you to understand how the responses are formed, from what data they are built, and why the system responds in a certain way, with the ability to modify this according to customer, industry, and legal requirements.In addition, all interactions are recorded for full audit and analysis.

This enables companies to understand "what their customers are asking" and improve their business intelligence systems. LLMs are only 10% of the entire pipeline we have implemented for our clients.

LLMs are a breakthrough, but it is irresponsible to use them as a black box blindly; if you don't believe us, just ask Air Canada! 😉

Welcome to our blog