Follow BigDATAwire:

May 10, 2023

LLMs Are the Dinosaur-Killing Meteor for Old BI, ThoughtSpot CEO Says

(Kostiantyn Ivanyshen/Shutterstock)

When the dust from the bombardment of ChatGPT and other large language models (LLMs) on the market finally clears, there will be fewer BI and analytics vendors left standing, ThoughtSpot CEO Sudheesh Nair said.

“I think it is like that huge meteor that came in and killed all the dinosaurs,” Nair told Datanami during a briefing on ThoughtSpot Sage, the company’s next-generation, LLM-based product. “This is an event that will actually disrupt the old BI products like never before.”

ThoughtSpot is in Las Vegas this week to host its annual user conference, as many other BI vendors are doing this week. Most of them, like ThoughtSpot, are making some sort of announcement for using ChatGPT or other LLMs to build natural language interfaces into their SQL based products.

While ThoughtSpot is well-prepared for LLMs thanks to its previous work to build around natural language search interface, older, more established BI and analytics tool vendors are scrambling, according to Nair.

They’re scrambling, he said, because the nature of how BI and analytics is done is changing right beneath our feet. When done properly, a natural language interface can potentially eliminate the need to have any analysts with SQL skills on the staff, Nair said. That puts developers of traditional BI and analytic tools in jeopardy, he said.

(Aleutie/Shutterstock)

“If the artifact that I build for is a dashboard, which is how BI usually thinks, and it is built by specialist who speaks SQL, they have no incentive to open it up natural language because they get paid just because they speak SQL,” Nair said.

Some BI and analytic tool vendors may layer an LLM on top of their dashboard for the purpose of explaining what’s going on in the dashboard, he said. But that merely shows how “inscrutable” those dashboards really are.

But the biggest source of LLM disruption on the BI and analytics tool market, he said, is the wholesale change it will bring to who uses the tools and what they’re able to do with them.

LLMs have already proven that they can turn plain English queries into SQL, which ostensibly is what BI and analytic tools were originally created to do (nothing is stopping hard-core coders from writing perfectly good SQL queries in Notepad).

But the real magic that LLMs like ChatGPT have displayed to the world is the ability to comprehend nuance. AI companies like OpenAI, Google, and Facebook have done the hard work of training their LLMs, such as GPT, Bard, and LLaMA, on massive amounts of human content pulled from the Internet. Thanks to the combination of huge amount of data, sophisticated neural network, and a massive number of compute nodes, the LLMs have begun to display a human-like ability to comprehend.

While the LLMs don’t have “real” comprehension as humans understand it, the LLMs have nevertheless developed excellent parroting skills. This doesn’t mean we’ve developed artificial general intelligence (AGI), but it does mean we have something that’s useful in certain situations, including as an interface for analytics and BI tools.

As Nair sees it, an LLM interface will reduce the need for SQL-loving BI specialists and open up the tools to a much wider net of business users, who will begin to ask all sorts of wild questions that would never have been allowed if the SQL-loving BI specialists were still serving as the gatekeepers.

“A business user, once you expose it directly, she will ask questions that are completely out of context and you can’t stand in the way,” he said.

Sudheesh Nair is the CEO of ThoughtSpot

ThoughtSpot is well-prepared for this LLM future due to the architecture of the product, Nair said. Unlike traditional BI tools, ThoughtSpot doesn’t require extract and schema work to be done ahead of time to handle random queries.

“It’s like a wedding cake,” Nair said of traditional BI. “It operates at the top layer. ThoughtSpot has no layers. We’re able to operate on granular data, which means any question you ask, we’re able to answer….If it’s in the Snowflake or Databricks somewhere, we will be able to find you the answer.”

Nair said ThoughtSpot can answer questions without building the layers ahead of time. Did the company do this knowing that someday there would be a revolution in natural language processing? Not really, Nair admits.

“We ended up in the right place the right time,” he said. “I always believed that good companies that are built with good people, good tech, and good market. But if you want to build a great company, sometimes you need to have luck and timing on your side.”

ThoughtSpot has always considered search and AI to be core to their analytic journey, Nair said. But until now, the AI technology hasn’t been really mature enough to do the sorts of predictive and prescriptive things that Nair wanted to do.

By having an AI that can sort of think like a human–or at least learn to parrot what humans would say in a convincing way–we’re now at a point where ThoughtSpot can open things up and begin to deliver the types of analytics experiences that Nair has always believed would eventually be possible.

The big problem, Nair said, has been capturing intent in neural nets. For example, if you ask “How did my article do this week?” the AI has no idea what the word “do” means, Nair said.

“Capturing the intent–that’s been a challenge,” he said. “We built our own neural net and all of that, and the approaches were not quite accurate enough. With Sage, what we have gone through is use large language models and generative AI for what it really meant to do, which is to extract intent from abstract sentiments.”

The new LLMs are sophisticated enough to understand intent, and that changes everything, he said. Users can now express abstract sentiments, and the LLM, thanks to the training upon a huge corpus of human data, are able to correctly interpret that intent and translate it into accurate SQL to run in the backend.

That, and a whole bunch of prompt engineering, is what ThoughtSpot is doing with Sage. “So that’s really hard to do and that’s very unique,” Nair said.

Nair is convinced this gives ThoughtSpot a compelling advantage over incumbent BI and analytic tool vendors. In the coming months and years, the company plans to build a next-generation AI-infused product that will begin to ask explorative questions and generate predictive insights on behalf of users–without the users asking it to.

“Sage is moving from what happened to why it happened? What will happen? What if I do something? And how could I change the future?” Nair said. “These four buckets need to be coming together. Today, BI is not there. Our current product is not built for that. That’s where Sage is going.  In the next the release that we came out, we are going from what to why? And then in the next couple of months we will actually take it from why to what if and what next.”

None of this would be possible without having a product based on search and language, Nair said. Other BI and analytic vendors will eventually get to where ThoughtSpot is in two to three years, he said. But by then, will it be too late?

“We happen to have the best architecture for large language models,” Nair said. “I really don’t know what other BI companies are going to do to continue their market dominance in the future.”

Related Items:

Which BI and Analytics Vendors Are Incorporating ChatGPT, and How

ChatGPT Dominates as Top In-Demand Workplace Skill: Udemy Report

Like ChatGPT? You Haven’t Seen Anything Yet

BigDATAwire