×

Generative AI and Starburst: Pioneering natural language interfaces for data exploration

Interesting potential plays in the LLM space for Starburst

Last Updated: April 9, 2024
AI

I was experimenting with an LLM recently and asked it to write me a Python client to connect to Trino. The system did the job quickly, and it did so in a syntactically correct way that probably saved me 20-30 minutes of digging up those details and writing it myself. 

This is transformative technology. What is happening now with deep neural networks is incredibly exciting – the future is suddenly rushing at us quickly. And, as I noted in my roundup of our recent Hack-a-Trino, there are some interesting potential plays in the LLM space for Starburst. 

We’ve been prototyping a number of projects in house, including a natural language interface that would allow business users with little or no SQL knowledge to ask direct questions of their organization’s data. 

The results can be amazing. They can also be wrong. 

So, for this and other reasons, we’re resisting jumping on the AI hype train and releasing a customer-facing product right now. 

In this post, I’d like to provide an overview of what we’ve been working on, where we think there could be exciting potential, and what we have planned for the future. 

What do LLMs do well?

These systems are remarkably good at understanding syntax. I’ve played around with using them to convert Snowflake queries to Starburst queries, for example. They produce results that work fairly well in Trino and provide details explaining why.

They’re good at data classification and auto-tagging. So, for example, you could ask ChatGPT to flag tables which contain PII and it will just rip right through and let you know if any of them contain personal information you might have missed. This could be very useful for governance and regulatory purposes, especially within large organizations. 

They’re surprisingly effective when you provide a detailed prompt, and, as I mentioned in my anecdote at the top, they’re fast. They can help you finish lower-level work in a fraction of the time.

Where do LLMs fall short?

As we all know, these LLMs can be prone to errors and hallucinations. 

We want our customers to be able to activate and do more with their data, but the results have to be accurate. You can’t turn data into actionable insights if you’re relying on flawed or false answers to your queries. This is probably the main reason we haven’t released a user-facing natural-language-to-sql tool right now. 

We can’t have our customers making decisions based on bad answers.

My point above about how effective they are when you provide a detailed prompt indirectly reflects another shortcoming. In some cases, you need to be so prescriptive and specific that your supposedly natural language query becomes about as natural as the legalese clauses in an NDA. In our experiments, we’ve found that you get to a point at which it’s actually way easier to just write a boolean clause in SQL. 

With Starburst, analysts and other users can query data through familiar BI tools. They don’t have to know SQL to access all their data. At the same time, a fair percentage of Starburst users do write SQL to run complex queries. In the short run, I don’t see these individuals relying on an LLM; it will be easier and more efficient for them to stick with SQL.

I don’t mean to be negative here or derail the Generative AI hype train. Not at all. As I noted above, this is truly exciting and transformative technology. Our focus for now is on finding that sweet spot for our customers. 

What are we doing at Starburst?

My team and I have been experimenting with ChatGPT since the first opportunity. Our customers have been doing exciting work, too. Some have already deployed natural-language-to-sql engines. Others are tuning their OpenAI instances in Azure and training them on specific datasets or data products. There is some upfront work required, but they’re seeing powerful results. 

Our hackathon was a huge success, as I noted, and we uncovered some interesting new avenues for research. One of them is this natural language interface we’ve been prototyping. The architecture is relatively simple as presented in this brief demonstration video.

A user interacts with a front-end application (a web interface or Slackbot) and asks a basic question; this natural language question then passes through a backend server (FastAPI) along with the metadata associated with the Starburst table(s); the question then goes to the davinci free model of OpenAI, which is pretty solid at translating written English text into code, especially SQL; this query is picked up by Starburst and the result is passed back to the front-end application. 

Before returning a result, the prototype shows how it translates your natural-language query into SQL. This step could be removed, but we thought it might be interesting, or even necessary, for transparency purposes. One of the frustrations with AI in general is the lack of explainability. Our prototype reveals both the query which produced the result and points to the source(s) of the data queried, functioning as a kind of audit trail.   

At this point, the prototype is just that – it’s not very bulletproof and there are some scale limitations. But it does hint at the phenomenal potential here. It’s pretty easy to envision some version of this joining the leading BI apps and data science toolkits Starburst currently connects with as another way for users to activate data inside large organizations. The simplicity of the natural language interface could allow entirely new groups to start using Starburst.

This is just the beginning. ChatGPT has been trained on open source code. 

It knows Trino, and it could be an incredible way to generate content and answers for customers in a rapid way. Our support team will always be ready to answer complex issues, but if you have a really simple question, such as how to structure a query, or how date and time work in Trino, then getting a rapid answer through an LLM could be a really great way to demystify the tech stack and accelerate your journey to productivity. 

There is so much more to talk about here, and we’re going to have some exciting announcements on this front in the next month.

Stay tuned. 

Schedule a call with an expert

Book time

Start for Free with Starburst Galaxy

Up to $500 in usage credits included

Please fill in all required fields and ensure you are using a valid email address.

Start Free with
Starburst Galaxy

Up to $500 in usage credits included

  • Query your data lake fast with Starburst's best-in-class MPP SQL query engine
  • Get up and running in less than 5 minutes
  • Easily deploy clusters in AWS, Azure and Google Cloud
For more deployment options:
Download Starburst Enterprise

Please fill in all required fields and ensure you are using a valid email address.

s