Our paper “Enhancing Data Accessibility: Integrating Speech-Based Interfaces with Large Language Models for Intuitive Database Queries” at LWDA 2024 in Würzburg

Bridging the Gap Between Data and Decision-Makers: A Chatbot-Driven Database Query Solution

In today’s business world, quick access to key information (stored in huge databases) is critical for effective decision-making. Unfortunately, existing dashboard tools and query methods are often too rigid, requiring technical knowledge that most employees do not have. As a result, they rely heavily on database engineers or data analysts, leading to delays in crucial business decisions.

For this problem, we developed a solution that simplifies this process by introducing a chatbot interface that allows employees to query databases using natural language. The chatbot maps the natural language question to a database query – No technical expertise on the user side is required.

Here’s how it works:

  1. Database Query Builder: A large language model (LLM) converts natural language queries into SQL statements, based on the database schema, including vocabulary and synonyms.
  2. Database Connector: The system runs the generated query and returns the result in a structured format, such as JSON.
  3. Processing Results: Results are transformed into meaningful text or visuals.
  4. Output Components: The system provides responses as text or data visualizations like graphs or charts.
  5. Multi-Modal Interface: Users can interact with the chatbot using both voice and text inputs.

The approach has been implemented in a web-based application allowing normal employers to work with the system under real-life conditions. The evaluation results show that simple questions are handled effectively, while more complex queries benefit from additional prompt customization. By incorporating company-specific terminology, the chatbot produces accurate SQL queries and minimizes errors.

We presented the work as a regular paper at the LWDA conference 2024 (co-located with KI2024) in Würzburg, Germany. At the conference we had a lot of interesting, fruitful discussions focusing  on the big potential of the approach and ideas for future improvements. Looking ahead, we aim to further enhance the system with specialized LLMs, optimize query performance, and refine methods to detect and address hallucinations in responses.

More details about our paper and the conference are available on the LWDA 2024 website.

Leave a Reply

You can use these HTML tags

<a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>