Our promising project with NTNU ReLU

How to effectively utilize LLM capabilities on real-world production data? We partnered with NTNU ReLU to start our journey.

Author Kristoffer Nesland portrait image
Author
Kristoffer Nesland
Publish date
· 4 min read

Inspired by the impressive advances of LLMs since the initial launch of ChatGPT in November 2022, Solution Seeker kicked off an ambitious project in collaboaration with the ReLU student organization at NTNU in September 2024. The challenge at hand was to explore how to effectively utilize LLM capabilities on real-world production data.

NTNU ReLU

NTNU ReLU are students who build machine learning solutions for companies. Their approach combines a research-driven focus on cutting edge developments with a pragmatic pursuit of business value. They aim to infuse new ideas into the industry while elevating ML expertise at NTNU. ReLU invites all students passionate about machine learning to join them. Here, they will contribute to shaping the future of ML, both at NTNU and in the broader industry

To illustrate the challenge, we can have a look at the following two examples. The first query being a general non plant-specific question, and the second one requiring access to data from a specific plant.

The issue with the second query? The LLM simply does not have access to the needed information (existing wells, test results, ...). Fortunately for us, engineers at Anthropic were working on exactly this challenge. In November 2024, the open source Model Context Protocol (MCP) was released. Put simply, MCP functions like an API specifically tailored for LLMs, enabling models to leverage a suite of "tools" to access information needed to answer user queries on topics beyond their training data. This approach effectively extends the model's knowledge to specialized domains without requiring retraining.

MCP

Launched in November 2024, MCP is a new standard for connecting AI assistants to the systems where data lives, including content repositories, business tools, and development environments. Its aim is to help frontier models produce better, more relevant responses.

For those interested in LLM tool use, we recommend Andrej Karpathy's video.

To understand what distinguishes MCP from an API, we recommend IBM's video.

With MCP integrated into our system, progress accelerated significantly. We developed custom tools built on top of the Solution Seeker pipeline, transforming the LLM's capabilities. Suddenly, the model could navigate existing sensors, calculate timeseries statistics, and generate insightful data visualizations - all using real production data that wasn't part of its original training.

The screenshot shows how the LLM (in the sidebar) is using the “Get well tests” tool to retrieve test reults from the Well Test application, making it possible for the LLM to answer the query with confidence.

Finally, we would like to take the opportunity to thank the ReLU team for inspiring work on some very interesting topics! It is still early days, but we have already gotten valuable feedback from end users and we have a lot of ideas on how to make our software suite "LLM-ready".

Let's see where this brings us 🔮