Well testing by deduction

Showing how Solution Seeker successfully use a deduction test methodology to achieve high quality well-tests

Value proposition

  • Ability to efficiently test wells that are not connected to a test separator
  • Higher quality test results by continuously monitoring the measurement uncertainty during testing
  • Shorter test period of wells by ending tests when the desired test quality is achieved compared to fixed test lengths, limiting reduced productio

Situation

Traditionally, well-tests are performed by routing individual wells through a test separator, allowing engineers to isolate a single well’s performance metrics. However, for various reasons, a test separator might not always be available. This can be due to tie-ins, physical restrictions or other constraints in the production setup.

One option to the traditional test separator is a deduction test. In this test engineers try to single out a well’s production by looking at the difference in total production when shutting down one well at the time. As production is temporarily reduced it’s critical that these tests are finished as soon as possible, but still produce high quality results.

Solution

To make sure deduction tests are run as efficiently as possible, but still produce trustworthy results we have developed a proprietary deduction test application. There are four main processes the application assists with before, during, and after testing.

Before testing begins it’s important to have a stable period with low uncertainty in the rate metrics as a benchmark for the test. The application automatically picks out stable periods and reports on the uncertainty, using our proprietary data-mining framework Squashy.

Once the test has started and the well in question is shut-in, the application helps the production engineers to monitor the remaining wells that are still producing. The objective is that the remaining wells are producing under conditions that resemble their original state. This is to make sure that we can make a fair comparison, and truly isolate out the effect of the well that is being tested.

When the test period has stabilized and the producing wells are close to their original state, the application analyses the uncertainties of the test parameters. Once the uncertainties are within acceptable thresholds, the engineers are notified so that they can end the test with high quality results.

Finally, all historical tests are stored and structured in a well-test database that gives an easy accessible overview of tests. The application will also detect all historical tests, with the same rules it applies for tracking live tests, and stores them in the same database. For convenience, the results can also be exported to any compatible system.

Outcome

Within the use case of deduction testing our partners have experienced several benefits with our solution.

One of the biggest benefits of having so many sensors and live data available is the ability to become condition-based in everything we do, including deduction testing. One of the key value drivers of this application is the facilitation and enabling of the engineers to run shorter tests based on knowledge of the uncertainty in the test results. Instead of running fixed-length tests, our partners are now able to finish the test when they have an acceptable result, which can be hours earlier than what they used as a fixed-length earlier.

The uncertainty measures in themselves are also valuable, since they act as a live quality measure. This makes it possible to factor in the uncertainty and quality when comparing the results over time, to see how the wells are evolving.

The application also allows the production engineer to customize what measurements they want to look at, and is an efficient workflow tool that gathers all the important information in one place.

Well test data is often the most reliable data we have about our wells’ behavior. Therefore, it is also the most valuable data in all types of model building, especially in data-driven well flow rate estimation. The test optimization module is a crucial part of the “test-to-rate” workflow, automatically providing qualified well-test data to NeuralCompass for rate estimation purposes.