An instance of predicting bulk density (RHOB) with Keras and illustrating impacts of normalisation on prediction outcomes
18 hours in the past
Massive quantities of knowledge are acquired every day from wells all over the world. Nonetheless, the standard of that information can fluctuate considerably from lacking information to information impacted by sensor failure and borehole situations. This could have knock-on penalties on different elements of a subsurface venture, akin to delays and inaccurate assumptions and conclusions.
As lacking information is likely one of the most typical points we face with effectively log information high quality, quite a few strategies and strategies have been developed to estimate values and fill within the gaps. This consists of the appliance of machine studying know-how — which has elevated in recognition over the previous few a long time with libraries akin to TensorFlow and PyTorch.
On this tutorial, we will probably be utilizing Keras, which is a high-level neural networks API that runs on prime of TensorFlow. We’ll use it as an instance the method of constructing a machine-learning mannequin to permit predictions of bulk density (RHOB). This can be a generally acquired logging measurement, nevertheless, it may be considerably impacted by unhealthy gap situations or, in some instances, instruments can fail, leading to no measurements over key intervals.
We’ll begin with a quite simple mannequin, that doesn’t account for normalising the inputs, a standard step within the machine studying workflow. Then, we’ll then construct a second mannequin with normalised inputs and illustrate its affect on the ultimate prediction end result.
Step one on this tutorial is to import the libraries we will probably be working with.
For this tutorial, we want 4 libraries:
These are imported as follows:
import pandas as pd
from…