Home > Oil & Gas > Insight

Working Toward Data Standardization

Ross Philo - Energistics
President and CEO

STORY INLINE POST

Tue, 01/21/2020 - 12:57

share it

From exploration to production, data acquisition and processing are crucial to improving a reservoir’s performance, but data standardization is key to ensure access to the information, says Ross Philo, President and CEO of Energistics. “A good way to consider data standardization is as a lingua franca between different organizations. Operators will be able to interpret units of measures as data is presented in a very readable form. Through this industry consensus, end-users will know they are receiving the right information.”

Energistics is a global, nonprofit, industry consortium that facilitates an inclusive user community for the development, adoption and maintenance of collaborative, open dataexchange standards for the energy industry in general and specifically for oil and gas exploration and production. It has more than 110 members, including the main IOCs, regulators, service companies, software developers and system integrators. “Our role is to get these groups together to determine what kind of data they need to share and how can we standardize formats in the most efficient way,” says Philo. The organization was founded because operators were tired of working with data coming from a myriad of formats. “Energistics was established not to create a standard, but to facilitate the discussions that lead to its determination,” Philo adds.

Philo believes that standards are meant to be collaborative, highlighting fiber-optic measurements as an example. When optic fiber was first introduced, many companies integrated the technology in different ways. “Some operators asked Energistics to look at these fiber-optic measurements and its DAS acquisitions and unite service companies and operators with disparate solutions to promote data usage,” Philo says. Working with its members, Energistics developed a standard for DAS data that allows companies to share DAS data easily and consistently between all parties involved.

Standardization can also open the door to interpretation beyond the information. Philo says there is an adage in the oil industry that oil can be found where it has been discovered before. “From a data perspective, this means we must analyze data that might be 10, 20 or 30 years old,” he says. “Maybe there is a new interpretation technique that allows operators to go back and evaluate subsurface formations and identify missed reserves.” He adds that it is difficult to read old files, especially if they are in a proprietary format of a service company than is no longer in the market. “Standards ensure the data remains accessible.”

Energistics works with three main sets of standards based on the XML language. The company’s oldest standard is WITSML, which is the industry reference for the transmission of data from rig-site to the offices of oilfield companies, integrators and operators. “This standard encompasses well construction, real-time drilling data, well location and other information that can be found in daily reports,” says Philo. PROMDL is the broadest set of standards for optimizing producing oil and gas wells with a focus on data from the reservoirwellbore boundary to the custody transfer point. “Recently, this standard has been adapted to carry additional data types, such as distributed acoustics, PVT and pressure transient analyses,” he says. On the other hand, the RESQML standard allows the transfer of really complex 3D models that describe the subsurface and cover the entire workflow from seismic interpretation through geological mapping to simulation.

The industry is strongly embracing digital transformation but Philo warns that the results will only be as good as the data that is introduced. “There is a strong desire to benefit from the data we acquire but there is a misconception that AI is somehow going to automatically correct errors in underlying data. Whether you are making a human or an AI decision, the quality of the introduced data is going to drive the quality of the outcome,” he says. The consortium’s most recent development is Data Assurance, a program that quantifies the level of confidence in each data set. “This whole concept of Data Assurance was implemented to get ready for the automation that AI and machine learning are going to introduce to the industry. We want to be able to ingest information and process it automatically while ensuring trust in the underlying data and therefore, the output. As we move toward autonomous systems, this is going to become even more important,” Philo says.

You May Like

Most popular

Newsletter