Starting point
It’s a time of great change and innovation in the oil and gas industries. IoT and increasing technical capabilities mean that data is being generated from hundreds and thousands of different sources. And it’s not just the energy giants like Shell and BP creating masses of data, it is also happening across small subcontractors running rigs, oil exploration businesses, and even health and safety companies. These increasing volumes of telemetry, customer, and usage and utilization data, can understandably be challenging for businesses to manage, let alone make use of. And, with the liberalization of energy and the turbulence of the past year, with 30 UK energy suppliers ceasing trade since August 2021, providers need to be on top of all the data they’re generating, and using it to their advantage, to inform operational and strategic business decisions.
When processed and collated effectively, data can be invaluable, helping to boost efficiency, minimize disruption, reduce costs, and aid resource planning, to name just a handful of benefits. IoT and equipment sensors can provide crucial monitoring and oversight of machinery and processing equipment, with live data being tracked to identify potential issues and failures before they happen. This allows companies to prevent costly unexpected failure and downtime, and plan for maintenance or parts replacements to take place during more convenient periods, for example, when demand is lower. Taking a proactive approach usually means that fixes can be completed in a shorter timeframe than if the failure was acted on retrospectively and the company had to wait for a site visit to fix the issue.
Performance data, such as utilization, can also be invaluable in helping to predict future usage and capacity requirements, such as how much coal or gas needs to be produced to meet demand on a particular day, or to spot a change in market trends. Similarly, this proactive approach allows businesses to prepare in advance of the event to maximize profit. But for that insight to be available, the data sources need to be integrated.
This is a challenge that German energy company, Uniper, was able to overcome through the creation of a data lake – a store for vast volumes of unstructured data in its raw format. As one of the largest global power generators, Uniper had more than 120 different internal and external sources of data and bringing them together to create real meaning or insight was a convoluted process. The energy provider was able to report on measures like gas storage capacities and utilization in silos but comparing that with the throughput and output of a gas power station, for example, was proving a lengthy and complex process.
Uniper worked with Talend to integrate its disparate sources of data into a data lake to make it accessible to the applications and reporting functions that needed it. This allowed Uniper to gather real-time information from the sensors in its plants to monitor usage, alter production levels depending on demand, and monitor the condition and performance of machinery and parts so maintenance could be planned for, with minimal downtime.
With the relevant information aggregated and easily accessible in the data lake, its market analysis teams are now able to provide data and answers to traders either immediately or within just a few days, where it would have previously taken them months to carry out the necessary research and collation of information.
As well as reducing Uniper’s integration costs by 80 percent, the data lake also helps to reduce operational risk around post-trade administration by managing and automatically tracking the documentation and reporting of all trading transactions.
When data is integrated in this way and made accessible to key stakeholders throughout the business, performance and operational efficiencies can be maximized. Integrating, centralizing and standardizing this valuable data provides a single source of truth from which informed decisions can be made. It also brings the advantage of speed to market, especially in the competitive and fast-moving energy landscape, allowing employees to self-service data requirements quickly, without disparate systems and IT complexities creating a bottleneck.
However, ensuring the necessary governance is in place is key to the success and efficiency of this centralization of data. Establishing good data lineage, through the appointment of data owners and implementation of data cataloguing or tagging mechanisms, is crucial to comply with the necessary security and GDPR regulation. Once the correct processes and procedures are in place, a data lake can open the doors for business and developers to collaborate to solve problems quickly and compliantly. And, with developer skills more stretched and in demand than ever before, this collaboration reduces the need for technical intervention and expedites the entire process.
As technical capabilities continue to advance across the industry, and with many more challenges lying ahead for the market in the next 12 months, competitive energy companies will need to capitalize on their real-time analytics and data to make more strategic and informed decisions. And, with unprecedented volumes of data and disparate sources, standardizing and centralizing data is a crucial starting point to create meaningful and actionable insight.
ROB JONES
Rob Jones is CEO at Qbase. In a data driven world, Qbase helps organizations to realize the full value of their data assets. Its services include data management and migration, data quality and governance, business analysis and insights, and marketing automation. It helps companies to prepare and build their data infrastructure so that it can be used to provide insights and analytics to optimise automated customer journeys and orchestration.