The Data Divide: Winning and Losing in Oil Industry Transactions – Geoffrey Cann – Canadian Energy News, Top Headlines, Commentaries, Features & Events – EnergyNow

Capturing Value at Times of Turmoil Requires Preparation

By Geoffrey Cann

TL;DR

  • The wave of oil and gas merger activity in the US looks poised to continue.
  • High data quality about assets is accretive to value; poor data quality destroys value.
  • Modern solutions adopted before deals help permanently fix data quality challenges.

An M&A Frenzy

The US oil and gas industry witnesses a steady stream of mergers and acquisitions every year, but the past 12 months have seen a dramatic step up in consolidation, involving the largest companies in the industry. It was kicked off when Exxon announced its acquisition of Pioneer Resources, which was quickly followed by Chevron purchasing Hess. Deals involving Aera Energy (owned by Shell and Exxon) and California ResourcesAPA Corp and Callon Resources, Southwestern and Chesapeake Energy, Occidental and CrownRock Minerals, Diamondback Energy and Endeavor Energy Resources, and Chord Energy and Enerplus have all followed.

There are a number of drivers behind the current wave of consolidation.

  • High interest rates are straining leveraged balance sheets, forcing some sellers to reconfigure.
  • Strong oil and gas prices since the start of the Ukraine-Russia war have delivered bumper profits for the industry, but opportunities to put that capital to work for the industry are limited.
  • Elevated share prices risked putting attractive acquisition targets out of reach.
  • Acquiring production solves for shortcomings in reserves and production growth, compensating for the limited prospects for large oil and gas companies to achieve their growth targets.
  • Forward demand projections are increasingly murky post 2030, when many markets have signalled their intent to become carbon neutral. Capital is becoming much harder to source for long stride projects beyond that timeline. Buying existing assets that can be put to work immediately are lower risk.

While these drivers do not show signs of reversal anytime soon, the horizon shows other factors that suggest the timing for transactions is positive. 2024 is a major election year in democracies globally, and post election environments can often translate into vigorous policy shifts that may constrain future acquisitions. Commitments at COP28 point to a transition away from fossil fuels, which may constrain growth in the future should governments restrict new resource development.

winning and losing in oil industry transactions geoffrey cann

Mergers and acquisitions tend to create waves of subsequent smaller deals as assets are sold to raise cash to pay for the deal, non-core assets are trimmed from the portfolio, and regulators force divestitures to reduce market concentration. Boards and management teams across the industry revisit their strategies and become predator to buy these available assets, or convert themselves into prey given market interest in more deals.

For very large firms, transaction announcements in the industry are about strategic fit. There isn’t time nor is the data available for a prospective buyer to complete a detailed due diligence of a target’s asset portfolio before the announcement. Subsequently, sellers stand up a data room where prospective buyers can crawl through the target’s data, carry out some analysis, and sharpen their pricing.

Finalized pricing for acquisitions and divestitures are effectively based on the data that companies have or present about their assets. Considerable value is at stake, on both sides, as buyers aim for as low a price as possible while sellers strive for as strong a price as can be justified.

How Data Quality Impacts Value

Very frequently, buyers discover, to their delight (and sellers to their dismay) that the selling company is plagued by poor data quality. When the quality of data is low, the buyer’s risk rises:

  • The strategic merit behind the deal is weakened, leading to buyer apprehension.
  • The assets take longer to analyze, delaying the time to close and impacting the time to value.
  • There is less certainty in the value calculations, leading to wider range estimates and greater potential for downside.
  • The regulatory risk rises because of the uncertainty.

This leads to buyers’ demanding a discount when data is uncertain or poor quality, or takes too long to analyze, to compensate for the increased risk. The corollary is buyers’ willingness to pay a premium when the data is high quality because there is greater confidence and less risk.

Strategically, those buyers with superior management systems can then target those companies whose processes are still manual or poorly run. They can more aggressively price the acquisition (pay more) because of their ability to create value by moving the acquired businesses onto their better management systems.

Similarly, sellers telegraph professionalism when their asset data is highly reliable, cloud resident, non-duplicative, current and automated. They can command higher valuations for their assets, a lower risk in their valuations, less discounting in general, faster negotiations, and stronger pricing for their shareholders.

Failing to Recognize

Poor data quality takes many forms. Often the most visible is a proliferation of incompatible and duplicative systems for the same content, such as multiple land systems containing conflicting and incomplete data. High levels of manual data preparation to build aggregation reports and insight are another indicator. Beneficially for the industry, the COVID pandemic revealed the risks behind the traditional centralized error-prone manual processes, and many companies moved to cloud-based record keeping.

Even so, employees can often point out shortcomings in dealing with data:

  • the prevalence of multiple databases for the same topic,
  • dual and triple data entry,
  • excel spreadsheets that contain mission critical data,
  • a continued reliance on paper-based legacy land contracts,
  • a history of unexplained data loss,
  • inadequate document versioning,
  • low levels of system integration, and,
  • low levels of security over confidential information.

Management choices are often the cause of poor data quality. For example, after a run of successive acquisitions, buyers will often defer the hard merger work of improving data quality by standardizing work, systems, and practices. Initiatives to improve data quality lack the appeal of increasing reserves or growing production, leading to limited investment. Measuring the value from improving data assets is harder than capturing the value from increased production. Shortages in talent can block data initiatives from launch. Over time, the amount of work, and the amount of time required, to move from low quality to high quality, is over-whelming.

Management may discount staff concerns about data quality, and fail to recognize their poor positioning, until a transaction occurs, and the discount is priced by the buyer. By this point, it is too late for any meaningful action. Hoping for an ignorant buyer is equally risky, as it is only a matter of time before buyers begin to deploy modern AI tools to detect issues in the sellers’ data.

Creating Value in Data

Managers seeking to improve their businesses, anticipating transactions as either a buyer or a seller, will spend the time and resources necessary to create value from their data. In our more digital world, with the power of modern tools from companies such as PakEnergy, managers look to deploy lower cost processes that save them time and require fewer hands on to deliver.

Contemporary environments are fully cloud-enabled, so that work can be carried out wherever and whenever is appropriate. Documents are digitized and ingested using character recognition tools to save time in data capture. Audit trails provide evidence for compliance with the most rigorous standards of governance. A single source of truth in data, with one-time capture, removes a needless source of uncertainty. Quality control mechanisms are transparent and automatic, eliminating the need for layers of checkers, auditors, and validators.

Matching our collective experience with consumer technology, the interfaces to these data environments are lessons in intuitive user-friendliness. The few manual data inputs are closely controlled using features like pick lists and drop-down menus, to minimize the risk of error, and to promote maximum consistency. Data rules are automatically applied, so that fields like dates, addresses, and names are always consistent.  Clever integrations move data nearly instantly and without loss within the firewall and between related third-party systems, eliminating manual touch points, verifications, and monitoring.

For buyers, the resulting environment allows for far more efficient onboarding of new deals, the ability to do more deals in less time, and a lower overall staff burden.  Having such a platform of software in place to streamline and reduce that burden is especially key for smaller operators.

Why PakEnergy?

The suite of solutions from PakEnergy are designed to capture and preserve value in the upstream and midstream sectors. These fully modernized tools are fast to deploy, easy to use, and preserve and protect asset data. Best of all, these solutions work equally strongly in new energy fields such as renewables and carbon capture and storage, future-proofing the business.

Conclusions

As the US oil and gas industry consolidates, the role of high-quality data takes center stage in determining transactional value. By investing in modern data management solutions, companies can not only make their assets more attractive for potential deals but also position themselves to efficiently integrate new acquisitions. Better data management is becoming synonymous with value.

Ready to learn more? Check out this exclusive webinar featuring Land professionals from one of the nation’s largest privately-held oil and gas companies, and me as the host. The team walks through their digital transformation journey from system selection criteria to best practices to ensure a seamless integration and meaningful return on investment.


Artwork is by Geoffrey Cann, and cranked out on an iPad using Procreate.

Share This:


More News Articles