I sometimes get pushback on why we need to collect data every second of every minute from each well pad. At Project Canary, we focus on continuous methane measurement for a simple reason: it’s the only way to get reliable data about the size of methane emissions from a wide array of facilities and action mitigation and human capital in real-time. Most economic studies show that methane detection technology can be cost-neutral or cost-positive despite the perception that high-fidelity sensors add cost. But saving our customers money is actually an ancillary benefit to why we insist on real-time monitoring: simply put, continuous methane measuring catches more leaks. Here’s why I’ve come to this conclusion.
There are two critical parameters for leak detection:
- Frequency in time (or how many measurements are collected over a given period) and the detection limit (the smallest leak size that it’s possible to see).
- Frequency is important because:
- Most leaks are intermittent and
- Small leaks become big leaks, and,
- More frequent measurements ensure trust in the supply chain. More on this later.
A third commonly discussed parameter of importance for methane measurements is spatial resolution or coverage. That is, how much area a single measurement can cover. Because most oil and gas fields or landfills are heterogeneous, covering a large area with a single measurement is not helpful for operations teams who need to find and fix issues.
Instead, high-speed spatial coverage, allowing you to pinpoint the location of a leak on an oilfield or site, is needed. So, when most folks talk about spatial coverage, what they really mean is not spatial resolution but the speed of coverage.
We agree that these two related factors – spatial resolution and speed of coverage–can be helpful. Still, we don’t include them in our two important detection parameters because coverage speed affects the measurement’s lower detection limit. Speed of coverage can boost the number of measurements for a given sensor, but there’s a necessary trade-off between speed of coverage and spatial resolution for aerial measurements. The physics of measuring methane and orbital mechanics drive this trade-off.
One standard method to measure remote gases is to use a measurement technique called infrared absorption spectroscopy, which measures the reflected infrared sunlight to quantify the concentration of a particular gas that absorbs infrared light. Another more recent approach is to use LiDaR (you might have heard about this technology from self-driving cars). Rather than measure reflected light, it sends out its own laser pulse and then measures the amount of light that comes back to a detector.
In both cases, the farther away these measurements are, the worse the spatial resolution (and the larger the pixel size) and the higher the detection limit. That is, from far away, you can only see a big leak and can’t see a small leak. From close up, you can see both big and small leaks. LiDaR gets a bit of an advantage here because there’s more energy to work with, meaning that you can get more information back from a reflected laser pulse than reflected sunlight, feeding algorithms with a stronger signal. So while you get an advantage in frequency because the farther away you are, the faster your satellite can go, or similarly, you can switch from a slower drone to a faster plane. But I digress. While there is a significant focus on “fixing the big leaks fast,” at Project Canary, we think it pays to measure all leaks all the time. Why?
First, big leaks can pop up anywhere. Independent field studies show that leaks follow the 80/20 rule–80% of leaks come from 20% of sources, which is often given as a reason to only scan for big leaks. One might interpret that 80% of leaks come from 20% of sites, but this isn’t the case. Leak surveyors who visit individual sites over the course of a few days discovered this rule. But when they revisit the sites, they find that the high-emitting sites are not the same.
Second, most leaks are small and intermittent. While most leak volume will come from your largest leak, most leaks by number are actually small. These add up over time. Additionally, many major leak sources are intermittent. For example, a dump valve may only be used when a tank needs to “dump.” If this equipment type leaks, then it’s only observed intermittently.
Third, small leaks grow into big leaks. I once was at a field office where a valve on processing equipment had a leak that triggered a major gas release. When the team investigated, they found that the leak was initially small, but it continued, and because of weather conditions, gas eventually pooled, and the concentrations reached near-explosive limits. An existing monitoring alarm captured these high levels. But the alarm can only measure explosive limits, and it triggered an emergency release, venting several thousands of pounds of gas into the air. Had a continuous monitor been placed at the site, operators told me, they would have caught the leak when it was still small.
Fourth, and maybe most importantly, real-time monitoring ensures trust in the supply chain and gives policy and decision-makers confidence. This trust has tremendous consequences–the end buyers of oil and gas make decisions every day that determine the future of our energy and petrochemical production. Decision-makers I speak to are pressed every day about sustainability and costs by end consumers, users, and constituents. In the face of insufficient data, they’re increasingly assuming the worst and making decisions accordingly.
At Project Canary, we believe that continuous monitoring is the key to catching leaks of all sizes. We believe that every molecule counts in the pursuit of responsibly sourcing natural resources. Driving real-time, granular data-based measurement across the energy value chain means detecting more leaks. Every molecule makes a difference.