Summary: | We carried out several numerical experiments to analyze how different boundary conditions affect the ability to detect small pipeline leaks. Our method is based on determining the soil temperature gradient above a buried district heating channel. The equivalent thermal conductivity of a wet insulation (<i>λ</i><sub>eq</sub>) value of 0.5 W/(m·K) was used to mimic a small water leakage. To evaluate the heat loss through the channel cross section, the heat conduction model was used for the pipe insulation, the concrete, and the soil, while the convection model was considered within the channel. The following effects were used to simulate different operating conditions: heat convection at the soil surface, leakage only from the supply or return pipe, soil height above the channel, soil thermal conductivity, and pipe diameter. With the exception of leakage only from the return pipe and low soil thermal conductivity 0.4 W/(m·K), the results showed a doubling of the soil temperature gradient when compared with the no-leakage case. This fact undoubtedly confirms the potential of the method, which is particularly suitable for leak detection in old pipelines that have priority for renovation. A key added value of this research is that the soil temperature gradient-based leak detection technique was found useful in most foreseeable DH operating situations.
|