The Effect of Ohmic Cable Losses on Time-Domain Reflectometry Measurements of Electrical Conductivity
- P. Castiglione * and
- P. J. Shouse
In time domain reflectometry (TDR), the ohmic resistance of the sample medium is related to the amplitude of the reflected signal at long time, once all the multiple reflections have taken place and steady state is achieved. Such a relationship, which permits measuring the sample electrical conductivity, is exact when no signal dissipation occurs other than in the sample. To account for signal attenuation along the coaxial cable, sample and cable are generally modeled as two resistors in series. In this work we review the fundamentals of the transmission line theory and demonstrate, both theoretically and experimentally, that such a formulation is incorrect. We propose a new simple procedure for the analysis of TDR signals based on a difference reflection method, by scaling the reflection coefficients with respect to one or more standards of known conductivity. The procedure was tested on 16 CaCl2 solutions, using two different TDR probes and two cable lengths. The experimental results are in excellent agreement with the theory.Please view the pdf by using the Full Text (PDF) link under 'View' to the left.
Copyright © 2003.