The Kullback-Leibler divergence, a fundamental concept in informationtheory and statistics, provides a useful framework for quantifying thedifference between two probability distributions. In the field ofbehavioral analysis, comparing distributions between different operantschedules is paramount for gaining insights into various phenomena, fromdecision-making processes to learning dynamics. By capturing therelative entropy between distributions, the Kullback-Leibler divergenceoffers a nuanced understanding of behavioral patterns.
Whether investigating the effectiveness of interventions, assessingthe impact of environmental variables, or examining the fidelity ofcomputational models to empirical data, the Kullback-Leibler divergenceserves as a versatile tool for comparing distributions and evaluatinghypotheses. Its application extends across diverse domains withinbehavioral science, offering researchers a robust methodology fordiscerning patterns, elucidating underlying mechanisms, and refiningtheoretical frameworks.
The Kullback-Leibler divergence formula for continuous variables canbe expressed as:
\[ {D_{KL}(P||Q) = \int_{-\infty}^{\infty}p(x) \log \frac{p(x)}{q(x)} dx} \]
Where\(p(x)\) and
Here we implement theKL_div() function which takes twonumeric distributions (\(P\) and
The function takes the following parameters:
x a numeric vector for the distributiony a numeric vector for the distributionfrom_a a numeric value for the lower limit of theintegration.to_b a numeric value for the upper limit of theintegration.First let’s generate and plot two random normal distributions with100 elements and calculate the

Now let’s use real data from two different subjects performanceduring 5 Peak Interval sessions:
data("gauss_example_1",package ="YEAB",envir =environment())data("gauss_example_2",package ="YEAB",envir =environment())P<- gauss_example_1Q<- gauss_example_2DKL_real<-KL_div(P$Response_Average, Q$Response_Average,-Inf,Inf)print(DKL_real)## [1] 0.237648