Trading the Breaking

Trading the Breaking

Quant Lectures

[QUANT LECTURE] Mutual information as evidence of structure

Market Inefficiencies - Information Theoretic Approach

Quant Beckman's avatar
Quant Beckman
Feb 06, 2026
∙ Paid

Mutual information as evidence of structure

This chapter treats quantitative trading research as a sequence of testable dependence claims, not as an automatic story about why markets move. The goal is to separate what the data actually supports—a conditional advantage linking X to R—from what remains underdetermined—any single structural narrative we might want to attach to it. The workflow moves from defining what it means to have information, to checking that claim against what exists in raw data, to identifying when structure appears (state conditions S), and finally to compressing the feature set without introducing leakage or manufacturing evidence through selection.

What’s inside:

  1. What mutual information is for in trading research. Mutual information is used to detect and quantify dependence without committing to a particular functional form.

  2. Separating dependence from functional form. Different decision rules can exploit the same underlying dependence. Dependence is the object; the specific rule is only a vehicle.

  3. Quantifying the strategy’s implied information. Realized performance can be read as an information claim per decision.

  4. Orthogonality to utility. Dependence is not monetization. A pattern can be real and still fail to generate PnL under a given execution scheme.

  5. Dependence is the object, not the model. This reverses the usual pipeline where a backtest creates belief and justification is constructed afterward.

  6. Information is not edge. The section distinguishes layers: (i) existence of dependence, (ii) robustness under refuters, (iii) translation into executable edge.

  7. Incremental information. The practical question is whether X adds anything beyond a baseline information set Z that is already available, already priced, or already captured by your stack.

  8. Conditional dependence, sample size, and non-circular boundaries. Dependence is not uniform; it often concentrates when the market enters a configuration S.

  9. Why feature sets behave differently than single features. Weak variables can become informative in combination (gating/synergy), and apparently strong variables can become redundant once others are known.

  10. Dependence can move across time scales. A pattern can exist at one horizon and disappear at another, or shift with regime.

  11. The smallest information set that preserves structure. Compression is an active process: reduce while verifying that the evidence you care about remains intact and that the compression introduces no leakage.

  12. Mutual information is a target, not a number you trust by default. I(X;R) depends on estimation choices, regularization, and temporal dependence.

Check a sample of what you will find inside:

Sample
1.94MB ∙ PDF file
Download
Download

This post is for paid subscribers

Already a paid subscriber? Sign in
© 2026 Quant Beckman · Publisher Privacy ∙ Publisher Terms
Substack · Privacy ∙ Terms ∙ Collection notice
Start your SubstackGet the app
Substack is the home for great culture