Trading the Breaking

Trading the Breaking

Share this post

Trading the Breaking
Trading the Breaking
[WITH CODE] Model: Gamma hedging
Alpha Lab

[WITH CODE] Model: Gamma hedging

When volatility spikes and market assumptions crumble, an overlooked error in risk models like gamma hedging can trigger a cascade of financial collapse that shatters entire portfolios

𝚀𝚞𝚊𝚗𝚝 𝙱𝚎𝚌𝚔𝚖𝚊𝚗's avatar
𝚀𝚞𝚊𝚗𝚝 𝙱𝚎𝚌𝚔𝚖𝚊𝚗
Mar 02, 2025
∙ Paid
11

Share this post

Trading the Breaking
Trading the Breaking
[WITH CODE] Model: Gamma hedging
4
4
Share

Table of contents:

  1. Introduction.

  2. Theoretical background of the Hessian matrix.

  3. Estimating function behavior locally.

  4. Taming the curvature risk.

  5. Integrating numerical methods into Gamma hedging.


Introduction

It's Saturday and you're playing Super Mario Bros at the speed of light. But instead of Super Mario’s level, it’s stocks, bonds, or magical unicorn tokens. To win, you need superpowers: math. Algorithmic trading is that game, and math is your cheat code.

Today we will explore three commonly used quantitative tools. While these tools promise to illuminate the hidden dynamics of financial markets, a closer look reveals that when misapplied or overrelied upon, they can lead to catastrophic losses running into thousands of dollars:

  • The Hessian matrix: aka curvature detective:
    The trading algorithm calibrates a pricing model to market data. The Hessian matrix is computed to understand the curvature of the pricing function, informing risk assessment.

  • Quadratic approximations: aka math microscope:
    A quadratic approximation provides a local estimate of how small changes in market parameters affect portfolio value. This serves as a basis for real-time risk monitoring.

  • Gamma hedging: aka sensitivity stabilizer:
    Based on the computed gamma and delta, the algorithm dynamically adjusts its hedges to remain neutral to small price changes.

Even at this introductory stage, it’s critical to understand that while these tools provide powerful insights, overreliance on historical data or static approximations can be dangerous.

Theoretical background of the Hessian matrix

Think of a function as a hill. The Hessian matrix tells you if the hill is a bowl—good for rolling marbles— or a saddle—great for sliding down, but watch your back!

For a function f(x, y):

\(f(\theta_1, \theta_2, \dots, \theta_n) \)

the Hessian matrix H(f) is a grid of second derivatives defined as:

\(H(f) = \begin{bmatrix} \frac{\partial^2 f}{\partial \theta_1^2} & \frac{\partial^2 f}{\partial \theta_1 \partial \theta_2} & \cdots & \frac{\partial^2 f}{\partial \theta_1 \partial \theta_n} \\ \frac{\partial^2 f}{\partial \theta_2 \partial \theta_1} & \frac{\partial^2 f}{\partial \theta_2^2} & \cdots & \frac{\partial^2 f}{\partial \theta_2 \partial \theta_n} \\ \vdots & \vdots & \ddots & \vdots \\ \frac{\partial^2 f}{\partial \theta_n \partial \theta_1} & \frac{\partial^2 f}{\partial \theta_n \partial \theta_2} & \cdots & \frac{\partial^2 f}{\partial \theta_n^2} \end{bmatrix} \)

If you’re standing on the hill, the Hessian answers: How curvy is it under your feet?

In trading, f represents an option pricing model or a portfolio risk measure, and the variables θi are market parameters like asset prices, volatility, or interest rates.

Let’s compute the Hessian for a simple two-variable function. Consider

\(f(x,y) = 3x^2 + 2xy + y^2. \)

Its second-order partial derivatives are:

  • fxx​=6

  • fxy=fyx=​=2

  • fyy​=2

Let’s compute the Hessian for f(x, y) = 3x² + 2xy + y². The code below plots its level curves—like elevation lines on a map:

import numpy as np
import matplotlib.pyplot as plt

# Define the function f(x, y)
def f(x, y):
    return 3 * x**2 + 2 * x * y + y**2

# The Hessian is constant for this quadratic function:
H = np.array([[6, 2],
              [2, 2]])

# Compute eigenvalues of H
eigvals, eigvecs = np.linalg.eig(H)

print("Hessian matrix H:\n", H)
print("Eigenvalues:", eigvals)

# Plotting level curves of f(x,y)
x_vals = np.linspace(-5, 5, 400)
y_vals = np.linspace(-5, 5, 400)
X, Y = np.meshgrid(x_vals, y_vals)
Z = f(X, Y)

plt.figure(figsize=(8, 6))
cs = plt.contour(X, Y, Z, levels=20, cmap='viridis')
plt.clabel(cs, inline=1, fontsize=10)
plt.title("Level curves of f(x, y) = 3x² + 2xy + y²")
plt.xlabel("x")
plt.ylabel("y")
plt.grid(True)
plt.show()

It's like it's trying to hypnotize you.! The contour plot shows level curves of f. The curvature in different directions is implicitly represented by the spacing of these curves.

It's like leaving a bar drunk!

The Hessian here is constant ([[6, 2], [2, 2]]), meaning the hill’s shape doesn’t change—it’s a perfect bowl!

However, even if the Hessian is computed perfectly, pitfalls exist:

  • A Hessian based on historical data can’t capture a sudden change in market dynamics.

  • Although eigenvalues suggest uniform curvature, unexpected perturbations in some directions reveal hidden instabilities.

Now that we’ve mapped the hill, let’s zoom in with our math microscope!

Estimating function behavior locally

A quadratic approximation is like using a microscope to study a function’s behavior locally. Using a Taylor series expansion, we approximate a function f(θ) near θ0​

\(f(\theta) \approx f(\theta_0) + \nabla f(\theta_0)^T (\theta - \theta_0) + \frac{1}{2} (\theta - \theta_0)^T H(\theta_0) (\theta - \theta_0). \)

This approximation is crucial for understanding local changes in asset prices or portfolio values. It’s like a tiny telescope helping traders predict small price moves.

Let’s illustrate with a simple single-variable function:

\(f(x) = \sin(x) + 0.5x^2. \)

We’ll compute its quadratic approximation at x0=0.

import numpy as np
import matplotlib.pyplot as plt

# Define the function
def f_complex(x):
    return np.sin(x) + 0.5 * x**2

# Derivatives at x0 = 0:
x0 = 0.0
f0 = f_complex(x0)  # f(0)
df0 = np.cos(x0)    # f'(0), since d/dx sin(x)=cos(x), and derivative of 0.5x^2 is x (x0 = 0)
d2f0 = -np.sin(x0) + 1  # f''(0) = -sin(0) + 1 = 1

# Define the quadratic approximation function
def quadratic_approx(x, x0, f0, df0, d2f0):
    return f0 + df0 * (x - x0) + 0.5 * d2f0 * (x - x0)**2

# Generate data for plotting
x_values = np.linspace(-3, 3, 400)
f_values = f_complex(x_values)
quad_values = quadratic_approx(x_values, x0, f0, df0, d2f0)

plt.figure(figsize=(8, 6))
plt.plot(x_values, f_values, label='Original function')
plt.plot(x_values, quad_values, '--', label='Quadratic approximation at x0=0')
plt.xlabel('x')
plt.ylabel('f(x)')
plt.title('Quadratic approximation of f(x) = sin(x) + 0.5x²')
plt.legend()
plt.grid(True)
plt.show()

Near x=0, the approximation hugs the real function. Far away? It’s like trying to fit a square peg in a round hole:

Near 0 they’re totally in sync!

While quadratic approximations work well locally:

  • They are only accurate for small deviations. If the market moves dramatically, the approximation fails.

  • Higher-order terms become important during large moves, and ignoring them might underestimate risk.

Now that we’ve tamed tiny moves, let’s tackle big swings with gamma hedging!

This post is for paid subscribers

Already a paid subscriber? Sign in
© 2025 Quant Beckman
Privacy ∙ Terms ∙ Collection notice
Start writingGet the app
Substack is the home for great culture

Share