Probability And Statistics 2 May 2026

They ran a Gibbs sampler (a type of MCMC) overnight. By dawn, the chains had converged. The posterior distribution revealed that the Drift switched states every 3.2 days on average. Now they could build a real-time predictor. For the next hour’s Drift speed, they used a Kalman filter —a recursive algorithm that updates predictions as new data arrives.

A debate ensued. Elara stepped in. “In Stat 1, you compare point estimates. In Stat 2, you compare entire distributions of belief.” probability and statistics 2

This was the key. They stopped using a single normal distribution and started using a . They realized the daily catch was a mixture of two regimes: calm days (low variance) and stormy days (high variance). Stat 2 gave them Expectation-Maximization to figure out, from past data, which days were which. The Convergence of Opinions A rival guild from the mountains arrived, claiming their own model was superior. Both guilds had different prior beliefs about the Drift’s behavior. The mountain guild thought the Drift was periodic (tides). The coastal guild thought it was a random walk. They ran a Gibbs sampler (a type of MCMC) overnight

The Kalman filter, now robustified, predicted the Drift would reverse direction in 20 minutes. The fleet turned back. The mountain guild, still using their old periodic model, sailed into the surge. They survived, but their nets were shredded. That night, Elara addressed the city: Now they could build a real-time predictor

The city’s sage, Elara, had studied . The Random Walk to Nowhere Elara began by modeling a single fishing boat’s position over time. In Stat 1, you’d say: The boat’s position after t hours is normally distributed with mean 0 and variance tσ². But Elara knew better. The Drift meant each step’s variance was random itself.