Markov Chain Generator
The simplification of a complex thing by breaking it down to its parts and taking a look at each of them. A crank spins the gear whose teeth pull the chain and what does that chain drive and so on forever. This is how I need to understand everything.
I was inspired by Irritant, a book written by Darby Larson — a 624-page single paragraph generated from roughly 70 words. It felt like a simple version of modern AI so i wanted to explore deeper.
The math underneath it is called a Markov chain and it turns out it is one of the foundational ideas behind modern AI. Each word looks back at the ones before it (n-gram lookback) and asks: what usually follows this? Not what means something. Just what tends to come next.
P(wₙ | wₙ₋₁, wₙ₋₂)
The probability of the next word, given the last two. No memory beyond that window. No understanding of meaning. Frequency, doing an impression of intention.
Every large language model running today is asking the same question — just across billions of parameters instead of a word list. This tool is the simple version. A way to get your hands on the idea and see what's actually happening under the hood.
Try it
seed text or type your own. MOre words is a better response
chain order — lookback
2
output — words
120
output
waiting for input...