What You'll Build A character-level language model that predicts the next character based only on the current character. No neural network, no gradients, just counting. A "bigram" is a pair of consecutive tokens (like 'e' followed by 'm'), so a bigram model is one that learns which pairs occur most often.

Notice that this model uses plain double arrays, not Value objects. That's because there's