Every Node.js service I've worked on hits the same caching wall. It always starts the same way. You add an in-memory cache. It's fast. Life is good. Then you scale to multiple instances. Now each server has its own view of the data. Stale reads start showing up in production. So you add Redis. Now all your instances share the same cache. Problem solved — until you realize every single request is p