Skip to main content

Command Palette

Search for a command to run...

Magic of Synchronized States: Building a 1 Million Checkbox Real-Time Grid

Published
7 min read
H
CS Graduate | Technical Writing | Software development | 20K+ impressions

It was through Twitter that I first stumbled across the absolute chaotic beauty of One Million Checkboxes. I checked several boxes during it's brief one week run. Thousands of tiny squares, rapidly toggling on and off like a living, breathing digital organism. It felt like magic. This was an invisible, telepathic link connecting strangers across the globe through the most mundane UI element imaginable: the <input type="checkbox">.

Later my curiosity immediately shifted into overdrive. "Like how how do you build this stuff?" I started digging into the architecture, poring over eieio.games' brilliant post-mortem, one guy made a Golang clone on Reddit, admiring low-poly renders at grid.lowpoly.dk, and consuming blogs with JS performance teardowns and tech stack analyses.

Now, I couldn't foot the server bill for a literal million checkboxes without my wallet bursting into flames, eieio said his bill was $800 after optimizing, 650 Million checkboxes/states passed over thousands of users. This is the story of how I made my clone of One Million Checkboxes with WebSockets, Redis Bitfields, sliding windows, and Docker containers.


The Naive Expectation: Arrays and Sockets

I expected the beginning to be simpler . Render an array of inputs, add an event listener, and push the changes over WebSockets using socket.io. With nodemon running in the background so I didn't lose my mind refreshing the server, I wrote a basic event emitter.

// The initial naive broadcast in index.html
socket.on('server:checkbox:change', (data) => {
    console.log(`Socket Server Event`, data);
    const {index, checked} = data;

    const input = document.getElementById(`checkbox-${index}`)
    if (input) {
        input.checked = checked;
    }
});

The Twist: It worked perfectly... for active events. If two users were on the page at the same time, clicking a box worked. But the moment I opened a fresh tab, the new window was entirely blank.

The Problem: Sockets only transmit what is happening right now. They have zero memory of what has already happened. I had solved the relay issue, but I had a Source of Truth issue. Who actually knows which boxes are checked?

The Insight: State is ephemeral unless anchored. The clients cannot be trusted to hold the state. I realized I needed a centralized memory bank.

I created a simple in-memory state array on the Node backend, pre-filled with false, and exposed an API endpoint so new connections could catch up.

window.addEventListener('load', async() => {
    const response = await fetch('/checkboxes');
    const data = await response.json();

    if (data && data.checkboxes) {
        const fragment = document.createDocumentFragment();
        
        data.checkboxes.forEach((serverValue, index) => {
            const input = document.createElement('input');
            input.type = 'checkbox';
            input.id = `checkbox-${index}`;
            input.checked = serverValue;

            input.addEventListener('change', event => {
                const checked = event.target.checked;
                socket.emit('client:checkbox:change', { index, checked });
            });
            fragment.appendChild(input);
        });
        container.appendChild(fragment);
    }
});

Hitting the Scaling Wall: The Reverse Proxy Revelation

This worked beautifully for one server running on localhost. But as my curiosity drifted toward production, I hit a massive conceptual wall.

While reading a fascinating article on Kinsta about Reverse Proxies, it hit me: to handle hundreds of users, Multiple Node.js instances sitting behind a load balancer.

As my architecture expanded, I hit a massive wall or a cloud bill. What happens when hundreds of users connect? One Node.js server cannot handle the load. I needed to run multiple server instances. I would need horizontal scaling.

But here’s the reality: if Server A holds the memory of the checkboxes, and User 2 connects to Server B, Server B has no idea what Server A knows.

The Problem here - Isolated memory.
The Solution: We must extract the brain out of the server.

Enter Valkey (the open-source Redis drop-in).

Instead of Node holding the state, the state had to be externalized. Furthermore, the servers needed a way to whisper to each other. I spun up a Valkey instance via Docker and orchestrated a brilliant Pub/Sub (Publish/Subscribe) architecture.

services:
  valkey:
    image: valkey/valkey
    ports:
      - 6379:6379

When a user toggles a box, the Node server updates Valkey, then publishes a message to a Valkey channel. Every other Node server is subscribed to that channel.

// The server listening to the Valkey Pub/Sub whisper
await subcriber.subscribe('internal-server:checkbox:change');
subcriber.on('message', (channel, message) => {
    if (channel === 'internal-server:checkbox:change') {
        const { index, checked, totalClicks } = JSON.parse(message);			
        io.emit('server:checkbox:change', { index, checked, totalClicks });
    }
});

The Mind-Bending Realization: Servers shouldn't actually know anything. They should act as dumb, lightning-fast messengers passing data between a highly optimized central brain (Valkey) and the client browsers.

To make this even more elegant, I didn't store a massive 10,000-item array in Valkey. I used Redis Bitfields (SETBIT). 10,000 checkboxes stored as literal bits (1 or 0) takes up a microscopic 1.25 Kilobytes of RAM. This was same as as what One Million Checkboxes used.

Human Nature: Rate Limiting and Identity

With the grid alive, I realized humans are inherently chaotic. People will spam click.

Initially, I implemented a strict 2-second timeout per click. It was functional, but brutally restrictive. It ruined the joy of scribbling across the grid. The problem wasn't clicking; the problem was bursts of machine-gun clicking.

The Twist: I ditched the simple timeout and built a custom Sliding Window Rate Limiter directly inside Valkey using INCR and EXPIRE. I allowed a burst of 3 clicks per 2 seconds.

If they violated the law of the grid, I didn't just drop their connection. I wanted to add some fun messages. I stopped the socket emission and sent an array of randomly selected, slightly condescending error messages:

const FUNNY_MESSAGES =[
    "yo damm chill out fam, wait 2 sec to continue",
    "Deep Breathe in, check, deep breathe out, check, wait 2 secs to continue",
    "Inhale, check, exhale, check, wait 2 secs to continue",
    "There are other people who want to check this box, wait 2 secs to continuw",
    "By the time you read this message, you'll be able to click boxes again.",
    "Error: Impatient human clicker here. Penalty for 2 seconds applied.",
    "One for one, three for three, wait three seconds to continue your spree.",
    "One small step for a man, Two big seconds to continue."
];

// ... inside socket.on('client:checkbox:change') ...
const rlKey = `ratelimit:socket:${userId}`;
const clicks = await redis.incr(rlKey);
if (clicks === 1) await redis.expire(rlKey, 2); // 2 second window

if (clicks > 5) { 
    const randomMsg = FUNNY_MESSAGES[Math.floor(Math.random() * FUNNY_MESSAGES.length)];
    socket.emit('server:error', { data: { error: randomMsg } });
    return; 
}

To accurately track who was clicking, I integrated OIDC (OpenID Connect) using express-openid-connect. Now, users could log in, establish an identity, and their rate limits (and my newly added global click analytics) were tied securely to their session rather than a spoofable socket ID.

Escaping Localhost: The Deployment Pivot

The final boss of any project is deployment. I confidently pushed my code to Render, expecting seamless magic. It failed. Render's architecture didn't play nicely with my multi-container Valkey + Node setup on the free tier.

This naturally led me to look for an environment that natively understood containers. Enter Vercel.

Because I had containerized my Valkey instance and Node app using docker-compose, Vercel spun it up instantly. There is a profound satisfaction in realizing that Docker isn't just a buzzword; it's a promise. If it runs on your laptop, it will run exactly the same way in the cloud.

The "Aha!" Paradigm Shift

Looking back at the completed grid, watching the total click counter tick upwards as friends tested the deployed link, I realized something profound.

A checkbox on a screen is just a pixelated illusion. The real magic—the actual system—is the invisible, highly orchestrated flow of bytes. It is the bitwise operations happening deep in Valkey's RAM. It is the Pub/Sub network broadcasting whispers across isolated server instances. It is the sliding windows mathematically calculating human patience.

Technology at this scale is never magic. It is just highly engineered logic. And once you understand the architecture beneath the surface, you don't just see a grid of checkboxes anymore. You see the matrix.