Aloe Wright / Essays

Notes for building outside the feed.

Two pieces moved out of rented publishing space and into a static page I control: one about Cloudflare-native video infrastructure, one about machines, labor, and ownership.

01

May 4, 2026

Building Spooool as a Central Cmd for Video

A YouTube alternative on Cloudflare-only infrastructure

Spooool is a video host where every piece of the runtime lives on Cloudflare: frontend, API, storage, encoding pipeline, rate limiting, fan-out. No origin server, no Postgres, no Redis, no S3. One wrangler deploy is the whole release.

A diversified content infrastructure is more important as of late due to algorithmic censorship, AI account-level errors, and issues contacting customer support. Cloudflare is set up in such a way that it lends itself to the development of a central command for streaming and publishing video content to various platforms.

My Solution

Let’s walk through how the interesting parts actually work, because most “build X on Cloudflare” posts stop at a Worker that returns JSON. The interesting bits here are the seams: chunked uploads to R2, a queue-driven encoding handoff to Stream, a Durable Object per channel for subscriber fan-out, and a token-bucket rate limiter that’s also a DO.

The bindings, in one place

Everything starts in wrangler.toml. The platform surface is the architecture:

[[r2_buckets]]            binding = “VIDEOS”
[[d1_databases]]          binding = “DB”
[[kv_namespaces]]         binding = “CACHE”      # hot reads
[[kv_namespaces]]         binding = “SESSIONS”   # multipart upload state
[[queues.producers]]      binding = “VIDEO_ENCODING”
[[analytics_engine_datasets]] binding = “ANALYTICS”
[[durable_objects.bindings]]  name = “CHANNEL_SUBSCRIBER_DO”
[[durable_objects.bindings]]  name = “RATE_LIMITER”
[triggers] crons = [”0 2 * * *”]   # GDPR sweep

A single Worker (src/workers/index.ts) owns fetch, queue, and scheduled handlers and routes via Hono. Static assets are served from [assets] with run_worker_first for /api/*, /watch/* (so OG tags can be injected via HTMLRewriter), and SEO endpoints.

Chunked uploads straight to R2 from the Worker

The non-obvious bit: spooool doesn’t use pre-signed URLs.

Chunks are POSTed as multipart/form-data to the Worker, which proxies into R2’s multipart upload API. State for an in-progress upload lives in KV (SESSIONS), not D1 — there’s no row until the upload completes.

The flow in src/workers/videos.ts:

// chunkIndex === 0: open the multipart upload, stash uploadId in KV
const multipart = await env.VIDEOS.createMultipartUpload(r2Key, {
  httpMetadata: { contentType: rawFile.type },
});
const firstPart = await multipart.uploadPart(1, rawFile.stream());
await env.SESSIONS.put(mpidKey, multipart.uploadId, { expirationTtl: 86400 });
// chunkIndex > 0: resume and upload by part number
const multipart = env.VIDEOS.resumeMultipartUpload(uploadMeta.r2Key, multipartUploadId);
const uploadedPart = await multipart.uploadPart(chunkIndex + 1, rawFile.stream());
// last chunk: complete and enqueue encoding
await multipart.complete(completedParts);
await env.VIDEO_ENCODING.send({ videoId, r2Key });

Three KV keys per session — :mpid, :meta, :parts — all 24h TTL’d so abandoned uploads garbage-collect themselves. Cumulative byte counting on each chunk enforces MAX_VIDEO_BYTES without reading R2.

A subtle detail: the per-user rate limit only fires on chunkIndex === 0. Subsequent chunks are unbounded — they’re already paying for an established session, and rate-limiting them would just turn slow networks into failed uploads.

Encoding: a queue, not a request

Once multipart.complete() returns, the Worker sends a message to the VIDEO_ENCODING queue and replies 201 to the client. The same Worker is also the queue consumer — declared via export default containing a queue handler:

async queue(batch: MessageBatch<unknown>, env: EnvBindings) {
  for (const message of batch.messages) {
    try { await handleEncodingMessage(env, message.body); message.ack(); }
    catch { message.retry(); }
  }
}

handleEncodingMessage either submits the R2 object to Cloudflare Stream (POST /accounts/:id/stream with url: “r2://...”) or marks it pending_encode for a future R2-only encoding path. Stream then calls back into /api/webhooks/stream, which updates the row’s stream_video_id and status. The API surface and the encoder are the same Worker — but they’re different events on the same script, so a slow encode never blocks an upload response.

A Durable Object per channel, for fan-out

When a creator uploads, every subscriber needs an inbox row. With 10k subscribers and concurrent uploads from the same creator, naïve fan-out stampedes D1.

ChannelSubscriberDO is keyed by channel:${userId} — one instance per creator — and uses blockConcurrencyWhile to serialise fan-out for that creator. It pages subscribers in batches of 200 with cursor pagination on the subscriber_user_id index, and writes to subscription_inbox with ON CONFLICT DO NOTHING so retries are safe:

const id = ns.idFromName(`channel:${payload.channelUserId}`);
const stub = ns.get(id);
await stub.fetch(’https://channel-do/fan-out’, { method: ‘POST’, body: ... });

Note triggerFanOut is best-effort — its errors are logged, not surfaced to the upload response. If the DO is down, the upload still succeeds; fan-out can be re-triggered later.

Rate limiting, also a Durable Object

Token-bucket rate limiting on Cloudflare is a great DO use case: you need a single source of truth per (bucket, identity) and you need atomic decrement. KV is eventually consistent and would let two concurrent requests both observe a full bucket.

RateLimiterDO stores { tokens, lastRefillMs } and computes the refill on read:

const refilled = Math.min(capacity, startTokens + (elapsedMs / 1000) * refillPerSecond);
const allowed = refilled >= cost;
const tokens = allowed ? refilled - cost : refilled;

blockConcurrencyWhile serializes take() per identity. Capacity and refill rate are passed in from the caller, so changing policy (”auth writes are now 30/min not 10/min”) doesn’t require a DO migration — only the Hono middleware in rate-limit.ts changes.

Caching: KV with a version bump, not invalidation

Trending videos are cached in KV. Instead of deleting cache keys when a video is uploaded or deleted, spooool bumps a version counter:

const version = await getTrendingCacheVersion(c.env.CACHE);
const cacheKey = trendingCacheKey(version, limit);

Old keys remain and expire on TTL; new requests miss against the new version and repopulate. This dodges the eventual-consistency problem of trying to delete a KV key globally and then immediately reading it back.

What the cron does

The scheduled handler runs runDeletionSweep (hard-deletes users past their 30-day GDPR grace window) and runDmcaRestoreSweep daily at 02:00 UTC. Wrapped in ctx.waitUntil so a slow sweep doesn’t block the next tick.

What’s worth stealing from this design

Three things I’d lift into other Cloudflare projects:

  1. Workers as both producer and consumer of their own queue. Same script, separate event types — no second deployment, no service binding indirection. Treats encoding as a different event on the same code, not a different service.

  2. DO-per-entity for any “serialize per X” problem. Channel fan-out and rate limiting look unrelated; both are “atomic op against a per-X state machine” and both are 100 lines of DO code.

  3. KV state for in-progress workflows. D1 rows only exist for completed entities. Half-finished uploads, encoding sessions, and other transient state belong in KV with a TTL — your relational schema stays clean.

The whole stack — frontend, API, encoder, fan-out, rate limiter, scheduled jobs — is one TypeScript codebase, one deploy, and (per individual cost analysis) somewhere between $0.50 and $15/month for a non-trivial test load. That’s the actual sell of building on a single platform: not free tier forever, but the architectural seams are bindings, not network calls.

Repo: Spooool

02

Apr 22, 2026

The Curious Expectation That Machines Will Improve Your Circumstances

What History Tells Us About the Last Major Economic Revolution & How It Might Guide Us

There seems to be a rather fashionable anxiety that artificial intelligence will either impoverish or enrich you, rather than on the considerably less poetic matter of who, precisely, will own the means of its production, regulate it, and profit from its unyielding exertions.

It is a seductive misdirection.

The current debate assures you that wages must either fall, because the lower rungs of employment will be quietly sawn off beneath you, or rise, because productivity will ascend and carry you with it like an obliging lift; and you are encouraged to feel gleeful at the assuredly forthcoming labor-less existence we all will enjoy.

But the presumption is flawed, because what AI is, or could be, has little to do with how it will impact most people. People shouldn’t worry about AI, they should worry about who controls it.

Economists that write in the WSJ or FT believe that in the long run, things will settle. They keep implying that the bruise of progress, however dark, fades into something resembling prosperity. It is, if you sit with the past a little while, a strange faith considering the reality of what comes with tectonic change.

Friction

Economists have a word for the trouble that comes when a new technology breaks across the old world. They call it friction — as though it were nothing more than a hand passing over wood, a splinter, a small heat soon gone. It is a soft word, almost an apology. But the kind of friction they’re referring to feels more like life and death on the Alaskan tundra. And to understand what kind of winter these new machines might bring, it helps to remember the last one: the one our great-great-grandparents lived through when there was a stretch of land at the edge of every village where a family could graze a goat or cut peat or set their sheep loose among the others. It wasn’t much. It was, in a quiet way, everything. It was the difference between belonging to a place and being a guest in it.

Then came the laws of enclosure. Hedges erected where there had been open ground. The commons, parcel by parcel, were stitched into the holdings of the few, and the families who had lived by them for generations found themselves without a foothold in the soil that had fed their grandparents.

Dispossessed of the land, the masses of disenfranchised farmers were pushed into burgeoning town centers where factories billowed smoke. These factories needed the workers, but they exploited their newfound vulnerability to set the conditions in which they lived. In the match factories, where they handled white phosphorus, workers developed a thing called phossy jaw — a slow vanishing of the jawbone, the body itself dissolving in tribute to the cheap, flickering light of someone else’s evening.

This was not a season. This was a hundred and fifty years of child labor where people regularly died without healthcare or advocacy. Across the ocean in the newly minted United States, railroads and bridges built an empire on the backs of those with no other option. You either worked enslaved to a titan of this new industrial age, or you perished. Lives began and ended inside nearly 200-years of friction. Whole generations were born into the smoke and laid down beneath it without ever seeing the other side.

One must admire the delicacy of the term, which manages to reduce centuries of deprivation to the social equivalent of a misplaced glove.

The more serious difficulty, which you are not expected to dwell upon for long, is that technological progress does not distribute itself with any particular sense of fairness; it accumulates, rather conspicuously, around those already in possession of capital, assets, and influence, leaving the rest to negotiate their access to its benefits through wages that are, in turn, determined by a person’s replaceability, bargaining power, and ability to endure.

The 15-hour Work Week

You may, if you are feeling optimistic, recall the prediction of John Maynard Keynes, who rather famously imagined that increased productivity would permit a fifteen-hour work week, a prospect so civilized that it has remained, with admirable consistency, entirely unrealized.

There hasn’t been a miscalculation in our growth rate, or a failure in our technologies to enable abundance, but rather the reality of a more persistent arrangement in which the gains of that abundance are retained by those who can acquire the machinery, the land its operated upon, and then dictate the terms of consumption. Meanwhile, the rest of the population must work, in whatever way “the market” deems they must, in order to afford the privilege of existing within reach of that abundance.

Ownership, in other words, is the quiet author of your future.

In theoretical models, those delicate constructions in which everyone possesses a modest share of resources and may therefore choose between leisure and income with philosophical composure, technology appears liberating, even benevolent; however in practice, where housing, energy, food, education, credit are owned/managed by an increasingly tiny class of individuals, the majority find themselves obliged to work not because productivity demands it, but because their survival does. It’s uncanny how history seems to repeat itself.

As an unfriendly reminder, this is the arrangement. The rich get richer and the poor get poorer. For a brief window in human history, for roughly the last century or so, things were a little less bleak. Yes, the baby boomers had it good, in comparison to practically every other generation in known history. This was thanks to hard fought battles by laborers, unions, and leaders like FDR who redistributed the industrialized economy’s resources through structural mechanisms he embedded in the economic and political systems of the United States. This reset expectations between the working class and the ruling class as to what a civilized society aught to aspire to be for the average person. But now, as roughly 80 years have passed, the social contract is back up for negotiation.

According to the UBS Global Wealth Report, in 2023 the world’s richest 1 percent, those with more than $1 million, owned 47.5 percent of all the world’s wealth – equivalent to roughly $214 trillion. Adults with less than $10,000 make up nearly 40 percent of the world’s population, but hold less than 1 percent of the world’s wealth. UBS defines “wealth” as the value of financial assets plus real assets (primarily housing) of an individual, minus their debts. (Source: Inequality.org)

Rapid economic growth in Asia (particularly China and India) has lifted many people out of extreme poverty. But the global richest 0.1 percent and 1 percent have reaped a much greater share of the economic gains, according to the World Inequality Report. In 2025, the richest 1 percent pocketed 20.3 percent of global income, up 3.4 percentage points since 1980. The top 0.1 percent pocketed 8.2 percent in 2025, up 2.5 percentage points since 1980. These ultra-rich individuals did take a hit in the 2008 financial crisis, but the richest top 0.1 percent have nearly regained the global income share they enjoyed in 2007.

One argument I hear sometimes is that wealth is better held in the hands of those with enough knowledge and wisdom to rear it into its best use. And yet… studies overwhelmingly show that the more inequality grows in any social construct, the greater the likelihood of instability, suffering, and eventually, collapse.

You may be tempted to believe that history, having once been so uncouth, has since acquired better manners, and that modern societies will distribute the benefits of AI with a generosity absent in earlier eras; yet the only periods in which living standards rose meaningfully for the average person were those in which labor movements—strikes, unions, and rather impolite demands—forced a redistribution of wealth that technology alone had declined to provide. Since Reagan, inequality has only grown.

Progress, it seems, required assistance.

Now

Which brings us to the present moment, in which you are invited to speculate whether AI will be kind or cruel, as though it possessed a temperament, when the more pertinent question should always be whether the structures governing its use will demand its gains be shared at all.

You may expect, if precedent is allowed any voice, that productivity will increase, and unabated, wealth will concentrate, resulting in calculated rationalizations.

The question I ask myself whenever I hear someone sharing an unsolicited opinion on the subject is the same: what stake do they have in the outcome? How much do the consequences of unfettered inequality actually impact them personally?

The truth is that whether your circumstances will improve in our new economic reality depends upon something far less elegant than theory; like our ancestors before us, we’ll have to demand it if we want any share of its abundance at all.

So, will AI make the average person’s life better? Statistically, it would be highly unlikely that it would make much measurable improvement in the daily life of the average person, unless there is strict regulation to govern its ownership and use. It is much likelier to degrade the quality of life for the average person when referencing its current trajectory of governance.

What will AI likely mean for our world? Without question we know it will enable heightened scrutiny, surveillance, and enforcement of sovereign law. The more availability to monitor any situation or person will undoubtedly lead to more monitoring. This will be good in some arenas and bad in others; it just means oversight of that monitoring is essential. As you most likely understand by now, it will also empower a technocratic class that maintains relative supremacy over every facet of society: economic, political, and social. While the degree to which this is true is unknown, those with proximity to the power emboldened today stand the most to gain tomorrow.

So, the average person is essentially screwed? Not necessarily, and not yet. History shows change requires organized demand, and there are plenty of ways you can engage in a positive way. It’s up to you, and everyone you know, to demand that if the means of production are guarded, which it seems they may be based on current movement in the global mining and chip sectors, then the specifics of how it may be consumed and how producers may profit from it must be controlled through powerful and effective, corruption-resistant frameworks. The EU AI Act uses a risk-based classification system that mandates specific technical and organizational controls for high-risk AI systems, such as those used in employment, credit scoring, or law enforcement. UNESCO’s recommendations insist on greater environmental sustainability and gender equality. When reviewing any framework keep an eye out for human oversight, safety systems, accountability mechanisms, transparency in production and use, data protection, proportionality, and other aims that mitigate bias and inequity. And if you’re currently evaluating AI for yourself or your company, you should also be referencing NIST’s recommendations related to AI cybersecurity to protect your assets against the widening attack surface.

The most important function of any governance solution, whichever people eventually coalesce around, must come with clear consequences, fairly assigned to bad actions, regardless of origin. If we can organize meaningful demand for distributed gains, hopefully we can also organize meaningful agreement around how AI can serve all of humanity, rather than the select few lucky enough to find themselves at the center of its development.

Keep track of AI governance through resources like this one and stay engaged in conversations like this one where everyday people are attempting to organize around the forthcoming reality that our economic system is never going back to the way it was unless we take the necessary steps now to ensure it does. And, selfishly, subscribe to blogs like this one: