Aloe Wright / Essays

01

Apr 7, 2026

The Library of Babel

How AI Systems Are Redrawing the Map of What We Know & Believe

Jorge Luis Borges’ story The Library of Babel conceives of the universe as a vast, infinite library composed of hexagonal galleries that contain every possible book that could exist. Although the vast majority of these books are pure gibberish, the laws of probability dictate that the library must also contain every coherent book ever written, or that could ever be written, including biographies of every person, predictions of the future, and most notably a perfect index of the library itself.

The story explores the existential despair and religious heresies of the librarians who inhabit this chaotic universe, as they struggle to find meaning in a place where all knowledge that has ever been or will be is buried beneath an overwhelming sea of nonsense…

The tragedy is not their failure to find this master index; it’s that they don’t realize the catalog they seek would itself need cataloging, creating an infinite regress of competing and contingent systems, each one claiming authority over the last.

We are living in that library now. Except the catalogers are not human, and what they choose to index determines not just what we find, but what we believe to be true.

In 2025, SEMrush conducted an experiment that exposed the scaffolding behind our new reality. They fed 2,500 queries, ordinary questions like “What’s a good project management tool?” or “Which laptop is best for video editing?” into ChatGPT and Google’s AI Mode, tracking which brands surfaced in the answers and which sources were cited as evidence. What they discovered was a fault line running through the consolidating digital landscape: the companies dominating traditional search rankings often evaporated entirely when an AI system answered the same question.

The technical term for this is visibility divergence.

The Geography of Mentions

Consider Microsoft and Google, who appear together in 78% of ChatGPT’s responses about digital technology and 82% of Google’s. This is not collusion, but rather gravitational pull. When two objects are massive enough, they warp the space around them, making it nearly impossible to discuss the territory without referencing both landmarks.

But mass alone does not guarantee presence. Zapier—a company most people would struggle to describe at a dinner party—ranks as the number one cited source in technology search queries while placing 44th in brand mentions.

Zapier is everywhere and nowhere: the ductwork in the walls, essential but unnoticed. Their How-To articles and integration guides make them useful to machines constructing answers, even as humans rarely think to recommend them.

This is the new cartography.

There are two maps of the same territory, and they barely overlap.

SEMrush found that 65% of the top 100 mentioned brands appear in both ChatGPT and Google AI Mode. That seems like consensus until you examine their citations. Only 32% of the top 100 cited sources overlap. In financial services, that number drops to only 16%. Imagine two historians writing about the same war, mentioning the same generals, but footnoting entirely different archives. One consults letters; the other, battle reports. Both claim accuracy. Neither is wrong, per se. But the version of events they construct, the truth they transmit, diverges within the details of each citation.

The Two Chambered Heart

SEMrush’s researchers identified what they call a “two-stage AI decision process,” surfacing a circulatory metaphor.

Stage One is the atrium: open, receptive, gathering fluid from everywhere. Here, AI systems behave like anthropologists in a foreign land, listening to what locals say in markets and cafés. They scan Reddit threads, Amazon reviews, Quora debates: the messy, contradictory chorus of user-generated opinion. This is where brands become mentioned.

Patagonia dominates 21.2% of fashion queries not through advertising, but through cultural saturation: the internet has reached a rough consensus that Patagonia signals ethical consumption in the same way Kleenex means tissue.

Stage Two is the ventricle: selective, pressurized, pushing only certain elements forward. Here, AI shifts from listening to fact-checking. It wants Wikipedia entries, official documentation, structured data—information that can be verified and parsed without ambiguity. This is where citations live, where Bankrate (86.6% citation rate in Google AI Mode for finance) exists as infrastructure rather than recommendation.

Most brands live in only one chamber. The rare few that circulate through both are the ones the system keeps pumping through its networks. Only 27 brands out of the top 100 achieve this dual presence.

The rest are either invisible because people discuss them but machines can’t verify them, or machines cite them but people never talk about them.

The Garden of Forking

In Borges’ The Garden of Forking Paths, the protagonist encounters a novel in which every possible outcome of every decision actually occurs, each spawning its own timeline… The book is incomprehensible because it refuses to choose.

Our AI systems have made the opposite choice: they choose constantly, ruthlessly, invisibly. And their choices differ.

When you ask ChatGPT about financial services, the top source it consults is Reddit, appearing in 176.9% of responses (the percentage exceeds 100 because sources can appear multiple times), followed by Wikipedia and Investopedia. When you ask Google AI Mode the same question, the hierarchy is completely different: Bankrate (86.6%), NerdWallet (75.1%), and Investopedia.

It’s the same question. Different answers.

YMYL

The divergence is most extreme in domains Google has long termed YMYL—”Your Money or Your Life”—subjects where getting the answer wrong can ruin someone.

Here, the disagreement over what constitutes a trustworthy source becomes existential. Is community consensus (Reddit) more reliable than professional aggregation (NerdWallet)? Is the wisdom of crowds wiser than the curation of experts?

The systems have answered differently, and so the reality they construct, the financial landscape visible to their users, forks into parallel versions of reality.

In one, SoFi barely registers (12.7% weighted share of voice, the only fintech in the top mentions). In both, legacy institutions like Fidelity (33.7%) and Vanguard (29.28%) dominate, but the evidentiary trail leading to their dominance branches through entirely separate ecosystems.

We used to worry about online bubbles that filter our realities, where people would see different information based on their preferences. We are now seeing something stranger: people asking neutral questions and receiving definitive, diverging answers constructed from different epistemologies, creating large gaps of misunderstanding.

The Accessibility Problem

There is a technical requirement buried in SEMrush’s appendix that would have seemed mundane in 2015: ensure your content uses static HTML, is crawl-able, includes structured markup, etc.

Many modern websites, especially those built on JavaScript frameworks, render content dynamically. For human visitors, this creates smooth, app-like experiences. For the crawlers that feed AI training datasets, it creates a kind of opacity, like trying to read a book through frosted glass.

If an AI model cannot parse your product specifications, your warranty terms, your FAQ, then it cannot cite you. You may have the most comprehensive information in your industry, formatted beautifully for human eyes, and be completely invisible to the systems mediating discovery.

This is a new form of literacy: not how well humans can read you, but how well machines can. And unlike traditional literacy, which develops gradually through education, machine-legibility often requires retroactive restructuring of existing infrastructure, and cuts faster and harder at the edges.

Companies that spent millions optimizing for Google’s traditional search algorithms in 2015 may find themselves invisible in 2025 because the medium through which content is accessed shifted beneath them. It’s like discovering that your entire library was cataloged in a language the new librarians can’t read, and they’ve decided that anything not in their catalog doesn’t exist.

The Commons Before Enclosure

If every company now engineers for AI visibility the way they once engineered for SEO, the signals will degrade. Reddit already struggles with astroturfing—corporate accounts posing as enthusiastic users. Imagine that scaled to an industry mandate. Wikipedia’s volunteer editors wage constant war against promotional editing; now imagine that every brand assigns teams to “improve” their Wikipedia presence.

We may be witnessing the brief window when user-generated content still functions as reliable signal—a digital Commons before the Enclosure Acts, that historical moment when shared land was fenced and privatized, and the landscape itself was restructured to serve commercial interests.

The alternative future is already visible in outline: a Heisenberg uncertainty principle for brand presence where the act of measuring and optimizing for AI visibility changes what AI visibility means… until the metrics decay into noise and new ones must be found, in an infinite regress of cataloging systems, each claiming authority over the last.

The Test

There is a diagnostic available now, in this transitional moment.

If you are mentioned but not cited, you exist in culture but not in the machinery of fact. People talk about you; machines don’t trust you enough to quote you. You are folklore.

If you are cited but not mentioned, you are infrastructure—useful, invisible, easily replaced when something more useful appears. You are plumbing.

It’s not yet clear which audience matters more: the humans deciding what’s worth discussing, or the machines deciding what’s worth remembering.

The uncomfortable possibility is that the question is already obsolete—that “human” and “machine” audiences have become so entangled, so mutually constitutive, that distinguishing between them is like asking whether it’s the map or the territory that determines where you can go.

The answer, as with most things Borges understood better than we do, is both, and neither; the question itself is the territory we’re trying to map.

02

Apr 22, 2026

The Curious Expectation That Machines Will Improve Your Circumstances

What History Tells Us About the Last Major Economic Revolution & How It Might Guide Us

There seems to be a rather fashionable anxiety that artificial intelligence will either impoverish or enrich you, rather than on the considerably less poetic matter of who, precisely, will own the means of its production, regulate it, and profit from its unyielding exertions.

It is a seductive misdirection.

The current debate assures you that wages must either fall, because the lower rungs of employment will be quietly sawn off beneath you, or rise, because productivity will ascend and carry you with it like an obliging lift; and you are encouraged to feel gleeful at the assuredly forthcoming labor-less existence we all will enjoy.

But the presumption is flawed, because what AI is, or could be, has little to do with how it will impact most people. People shouldn’t worry about AI, they should worry about who controls it.

Economists that write in the WSJ or FT believe that in the long run, things will settle. They keep implying that the bruise of progress, however dark, fades into something resembling prosperity. It is, if you sit with the past a little while, a strange faith considering the reality of what comes with tectonic change.

Friction

Economists have a word for the trouble that comes when a new technology breaks across the old world. They call it friction — as though it were nothing more than a hand passing over wood, a splinter, a small heat soon gone. It is a soft word, almost an apology. But the kind of friction they’re referring to feels more like life and death on the Alaskan tundra. And to understand what kind of winter these new machines might bring, it helps to remember the last one: the one our great-great-grandparents lived through when there was a stretch of land at the edge of every village where a family could graze a goat or cut peat or set their sheep loose among the others. It wasn’t much. It was, in a quiet way, everything. It was the difference between belonging to a place and being a guest in it.

Then came the laws of enclosure. Hedges erected where there had been open ground. The commons, parcel by parcel, were stitched into the holdings of the few, and the families who had lived by them for generations found themselves without a foothold in the soil that had fed their grandparents.

Dispossessed of the land, the masses of disenfranchised farmers were pushed into burgeoning town centers where factories billowed smoke. These factories needed the workers, but they exploited their newfound vulnerability to set the conditions in which they lived. In the match factories, where they handled white phosphorus, workers developed a thing called phossy jaw — a slow vanishing of the jawbone, the body itself dissolving in tribute to the cheap, flickering light of someone else’s evening.

This was not a season. This was a hundred and fifty years of child labor where people regularly died without healthcare or advocacy. Across the ocean in the newly minted United States, railroads and bridges built an empire on the backs of those with no other option. You either worked enslaved to a titan of this new industrial age, or you perished. Lives began and ended inside nearly 200-years of friction. Whole generations were born into the smoke and laid down beneath it without ever seeing the other side.

One must admire the delicacy of the term, which manages to reduce centuries of deprivation to the social equivalent of a misplaced glove.

The more serious difficulty, which you are not expected to dwell upon for long, is that technological progress does not distribute itself with any particular sense of fairness; it accumulates, rather conspicuously, around those already in possession of capital, assets, and influence, leaving the rest to negotiate their access to its benefits through wages that are, in turn, determined by a person’s replaceability, bargaining power, and ability to endure.

The 15-hour Work Week

You may, if you are feeling optimistic, recall the prediction of John Maynard Keynes, who rather famously imagined that increased productivity would permit a fifteen-hour work week, a prospect so civilized that it has remained, with admirable consistency, entirely unrealized.

There hasn’t been a miscalculation in our growth rate, or a failure in our technologies to enable abundance, but rather the reality of a more persistent arrangement in which the gains of that abundance are retained by those who can acquire the machinery, the land its operated upon, and then dictate the terms of consumption. Meanwhile, the rest of the population must work, in whatever way “the market” deems they must, in order to afford the privilege of existing within reach of that abundance.

Ownership, in other words, is the quiet author of your future.

In theoretical models, those delicate constructions in which everyone possesses a modest share of resources and may therefore choose between leisure and income with philosophical composure, technology appears liberating, even benevolent; however in practice, where housing, energy, food, education, credit are owned/managed by an increasingly tiny class of individuals, the majority find themselves obliged to work not because productivity demands it, but because their survival does. It’s uncanny how history seems to repeat itself.

As an unfriendly reminder, this is the arrangement. The rich get richer and the poor get poorer. For a brief window in human history, for roughly the last century or so, things were a little less bleak. Yes, the baby boomers had it good, in comparison to practically every other generation in known history. This was thanks to hard fought battles by laborers, unions, and leaders like FDR who redistributed the industrialized economy’s resources through structural mechanisms he embedded in the economic and political systems of the United States. This reset expectations between the working class and the ruling class as to what a civilized society aught to aspire to be for the average person. But now, as roughly 80 years have passed, the social contract is back up for negotiation.

According to the UBS Global Wealth Report, in 2023 the world’s richest 1 percent, those with more than $1 million, owned 47.5 percent of all the world’s wealth – equivalent to roughly $214 trillion. Adults with less than $10,000 make up nearly 40 percent of the world’s population, but hold less than 1 percent of the world’s wealth. UBS defines “wealth” as the value of financial assets plus real assets (primarily housing) of an individual, minus their debts. (Source: Inequality.org)

Rapid economic growth in Asia (particularly China and India) has lifted many people out of extreme poverty. But the global richest 0.1 percent and 1 percent have reaped a much greater share of the economic gains, according to the World Inequality Report. In 2025, the richest 1 percent pocketed 20.3 percent of global income, up 3.4 percentage points since 1980. The top 0.1 percent pocketed 8.2 percent in 2025, up 2.5 percentage points since 1980. These ultra-rich individuals did take a hit in the 2008 financial crisis, but the richest top 0.1 percent have nearly regained the global income share they enjoyed in 2007.

One argument I hear sometimes is that wealth is better held in the hands of those with enough knowledge and wisdom to rear it into its best use. And yet… studies overwhelmingly show that the more inequality grows in any social construct, the greater the likelihood of instability, suffering, and eventually, collapse.

You may be tempted to believe that history, having once been so uncouth, has since acquired better manners, and that modern societies will distribute the benefits of AI with a generosity absent in earlier eras; yet the only periods in which living standards rose meaningfully for the average person were those in which labor movements—strikes, unions, and rather impolite demands—forced a redistribution of wealth that technology alone had declined to provide. Since Reagan, inequality has only grown.

Progress, it seems, required assistance.

Now

Which brings us to the present moment, in which you are invited to speculate whether AI will be kind or cruel, as though it possessed a temperament, when the more pertinent question should always be whether the structures governing its use will demand its gains be shared at all.

You may expect, if precedent is allowed any voice, that productivity will increase, and unabated, wealth will concentrate, resulting in calculated rationalizations.

The question I ask myself whenever I hear someone sharing an unsolicited opinion on the subject is the same: what stake do they have in the outcome? How much do the consequences of unfettered inequality actually impact them personally?

The truth is that whether your circumstances will improve in our new economic reality depends upon something far less elegant than theory; like our ancestors before us, we’ll have to demand it if we want any share of its abundance at all.

So, will AI make the average person’s life better? Statistically, it would be highly unlikely that it would make much measurable improvement in the daily life of the average person, unless there is strict regulation to govern its ownership and use. It is much likelier to degrade the quality of life for the average person when referencing its current trajectory of governance.

What will AI likely mean for our world? Without question we know it will enable heightened scrutiny, surveillance, and enforcement of sovereign law. The more availability to monitor any situation or person will undoubtedly lead to more monitoring. This will be good in some arenas and bad in others; it just means oversight of that monitoring is essential. As you most likely understand by now, it will also empower a technocratic class that maintains relative supremacy over every facet of society: economic, political, and social. While the degree to which this is true is unknown, those with proximity to the power emboldened today stand the most to gain tomorrow.

So, the average person is essentially screwed? Not necessarily, and not yet. History shows change requires organized demand, and there are plenty of ways you can engage in a positive way. It’s up to you, and everyone you know, to demand that if the means of production are guarded, which it seems they may be based on current movement in the global mining and chip sectors, then the specifics of how it may be consumed and how producers may profit from it must be controlled through powerful and effective, corruption-resistant frameworks. The EU AI Act uses a risk-based classification system that mandates specific technical and organizational controls for high-risk AI systems, such as those used in employment, credit scoring, or law enforcement. UNESCO’s recommendations insist on greater environmental sustainability and gender equality. When reviewing any framework keep an eye out for human oversight, safety systems, accountability mechanisms, transparency in production and use, data protection, proportionality, and other aims that mitigate bias and inequity. And if you’re currently evaluating AI for yourself or your company, you should also be referencing NIST’s recommendations related to AI cybersecurity to protect your assets against the widening attack surface.

The most important function of any governance solution, whichever people eventually coalesce around, must come with clear consequences, fairly assigned to bad actions, regardless of origin. If we can organize meaningful demand for distributed gains, hopefully we can also organize meaningful agreement around how AI can serve all of humanity, rather than the select few lucky enough to find themselves at the center of its development.

Keep track of AI governance through resources like this one and stay engaged in conversations like this one where everyday people are attempting to organize around the forthcoming reality that our economic system is never going back to the way it was unless we take the necessary steps now to ensure it does. And, selfishly, subscribe to blogs like this one: