TL;DR
  • Punch the macaque passed ten rounds of algorithmic auditions to achieve his viral status this week.

  • Your feed stopped being social media years ago. It's a publisher now, but it hasn't been held accountable like one.

  • The machine that brought you a baby monkey is the same machine that's been radicalizing people for a decade.

Last week, thirty million people watched a six-month-old Japanese macaque named Punch drag an IKEA stuffed orangutan across a concrete enclosure at a zoo outside Tokyo.

He'd been abandoned by his mother at birth, handed a plush toy as a surrogate, then released into a troop of sixty monkeys who wanted nothing to do with him. Visitors posted videos as #HangInTherePunch went viral across four continents.

I watched the videos. I forwarded one to two friends without thinking about it. The sharing was just reflexive, like the internet equivalent of announcing 'horse' on a road trip.

Then I got curious about the machine underneath it.

Auditions

Earlier this month I wrote about how refusing to engage with the online outrage machine is the most radical thing you can do in 2026. I meant that, but understanding how these feeds actually work is a useful move. You can't defend against something you don't understand.

So this week I went looking for the actual mechanics.

When a video gets posted on TikTok, Instagram, or Facebook, the platform shows it to a small test group, usually a few dozen to a few hundred people, often strangers who've never heard of the account. In the first fifteen to sixty minutes, every behavior gets logged: did people watch past the first three seconds, did they finish it, did they rewatch, did they DM it. Clear the threshold and the content goes to a larger test group. Clear that one, another wave.

Punch passed roughly ten rounds of increasingly large automated auditions, each completed before most of us had ever heard his name.

This is what virality actually is: a series of algorithmic promotions, each triggered by real human behavior from the previous round. Millions of individual people made micro-decisions, and the machine kept amplifying the signal.

Besides the cursory information online about how these algorithms work, I really have no idea what the machine is optimizing for. Nobody does, outside of a handful of engineers operating under non-disclosure agreements.

Signal Hierarchy

The specific behaviors that trigger expansion carry very different weight, and we know this only because platforms occasionally let details slip.

On Instagram, the single most powerful signal for reaching new audiences is a DM share. Sending a post to a friend carries three to five times more algorithmic weight than a like, confirmed publicly by Meta in January 2025. Instagram logs 694,000 Reels sent via DM every minute.

When I forwarded that Punch video to two friends without thinking about it, I was functioning as the algorithm's quality control department. My vote counted more than a thousand passive likes from users who scrolled past.

TikTok applies a brutal filter before any of that: viewers decide whether to keep watching in under two seconds. Every piece of content that goes viral has cleared a threshold most humans don't consciously register. Punch cleared it because a baby monkey dragging a stuffed animal is legible before your brain has finished loading. The emotional payload arrives instantly.

Facebook's system, now running on what Meta internally calls the Andromeda update, has shifted from a social graph to a discovery engine. Roughly half your Facebook feed in 2026 is content from people you don't follow, surfaced entirely by AI prediction models.

These are the breadcrumbs the platforms have chosen to share. The actual architecture (the specific weights, the ranking logic, the precise inputs that determine why Punch showed up on your screen and not someone else's) remains completely hidden.

Algorithm interpretability is the ability to understand, explain, and trace the decision-making process of an AI system, making its internal logic transparent to humans. The major platforms have built zero of it for their users. The feed you scroll every morning is produced by a system you cannot interrogate, cannot audit, and cannot appeal.

Black Box

You open Instagram and your feed loads. You have no idea whether the first post appeared because the algorithm predicts you'll share it, because the account paid for promotion, because someone in your network engaged with it four hours ago, or because a model trained on a billion data points determined that this particular image at this particular moment will hold your attention for an additional 1.3 seconds.

You will never know and the platform is not required to tell you. There is no disclosure and no audit trail. The system that shapes your attention more than almost anything else in your daily life is a complete black box.

Input: your behavior. Output: your feed. Middle part: not your business. Via Data Science Archive.

Jonah Berger and Katherine Milkman at Wharton spent years studying what makes content spread. Their finding: virality is driven by physiological arousal. Awe, anger, and anxiety spread reliably. Sadness barely moves. The algorithm is indifferent to moral valence, it reads arousal signals and acts on them.

If users understood that anger and anxiety were being systematically amplified because arousal drives engagement, the reaction would be severe. The opacity protects the product. Punch generates awe and anxiety simultaneously; we watch him drag the stuffed orangutan and something lifts in our chests, then he gets swatted by an older monkey and our heart rates spike. Both responses generate identical engagement signals, the algorithm registered them identically to a clip of political outrage.

We are voting daily in elections we don’t even realize we are participating in, for candidates we can’t see, governed by rules we were never given.

Publishers vs Distributors

Section 230 of the Communications Decency Act, passed in 1996, is the law that made modern social media legally possible. Its core protection: platforms cannot be treated as the publisher or speaker of content their users create. The early-internet logic was: if you're just the pipe, you shouldn't be liable for what flows through it.

That framing is now almost completely disconnected from reality.

In 2009, Facebook replaced the chronological feed with an algorithmic News Feed, and every platform soon followed. By the mid-2010s, the feeds we were scrolling were curated selections from engagement prediction models, optimized entirely for time-in-app. The platforms had become editors. Fast, automated, operating at a scale no human editor could approach, but editors nonetheless, making active decisions about what gets amplified and what disappears.

Today, roughly 40% of your Facebook feed is from people you don't follow. TikTok was built from day one around strangers, ranked entirely by algorithmic prediction. These platforms are making consequential editorial decisions at a billion-person scale while the legal system is just barely catching up.

In August 2024, the Third Circuit Court of Appeals ruled in Anderson v. TikTok that the For You Page algorithm is TikTok's own expressive activity, meaning Section 230 doesn't shield them from liability for what the algorithm recommends. The case was brought by the mother of a ten-year-old girl who died after TikTok's recommendation engine served her a blackout challenge video. When the algorithm makes the editorial choice, the court found, the platform owns the outcome.

The platforms spent years hiding behind a legal framework designed for a version of the internet that no longer exists. The interpretability problem made that hiding easier: if no one can see how the algorithm works, no one can prove it made a choice.

Bluesky’s Experiment

In 2023, Bluesky launched what it called a "marketplace of algorithms." The concept was straightforward: instead of one proprietary black box controlled by a single company, developers and users could build, publish, and subscribe to their own feed generators. A feed for cat photos, one for your mutuals only, one for posts from accounts within your social graph. Each feed's logic made visible, with its priorities declared upfront.

Bluesky had 25 million users as of December 2024, which sounds significant, until you realize it is a fraction of Meta's three billion. Its marketplace of algorithms is only meaningful if users understand what they're choosing between, and most don't. Algorithmic literacy is not a skill most people have been given reason to develop, because the dominant platforms have spent fifteen years ensuring you never needed it.

But the model proves something important: opacity is a choice. Interpretability is buildable. The major platforms simply decided not to build it, because transparency would expose what they've actually been optimizing for.

The Monkey in the Machine

The algorithm that has spent the last decade radicalizing people, amplifying outrage, and quietly making everyone more exhausted. That algorithm brought Punch to thirty million people via the same signal hierarchy, test-and-expand mechanics, and arousal threshold. A baby monkey and a culture war post score similarly when the signals are counted.

The difference is purely content. The machine doesn't know the difference between awe at a baby animal and rage at a political enemy. It just knows that both kept us scrolling.

You could argue that's fine when the output is Punch. You'd be right, mostly. But the machine that delivered him is indistinguishable from the machine that delivered everything else. It has no values. It has objective functions we cannot read, producing outputs we cannot trace, at a scale that shapes public consciousness.

The platforms built that system while claiming to be neutral infrastructure. The algorithm was always making choices, we just didn't have language for it yet. Plus they had very good lawyers.

But the machine that brought him to us is the real story, one the platforms have spent fifteen years and a lot of legal fees ensuring we never get to read, because to tell that story we’d need to see inside, and they've made sure we can't.

Up and to the right.

Keep Reading