
If you logged into X on October 9 or 10, 2025, you probably expected the usual feed: a few memes, your cousin’s dog video, maybe one of those cryptic subtweets from someone who still thinks “vaguebooking” works in 280 characters. Instead, what you got was a carnival barker’s megaphone of right-wing political content—accounts you never followed, takes you never asked for, rants you thought you blocked years ago. Suddenly, your feed was less “personalized timeline” and more “Fox News broke into your house and rearranged the furniture.”
This wasn’t your imagination. Axios and other outlets flagged what users themselves noticed: a sudden algorithmic surge, an unsolicited parade of unfollowed right-wing content. Frustration followed, as blocking became the new cardio. The problem wasn’t simply political lean—it was the structural reality of a platform that prioritizes “engagement” over sanity, “relevance” over consent, and “personalization” over the very basic idea that your feed should reflect… well, you.
The Algorithm’s Greatest Hits
Let’s trace this latest performance art from the engineers’ codebase straight into your scrolling thumb.
- October 9–10, 2025: Users across X reported their feeds flooded with posts from right-wing accounts they hadn’t followed. The theme: unsolicited content featuring election conspiracies, culture war memes, and posts hyping the administration’s “war on woke.”
- Blocking bonanza: In response, users began mass-blocking, only to watch new unfollowed content appear like weeds in a parking lot crack.
- The data backdrop: Studies from 2021 through 2025 repeatedly documented algorithmic favoritism toward right-leaning content, especially after Elon Musk publicly endorsed Donald Trump in late 2023. Researchers found “boosts” that mysteriously coincided with Musk’s political statements, though some users did report feeds skewed the other way.
What unites all of it is the unpredictability. The algorithm isn’t showing you what you asked for; it’s showing you what it thinks will keep you engaged. Engagement, of course, is a polite euphemism for rage.
A Timeline of Tweaked Reality
The arc is clear enough that we can put it in dates and numbers, like a scientific study of a black hole swallowing common sense.
- 2021–2022: Early academic work notices that X’s recommendation engine disproportionately amplifies right-leaning material compared to neutral baselines. Company spin: “We prioritize high-engagement content.” Translation: outrage travels fastest.
- November 2022: Elon Musk purchases Twitter (later renamed X). Installs himself as Chief Algorithm Officer by vibe alone.
- 2023: Musk endorses Trump. Within months, researchers notice measurable bumps in right-leaning engagement metrics. The company denies intentional bias. Musk tweets that the algorithm is “fair,” which in practice means “fair to whoever I like this morning.”
- 2024: Users on both ends complain. Some left-leaning users notice their feeds look like campaign flyers; right-leaning users occasionally claim shadow bans. In reality, the algorithm is playing both ends, seeding doubt as its primary product.
- September–October 2025: With an election year countdown clock ticking, feeds erupt with uninvited political content. Frustrated users share screenshots of “suggested posts” that look more like targeted ads for the RNC.
The Neutrality Mirage
X insists it is committed to “neutrality.” That word, in Silicon Valley, functions the way “thoughts and prayers” do after a mass shooting. A ritualized incantation meant to deflect responsibility while nothing changes.
Neutrality is impossible when your business model is engagement. Engagement is driven by friction. Friction is sustained most efficiently by outrage. And outrage, in American politics circa 2025, tends to arrive in bulk shipments from the right-wing outrage-industrial complex.
Thus the algorithm doesn’t need to be maliciously biased to be structurally biased. It just needs to reward whatever makes people angriest fastest. Which means your aunt’s recipe blog post sinks to the bottom, and a video of someone burning a Pride flag to “own the libs” climbs to the top.
The Personalization Illusion
Personalization used to mean: “We’ll show you more of what you like.” In practice, on X in 2025, it means: “We’ll show you what will keep you clicking, regardless of whether you like it.”
Your personal taste doesn’t matter. Your outrage does. The algorithm is like a waiter who hears you order a salad, then returns with a bucket of fried chicken because “trust me, you’ll eat more of this.”
Blocking accounts is the new whack-a-mole. Users block ten, twenty, fifty accounts, only to see fresh ones surface because the algorithm has infinite supply. Personalization has become a casino rigged against the gambler, one where the house wins by feeding you posts designed to make you angry enough to pull the lever again.
Why October’s Surge Matters
This wasn’t just an annoying bug. It’s a glimpse of the information architecture heading into 2026. When the majority of Americans report they are “unsure” about basic scientific claims (see: Tylenol and autism), it’s because the platforms where they encounter information are designed to seed doubt, not clarity.
So when X users logged in on October 9–10 and saw their feeds hijacked by political content they didn’t choose, what they saw wasn’t an error. It was the system working as designed. Engagement maximization dressed up as freedom of expression.
What the Research Shows
The studies aren’t ambiguous.
- Academic analyses 2021–2025: Multiple peer-reviewed papers show disproportionate amplification of right-leaning material, relative to its baseline presence.
- Company defense: X claims these are artifacts of higher engagement. If right-leaning posts draw more shares, clicks, and arguments, they’ll show up more, end of story.
- Alternative findings: Some researchers note left-leaning surges in certain contexts, often around social justice protests. But the overall tilt, especially post-2023, trends rightward.
It’s less conspiracy, more gravitational pull. Rage is the heaviest object in the room, and the algorithm orbits accordingly.
The Consequences: Democracy as Clickbait
- User Experience: People leave platforms when they feel spammed by politics they didn’t sign up for. Or worse, they stay, simmering, letting rage become the baseline.
- Elections: Algorithmic favoritism doesn’t need to be 100% to tilt reality. If undecided voters are nudged a few percentage points by seeing one kind of post more often, the outcome isn’t neutrality—it’s engineered momentum.
- Public Discourse: When people assume the platform itself is picking sides, trust erodes further. At that point, every feed becomes not a conversation but a suspicion machine.
The Satirical Kernel: X, the World’s Loudest Megaphone
X isn’t a social media platform anymore. It’s an algorithmic slot machine, except instead of cherries and sevens, the reels spin “Hunter Biden’s laptop,” “the border invasion,” and “cancel culture.” Musk, like every casino owner, insists the game is fair. But the house always wins.
The irony is that neutrality has become its own performance. Every time the algorithm burps up a new round of unsolicited right-wing screeds, X insists the code is impartial, the way a crooked referee insists he just happened to call twelve penalties on one team.
And the personalization pitch is almost tender in its gaslighting. “We’re showing you what you want,” X says, as if the flood of content from accounts you never followed is somehow evidence of your deepest desires. The system insists it knows you better than you know yourself. Which is true, but only in the way a stalker knows your daily commute.
Closing Section: The Algorithm Will See You Now
The October surge of right-wing sludge wasn’t an accident, wasn’t a bug, wasn’t a glitch. It was the logical conclusion of a system built not to inform you but to provoke you.
Americans already live in an information ecosystem where truth competes with performance, and performance always wins. X has simply automated the process, turning your feed into a reality TV confessional booth where the contestants are politicians and pundits you never invited.
You can block. You can mute. You can log off. But the algorithm doesn’t care. It will continue to churn, to spit, to flood, because rage pays the bills.
And when the next election cycle heats up, the platform will pretend again to be a neutral arbiter. But the truth is simpler, darker, and less flattering: X isn’t biased left or right. It’s biased toward chaos. It doesn’t want you informed. It wants you inflamed.
Because inflamed people don’t walk away. They keep scrolling. They keep fighting. They keep pulling the lever.
The algorithm doesn’t need your approval. It just needs your attention. And it will have it, whether you asked for it or not.