Social Media—”The technology of the Century.” I want to talk about an unnerving discovery I’ve been observing over the past few years, that our collective sense of “what’s really happening” has fractured. I’m sure you’ve all experienced this before. You tune in to a breaking news story, check social media, and realize that half your friends or family seem to be talking about a totally different event, even though it’s the same headline. It’s not merely about differing opinions anymore; it’s about each side accepting an entirely different set of “facts.” This disconnect feels disorienting, especially when we’re looking for solid grounding on what the reality is.
I’ve had a growing sense that more than just opinions or biases are at play, there’s a whole new ecosystem of information that isn’t always anchored in reality showing up in different streams. It leads us down separate paths of outrage, confirmation, or skepticism, creating pockets of understanding that rarely intersect. In many ways, I feel like we’re all living in distinct bubbles, shaped by the social media feeds we follow, the new independent media we trust, and, of course, our own ingrained habits of attention.
What we’ll be exploring today is how these “different worlds” arise, why social media algorithms drive us toward increasingly narrow viewpoints, and how we might reclaim our ability to see beyond our customized feeds. We’ll delve into the consequences of a fragmented media landscape, both old-school and newly independent outlets, and consider ways we can begin bridging the divide. Ultimately, this piece aims to offer a path forward that doesn’t involve choosing sides so much as rediscovering the shared reality that underpins them all.
We’re not Truth Seeking Machines
I want to start off by mentioning an observation that someone I watch made which really rang true for me. A big part of the problem we face is that humans are not naturally wired to be truth-seeking machines. We lean toward information that reaffirms what we already believe, we prefer comfortable understanding. Confronting facts that challenge our worldview is uncomfortable, and that discomfort takes practice to overcome. As we will discover, what used to keep us in check, open dialogue between opposing views, has broken down.
The Three Worlds: Left, Right, and Reality
More and more, our social environment is fracturing into at least three distinct “worlds.” Two of them—let’s call them “Left World” and “Right World”, are shaped by separate news sources, independent media personalities, and opinion platforms that reinforce their specific viewpoints. Then there’s the third world: the world of reality, where only real facts can exist but have very few mechanisms to quickly enter the two other worlds. This is the space where objective truths about events and data-based evidence are supposed to hold sway.
Getting to this “real world” is becoming surprisingly difficult now. Why? The answer largely lies in how information is served to us, and how we, as humans, respond to it.
Social Media Algorithms
In order to fully understand the three worlds we need to discuss something that I’m sure you’ve all heard about before, the Social media algorithms. This buzzword has spread like wildfire and I think for a very good reason. We’re all interacting with it in some way our another. Algorithms are incredibly good at giving you information they think you’ll interact with. It’s like having a personal tailor who only makes you clothes in your favorite style, color, and fabric. Sounds great, until you realize it’s narrowing your wardrobe. In the same way, algorithms narrow the range of content you see.
Why do they do this? Because they want your attention. Humans are wired to pay closer attention to things that rile us up, especially if there’s a threat or conflict. Outrage and controversy keep us glued to our screens far more effectively than civil discourse or balanced reporting. The algorithm is simply a feedback loop: the more you click on outraged opinions, the more outraged opinions you’ll be shown. Over time, you’re stuck in a silo, hearing only one side and seeing only certain facts, all while believing this is exactly how the entire world works.
We have to understand that Echo Chambers are not a bug, they are a feature. When we talk about “echo chambers,” it’s tempting to blame social media alone. But the truth is, our social splits, Left vs. Right, have been around for ages. Social platforms didn’t create this divide; they merely supercharged it.
The result? Two (or more) isolated bubbles of information. You open your feed and see exactly what reinforces your existing beliefs and biases. It isn’t necessarily a grand conspiracy; it’s often just how the attention economy functions. Combine that with our natural tendency to seek “like-minded” individuals and voilà: echo chambers that rarely get challenged from the inside.
Rise of Independent Media, and Decline of Standards
Adding to this complexity is the explosion of independent media outlets. On the surface, this seems great, more voices, more viewpoints, more freedom. The downside is that many of these outlets don’t have to follow the same journalistic standards that traditional media once held.
Traditional mainstream media is run by companies that, at least in principle, must maintain certain standards to protect their reputation and avoid legal issues. Independent media often has no such constraints, which makes it easier for misinformation or outright lies to spread. In a world where truth can be easily distorted, any platform that punishes lies is at a disadvantage because it risks losing its outraged, hyper-engaged audience.
Think about it for a second, MSM (mainstream media) are a often a conglomerate which can be held accountable for things like defamation, inaccurate reporting and government interference. This isn’t to say that all of MSM reporting is accurate, this is certainly not the case and our trust for these institutions is at an all time low, often for good reason. However the issues we have with MSM will apply tenfold to independent media who are often heavily protected under the First Amendment umbrella. Recently CNN was sued by a US navy veteran for defamation, they ended up settling. This sort of consequence for bad reporting is not something the New Media needs to worry about, as they rarely face this sort of challenge. They are free to propagate what ever “opinion” they want, which is often taken as fact by their audiences.
Nowhere is this phenomenon more blatant than on X. The platform’s looser rules around speech encourage people to say almost anything that grabs attention, and the opportunity to make money based on engagement only raises the stakes. If a single post can rack up massive engagement, it can translate into real dollars for the creator. We’re talking potentially thousands of dollars a month if you have hundreds of thousands, or millions, of followers.
Consider an account like Libs of TikTok (just one prominent example). If a single post gets, say, a million views, that alone can generate a chunk of cash, depending on ad revenue sharing. Multiply that by daily posts and it becomes a very profitable business model. The point is: there’s a monetary reward for incendiary posts. Combine that with a platform that has minimal fact-checking before something goes viral, and you’ve got a perfect storm for misinformation to spread widely.
Let’s Quickly Reflect
Bringing this back to the Three Worlds that we find ourselves in now, if we apply everything we’ve gone over here so far:
Seeking only comfortable information
Algorithms feeding you exactly what you want to see
Lack of consequences for new networks propagating lies
The dollars incentivize everything
its becoming very easy to see how we’ve ended up with our Three Worlds today. There’s actually another factor to all of this that I think hasn’t helped, Fact checkers, the arbiters of Truth!
Fact-Checking Is on Life Support
At this point, the concept of fact-checking is almost dead in the eyes of many. People have lost trust in institutions that used to serve as gatekeepers, journalistic outlets, scientific bodies, and so on. This distrust is understandable; it’s been building for years. X’s “Community Notes” is a step toward real-time fact-checking by peers, but it’s a tiny patch over a massive tear. Without broad societal trust, fact-checking doesn’t resonate with people who already suspect every “official” source of having an agenda.
As if our own divisions weren’t enough, foreign actors see this fragmented landscape as an opportunity. They can easily blend into social media communities, fanning the flames and injecting misinformation that serves their strategic interests. Their ultimate goal is to destabilize our nation’s influence and strengthen their own. This intensifies the mistrust and confusion, making it even harder to figure out what’s real and what’s fake.
How Do We Overcome This?
There are quite a few things that we can start working towards to make a difference, both in our personal lives and in the ecosystem you find yourself in.
First things first, the obvious which I’m sure you’ve all felt before: take a break. Spending hours scrolling through curated feeds can warp your perspective. Set some boundaries. Stop letting the algorithm feed you endless drips of outrage.
Next, actively follow voices from the other side. You don’t have to buy into everything they say, but it’s vital to see what’s shaping their perspective and for ourselves to start training our minds to consume disagreeable information no matter how uncomfortable it may feel. This balanced feed can help you cross-reference information and spot inconsistencies and lessen the resistance to other perspectives which may end up being the truth.
Another way to address that same issue is many platforms have settings that limit how aggressively the algorithm tailors content to your preferences. Consider turning these off or reducing personalization. It might feel uncomfortable at first, but it helps prevent a one-sided stream of content.
There’s a tricky moral dilemma that I found myself considering during my time writing this post. Should we do this for our loved ones, turn off their personalization settings, without telling them? It’s a tough ethical call. On the one hand, you might genuinely help them see a broader reality and lessen their doom scrolling. On the other, you’re making a choice for them that they’re unaware of. This isn’t a question with a simple yes or no answer, I certainly don’t have an answer to this dilemma so you’ll have to take this one away with you to consider.
Future Solutions on The Horizon
I think eventually, we might lean on AI to solve the post-truth problem, at least in part. AI can scan massive amounts of information and cross-reference sources for consistency and reliability in a fraction of a second. That’s something no human fact-checker can do. The tricky part to this solution is going to be the Marketing aspect I think. We don’t want AI to declare “This Is The Truth.” Instead, it needs to add context, flag potential inaccuracies, and offer more data points for us to consider rather than become an authority in the “information accuracy space”.
Here’s my vision for this, Imagine a browser plugin that gently informs you: “We found these conflicting sources about this claim; here’s the extra context you might want to consider.” It would encourage you to think critically rather than dictating “truth” from on high. Ideally, we could provide feedback to train the AI further or inform others on that same information source, creating a constantly improving system that’s less susceptible to bias and manipulation.
It’s a hopeful vision. But even AI isn’t immune to human biases, especially if it’s trained on flawed data. We’ll have to figure out a way to ensure it has transparency in these systems to ease any possible distrust in the system.
With that said I think until an effective, system-wide solution arrives, we all have a role to play. We can’t wait for governments, platforms, or AI to fix everything for us. We have to become the “white blood cells” in the online ecosystem, challenging misinformation and injecting reason into conversations. It’s not always fun, and it often leads to friction, but it’s vital in keeping the broader system healthy.
Concluding Thoughts
We live in an era where reality is up for grabs, a time when different factions occupy entirely separate “worlds” within the same world-wide web. Personally I’ve found it unsettling and to be honest, sometimes exhausting. But it’s also an opportunity. We can choose to leave our bubbles, or at least poke a few holes in them. We can learn to pause and question instead of scrolling and nodding. We can challenge ourselves, and each other, to see the world not just as we want it to be, but as it truly is.
I’m reminded of one of my favorite quotes from the founding father, John Adams:
“Facts are stubborn things; and whatever may be our wishes, our inclinations, or the dictates of our passions, they cannot alter the state of facts and evidence.” - John Adams, 1770
In these fractured times, making that choice becomes an almost radical act. Yet it might be the only way to bridge our divides and piece together our shared reality. After all, if we don’t do it, who will?