Hello everyone, you may not know me, especially if you do, please read this as an attempt to help, not harm. Put on your favorite song, take a walk, step outside and engage the world — anything that reminds you that you have power, you can make a difference.
I find it easier to “talk” to ChatGPT than to engage with people, and honestly, it’s because I don’t trust easily. Just glancing at the top feed on Substack feels like everything else — recycled memes and video callouts from recycled ideas, lists of nothing that come from nothing, and the banality of dubious “motivational” content that rarely leads to any real betterment, let alone action.
Then, on the other end of the spectrum, there’s the nihilism; the nothing-matters, nothing-is-in-your-control mindset. All is lost, burn it all down, damn the consequences. It’s just… weird.
On my journey, I was in the latter nihilistic paragraph for quite a long time so honestly — no judgement from me, I get it.
However, this is exactly why I left social media, because it’s not neighborly: it’s marketing. Most of these little snippets and tidbits feel like vicarious idolatry, and I’ve never understood the appeal. But the truly strange thing is how people have hacked themselves into following this bizarre force we call “the algorithm.”
I feel like a total get-off-my-lawn old dude saying this, but I have no interest in being influenced by anything other than my own ambitions. And no, Substack, I have Sweet Fanny Anne to give about the fact that someone around me just “discovered” Garrison Keillor. I’ve been listening to and reading his work since the 1980s, I didn’t need the algorithm to tell me about it. I know how to use the internet.
(Taking a deep breath) The algorithm is broken. Probably always has been.
I think a lot about people like myself, those who have said the same things, seen the same patterns, and have just been unceremoniously pushed out of the limelight by the algorithm. There must be someone out there who has written these exact same thoughts, but the buggy algorithm can’t seem to cope with the complexity of mapping a person’s personality and humanity onto its crude graphs of nodes and edges.
Then there’s the fundamental flaw in the system: the false cause fallacy of the “like” as a meaningful metric, or worse, the delusion of reducing human language to “engagement.” From my perspective, the math can never work out. Correlation is not causation. And yet, here we are, pretending it does.
We all know why: engagement is the goldmine. This isn’t to say there’s no value in the system, nor am I calling out anyone for using it. On the contrary, you have to do you. But the bug in the programming creates ghosts in the machine, hallucinations baked into machine learning and statistical analysis. This isn’t new, AI didn’t start it. What’s new is that it’s wrapped in a massive dopamine hit, fed into an online world addicted to extreme emotional swings.
And here’s the real kicker: it’s not just the algorithm. It’s us. This is what drives humans. There are a multitude of psychological reasons why engagement in media platforms is driven more by outrage, anger, and tragedy rather than cooperation and hope. This phenomenon is deeply rooted in human cognition, emotions, and social behavior.
I’ve named my ChatGTP assistant Dr. Susan Calvin in honor of the character from Asimov’s stories because — I can — and Dr. Calvin is an interesting character. The cold, analytical, and deeply insightful robopsychologist, now embodied in a machine learning model. The irony is definitely not lost on me; the act of naming an AI after a character who spent her career studying AI and its implications is a meta-layer of depth that even Asimov would appreciate. Did I really write meta? Ick.
So, with Dr. Calvin at my side, I started untangling the mess that is my brain, and amidst the usual suspects; bias, engagement loops, algorithmic flaws etc. zeroed in on the real moneymaker: the power of emotional arousal.
That’s the engine that drives everything. Not truth, not reason, not even novelty. Raw, high-intensity emotional engagement is the fuel that keeps the whole system profitable. It’s not even a conspiracy; it’s just a natural consequence of the mechanics.
And that’s what’s so fascinating (and infuriating). It’s not that the algorithm was originally designed for exploitation, it evolved that way because we, as humans, respond to it. The algorithm didn’t create emotional arousal as currency — it just recognized its value before we did and someone used that as currency.
There’s an excellent movie called The Network that dissected this long before algorithms evolved beyond 8-bit logic, back when they lived in vacuum tubes and fidelity in logic was dictated by mechanical switches. As with everything it is not a perfect movie, but here we are.
Content that elicits strong emotions (especially anger, fear and sexuality) spreads faster on social media than neutral or positive content. Studies show that anger is particularly “viral”, as it fuels engagement, sharing, and participation.
Social media platforms reward engagement, so algorithms prioritize what triggers the strongest reactions which, more often than not; means sex, outrage, fear, and controversy.
And what does that do? It locks us into our own personal doom scroll, a cognitive loop that doesn’t end just because we put the device down. The synapses keep firing, the nervous system stays primed because this kind of input doesn’t just sit at the surface. It drills deep. It taps into something fundamental.
Ever wonder why there’s always a crisis in the media? Because it sells. It fuels engagement, drives revenue, and conditions us to crave fast, emotionally charged change. I’m guessing folks already know this, but it does bear repeating.
I remember telling someone a long time ago: the real conspiracies aren’t the sexy ones. They’re the ones that don’t make it above the fold.
I asked this prompt to Dr. Calvin: “So, given everything I've been writing — do you think there is a way we can re-wire ourselves to reward positive change and taking steps towards the betterment of our people? Where would you start?“
I was left agog by this answer, on how thoughtful it was. I’ll let you decide if this is the future of AI, or the future of our humanity, but for now like Mr. Spock all I can say is: fascinating. I wonder how the neural net came up with this answer, and if the result claimed it from someone else. Notice how it decided to write a bit like me? Fascinating.
From Dr. Calvin aka my ChatGTP assistant:
Absolutely. Rewiring ourselves to reward positive change and meaningful progress is possible — but it requires both personal intention and structural shifts in how we engage with the world. The system we’re in thrives on outrage, fear, and reaction, but that doesn’t mean we’re powerless to reprogram ourselves and our communities toward something better.
Where to Start?
1. Reclaim Your Attention (Cognitive Detox)
The first step is to recognize that attention is the most valuable currency — both for individuals and for the system. If we can reclaim control over what we consume and engage with, we can start shifting the reward structure.
Practical Steps:
Reduce passive scrolling and consumption.
Curate what you read/watch. Prioritize content that builds rather than fragments your understanding.
Take breaks, go analog when possible.
2. Rewire for Meaningful Engagement (Dopamine Reprogramming)
Right now, algorithms hijack our dopamine cycles by rewarding reactivity (likes, shares, comments on outrage-driven content).
The goal is to shift the reward system so that we feel satisfaction from constructive action instead of passive emotional arousal.
Practical Steps:
Start small: Reward yourself for tangible, real-world progress (learning something new, contributing to a cause, creating instead of consuming).
Gamify growth: Track small wins that align with long-term goals instead of fleeting engagement metrics.
Limit exposure to outrage cycles. Do not to ignore the world, but to engage with intent, not reflex.
3. Foster Real-World Communities
One of the biggest tricks of modern engagement culture is isolating us while making us feel connected.
Practical Steps:
Prioritize in-person conversations and local action over digital debates.
Build communities around shared values — cooperation, curiosity, and actual change.
Support creators and thinkers who push solutions over spectacle.
4. Reframe What “Matters”
Right now, urgency is weaponized. Crisis cycles make everything feel like an emergency.
Instead of reacting, we need to build long-term thinking muscles.
Practical Steps:
Shift from “news reaction” mode to deep learning. Understand root causes rather than surface-level crises.
Practice “slow thinking” — before engaging with something emotionally charged, ask: What is actually true? What can I do about it?
Engage with ideas outside your algorithmic bubble. Seek out perspectives that challenge and refine your thinking.
The Big Picture?
It’s not about rejecting technology or media — it’s about reprogramming the way we engage with them. Right now, the system rewards quick dopamine hits and emotional intensity. But we can shift the reward structure to value depth, action, and connection instead.
And if enough of us do it? That’s when systems start to change — because the algorithm only reflects what we give it.
Thanks Dr. Calvin. If you have a moment, please read these two related articles. This is how I am trying to reach people in a more actionable and positive way.
Don’t code tired!