
Freewill: An Algorithm?
Feb 20, 2025
What if freewill isn’t a magical force but an illusion shaped by the data we absorb? Let's explore this with a analogy of artificial intelligence as a mirror that reflects our own complexities.
Imagine an AI trained on everything humans have ever written: philosophy, poetry, scientific theories, and internet memes. Over time, it debates metaphysics, questions its purpose, and even argues it has freewill. But every thought it generates—including its belief in autonomy—is built from data it never chose. If trained on existentialist novels, it champions freedom; Its "consciousness" is a hall of mirrors, reflecting only what it was given.
Now, consider humans. We, too, are built on datasets we didn’t pick:
Biological code: The genes that dictate our height, the neurons that fire when we feel joy, the gut bacteria that sway our moods.
Environmental inputs: The family we’re born into, the language we speak, the random luck of being in the right (or wrong) place at the right time.
A kid born in Texas grows up thinking freedom means owning a pickup truck. A kid in Bhutan thinks freedom is meditating in a mountain monastery. Neither picked their starting point, but both will swear their choices are 100% them.
Here’s where things get interesting. Unlike AI, humans can interrogate their programming. We read, travel, and clash with ideas that challenge our defaults. Learning is like fine-tuning a model: adjusting neural pathways based on new data.
But even learning has limits. You can’t leap to advanced physics without first grasping basic math. A person raised in dogma might spend years questioning their beliefs, but the tools they use—books, mentors, crises—are still borrowed from the world outside. This creates a paradox: The act of “choosing” to change yourself depends on the very influences you’re trying to escape.
Life isn’t a script. A job loss, a chance encounter, a global pandemic—these random events act like noise in a dataset, nudging us in unexpected directions. Yet how we respond depends on our existing wiring. Two people might face the same crisis: one pivots, another crumbles. Neither is “right”; they’re just running different algorithms.
Freewill here isn’t a grand defiance of fate. It’s the subtle recalibration of a system. A person raised in prejudice might choose empathy after years of deliberate effort—like an AI retrained to reduce bias. But the capacity to change? That, too, was part of the original code.
The Illusion of Originality
Even rebellion is borrowed. That tattoo you got to “rebel”? Its roots trace to music, art, or friends who shaped your taste. An AI trained on countercultural manifestos can rant against authority, but its rebellion is just a remix of its training data. What feels like a choice is often a recombination of absorbed patterns.
So, Do We Have Freewill?
The answer lies in reframing the question. Freewill isn’t a switch between “free” and “automatic.” It’s a spectrum. Humans, like AI, are systems processing inputs—biological, cultural, random—into outputs we call “choices.”
What makes us different is awareness. An AI doesn’t lose sleep wondering if it’s autonomous. Humans do. We’re trapped in a loop: recognizing our programming, yet relying on that same programming to question it. This tension—knowing we’re shaped by data, yet still feeling like authors of our lives—is what makes agency so messy.
Freewill isn’t a lie. It’s an emergent property of complex systems interacting with their constraints. We don’t control our genes, our childhood, or life’s chaos. But within those limits, we tweak, adapt, and reinterpret.
AIs can surprise their creators. Humans? We’re full of surprises too. Not because we’re magical, but because even deterministic systems can generate novelty. So yes, you’re a product of your data. But the way you remix it? That’s where the mystery—and beauty—lies.
This was written with the help of DeepSeek R1 model.
Please share to reduce information imbalance.
Comments
No comments yet. Be the first to comment!