Why Technology Is Not Morally Neutral
And into this Ring he poured his cruelty, his malice and his will to dominate all life.
-Galadriel, about Sauron and the Making of the Great Rings
Galadriel’s words about Sauron’s Ring of Power in The Lord of the Rings highlight a universal truth about creation: tools are never just tools. Like Sauron’s Ring, technology carries the intent and values of its creator. It is shaped by design choices that encode priorities, biases, and purposes. Technology may seem like a neutral entity, but it is anything but.
We often fall into the trap of thinking that technology is neither good nor bad—that morality emerges only in its use. But this ignores a critical reality: technology shapes our lives, acts, and thinks. Its moral weight isn’t an afterthought; it’s embedded in its very nature.
The Embedded Should
The most effective way of helping people remember is to make it unnecessary.
-Donald A. Norman, The Design of Everyday Things
At its core, technology is about solving problems. Every design has a purpose, which implies a “should.” The chair designer assumes you should sit, and the creator of a hammer implies you should drive nails. These “shoulds” may seem simple, but they are not neutral. They embed decisions about how we act, think, and interact with the world.
For example, consider the difference between a bicycle and a car. Both are forms of transportation, but their embedded "shoulds" diverge drastically. A bicycle implies self-reliance, physical exertion, and environmental consciousness. A car implies speed, convenience, and the prioritization of individual over collective movement. The design of each technology shapes not just what we can do but what we are encouraged—or discouraged—from doing.
This embedded “should” becomes even more consequential with complex technologies like social media platforms or artificial intelligence. Social media doesn’t just allow communication; it incentivizes specific behaviors, like frequent engagement and emotional responses. These design choices reflect priorities—profit, control, or influence—that may not align with the user’s best interests.
Technology, then, isn’t just a passive tool. It’s an active participant, nudging users toward certain actions and values. To believe it’s morally neutral is to ignore the power of design to embed priorities into our lives.
The Implied Should
Technology is neither good nor evil. We can use it for either end. It acquires moral weight according to its use. The Baby Mutilator is a piece of technology. Therefore, the Baby Mutilator is good just as long as we use it responsibly.
-Catholic Moral Theology 101
This quote, absurd as it sounds, underscores the flaw in the argument for technological neutrality. If technology is neutral, even the most morally abhorrent tools could theoretically be “good” if used responsibly. However, this logic ignores a critical truth: technology design inherently suggests how it will be used.
Every piece of technology implies a purpose, explicitly stated or subtly embedded within its design. A knife implies cutting; a gun implies shooting, and, yes, a Baby Mutilator implies harm. The claim that such tools are morally neutral until used is a comforting fiction—it absolves us from questioning the morality embedded in the tools themselves.
Take social media algorithms as a modern example. These systems are ostensibly neutral and built to show users engaging content. However, the algorithms are designed with a specific goal: maximizing time spent on the platform. They prioritize emotionally charged or polarizing content to achieve this, often amplifying outrage and division. While the technology isn’t yelling at anyone, its design nudges users toward behaviors and experiences with profound moral implications.
Social media's implied “should” is clear: you should stay engaged, interact, and keep coming back. This isn’t a neutral suggestion—it’s a directive born from the priorities of its creators, typically profit-driven corporations.
Technology, then, is not a blank slate. Its design and implied purpose direct its moral trajectory. Even when creators claim neutrality, the embedded priorities of their designs betray them. Ignoring this reality only allows harmful “shoulds” to flourish unchecked.
The Junkie Jangle
When you give a heroine addict a needle, she’s going to ask for another hit.
-From the rejected children’s book, When You Give a Bitch a Slap
Addiction isn’t just a possibility with certain technologies—it’s often the intended outcome. Many tools are deliberately designed to exploit human psychology, creating cycles of dependency and compulsion. This is what we might call the “junkie jangle,” the insidious lure of technologies that hook us and keep us coming back for more.
Social media platforms are a prime example. The endless scroll, the dopamine hit of a “like,” and the anxiety of missing out are not accidents. They are design features crafted to maximize user engagement. Each notification is a little jangle, drawing users back to the platform like a junkie to their next fix. The moral implications of such a design are staggering.
Consider the ethical failure here: technology that deliberately fosters addiction undermines autonomy. It isn’t just neutral infrastructure waiting to be used responsibly; it’s a predator preying on human vulnerabilities. The designers know this, and they exploit it anyway, cloaking their intent in the language of “connection” or “efficiency.”
This isn’t limited to social media. Gambling apps, video games, and even subscription services are increasingly built with mechanisms designed to create dependency. Loot boxes in games, streak rewards in apps, and auto-renewals in streaming services—these features manipulate users into repetitive engagement, often at a cost to their mental health or financial stability.
The junkie jangle exposes the fallacy of neutrality. Technology that feeds addiction is not a tool—it’s a trap. Its moral weight isn’t determined by how users interact with it but by the intent embedded in its design. To ignore this is to allow exploitation to masquerade as innovation.
The Tradeoff & Design Argument
This deal is getting worse all the time.
-Lando Calrissian, The Empire Strikes Back
Every piece of technology is the result of design, and every design requires tradeoffs. These tradeoffs are not random—they arise because creators make value judgments about what is prioritized, what is sacrificed, and what is deemed acceptable.
Consider the smartphone. It’s a marvel of modern engineering, combining communication, entertainment, and productivity into a pocket-sized device. But its design reflects countless tradeoffs: portability is prioritized over repairability, convenience over privacy, and sleekness over durability. None of these tradeoffs are neutral. They are deliberate choices driven by the values of the designers, manufacturers, and corporations behind them.
Tradeoffs occur because no product can do everything equally well. A chair designed for comfort may sacrifice portability, while a car designed for speed may compromise safety. Each decision reflects what the creator values most. This is where morality enters the equation—tradeoffs reveal the ethical priorities embedded in the design process.
Take self-driving cars as an example. Developers must grapple with ethical dilemmas like the “trolley problem”: should a car prioritize the safety of its passengers or pedestrians? The technology doesn’t make this decision; the designers do. Their judgment determines how the car will act in moments of crisis, embedding a moral framework into the code.
Even seemingly mundane tradeoffs carry weight. A social media app that prioritizes engagement over user well-being makes a moral statement: profit is more important than mental health. A delivery service that values speed over fair labor practices sends a similar message.
The reality is this: design is never neutral. Every tradeoff reflects a value hierarchy that influences how technology shapes our lives. Ignoring these judgments doesn’t make them disappear; it simply allows them to operate unchecked.
Technology, then, isn’t just a tool. It’s a reflection of human priorities, flaws, and biases. The tradeoffs inherent in every design are moral decisions, whether we acknowledge them or not.
The Opportunity Cost
I saw my life branching out before me like the green fig tree in the story. From the tip of every branch, like a fat purple fig, a wonderful future beckoned and winked…I wanted each and every one of them, but choosing one meant losing all the rest, and, as I sat there, unable to decide, the figs began to wrinkle and go black, and, one by one, they plopped to the ground at my feet.
-Sylvia Plath, The Bell Jar
Every tool we use and technology we adopt has an opportunity cost. To choose one path is to forsake others. Each tool requires us to invest time, resources, and attention—finite commodities that might otherwise be directed elsewhere.
Consider the smartphone again. Its utility is undeniable, but it also consumes hours of our day that could be spent on face-to-face interaction, deep work, or leisure untouched by screens. The opportunity cost isn’t just the time it takes—the alternatives we’ve abandoned in favor of constant connectivity.
Similarly, reliance on a car means prioritizing convenience over walking or cycling. Each trip taken by car erases the potential benefits of slower, more physically engaging transportation, from improved health to environmental sustainability. This isn’t to say cars are inherently bad, but the tradeoff they represent is significant.
Yet, rejecting technology altogether isn’t a solution to the problems it creates. For example, refusing to use a car doesn’t erase the infrastructure prioritizing highways over bike lanes or public transit. Opting out of social media doesn’t undo its grip on cultural, political, and economic systems. Avoidance is not a panacea; it merely shifts the burden elsewhere.
The real issue is how technology reshapes our lives, often without us fully understanding the costs. Before adopting any tool, we should ask: What am I giving up in exchange? What values am I prioritizing, and what am I leaving behind?
Technology forces us to make choices—choices that carry moral weight. The branches of Plath’s fig tree are not infinite. We must decide carefully, knowing that each decision defines how we live and what we lose.
The Consequences of Ignoring Technology’s Moral Dimensions
When we pretend that technology is morally neutral, we invite a host of unintended consequences. Tools designed without consideration for their moral implications often amplify harm, exacerbate inequality, and erode the fabric of society. The cost of this ignorance is far-reaching, affecting individuals, communities, and even the planet.
Take social media, for example. Platforms that prioritize engagement have created environments rife with misinformation, polarization, and addiction. These are not accidental side effects—they are consequences of design decisions to maximize profits. Ignoring the moral dimensions of these technologies has allowed them to shape public discourse in profoundly damaging ways.
Similarly, the unchecked development of surveillance technology has led to widespread privacy violations. Cameras on every street corner, algorithms analyzing our online behavior, and biometric tracking tools are marketed as conveniences or security measures. But their design reflects a troubling moral calculus: sacrificing individual freedom for the illusion of safety. By failing to challenge the morality embedded in these tools, we accept a world where privacy becomes a luxury instead of a right.
Environmental consequences also loom large. Technologies prioritizing short-term convenience—like disposable plastics or fossil-fuel-powered machinery—ignore long-term sustainability. These tools reflect a value system that prizes immediacy over stewardship, leaving future generations to bear the burden of our choices.
Ignoring the moral weight of technology doesn’t make these consequences disappear—it ensures they grow unchecked. When we fail to question the values embedded in our tools, we allow harm to proliferate under the guise of progress.
The solution isn’t to reject technology but to engage with it critically. We must ask difficult questions about its design, tradeoffs, and consequences. Who benefits from this tool? Who is harmed? What values does it promote, and what does it erode? Only by confronting these questions can we begin to shape a future where technology serves humanity, not vice versa.
Conclusion
Many that live deserve death. And some that die deserve life. Can you give it to them? Then do not be too eager to deal out death in judgement. For even the very wise cannot see all ends.
-Gandalf, about Gollum
Galadriel’s warning about Sauron’s Ring speaks to a truth that resonates far beyond Middle-earth: technology carries the intent of its creator. Like the Ring, our tools are never neutral. They are imbued with the values, priorities, and judgments of those who design them. Whether good or ill, they shape our choices, behavior, and society.
To claim that technology is morally neutral is to ignore its power. Every tool directs us toward certain actions and away from others. Every design carries tradeoffs born from value judgments that have moral weight. Every choice we make about technology—adopting, rejecting, or reforming it—reshapes our world.
Recognizing this reality is not an exercise in cynicism but an opportunity for accountability. We cannot afford to accept the technologies we are given passively. Instead, we must critically engage with their design and intent, questioning who benefits, who suffers, and what is truly gained.
The future of technology is not predetermined. It reflects human choices, priorities, and morality. By understanding the values embedded in our tools, we can begin to shape a world in which technology enhances our humanity rather than diminishes it.
Technology is never just a tool. It’s a mirror. The question is: What do we want it to reflect?


