"There are no morals about technology at all. Technology expands our ways of thinking about things, expands our ways of doing things. If we're bad people we use technology for bad purposes and if we're good people we use it for good purposes"
About this Quote
Simon’s provocation is designed to puncture a comforting modern habit: treating technology as if it arrives with a built-in conscience. By insisting there are “no morals about technology at all,” he’s not praising Silicon Valley neutrality so much as stripping away a popular alibi. If the tool is the villain, the user gets to stay innocent. Simon won’t allow that. He relocates responsibility where it’s messier and more politically demanding: in human values, institutions, and incentives.
The line works because it’s framed like common sense, almost blunt to the point of annoyance. “Technology expands” is the key verb: not redeems, not corrupts, not enlightens. Expands. That’s a cognitive scientist’s word choice. It implies a widening of options and mental models, not a moral arc. Subtext: more capability means more exposure of what we already are. When capacity increases, character matters more, not less.
Context matters here: Simon came out of a mid-century world of systems analysis, early computing, and bureaucratic decision-making, where the pressing question wasn’t whether machines were evil, but how organizations make choices under constraints. His claim quietly targets technological determinism and the melodrama of “Frankenstein” narratives. Yet it also risks oversimplifying: “good people” and “bad people” aren’t stable categories; they’re shaped by the same technologies and the structures deploying them.
The bite is that neutrality isn’t comfort; it’s a demand. If tech has no morals, then governance, design, and accountability can’t be outsourced to innovation’s vibes.
The line works because it’s framed like common sense, almost blunt to the point of annoyance. “Technology expands” is the key verb: not redeems, not corrupts, not enlightens. Expands. That’s a cognitive scientist’s word choice. It implies a widening of options and mental models, not a moral arc. Subtext: more capability means more exposure of what we already are. When capacity increases, character matters more, not less.
Context matters here: Simon came out of a mid-century world of systems analysis, early computing, and bureaucratic decision-making, where the pressing question wasn’t whether machines were evil, but how organizations make choices under constraints. His claim quietly targets technological determinism and the melodrama of “Frankenstein” narratives. Yet it also risks oversimplifying: “good people” and “bad people” aren’t stable categories; they’re shaped by the same technologies and the structures deploying them.
The bite is that neutrality isn’t comfort; it’s a demand. If tech has no morals, then governance, design, and accountability can’t be outsourced to innovation’s vibes.
Quote Details
| Topic | Ethics & Morality |
|---|
More Quotes by Herbert
Add to List









