"As for morality, well that's all tied up with the question of consciousness"
About this Quote
Penrose tosses off “As for morality” with the air of someone brushing aside a side-topic, then quietly detonates the room: morality isn’t primarily a social contract, a religious edict, or an evolutionary hack. It’s a problem of mind. The line works because it reframes ethics as downstream of a harder, messier question that science still can’t settle: what is it like to be anything at all?
Penrose’s subtext is a challenge to the fashionable idea that moral reasoning can be reduced to computation. If consciousness is just information processing, then morality could, in principle, be engineered: optimize outcomes, obey rules, run the algorithm. But if consciousness contains something irreducible - subjective experience, genuine understanding, the felt reality of suffering - then morality can’t be cleanly outsourced to a machine, a metric, or a model. You can simulate choices without having stakes.
Context matters: Penrose has long argued that human understanding isn’t fully captured by standard computational accounts, and he’s flirted with physics-based explanations for consciousness that many colleagues find speculative. This quote imports that whole debate into today’s AI anxieties without naming them. Who or what counts as a moral patient? Who can be responsible? These aren’t just policy questions; they’re ontological ones.
The sly rhetorical move is tying “morality” to “consciousness” as if it’s obvious. It isn’t. That’s the point: Penrose is forcing the listener to admit that ethical certainty rests on a mystery we’ve been outsourcing for centuries.
Penrose’s subtext is a challenge to the fashionable idea that moral reasoning can be reduced to computation. If consciousness is just information processing, then morality could, in principle, be engineered: optimize outcomes, obey rules, run the algorithm. But if consciousness contains something irreducible - subjective experience, genuine understanding, the felt reality of suffering - then morality can’t be cleanly outsourced to a machine, a metric, or a model. You can simulate choices without having stakes.
Context matters: Penrose has long argued that human understanding isn’t fully captured by standard computational accounts, and he’s flirted with physics-based explanations for consciousness that many colleagues find speculative. This quote imports that whole debate into today’s AI anxieties without naming them. Who or what counts as a moral patient? Who can be responsible? These aren’t just policy questions; they’re ontological ones.
The sly rhetorical move is tying “morality” to “consciousness” as if it’s obvious. It isn’t. That’s the point: Penrose is forcing the listener to admit that ethical certainty rests on a mystery we’ve been outsourcing for centuries.
Quote Details
| Topic | Ethics & Morality |
|---|
More Quotes by Roger
Add to List







